Liz Holliday

Communications Specialist
Online Work

Combining Two Loves: Technology and Teaching at Columbia Nursing’s New State-of-the-Art Fuld Simulation Center


Dr. Kellie Bryant recently joined Columbia University School of Nursing and is responsible for day-to-day operations of the new state-of-the-art Helene Fuld Health Trust Simulation Center. The center will occupy two floors in Columbia Nursing’s new building and is expected to open to students in the fall of 2017. The new 16,000-sq-ft space more than quadruples the school’s current simulation laboratory.

In her new role, Bryant is responsible for the operation of the center including developing simulation activities, integrating simulation into the curriculum, faculty training, teaching simulation sessions and evaluating programs. The role is tailor-made for this self-described “techie” who also has a love of teaching. Her entry into simulated learning was somewhat by chance, but Bryant says she never looked back since that day in 2008. She was immediately intrigued by how simulated learning could not only augment, but enhance the student learning experience.

Bryant received her BS in nursing from Stony Brook University, graduated from the Advanced Practice Nursing Program in Perinatal Women’s Health Nurse Practitioner from SUNY Stony Brook and obtained her doctorate in nursing practice from Case Western Reserve University.

What is simulation?

Simulation in general is a strategy to replicate a real life situation. At Columbia Nursing, we are using simulation to replicate the clinical environment for students to practice the skills needed to become a healthcare provider. We are replicating everything from the hospital/outpatient rooms to the role of the patient using high-tech Human Patient Simulators (HPS), called manikins. In some cases we’re also using standardized patients, which are real people that come in and play the role of a patient or a patient’s family member. The learning activity involves students caring for these patients who have various medical conditions by using the knowledge and skills they have learned from the classroom and clinical setting.  Our goal is to allow them to enhance their skills in a safe learning environment. That way, students and new graduates will have an easier transition into clinical practice and their new roles.

How will the simulation center be used at Columbia Nursing?

The simulation center is one of the largest in this area, particularly for a nursing school. When a student walks into one of the simulation rooms in the new Helene Fuld Health Trust Simulation Center, it will resemble a real patient care environment. This includes rooms that meet the needs of all our different programs. We will have an operating room, labor and delivery suite, outpatient exam rooms, standard hospital rooms as well as rooms that are flexible for all different types of classes.

We’ll replicate various patient scenarios that students will encounter in the clinical setting. Students will have to care for their “patient” by obtaining their medical history, performing an assessment and administering medications, going all the way through to implementing a treatment plan. Students will be required to think critically and decide how to prioritize care for their patient. If the blood pressure is low, why is it low? Could the patient be hemorrhaging? What should I do next? In addition, simulation helps students to work on their communication skills such as how they interact with other members of the health care team and with family members. We replicate all of this in the simulation center. The more opportunities for students to practice these skills, the more confident and prepared they will feel in the clinical setting.

Our simulation program will also be inter-professional, so not only our students will be using it. We are looking forward to having simulations with other disciplines, such as medical students, nurses in the hospital setting, physical therapy, and occupational therapy.

Is there one part of the simulation center you are most excited about?

It is hard to say what my favorite is because I am excited about the entire simulation center! My specialty is women’s health so I guess I would say the labor and delivery suite will probably be my favorite. It’s fascinating to me that there is a manikin that can actually give birth.

What is it about simulation you are so passionate about?

I have always been a techie, and I have always had to have the newest gadgets. I got into simulation due to a former colleague who had a small grant to start a simulation program. The school had three manikins that had been sitting in a box for a while, and the grant enabled us to develop simulation curriculum.  I started working there one day a week and fell in love with simulation learning. The experience made me realize that simulation was a great way for students to learn. Eventually a full-time position opened up, and that is how I got my first job in simulation.  When I was in school, we learned a lot of skills for the first time with real patients. You can’t make mistakes there. If you were about to do something wrong, the instructors had to stop you because you were going to cause the patient harm. In simulation, students can learn from their mistakes in a safe environment. If they do something wrong, we can discuss a plan for improving their performance and provide students an opportunity to practice until they get it right.

Is there anything else you want people to know about the simulation center?

I can’t wait for the center to open! We will be a simulation center of excellence and the place where other nurse educators will want to learn about simulation. We want to be the best learning environment for our students and our other stakeholders and the community. As part of NewYork-Presbyterian/Columbia University Medical Center (CUMC), our simulation center is affiliated with one of the best hospital systems in the country. And being part of CUMC provides a great opportunity for our students to learn about, from, and with other health professionals to improve health outcomes. I think that is an important piece of the center. Oh, and research! I am very excited about collaborating with my colleagues to explore ways to incorporate the simulation center into the incredible research that goes on at Columbia Nursing.


This is an article I wrote for the London publication ‘The Periscope Post’ on January 13th 2011:

Women must battle ageism to remain on TV after 40. Photo credit: ST33VO

Women have been trying to find their place within the typically male dominated field of television broadcast since the early 1970s. Thanks to pioneers of the business, women like Barbara Walters, who in 1976 became the first woman to anchor network TV, have proven themselves worthy in a world full of men. And now that the glass ceiling has been broken, women are asserting their presence in the field more than ever.

However, as the recent Miriam O’Reilly case shows us, it isn’t all sunshine and roses for the women in television broadcasting. The double-standard expectations of youth and beauty unfairly seem to be a real threat to veteran women in the broadcasting field.

O’Reilly, 53, was dropped from her BBC show, “Countryfile”, when it moved to prime time in April 2009. Although her sex discrimination case against her former employers failed, she brought an ageism suit against the BBC; it took two years, but an employment tribunal finally agreed with O’Reilly this week that the BBC unfairly dismissed her. Now, the BBC claims they “would like to discuss working with her in the future.” The fact that the announcement came after O’Reilly took the BBC to court only seems to tarnish the sincerity of the statement.

This is far from the first case of an older woman being replaced for a younger model to appease viewership. The most famous ageism case goes back to the early 1980s: Before O’Reilly won her battle against the BBC, U.S. journalist Christine Craft won a similar suit in 1981. Craft was demoted from her anchoring position after only nine months, following poor ratings in appearance and demeanor. Craft claimed the reason for her demotion was that she was “too old, too unattractive, and not deferential enough to men”.

Some say that the ascetical demands of television are blind to gender, however this is simply not the case. While both sexes adhere to the demands of looking good on camera, aging does not affect men in the same way as women. Last year, Britain’s Skillset revealed that “75 percent of men in TV are aged 35 or over compared to just 52 percent of women.” According to The Guardian, Skillset also found that “only one in 10 women working in television is over 50 – half are under 35.” So while it is perfectly acceptable for men to remain on air well after their hair has turned gray, women are dropped at the first sign of a wrinkle.

Women seem to get stuck serving a type of figurehead role on television, to be painted and preened to the pleasure of the onlooker. If this weren’t the case, than why would it matter if a 53-year-old were the face of a prime time slot? Even images of CBS’s Katie Couric were altered in promotional ads to make her appearance more youthful back in 2006. At the time The New York Times reported, “As part of a cover story in its promotional magazine Watch, a picture of Ms. Couric taken at that event has been altered to give her noticeable slimmer physique and fewer facial lines.”  Does NBC’s Brian Williams have to put up with demands like that? Probably not. It seems that while older men get to be “silver foxes” or distinguished, older women are described as haggard or dowdy.

It is not to say that seasoned women over 40 do not exist on television, because they most certainly do (Oprah and Barbra Walters can attest to that). However, studies have shown that they appear less often than their younger, female counterparts. This suggests that the seasoned women of broadcasting are not only competing with men for airtime, but younger, fitter women as well.

If Oprah and Barbra can maintain viewership over the years, why don’t networks trust other women to do the same? In a “post-feminist” society where women and men are supposed to be on a level playing field, it is a shame that networks are pinning women against each other.

It can only be hoped that the BBC will learn from the mistake of firing O’Reilly. It would be the first step in appreciating, and accepting, seasoned female presenters. Ageism cases such as this have been going on for far too long, and frankly take us back to the days where women were to be seen and not heard. As it stands, the women in broadcasting expect a career spanning that of a professional athlete: In lieu of athletic ability is a woman’s age, and when the wrinkles start to show, it’s off to the bench, with newer “fitter” women taking the field.

This is not to say that being a presenter should be a lifetime guarantee; television broadcasting, after all, is not the U.S. Supreme Court (of which members cannot be fired, only impeached). However, until age starts to affect the quality of broadcasting being transmitted, women should not have to worry about being fired because of crow’s feet or a few gray hairs.


The following is an article I wrote for my former website ‘Media Flair’ in February 2011:

“Microblogging” has sky-rocketed since Twitter first emerged onto the scene in 2006. It has become ingrained into our social landscape, now with millions of users worldwide. Twitter can be great for vents, rants, and links (not to mention celebrity dirt); but do tweets qualify as journalism? First off, here’s what the experts have to say…

According to Rory O’Connor of The Huffington Post Twitter has broken some news stories before the regional press. He cites the 2009 U.S. Airways’ plane crash in the Hudson River as a prime example. It was a Twitter user who broke the story first, not the international press (despite being surrounded by major international news services). Janis Krums’ tweet of a “plane in the Hudson” (complete with a TwitPic taken from his iphone) gained nearly 40,000 views within four hours of tweeting. The rapid traffic actually caused the site to crash (no pun intended).

O’Connor spoke to Twitter co-founder Biz Stone who said the news value of Twitter was realized early, “…things like earthquakes led to Twitter updates. The first Twitter report of the ground shaking during recent tremors in California, for example, came nine minutes before the first Associated Press alert”.

It seems O’Connor states Twitter is best served to Citizen Journalists, those who happen to be unintentionally on the scene of a news break. But I have to wonder, what about those of us who are already journalists, is Twitter really the best outlet? O’Connor’s interview with Biz Stone also mentioned the news networks adaptation to Twitter almost at its onset. Now every story published by, say The Guardian, is tweeted as soon as it is published.

“any journalist could benefit from an internship at Twitter, […] it’s where they’ll learn how to compete for the future.”

Brian Solis, a writer for BusinessWeek’s blog, likens Twitter’s capabilities to CNN’s 24-hour news network. According to Solis “news no longer breaks; it tweets. Some 200 million people learn about breaking events as they happen, trigger a network effect that demonstrates the reach and velocity of social physics.” Twitter is something that now “rivals traditional newswires.” So much so, that Solis said “any journalist could benefit from an internship at Twitter, […] it’s where they’ll learn how to compete for the future.”

But is Twitter really the future of journalism? Can Solis really expect a trainee journalist to aspire to intern at Twitter and not, say, ABC or The New York Times? Not everyone agrees that Twitter is Journalism. Michael De Monte, writer for ScribbleLive as well as BusinessWeek’s blog, says live tweeting has its limitations. Questions “can’t be answered in 140-character chunks” and in-depth live coverage is best left to live blogging, and not to Twitter. He states that “twitter works nicely for providing links to existing stories”, but it is not ideal. De Monte states “Twitter’s limitations make it a poor medium for news coverage. How much of a story can you tell in 140 characters?”

“Twitter’s limitations make it a poor medium for news coverage. How much of a story can you tell in 140 characters?”

With that, I have a tendency to agree with De Monte. Is Twitter journalism? Debatable. Is it a great way to solicit journalism to gain a wider readership? Absolutely. In 140 characters it is hard to say anything except the most shallow aspects of a story. However, Twitter’s ability to give (essentially) headlines of story and link to the longer version is where the journalism really comes in. I follow multiple news agencies and read stories from them throughout the day. The ability to retweet also offers users to not only gain information, but share it with their followers as well.

One thing these articles lacked is the importance of networking on Twitter. It allows a journalist to follow colleagues around the world. Not only is this important to stay up to date, but can help find details and contacts for future stories as well. When I hit a dead end on a story, I often tweet about it and ask my journalist followers to offer suggestions or contacts.

Twitter is more of the right hand man to journalism than a stand alone journalistic medium. To offer an analogy, Twitter is the robin to journalism’s batman: Twitter enhances, links, spreads, and helps journalism, but on it’s own it doesn’t give enough depth to do the job correctly. As for Brian Solis’ claim that Twitter is where journalists will “learn to compete for the future”; only time will tell.

Follow Liz Holliday on Twitter: @presentliz


The following is an article I wrote for my former website back in January 2011:

Twitter is a popular social networking tool, but can it land you a better job? Photo credit: David Saunders

Celebrities do it, companies do it, and so does your neighbour and roommate; with 145 million reported users in 2010, your mom probably does it too. Tweeting has become the most effective marketing tool since Egyptians first started using billboards in ancient times. People can now sit at home (or where ever their smart phones have service) and market themselves, their product, and company, all in 140 characters at a time.

Many claim when used effectively, Twitter can be the key to your future. Numerous blogs, articles, and reports have been written with the claim it will lead to success. However, is the concept of jobs through Twitter a bit of a stretch? Can Twitter actually get you a job?

YES, Tweet and Ye Shall Receive:

Laura Raines of Atlanta Business News says a few years ago she would be skeptical of gaining jobs through social networking, but not anymore. Raines explains the appeal of Twitter lies in its accessibility; “tweets are short (140 characters), to-the-point bits of data.” More so, she claims ‘tweets’ can help you find jobs not advertised elsewhere. “Employers are using social media to find and check out talent,” Raines writes.  And apparently Twitter will not only help you find jobs, but works as a “personal agent” as well. Raines says “it can amplify your personal brand, take your network global, give you access to cutting-edge knowledge and resources, and showcase your credibility and marketability.”

YES, But Keep It Fresh and Clean:

Not everyone agrees with Raines, however, Forbes’ blogger Susan Gunelius believes Twitter may actually prevent you from landing a job. Gunelius said “it is estimated that over half of hiring is based to some extent on social media research.” This means what you post online, could prevent you from getting your foot in the door at your next job interview. She notes that content on social networks should be kept clean (grandmother approved), kept calm (no angry rants), kept honest (cross your heart and hope to die), and kept quasi-censored (Twitter over-shares might come back to haunt you). According to Gunelius “most corporate human resources departments and hiring managers review job applicant’ online reputations and content before they are contacted for interviews.” So save yourself some grief and keep your dirt and dish for your diary.

Gunelius isn’t the only one out there who thinks a clean twitter account could help you look good to potential employers. The Wall Street Journal’s Jonnelle Marte insists never to tweet “about anything you wouldn’t want your boss or mother to see, and tell your friends to keep their tweets to you appropriate.” Personal anecdotes are fine according to Marte, however, they should be kept to a minimum.

NO, “Social Media Bible-Thumpers Are Nuts”:

Not everyone believes in the ‘jobs through Twitter’ revolution, especially during a recession. Social Media Explorer blogger Jason Falls thinks social media fanatics are similar to evangelical Christians. While Falls agrees it may help project a certain personal image, most companies are not “inside the tech bubble or hip to social media” and would not even consider using twitter over an old fashion interview. According to Falls, “the social media job security utopia doesn’t exist for most of the free world.” More so, he believes you should network, but not online. As he says, “the well-networked don’t apply for jobs, they go get them.” If you want a job, Falls suggests the old fashion method: face-to-face interviews, endless phone calls, voice mails and applications. Falls believes it is best to stick to the proven methods, and avoid social network soliciting because “executives that are making hiring and firing decision probably think the social media bible-thumpers are nuts.”

NO, Jobs Come from Humans, Not Twitter:

Clue Wagon’s Kerry Scott agrees with Falls that Twitter will not magically get users jobs (after all, it isn’t a school guidance counselor). She does admit that it can be “a great tool for a job hunter,” though it doesn’t just happen overnight. Scott writes that those attempting to find work through Twitter usually “sit quietly, waiting for lightning to strike.” With this kind of ‘let it come to me’ mentality, Scott writes, “there is no chance that you are going to get a job through Twitter. NO CHANCE.” The only way to get a job is through “humans”.  Scott believes you must put yourself out there, find the right people in your field to follow, and engage in dialogue. As Scott states “the nice thing about Twitter is that the culture allows you to just…start talking.”

CONCLUSION, What We’ve Learned:

Thus, it appears Twitter will not hand you a job on a golden platter (although wouldn’t that be great?). If you want a job, it will take work and perseverance, however, social media doesn’t seem to hurt the process (assuming your Twitter account is kept clean). By following the right people, and social networking the crap out of yourself, Twitter has the potential to help you spot the jobs, and get you a foot in the door.

ONLINE WORK: Media Flair: ‘Pimp My Blog’

The following is a lecture series entitled ‘Pimp My Blog’ that I covered for my former website back in November 2010:

‘Pimp My Blog’ Lecture Series, City University London 2010

By now, journalists know the importance of a personal blog: it helps get a name out there, potentially lead to a job, or simply can act as a fulfilling pastime. However, how does one make the leap from personal blog to infamous blogger? A panel lecture, entitled “PimpMyBlog” at City University London, attempted to answer this question on Tuesday night.

The lecture series set out to teach journalism students at the University how to utilize soical media (apps, widgets, etc.) to get their blogs, and journalism noticed. The event included lectures by a who’s who of the social media field. MediaFlair attended the lecture and has summarized the highlights in these four posts:

Tim Glanfield: Link to get noticed

Karl Schneider: Blog with passion

Patrick Smith: “Just do it”

Martin Stabe: How to pimp your online footprint