In order to understand this guest post, you need to know two things:
- Antifragile is an amazing book by Nassim Nicholas Taleb.
- George Kovacs is one of the most incredible airway teachers and thinkers I have met since hanging with Rich Levitan. He will soon be on the podcast, until then enjoy his guest post.
Are the Principles of Antfragility Relevant to the Practice of Emergency Medicine? Issues of Error Mitigation, Competence and Patient Safety
by George Kovacs MD MHPE FRCPC
I’ve been reading the book Antifragile: Things that gain from disorder by Nassim Taleb. It was referred to me by a close friend and colleague who has a much higher intellectual capacity and a better attention span than myself (1). The word antifragile didn’t exist before Taleb coined the term. As he explains it is best appreciated by first agreeing on a known defined term, “fragile”. A fragile economy is one that doesn’t take risk, one that is very calculated, and avoids error for its assumed inevitable negative consequences. Resilience or toughness may describe the opposite of fragility and in many contexts would be considered a positive system attribute. The term antifragile goes further, describing a system that appreciates the value of error, one that looks outside of the box and is willing to take risks and understands the consequences and great potential benefit of failure. An antifragile system doesn’t accept building a new tower that will withstand another airplane crash. It builds a tower that will withstand a much greater threat. In an antifragile system error is a necessary ingredient for the healthy growth of the system and we as individuals need to know when and how to embrace it.
This book is about systems/economies but one can’t help thinking how we as individuals may benefit from being antifragile. Versions of the saying ‘from crisis comes opportunity’ have been around for centuries and while true much of the time, it depends how close you are to the charged wire (2). Taleb discusses how individual suffering and error, while benefiting the collective, is never without measurable individual/micro consequences. As we know in medicine, population benefits don’t necessarily translate to the individual. So how does this concept of antifragility apply to the individual? How does this relate to what we observe in our Emergency Medicine environment?
I think most of us have observed a change in clinician decision-making skills where learners and practitioners are increasingly uncomfortable with uncertainty. We are more and more risk averse, whether it be from fear of litigation, increased public expectations or what is more likely a combination of the above (plus other complex factors). Unfortunately this is affecting our ability to be critical thinkers and risk takers, necessary and inherent traits of a good emergency physician and responsible system gatekeeper. Our increasing reliance on technology is nowhere better exemplified than by the FAA report claiming that pilot skills are being lost because an over dependence on automation technologies (3).
In the world of Emergency Medicine, arguably of equal or greater stakes, critical skills are too being lost. It has become almost unacceptable to have a right lower quadrant pain patient leave the emergency department without imaging. Many would consider a return visit diagnosis of appendicitis without having had a CT at the time of the initial presentation an assessment failure. Unfortunately we, like our pilot counterparts are very likely witnessing an unintended decay of our assessment skills (and a few procedural skills ie. direct laryngoscopy) as we increasingly depend on technology. This new standard of care is in part a result of our error intolerant collective and our quest for certainty. It also reflects that need for diagnostic immediacy, an approach that may not be in any given patient’s best interest (4).
The chase for answers may make the patient happy when they leave our care but at the same time may lead to iatrogenic morbidity and premature demise. In fact that there is some data suggesting patient satisfaction may inversely be related to outcome (5). Are we fostering the creation of the fragile clinician? Are we creating a clinician who hasn’t learned to take risks or who hasn’t experienced enough error to grow and become appropriately confident and competent? Much of our clinical decision-making educational efforts are appropriately directed to teaching how to avoid error. Can we teach anifrgility?
We know that volume influences confidence and competence. Regardless of what some educators might say numbers do matter (6,7). The importance of volume however is often misinterpreted. From a ‘fragile’ perspective it means getting repeated exposures will eventually make us better and help produce the intended outcome of competency. However the more important lesson is the antifragile one, that volume increases our exposure to negative outcomes and these experiences are at least as important if not more so than volume alone. As Twain said “Good judgment is the result of experience and experience the result of bad judgment”.
Taleb again describes the difference between resilience and antifragility by the scope of tolerance, where resilience is a state of ‘safe’ competence that can manage a defined measurable negative influence. In contrast, the antifragile ecosystem must withstand much greater stressors to prove successful in a growing economic ecosystem. Similarly when learning a skill, do we want our learners to become just competent, measured as minimally safe? There is reasonable evidence that one of the most effective means of maintaining competence is to ‘overlearn’, to push competence beyond that which is ‘just enough’, a tactic that positively affects the slope of skill decay (8). Maintenance of competence is not enough for the antifragile just as neutral growth in unacceptable in the market. The challenge is to deliberately embrace challenge throughout our career as complacency will lead to incompetence.
Simulation has stepped up as a potential partial solution to these challenges in providing “permission to fail”. However antifragile simulation requires failure to have real consequence. As Mike Tyson said “we all got plans, til we get punched in the mouth”. Simulation is about content and context. The content includes the educational material and equipment used in providing simulation. Context is the decision environment where we put our thoughts and actions on the line. Too often simulation is about ‘stuff’ and technology where high fidelity is synonymous with expensive equipment and elaborate centres. Simulation should simulate real life, full of error with consequence. We can’t stay out of the ring because something bad may happen.
The fragility train is likely unstoppable. The growth in medical technology we are witnessing will not go away. The lawyers will not retreat and justifiably the population will not reduce their expectations. We however have to appreciate the fact that this path is not without consequence. Although we may feel better by teaching and understanding error with the hope that it will provide improved patient safety and better outcomes as stated above, it alone is not enough.
Finally, we need to recognize that we cannot all be antifragile. It has been well recognized that too much consequence with certain individuals and systems can and will cause breakdown and failure in of itself (9). It doesn’t mean that the solution is to reduce stress and create more placid environments. It means that we should recognize the benefits of antifragility and the risks of fragility when we strive to provide better patient care. We all have at least been humbled by failure. It is time to recognize that “to err is human” and perhaps both necessary and antifragile.
Random thoughts on a Tuesday night
1. Taleb, Nassim. Antifragile: Things That Gain From Disorder. New York: Random House, 2012.
2. Author unknown. “A crisis is an opportunity riding the dangerous wind”Traditional Chinese proverb.
3. Federal Aviation Administration Fact Sheet – Report on the Operational Use of Flight Path Management Systems November 21, 2013 (http://www.faa.gov/news/fact_sheets/news_story.cfm?newsId=15434)
4. Berdahl CT, Vermeulen MJ, Larson DB, Schull MJ. Emergency department computed tomography utilization in the United States and Canada. Ann Emerg Med. 2013 Nov;62(5):486-494.e3.
5. Fenton JJ, Jerant AF, Bertakis KD, Franks P. The cost of satisfaction: a national study of patient satisfaction, health care utilization, expenditures, and mortality. Arch Intern Med. 2012 Mar 12;172(5):405-11.
6. Ericsson KA, Nandagopal K, Roring RW. Toward a science of exceptional achievement: attaining superior performance through deliberate practice. Ann N Y Acad Sci. 2009 Aug;1172:199-217.
7. Gladwell, M. (2008). The 10,000 Hour Rule. In Gladwell M, (Ed), Outliers: The Story of Success (pp. 35-68). New York: Little, Brown and Company.
8. Pusic MV, Kessler D, Szyld D, Kalet A, Pecaric M, Boutis K. Experience curves as an organizing framework for deliberate practice in emergency medicine learning. Acad Emerg Med. 2012 Dec;19(12):1476-80.
9. Croskerry P, Law JA, Kovacs G. Human Factors in Airway Management. In: Kovacs G, Law JA (eds.) Airway Management in Emergencies. Second edition. 2011 People’s Medical Publishing House-USA.
Latest posts by Scott Weingart (see all)
- Podcast 182 – Kettlebells for the Brain – Meditation from SMACC 2016 - September 19, 2016
- Podcast 181 – Pulmonary Hypertension and Right Ventricular Failure with Susan Wilcox - September 5, 2016
- Podcast 180 – On Argumentation, Fallacies, and Twitter Misery - August 22, 2016