Thinking About Thinking: Discussing Bias Within Audiology

Thinking About Thinking: Discussing Bias Within Audiology

As audiologists, it is necessary that we are able to quickly obtain results and put together relevant clinical information to properly care for our patients. Reasoning through available information is an important part of our clinical decision making. How we base our decisions can impact our test procedures, the patient’s diagnosis, how their treatment plan is fashioned, referrals to ENT physicians or other allied healthcare professions, and how we decide to make use of resources that are available to us. It’s easy to imagine that in our day-to-day we are constantly bombarded by information in our environment. Sometimes we are able to sit down and deliberate each piece of information that’s available to us, but other times we make quick decisions and adapt on the fly. We take what snapshot of information we have available to us and apply some of our previous experiences. This problem-solving strategy called heuristics is what often allows us to come to a quick judgment. While heuristics are helpful in navigating the myriad of choices we have each day, that doesn’t always mean they are always correct. Using shortcuts can lead to poor judgment and in some cases, bias our decision-making potential.

What does this mean for us as hearing healthcare professionals? If we as audiologists don’t challenge the heuristics that guide our decision-making, our cognitive bias may lead to ineffectiveness and poorer health outcomes for our patients. While there are many types of cognitive bias that have been well-studied in psychological research, it is important that we make ourselves aware of potential pitfalls so that we can become better clinicians. Below is a table adapted from Schlonsky et al. (2019) that looks at a few that are commonly documented in the medical field.

Cognitive Bias

Definition

Affective

Tendency for a decision to be influenced by affective states (e.g., emotion, feelings, etc.)

Anchoring

Relying excessively on an initial piece of information (i.e., the “anchor”) to make subsequent judgments during a

decision-making process

Ascertainment

Shaping a decision or judgment based on a prior expectation

Attentional

Tendency for perceptions or judgments to be influenced by a person's recurring thoughts at that time

Availability

Making judgments about the likelihood of an event based on how readily examples come to mind

Confirmation

Tendency to search for, interpret, favor, and/or recall information that confirms an existing belief or hypothesis

Framing Effect

Being disproportionately influenced by how a problem is described (e.g., with positive or negative semantics)

Overconfidence

Tendency to think one knows more than one does, especially if placing faith in opinions without gathering necessary supporting evidence


Preventable medical errors are estimated to cost the US healthcare system $20 billion annually. Misdiagnosis in medicine has been explored in critically acclaimed texts such as To Err is Human: Building a Safer Health System, and How Doctors Think. Medical errors have also made it to the big screen in documentaries such as HBO’s Bleed Out (2018). In a recent analysis by Galvin et al. (2019) in the Journal of the American Academy of Audiology (JAAA), they found no peer-reviewed studies that look at strategies to reduce biases in the audiology clinic. Best practices continue to be a hot-button issue within audiology and is a topic that is often explored within our Academy. It’s a topic that is heavily researched and was likely emphasized throughout your educational coursework, one that is increasingly prevalent in social media, and one that our patients are continuing to be aware of as they walk through the doors of our clinic. While more research continues to explore this topic, what can we do in our day-to-day to help reduce cognitive bias and ensure we are sticking to best practices?

How can we start to address these issues as they pertain to the work that we do? First off, we should do our best to listen to the opinions of our coworkers and patients. Our intuitions are very useful, but not always 100% correct. Reminding ourselves about the importance of being open-minded and accepting that we are not always right is perhaps the first step in talking about potential bias that may be influencing our work. In a recent column in Signal & Noise by Brian Taylor,1 the following suggestions were made to help override our executive processes and avoid blind spots in routine clinical testing:

  1. Collecting a thorough case history and performing a comprehensive audiological assessment
  2. Using checklists to ensure that every step of the evaluation has been conducted
  3. Conducting all tests in a manner that reflects current bests practices
  4. Taking a diagnostic “time out” to review your work and ensure you are not missing details
  5. Reporting all findings with referring physicians, including when findings indicate a possible nonbenign condition
  6. Conducting meetings or grand rounds with staff to discuss challenging or unusual cases
  7. Using Big Data to build your own set of normative data for each test you conduct

More research needs to be done on how bias can impact other aspects of our scope such as hearing aid dispensing, vestibular assessment, and other special testing considerations within our profession. Take a moment to think about your own clinical practice and how you make quick decisions. Sometimes it’s a good thing for all of us to pause, stop, and consider how we think with ourselves and others, and walk through how we reach our clinical impressions.

Works Cited

  • Taylor B. Avoiding clinical blind spots with good audiology. Hearing Health & Technology Matters Web site. https://hearinghealthmatters.org/hearingeconomics/2017/audiology-clinical-blind-spots/. Updated 2017.
  • Groopman J. How doctors think. First Mariner Books; 2008.
  • To Err is Human: Building a safer health system. Washington DC: National Academies Press; 2000.
  • Shlonsky A, Featherston R, Galvin K, et al. Interventions to mitigate cognitive biases in the decision making of eye care professionals: A systematic review. Optometry and Vision Science. 2019;96(11):818-82 https://search.proquest.com/docview/2310675514. doi: 10.1097/OPX.0000000000001445.
  • Galvin KL, Featherston RJ, Downie LE, et al. A systematic review of interventions to reduce the effects of cognitive biases in the decision-making of audiologists. Journal of the American Academy of Audiology. 2019. https://www.ncbi.nlm.nih.gov/pubmed/31287054. doi: 10.3766/jaaa18096.

Additional Resources & Further Reading

  • Take a Quiz! There’s plenty of websites out there that will try to pinpoint what may be steering your decisions. This one only takes a few minutes.
  • https://www.ideastogo.com/articles-on-innovation/quiz-what-cognitive-bias-are-you
  • Quick Youtube infographic video on heuristics
  • https://www.youtube.com/watch?v=ReFqFPJHLhA
  • Hosford-Dunn H. Audiologists not immune to dunning-kruger effect. Hearing Health & Technology Matters Web site. https://hearinghealthmatters.org/hearingeconomics/2017/dunning-kruger-effect/. Updated 2017.
  • Saposnik G, Redelmeier D, Ruff CC, Tobler PN. Cognitive biases associated with medical decisions: A systematic review. BMC medical informatics and decision making. 2016;16(1):138. https://www.ncbi.nlm.nih.gov/pubmed/27809908. doi: 10.1186/s12911-016-0377-1.

 

Eric Bostwik, AuD is currently an Instructor and Clinical Audiologist at Temple University Hospital. He attended the University of Wisconsin-Madison for both his Doctorate of Audiology and Bachelor of Arts degree. His current clinical scope includes comprehensive diagnostic testing, vestibular evaluation, electrophysiological testing, and auditory rehabilitation (hearing aids and implantable devices).