• Home
  • >
  • Blog
  • >
  • Part 3:  AI & Risk to Empathy, Compassion and Trust in Healthcare

June 14, 2023

Part 3:  AI & Risk to Empathy, Compassion and Trust in Healthcare

Impact of Artificial Intelligence (AI) on Patient-Centered Care

Written by Joanne Byron, BS, LPN, CCA, CHA, CHCO, CHBS, CHCM, CIFHA, CMDP, COCAS, CORCM, OHCC, ICDCT-CM/PCS

This article follows Part 1 – Basics of Artificial Intelligence (AI) and Healthcare Compliance published by AIHC on June 6, 2023 and Part 2 – Who Regulates Healthcare AI?.  AI is advancing rapidly, so we encourage you to reference the new Artificial Intelligence article category for the latest published by AIHC.  The COVID pandemic has proven that patients like telehealth and means other than face-to-face encounters for routine health needs.  But can AI replace the empathy, compassion and trust instilled during a personal encounter with a health care professional?  This article explores recent research on the topic of AI and patient-centered outcomes research.

As AI is integrated into patient care and medical coding, billing and accounts receivable management, will we risk the human connection of empathy and compassion which builds trust in the physician-patient relationship?  I embarked on a short research project to explore this topic and share my findings with AIHC members and affiliates.

Empathy, compassion and trust are fundamental values of a patient-centered, relational model of health care.  As Artificial Intelligence (AI) is advancing the delivery of health care and improving the diagnosis and treatment of our patients, is this promising technology providing greater efficiency and more free time for health-care professionals to focus on the human side of care, including fostering trust relationships and engaging with patients with empathy and compassion?  Or is it freeing time to see more patients to increase the revenue stream?

A June 2023 abstract was posted to the National Institutes of Health (NIH) National Library of Medicine, entitled “Artificial Intelligence in Health: Enhancing a Return to Patient-Centered Communication.”  The authors emphasize concerns around AI in the delivery of health care; concerns related to ethics, privacy, data representation and the potential of eliminating physicians.

The article states “However, AI cannot replicate a physician’s knowledge and understanding of the patient as a person and the conditions in which he or she lives. Therefore, provider-patient communication will be paramount in providing safe and effective health care.”

In April 2023, the NIH posted “The impact of artificial intelligence on the person-centered, doctor-patient relationship: some problems and solutions”.  The authors agree AI is a solution to freeing up of time for doctors and facilitating person-centered doctor-patient relationships. However, “… there is very little concrete evidence on their impact on the doctor-patient relationship or on how to ensure that they are implemented in a way which is beneficial for person-centered care.”

  • Patient-centered outcomes research (PCOR) compares the impact of two or more preventive, diagnostic, treatment, or health care delivery approaches on health outcomes, including those that are meaningful to patients.

In light of the given the importance of empathy and compassion in the practice of person-centered care, they conducted a literature review and found that besides empathy and compassion, shared decision-making, and trust relationships emerged as key values.

Using AI tools can have a positive impact on person-centered doctor-patient relationships, according to the article, when:

  1. using AI tools in an assistive role; and
  2. adapting medical education.

Artificial intelligence and the doctor-patient relationship expanding the paradigm of shared decision makingis a June 2023 NIH article emphasizes how AI based clinical decision support systems (CDSS) are rapidly becoming more prevalent in healthcare, playing an important role in diagnostic and treatment processes. For this reason, AI-based CDSS has an impact on the doctor-patient relationship, shaping their decisions with its suggestions.

The article poses that we may be on the verge of a paradigm shift, where the doctor-patient relationship is no longer a dual relationship, but a triad. AI implementations may instead foster the inappropriate paradigm of paternalism. Understanding how AI relates to doctors and influences doctor-patient communication is essential to promote more ethical medical practice. Both doctors’ and patients’ autonomy need to be considered in the light of AI.

A successful AI case related to patient-centered outcomes was located on HealthIT.gov, the website for the Office of the National Coordinator for Health Information Technology (ONC).   ONC completed a project in September 2021 “Training Data for Machine Learning to Enhance Patient-Centered Outcomes Research Data Infrastructure.”

Through this project, ONC in partnership with NIH and the National Institute of Diabetes and Digestive and Kidney Diseases (NIDDK), advanced the application of AI/ML in patient-centered outcomes research (PCOR) by generating high quality training datasets for a chronic kidney disease (CKD) use case – predicting mortality within the first 90 days of dialysis. This case was selected because mortality in the first 90 days of dialysis initiation in ESKD/ESRD patients remains notably high and included joint clinician-patient informed decision making. PCOR researchers can build off the foundational work completed through this project and extend the application of these methods to a wider array of use cases and advance the application of ML to enhance PCOR infrastructure.

Conclusion

Working in health care requires adapting to constant change as technology and software advancements force not only providers, but IT professionals and health care administrators to stay ahead of what is coming.

It is important to be fiscally responsible, however, do we want to live in a world where we can only speak to a machine regarding questions about our medical bills, or discuss our concerns regarding a treatment plan?  Where is the humanity in that?

We must move forward with integrating AI into our lives.  But, moving forward, it is important to re-evaluate whether and how empathy, compassion and trust could be incorporated and practiced within a health-care system where artificial intelligence is increasingly used. Most importantly, society needs to re-examine what kind of health care it ought to promote.

AIHC will continue to post articles related to artificial intelligence with regards to healthcare compliance.  Click Here for additional articles on various HIPAA topics.  Click Here for articles relating to Artificial Intelligence. Visit the AIHC Certifications page with online compliance learning opportunities.

TAGS


Verified by MonsterInsights