You are viewing a javascript disabled version of the site. Please enable Javascript for this site to function properly.
Go to headerGo to navigationGo to searchGo to contentsGo to footer
In content section. Select this link to jump to navigation

Dancing with uncertainties in the era of artificial intelligence


 In this commentary, a medical student reflects on the promise of artificial intelligence (AI) in mitigation of physician burnout and moral injury. The rapid introduction of AI technologies may present a challenge to medical professionals, especially those engaged in the transdisciplinary care of children with disabilities.

Embarking on my medical education felt like entering an ominous void; the unfamiliarity was pervasive, especially as a first-generation college student and Salvadoran American who started my career as a dance artist, performing and teaching primarily in the Houston area. At night, I worked as a scribe in the emergency department. This sneak peek into healthcare provided me with valuable knowledge and insight, but it also put me at the forefront of the struggles that healthcare workers face, particularly when it comes to burnout and moral injury. I recall being advised to heed warnings about pursuing a career in medicine from several doctors and nurses, all before applying to medical school. Although uncertainty presented itself at every point of my journey, I continued to navigate my path. Moreover, the culmination of these uncertainties during and after the COVID-19 pandemic has led to more challenges in healthcare that we collectively need to solve.

One such problem is burnout. More than half of healthcare workers experience burnout, and this burnout is significantly higher than burnout rates in other sectors. As a public health crisis, burnout has also disproportionately impacted workers of color and contributes to health care disparities. Burnout and worker turnover were estimated to cost $4.6 billion annually pre-COVID, and that number continues to rise [1]. Medical errors are associated with physician burnout, and suicide is 5–7 times higher in physicians than in the general population [2]. Institutional, policy, and research efforts to combat this problem continue to be suboptimal.

New applications of emerging technologies like artificial intelligence (AI) may be part of a multifaceted solution. For example, AI could be leveraged to identify high-risk workers by assessing individual screening and prediction models [3, 4]. It could then be used to create and administer personalized tools for a more comprehensive, individualized approach to improve outcomes.

AI has already shown great utility in medicine and has been applied in several clinical settings. Current uses include the identification of cardiac dysrhythmias such as atrial fibrillation, glycemic monitoring, seizure detection, and medical imaging, among others with additional promise in research settings [5]. Despite the progress that has been made, there is still so much that needs to be explored, tested, and validated. Some studies in this domain lack reproducibility and generalizability [6]. Although AI may bring efficiency and expedition to the forefront, certain populations may be at a disadvantage with potential exclusion from benefits. As someone with a dual interest in pediatrics and physical medicine and rehabilitation, I cannot help but think about how these populations will be affected.

In recent years, both the Centers for Disease Control and the United States Census have noted an increase in the prevalence of children with disabilities [7, 8]. Moreover, the population of children with medical complexity is growing at a rate of 5% per year in the United States [9]. These unique patients highlight the importance and value of personalized care. Predictive models continue to improve each year. One such model has shown promise in the retrospective identification of susceptible pediatric patients with complex health needs who are at risk for future hospitalizations. However, as patients require increasingly intricate care, the models may not properly adapt [10]. Algorithmic medicine, when used in particular patient populations, may not fit previously set parameters, perhaps introducing yet another detriment to our complex healthcare system. The uncertainty about unfamiliar technologies in practice and research as well as the resultant challenges we will face remain. Nonetheless, trainees, like me, look forward to learning and engaging in the solution as we advance through our medical careers.

Although the future continues to unveil and shroud itself in more mystery each day, we must be ready to adapt to the dynamic scene and forge ahead. John Allen Paulos, an American mathematician, once said, “Uncertainty is the only certainty there is, and knowing how to live with insecurity is the only security.” Dancers often embrace uncertainty, finding comfort in it, especially when it comes to improvisation. We lead, and sometimes we follow. We act and we react; our movements remain purposeful. For me, I will continue with our lovely, yet fickle dance partner and see what comes of it.

Conflict of interest

The authors have no conflicts of interest to disclose.


I would like to thank the NYU Langone Health Clinical Translational Science Institute staff, with special thanks to my mentors (co-authors) of this article for their guidance throughout my training there and beyond. Lastly, thank you to the Robert A. Winn Clinical Investigator Pathway Program for this opportunity as I prepare for a gap year with the Fulbright U.S. Student Program to study dance science in the United Kingdom prior to returning to apply to combined Pediatrics –Physical Medicine and Rehabilitation residency programs.



Levine D . U.S. Faces Crisis of Burned-Out Health Care Workers. US News & World Report; 2021. Available from:


Ventriglio A , Watson C , Bhugra D . Suicide among doctors: A narrative review. Indian J Psychiatry. (2020) ;62: (2):114–120. doi: 10.4103/psychiatry.IndianJPsychiatry_767_19


Dwyer DB , Falkai P , Koutsouleris N . Machine Learning Approaches for Clinical Psychology and Psychiatry. Annu Rev Clin Psychol. (2018) ;14: :91–118. doi: 10.1146/annurev-clinpsy-032816-045037


D’Hotman D , Loh E . AI enabled suicide prediction tools: a qualitative narrative review. BMJ Health Care Inform. (2020) ;27: (3):e100175. doi: 10.1136/bmjhci-2020-100175


Briganti G , Le Moine O . Artificial Intelligence in Medicine: Today and Tomorrow. Front Med (Lausanne). (2020) ;7: :27. doi: 10.3389/fmed.2020.00027


Haibe-Kains B , Adam GA , Hosny A , et al. Transparency and reproducibility in artificial intelligence. Nature. (2020) ;586: (7829):E14–E16. doi: 10.1038/s41586-020-2766-y


Zablotsky B , Black LI , Maenner MJ , et al. Prevalence and Trends of Developmental Disabilities among Children in the United States: 2009–2017. Pediatrics. (2019) ;144: (4):e20190811. doi: 10.1542/peds.2019-0811


Young NAE . Childhood Disability in the United States: 2019. United States Census Bureau; 2021. Available from:


Agostiniani R , Nanni L , Langiano T . Children with medical complexity: the change in the pediatric epidemiology. Journal of Pediatric and Neonatal Individualized Medicine. (2014) ;3: (2):e030230. doi: 10.7363/030230


Ming DY , Zhao C , Tang X , et al. Predictive Modeling to Identify Children With Complex Health Needs At Risk for Hospitalization. Hosp Pediatr. (2023) ;13: (5):357–369. doi: 10.1542/hpeds.2022-006861