Optimising Artificial Intelligence Ultrasound.13
Optimising Artificial Intelligence Ultrasound.13
Address for correspondence: Anastasia Jones1,2, Ryan Tang2, Anahita Dabo‑Trubelja3, Cindy B. Yeoh1,2,
Dr. Anastasia Jones,
MD12902 Magnolia Drive
Leshawn Richards1,2, Vijaya Gottumukkala4
MCB‑Anest Tampa, FL 33612.
1
Department of Anesthesiology, Moffitt Cancer Center, Tampa, FL, 2Department of Oncologic Sciences,
Morsani College of Medicine, University of South Florida, Tampa, FL, 3Department of Anesthesiology,
E‑mail: Anastasia.Jones@
Memorial Sloan Kettering Cancer Center, New York, NY, 4Department of Anesthesiology, MD Anderson
moffitt.org
Cancer Center, Houston, Texas, USA
Submitted: 31-May-2024
Revised: 20-Aug-2024
Accepted: 22-Aug-2024 ABSTRACT
Published: 26-Oct-2024
Artificial intelligence (AI) was once considered avant‑garde. However, AI permeates every industry
today, impacting work and home lives in many ways. While AI‑driven diagnostic and therapeutic
applications already exist in medicine, a chasm remains between the potential of AI and its
Access this article online clinical applications. This article reviews the status of AI‑powered ultrasound (US) applications
in anaesthesiology and perioperative medicine. A literature search was performed for studies
Website: https://journals.lww.
com/ijaweb examining AI applications in perioperative US. AI applications for echocardiography and regional
anaesthesia are the most robust and well‑developed. While applications are available for lung
DOI: 10.4103/ija.ija_578_24
imaging and vascular access, AI programs for airway and gastric US imaging solutions have yet
Quick response code
to be available. Legal and ethical challenges associated with AI applications need to be addressed
and resolved over time. AI applications are beneficial in the context of education and training. While
low‑resource settings may benefit from AI, the financial burden is a considerable limiting factor.
function [Figure 2].[11] Another helpful feature of interpretation of prospective real‑time lung US from
Kosmos is the immediate feedback during image an average of 68.1% to 93.4% (P < 0.001).[15] In the
acquisition regarding probe placement, tilt, and wake of the coronavirus disease 2019 (COVID‑19)
rotation. Feedback on image optimisation helps the pandemic, another study found that the AI application
user become skilful over time, and the feedback helps of guided B‑line quantification was used to detect
in a setting where POCUS experts are not available COVID‑19 pneumonia.[16] Kuroda et al. found that
for consultation.[12] In a single‑centre study by Baum this AI function correlated well with computed
et al.,[13] a randomised investigation showed that tomography findings of pneumonia, with an accuracy
novice users with POCUS devices equipped with of 94.5% for 12 zones (bilateral anterior, lateral, and
AI functionality had a shorter apical 4‑chamber posterior chest) and 83.9% for eight zones (bilateral
acquisition time and higher image‑quality scores. anterior and lateral chest).
Vascular‑access US
In the operating room, the US is commonly used for
vascular access. For practised clinicians, using the US
for common procedures, such as central venous and
arterial access, is second nature. For less‑practised
clinicians (i.e. those with no US or interventional
skills to obtain central femoral venous access,
especially during critical haemorrhage), a handheld
AI‑guided US device has been developed to direct
them.[22] This device (1) identifies the femoral vein, (2)
directs the user in image optimisation, (3) identifies
a safe needle‑insertion point, (4) deploys the needle,
and (5) prompts the user to advance the guidewire
for catheter placement. So far, this device has only
Figure 2: Transthoracic echocardiogram parasternal long‑axis
image obtained with Kosmos. AI‑powered labeling appropriately
been tested on phantoms and porcine models.[22]
labels structures‑ right ventricle, left ventricular outflow track, left Yet, continued research and improvement on similar
ventricle, mitral valve, and left atrium. AO = aorta, AV = aortic valve, AI‑guided technologies could provide support
IVS = intraventricular septum, LA = left atrium, LV = left ventricle,
LVOT = left ventricular outflow track, MV = mitral valve, RV = right for trainees or novice clinicians in low‑resource
ventricle settings. However, the potential benefits of AI‑guided
technologies must be compared to the risks for with 95%–100% accuracy.[27] Nerveblox, developed in
inexperienced operators. 2020, presents colour‑labelled anatomical structures
of US images to guide RA procedures.[26,28] Various
AI in US‑guided regional anaesthesia machine learning models are also being developed
The field of RA relies heavily on the US to acquire to further improve target detection and tracking
images and guide procedures. Pros of using US in algorithms.[24]
RA include portability, absence of radiation, direct
and real‑time visualisation of anatomic structures, One of the main advantages of AI in US‑guided RA
and local anaesthetic spread.[23,24] However, the is structure identification for non‑expert users.[27,29,30]
success of a block is often operator‑dependent, with However, more than accurate structure identification
outcomes affected by factors such as a physician’s is needed to ensure a safe and effective block. Accurate
anatomic knowledge, inattentional blindness, fatigue, imaging and identifying pertinent structures are
technique, and anatomical challenges, such as merely a foundational component of RA procedures.
obesity, trauma, and subcutaneous emphysema.[25] Safe needle trajectory, needle visualisation,
Introducing AI guidance for US imaging with real‑time placement, and injection of local anaesthetic into
detection of key structures, such as nerves, muscles, the appropriate location are all crucial components
fascia, and blood vessels, could significantly improve of an operator‑dependent, safe, and effective block.
physician performance. Currently, AI models for NeedleTrainer was built into ScanNav Anatomy US
these applications include deep convolutional neural systems to bridge this gap in developing procedural
networks. For example, the U‑Net architecture, skills. The programme uses retractable needles and
developed in 2015 by Olaf Ronneberger at the augmented‑reality technology to simulate procedural
University of Freiburg in Germany, can segregate conditions in a human patient. This system could
and segment grey images.[26] With this technology, be used to improve needling skills in regional nerve
machine learning platforms integrated into US blocks as well as vascular access.[26,31]
systems, such as ScanNav Anatomy Peripheral Nerve
Block (Intelligent Ultrasound, Cardiff, UK) and A study by Bowness et al.[30] explored the utility of
Nerveblox (Pajunk, Geisingen, Germany), have been assistive AI for US scanning in RA, and it showed
developed. ScanNav Anatomy Peripheral Nerve that both experts and non‑experts valued learning and
Block was approved by the FDA in 2022; it creates teaching US scanning for RA. Interestingly, non‑experts
colour overlays of key anatomical structures to assist were more likely to give positive feedback on using
physicians in performing RA [Figure 3]. In a clinical assistive AI, while experts observing non‑experts in
study from April 2021 on ScanNav, experts reviewed a procedure were more likely to report increased risk
the application of the technology and found that the and safety concerns.[30] Risk and safety concerns are
programme identified and recognised key structures expected with implementing any new technology. By
contrast, proceduralists performing RA blocks should
be aware of potential risks and understand how to
mitigate and address complications. Regrettably, this
type of understanding is often relegated to experienced
providers and is outside the arsenal of a novice
clinician.
As AI‑assisted technologies are still in the preliminary improve patient care. In the future, AI will integrate
stages, detection and tracking errors and image into daily clinical practice as a tool for clinicians;
misinterpretation are risks, especially in patients with therefore, anaesthesiologists must collaborate
anatomical variations or abnormal anatomy, such as closely with engineers and scientists in AI systems
in trauma. There are also limitations in the technology development.
used for detecting osseous images. Block complications
may include block failure, needle trauma, hematoma, CONCLUSION
nerve injury, pleural injury, peritoneal injury, and local
anaesthetic systemic toxicity.[24] In addition to clinical Assistive AI has great potential for successful
challenges, the legal, financial, and ethical questions and widespread implementation in the US in
associated with implementing AI are still evolving. anaesthesiology because of its capacity to aid image
There are multiple unresolved ethical questions— acquisition, image optimisation, video interrogation,
such as who is responsible for complications when AI and feedback. While incorporation into clinical
is being utilised? In addition, what if AI and clinician practice has been lagging because of multiple barriers,
assessments are directly opposed? Additional causes AI tools in the US will succeed first as educational
of concern are legal responsibility for system errors and personal improvement tools. As AI solutions
and the use of faulty data when training these models. become increasingly robust, the AI programs for the
The White House of the United States released US have the potential to revolutionise and transform
the ‘Blueprint for AI Bill of Rights’ as a guide to AI anaesthesiology and perioperative medicine. Still,
developers and users; however, legal regulation is still they will require practising clinicians to collaborate
limited.[32] with AI developers to engage actively.
6. Sechopoulos I, Teuwen J, Mann R. Artificial intelligence assessment of gastric volume by gastroscopic examination.
for breast cancer detection in mammography and digital Anesth Analg 2013;116:357-63.
breast tomosynthesis: State of the art. Semin Cancer Biol 20. Tacken MCT, van Leest TAJ, van de Putte P, Keijzer C,
2021;72:214-25. Perlas A. Ultrasound assessment of gastric volumes of thick
7. Repici A, Badalamenti M, Maselli R, Correale L, Radaelli F, fluids: Validating a prediction model. Eur J Anaesthesiol
Rondonotti E, et al. Efficacy of real-time computer-aided 2021;38:1223-9.
detection of colorectal neoplasia in a randomized trial. 21. Carsetti A, Sorbello M, Adrario E, Donati A, Falcetta S. Airway
Gastroenterology 2020;159:512-20.e7. ultrasound as predictor of difficult direct laryngoscopy:
8. Repici A, Spadaccini M, Antonelli G, Correale L, Maselli R, A systematic review and meta-analysis. Anesth Analg
Galtieri PA, et al. Artificial intelligence and colonoscopy 2022;134:740-50.
experience: Lessons from two randomised trials. Gut 22. Brattain LJ, Pierce TT, Gjesteby LA, Johnson MR, DeLosa ND,
2022;71:757-65. Werblin JS, et al. AI-enabled, ultrasound-guided handheld
9. Biffi C, Salvagnini P, Dinh NN, Hassan C, Sharma P, robotic device for femoral vascular access. Biosensors (Basel)
Group GIGCS, Cherubini A. A novel AI device for real-time 2021;11:522. doi: 10.3390/bios11120522
optical characterization of colorectal polyps. NPJ Digit Med 23. Marhofer P, Greher M, Kapral S. Ultrasound guidance in
2022;5:84. doi: 10.1038/s41746-022-00633-6 regional anaesthesia. Br J Anaesth 2005;94:7-17.
10. Naji A, Chappidi M, Ahmed A, Monga A, Sanders J. 24. Viderman D, Dossov M, Seitenov S, Lee MH. Artificial
Perioperative point-of-care ultrasound use by anesthesiologists. intelligence in ultrasound-guided regional anesthesia:
Cureus 2021;13:e15217. doi: 10.7759/cureus.15217 A scoping review. Front Med (Lausanne) 2022;9:994805. doi:
11. Kosmos by EchoNous. EchoNous, Inc; c2015-2024. 10.3389/fmed. 2022.994805
Avaliable from: https://echonous.com/kosmos-plus/?utm_ 25. Lloyd J, Morse R, Taylor A, Phillips D, Higham H,
term=ultrasound%20ai&utm_campaign=&utm_ Burckett-St Laurent D, et al. Artificial intelligence:
source=adwords&utm_medium=ppc&hsa_acc=3477071792 Innovation to assist in the identification of sono-anatomy for
&hsa_cam=20562115886&hsa_grp=161634935400 &hsa_ ultrasound-guided regional anaesthesia. Adv Exp Med Biol
ad=691771426280&hsa_src=g&hsa_tgt=kwd-530725551486 2022;1356:117-40.
&hsa_kw=ultrasound%20ai&hsa_mt=e&hsa_ 26. Mika S, Gola W, Gil-Mika M, Wilk M, Misiołek H. Artificial
net=adwords&hsa_ver=3&gad_source=1&gclid=Cj0KCQjw8 intelligence-supported ultrasonography in anesthesiology:
J6wBhDXARIsAPo7QA-XaK4nmUH-PGegl0nqgtpjA2FpfHwM Evaluation of a patient in the operating theatre. J Pers Med
WhPTwYpKf7OuuJa3Ya_4PJYaAjjxEALw_wcB. Last assessed 2024;14:310. doi: 10.3390/jpm14030310
on July 2024. 27. Bowness J, Varsou O, Turbitt L, Burkett-St Laurent D.
12. Serrano RA, Smeltz AM. The promise of artificial intelligence- Identifying anatomical structures on ultrasound: Assistive
assisted point-of-care ultrasonography in perioperative care. artificial intelligence in ultrasound-guided regional anesthesia.
J Cardiothorac Vasc Anesth 2024;38:1244-50. Clin Anat 2021;34:802-9.
13. Baum E, Tandel MD, Ren C, Weng Y, Pascucci M, Kugler J, et al. 28. Larkin HD. FDA approves artificial intelligence device for
Acquisition of cardiac point-of-care ultrasound images with guiding regional anesthesia. JAMA 2022;328:2101. doi:
deep learning: A randomized trial for educational outcomes 10.1001/jama.2022.20029
with novices. CHEST Pulmonary 2023;1:100023. 29. Gungor I, Gunaydin B, Oktar SO, M Buyukgebiz B, Bagcaz S,
14. Bhoil R, Ahluwalia A, Chopra R, Surya M, Bhoil S. Signs and Ozdemir MG, et al. A real-time anatomy ıdentification via
lines in lung ultrasound. J Ultrason 2021;21:e225-33. tool based on artificial ıntelligence for ultrasound-guided
15. Nhat PTH, Van Hao N, Tho PV, Kerdegari H, Pisani L, Thu LNM, peripheral nerve block procedures: An accuracy study.
et al. Clinical benefit of AI-assisted lung ultrasound in a J Anesth 2021;35:591-4.
resource-limited intensive care unit. Crit Care 2023;27:257. 30. Bowness JS, El-Boghdadly K, Woodworth G, Noble JA,
doi: 10.1186/s13054-023-04548-w Higham H, Burckett-St Laurent D. Exploring the utility of
16. Kuroda Y, Kaneko T, Yoshikawa H, Uchiyama S, Nagata Y, assistive artificial intelligence for ultrasound scanning in
Matsushita Y, et al. Artificial intelligence-based point-of- regional anesthesia. Reg Anesth Pain Med 2022;47:375-9.
care lung ultrasound for screening COVID-19 pneumoniae: 31. Shevlin SP, Turbitt L, Burckett-St Laurent D, Macfarlane AJ,
Comparison with CT scans. PLoS One 2023;18:e0281127. doi: West S, Bowness JS. Augmented reality in ultrasound-guided
10.1371/journal.pone.0281127 regional anaesthesia: An exploratory study on models with
17. Baldawi M, Ghaleb N, McKelvey G, Ismaeil YM, Saasouh W. potential implications for training. Cureus 2023;15:e42346.
Preoperative ultrasound assessment of gastric content in doi: 10.7759/cureus.42346
patients with diabetes: A meta-analysis based on a systematic 32. Hine E, Floridi L. The blueprint for an AI bill of rights: In
review of the current literature. J Clin Anesth 2024;93:111365. search of enaction, at risk of inaction. Minds and Machines
18. Valla FV, Tume LN, Jotterand Chaparro C, Arnold P, 2023;33:285-92.
Alrayashi W, Morice C, et al. Gastric point-of-care ultrasound 33. Naaz S, Asghar A. Artificial intelligence, nano-technology and
in acutely and critically ill children (POCUS-ped): A scoping genomic medicine: The future of anaesthesia. J Anaesthesiol
review. Front Pediatr 2022;10:921863. doi: 10.1016/j.jclinane. Clin Pharmacol 2022;38:11-7.
2023.111365 34. Char DS, Burgart A. Machine-Learning Implementation in
19. Perlas A, Mitsakakis N, Liu L, Cino M, Haldipur N, Davis L, Clinical Anesthesia: Opportunities and Challenges. Anesth
et al. Validation of a mathematical model for ultrasound Analg 2020;130:1709-12.