Digitalization and Artificial Intelligence in (Heart-) Medicine
New Technologies in Healthcare
Peer-review

Digitalization and Artificial Intelligence in (Heart-) Medicine

Review Article
Issue
2023/06
DOI:
https://doi.org/10.4414/cvm.2023.1236103712
Cardiovasc Med. 2023;26(06):190-192

Affiliations
Luzerner Kantonsspital, Lucerne, Switzerland: a Department of Anesthesiology, Emergency Medical Service and Pain Therapy
b Division of Cardiac Surgery

Published on 22.11.2023

Abstract

Digitalization of the healthcare sector is changing its landscape. Artificial intelligence is bound to have a major impact on care providers and their interactions with their patients. As promising as new technologies may be, there will be side effects. New technologies will place new and higher demands on its users and will require an extremely high level of expertise to check plausibility. Alongside increases in efficiency and effectiveness, the demands on the humans in healthcare will also rise.
Keywords: Artificial intelligence; big data; digitalization; electronic health record; telemedicine

Introduction

Digitalization has taken on a whole new meaning in cardiology. It is no longer just a drug therapy but rather a term for the next industrial revolution. In a narrow sense, digitalization is the transformation of analogue data (e.g., text, images) into digital data and its use.
Digitalization is understood as the use of artificial intelligence (AI), big data analysis and clouds, robotization and automation as well as social media and platforms. These technologies can be used in almost all economic sectors and business areas, and are increasingly present in our everyday lives.
Medicine tends to be rather reserved and conservative in the adoption of new technologies, even though digitalization and AI in particular, could support and relieve doctors in their everyday work. Digitalization in medicine is already happening today, e.g., in the analysis of complex image files in radiology [1]. However, the integration of digitalization and AI into clinical workflows is not yet widespread. Studies on optimizing results are still scarce, but it is probably only a question of time.

History

Interestingly, the concept of digitalization and AI was already developed in the 1950s. Researchers were working on thinking machines and saw the potential for medicine and medical practice [2-5]. However, the computer-based systems of the time were rudimentary, did not lead to success and expectations were often disappointed [6]. It was not until recent developments in chip development, and thus in computing capacity and the ability to process and analyze large amounts of data, that the application of powerful algorithms became possible. They are now capable of quickly and reliably extracting relevant information from a huge pool of data and applying it for further use. The digitalization and application of AI has already begun to change our everyday lives.
Physicians should not close their minds to this development but consider the possibilities of integrating them to improve patient care and quality of life [1, 7-9]. If the development is left exclusively to commercial companies, there is a risk that they will shape applications according to their agenda without considering physician’s or patient’s actual needs.

Gartner Hype Cycle

The Gartner hype cycle (GHC) is a graphical representation of the maturity and adoption of a new technology or application [10]. Figure 1 shows such a GHC. The interest in a new technology is greatest at the beginning, but because the usability or commercial benefit is low, disillusionment spreads accordingly. In the next step, things slowly improve and the technology becomes more important as it is integrated into the workflow. Ideally, a “mature” technology develops. Digitalization and in particular AI, has gone through such a process, but only a few applications have reached the phase of maturity (“enlightenment”).
Figure 1: Gartner Hype Cycle. Graphical representation of the maturity and adoption of a technology or application [10]. Typical phases include the innovation trigger, the peak of inflated expectations, the trough of disillusionment, the slope of enlightenment and the plateau of productivity.

Medical Applications

Digitalization is already being used in medicine in many ways (11; fig. 2). For example, electronic health records (EHRs) have replaced paper-based records and allow doctors to access a patient's medical history, notes, test results, images, and other relevant information at any time. EHRs can easily be shared between different health care providers and, as such, could bring important improvements in patient care. Another key technology is telemedicine. It uses digital technology to connect doctors with patients over a distance. It became particularly popular during the COVID-19 pandemic. Telemedicine enables virtual consultations, monitoring of patient conditions and other forms of care. It provides a link between patients and healthcare providers without the need for a physical visit to the doctor's office or a hospital. This can be important for elderly patients and those with chronic diseases.
Figure 2:Main technologies of digitalization in medicine.
Medical imaging today is completely digitalized: digital X-rays, computed tomography (CT), magnetic resonance imaging, and echocardiography among others. Combining digital data with advanced analysis technologies can improve the accuracy and speed of diagnosis, allowing doctors to optimize patient care. Mobile health applications can help in collecting data such as vital signs, medication adherence and other relevant information to monitor patients' health. These devices provide daily health data and active monitoring compared to a once a month or once a year clinical checkup. They include heart rate sensors, exercise tracking, blood glucose monitoring, information about hydration, blood pressure and others. In addition, such health apps can provide educational resources for patients.
AI may have the greatest potential to transform medicine. It is used to simulate human knowledge and to analyze complex medical data and biological processes. Several publications have already addressed the potential optimizations offered by AI [1, 7-9, 12, 13]. For example, AI is capable of reading and analyzing electrocardiograms, retinal images, CT scans or skin changes quite accurately. Today, such analyses are not yet perfect and usually require an experienced doctor to validate them. However, they can accelerate diagnoses and can make complex ones more accurate through algorithms. The main goal of AI in biomedicine is to establish relationships between patient health data, diagnostics, treatment programs and outcomes. AI search engines could analyze the course of a disease based on the EHR, test results and imaging data, provide early warning of potential complications and suggest appropriate early treatment options. Critical care medicine and anesthesia are particularly well suited to integrate such AIs into their clinical practice given the large amount of data collected in daily practice, but many other areas could benefit as well [14].
Of course, the majority of patients will prefer to interact with a doctor and receive a diagnosis from a real person rather than from a “chatbot”. Medicine is not just data and science that can be replaced by AI. Factors such as empathy, social interaction and the support of the doctor's presence can be crucial for the patient's well-being and health. Nevertheless, it has been demonstrated that digitalization and AI can support doctors and improve patient care. AI could also help to ask patients about their symptoms via a “chatbot” in advance of a visit, analyze vital parameters and laboratory values, and make recommendations for diagnosis and therapy. This may seem futuristic and may even be undesirable, but it is not far-fetched given today's healthcare bottlenecks, financial pressures and shortage of specialists. Patients could be seen in a timely manner, and further diagnostic and treatment steps could be initiated. In this context, it is interesting to note that a “chatbot” was used as early as 1964 by Joseph Weizmann at the Massachusetts Institute of Technology's AI Laboratory [6, 15]. It has taken more than 60 years for this to become a reality .
By automatically writing notes, consultation or operation reports and letters based on the EHR and sending them to the referring doctor AIs and chat programs could help reducing the workload of doctors.

Dangers and Limitations

As promising and useful as AI can be, it is important to recognize the potential weaknesses and dangers of these technologies. Any AI is only as good as the data it is trained on. Today, a lot of AI training data comes from the internet and is therefore susceptible to manipulation, or “data poisoning” (fig. 3). This means that an algorithm can be poisoned with harmful and false data – even intentionally – and thus is made to behave in an undesirable and manipulated way. According to Florian Tramèr of the ETH Zurich, this is a real problem since potentially anyone with a computer connected to the internet can manipulate AI applications through data poisoning [16].
Figure 3:Illustration of the mechanism of a “poisoning attack” in artificial intelligence. The attacker identifies the source of the raw data set used to train the model and then poisons the data to compromise the accuracy of the resulting machine learning model.
In medical applications, there is the question of what the norms are in daily practice (against which, for example, a CT image is compared) and whether the AI training data could lead to a distortion of the real world. These standards are certainly difficult to define and are part of ongoing discussions.
As with any new technology, drug or device therapy, robust and well-designed clinical trials are needed to truly demonstrate whether AI-assisted care can improve patient management and outcomes compared to conventional care. Well executed trials will demonstrate not only the benefits, but also the limitations, potential risks and unintended consequences of AI use in real-world clinical settings. The path of AI is fascinating but can only be taken if scientific support is provided early and before widespread commercialization. The application of AI, like other technologies, must be subject to continuous scientific monitoring and meet the requirements of scientific rigor.

Ethical and Legal Dilemmas

Every new technology requires adaptation in the sense of a coevolution of culture and society: technolution. To harness the power of big data for the benefit of individual patients and the whole population data, protection laws need to be refocused. Questions about who “owns” data and how data can be used to innovate and improve services need to be addressed. AI introduces new vulnerabilities and risks including cybersecurity threats and the potential for malicious use. As AI advances, there is an increasing potential for autonomous systems to make recommendations and even decisions without human intervention. This raises questions about the appropriate level of human control over AI systems and liability, particularly in medicine.

Conclusions

Digitalization, AI and related technologies are inexorably entering and transforming our everyday lives, including healthcare. As service providers, we will have to evolve and adapt with the technologies. Many aspects of this development are fascinating, and as such, we firmly believe that these technologies will support rather than replace in medicine, potentially improving outcomes and allowing more time for the much-needed human interaction in our daily practice.
Prof. Peter Matt
Division of Cardiac Surgery
Luzerner Kantonsspital
Spitalstrasse
CH-6000 Luzern
peter.matt[at]luks.ch
1 El Naqa I, Haider MA, Giger ML, Ten Haken RK. Artificial Intelligence: reshaping the practice of radiological sciences in the 21st century. Br J Radiol. 2020 Feb;93(1106):20190855.
2 Turing A. Computing machinery and intelligence. Mind. 1950 Oct;LIX(236):433-60.
Conflict of Interest Statement
No financial support and no other potential conflict of interest relevant to this article was reported.