By Author
  By Title
  By Keywords

February 2022, Volume 72, Issue 2

Editorial

About artificial intelligence…

Sarwat Hussain  ( Professor of Radiology, University of Massachusetts, USA. )

Early computers were used as automatic calculation tools. Later, as the Industrial Revolution began, manufacturing devices, were developed to automate more complex tasks, such as guiding weaving patterns on looms. It was not until 1950s, that the Computer Science began to be established as a defined academic discipline. Early on, researchers realized that, based on the ability of the computer to process logic, it may be possible to programme computers to mimic mental capabilities generally associated with basic human intelligence and intellect. In 1956 this attribute was termed, Artificial Intelligence (AI) by John McCarthy, an American computer scientist.1 Nothing much happened for several decades. In 1997, computer science once again came in the lime light, when a massive worldwide media coverage was given to a computer, IBM Deep Blue beating the reigning world chess champion, Gary Kasparov. In a 6-game match the computer won 2-1 with three draws.2 A computer wining in a game, requiring intellect and analytic thinking, brought the world's attention to the extraordinary capabilities of computers. Subsequent surge in investment in AI research surged and the convergence of the following three elements brought the AI to its current prominence:

1. Enhanced computing power using Graphic Processing Units (GPU) and Tensor Processing unit (TPU) in place of the conventional central processing unit (CPU). GPU have the ability to parallel process and perform 3-D rendering and were originally developed for computer gaming.

2. Development of high powered, easy to use, languages such as Python, Java, Lisp etc.3

3. And the emergence of extremely large data sets generated daily, by the billions of 'Smart' devices and other machines connected to the internet. Collectively, these sources of data are called Internet of Things (IoT).  The body of the accumulated data is known as the Big Data By one estimate, the Big Data receives 2.5 quintillion bytes every day. Protected personal information from business, healthcare, banking and national security are encrypted in the Big Data.4

Information in the big data is processed using multitudes of specialized algorithms to derive inferences. These assist in the evidence-based decision making in all walks of life. The discipline of collection, curation and processing of data to derive actionable analytics, has become a distinct branch of computer science, the Data Science. This forms a central pillar of the AI.

Currently, AI is defined as an approach to make a computer demonstrate human like intelligence of rational thinking, learning from experience and problem solving. This definition would change with further advancement in AI.5  In the literature, the commonly used terminology: Machine Learning (ML), Deep Learning (DL) and Artificial Neural Network (ANN) are frequently used synonymous with AI.6

 ML, also called predictive analytics, is a statistical method to automatically process large amount of data to uncover hidden patterns and trends. As the data are processed, the computer learns without being explicitly programmed. The data may originate from any enterprise. In business, predicting future growth from customers buying habits. In healthcare, ML analytics help to reduce hospital cost by determining post-surgical risk stratification to reduce the length of hospital stay, and readmissions. On the other hand, DL is a subtype of AI and ML, but more complex.  DL functions through ANN, a code sign of hardware and software. Inspired by human neural connections, ANN is made up of dozens of interconnected layers (hardware) of artificial neurons. Each layer has multiple nodes. Each node in every layer is connected with all the nodes in the subsequent layer passing binary digital information. The transmitted data is assigned a weightage from 0-10 (software).  The value of the data together with the weightage of the connections form the basis of calculation of complex probabilities to arrive at results. If the network result is not correct, then through "back propagation", the connection weights are adjusted until the neural network is optimized. This is how the ANN learns from failure and feedback to make smarter decisions.7

Curated statistics from the big data can be processed and analyzed through ML, DL and ANN.8 Conclusions derived these processes are behind the business analytics, production efficiencies, financial automation, quality improvement and healthcare processes, to name a few. In diagnostics, for instance, AI applications have shown to improve accuracy of diagnosis in medical imaging through computer-aided detection (CAD), segmentation, image reconstruction, organ perfusion mapping, Radiomics etc. Additionally, AI's computer-based image analysis of histopathology slides, skin conditions and diabetic retinopathy affords specialist's level diagnoses to primary care doctors and nurse practitioners. In laboratory medicine, AI based automation of body fluids handling, quality improvement and alerts for abnormal results are some of the ML applications. In clinical medicine AI has been successfully employed in reducing the length of hospital stay and readmissions following procedures, and help assure timely maintenance of hospital equipment. Analysis of data from electronic medical records (EMR), Hospital (HIS) and Radiology Information Systems (RIS) are being used for improved patient care, education and research. AI based analytics can now be routinely used for administrative financial and planning decisions.9

Most physicians have discomfort with the subject of AI because of the complexity of the terminology and due to unfamiliarity with benefits of AI applications in healthcare. These barriers can be removed by learning about commonly used terms and their applications. Some of these terms mentioned above are ML, DL, ANN and GPU and example of natural language processing (NLP) such as chatbots.10

In many low-income countries, adoption of AI technology in healthcare is by default. AI applications come embedded in radiology and laboratory equipment. Computed Tomography, Magnetic Resonance Imaging and mammography machines are usually preloaded with CAD and advanced image processing, and laboratory equipment with similar automations. At societal level, AI usage of 'Siri, of iPhone a chatbot, is almost universal.11

AI is projected to add $ 15 Trillion to global economy by the year 2030, up from $ 600 Million in 2016. The fiscal boon will occur mostly in the HIC.12 Early adoption of AI has the potential to accrue big dividends for the LMIC economies, and may even become a great economic equalizer. For the AI research to be rooted in any country, all data in the public and the private domains must be collected as or converted into digital format. For the LMIC tremendous benefit can be derived from early investment in the AI infrastructure and research.

The scarcity of digital data and the cost of hardware is a road block in deploying AI technologies and their immense potential benefits in the developing world.13 A change in approach is required if progress is to be achieved by LMICs, whereby a deterrent to adopting the AI, namely the paucity of digital, should instead be considered as research questions to be solved. If the research is done based on the HIC models of exuberant data, road blocks for LMIC will continue. Even if such a research is conducted by data scientists in the LMICs.

At public health level, some AI based techniques have been described how to track infants using fingerprint technology to ensure proper vaccinations coverage.14 Point of care diagnosis can be important in the early detection of disease outbreaks and AI based automated clinical diagnosis of multiple conditions ranging from dengue fever to cataracts.15,16

Middle income countries, such as Pakistan, are rich in young and highly intelligent human resource. Up to two third of Pakistan's population is under 30 years of age. Enough of these individuals must be trained to realize full economic and human services potential of AI technology. In the LMICs, institutions of higher learning must take leads in offering IA education. At the individual level, all educated persons, including teachers, doctors, engineers, bankers etc. need to develop AI consciousness though self-education. The larger the country wide grass root foot print of the AI movement is, the greater the opportunity of producing world class work force and researchers for the benefit of the society. 

A question commonly asked, "will AI replace the skilled workers in the future". The answer would have to be, "the skilled workers who are AI educated will replace in those who are not", especially those who are seeking work in the HIC.17 Currently, the role of AI is entirely assistive and supportive. Complexity, legality and ethics of certain profession will probably prevent AI from substantially taking over human functions.

As AI is evolving, it is important to recognize that it is not a panacea for the society. In fact, significant issues still remain to be addressed. Potentials risks to the society lie in the absence of consensus on ethics, safety and privacy in the AI domain, insidious entry of racial bias in judiciary and banking and adversarial attacks on public utilities and installation of national security, to name a few.18 But in the final analysis AI is here to stay and bring about quantum change is the world. The release of GPT-3 one example of dramatic AI based automation. Readers are advised to watch the fascinating You tube video entitled "gpt-interview".19

 

Acknowledgement: The author acknowledges the encouragement and feed back received from Dr. Muhammad Fareed Suri and Dr. Junaid Bhatti, members to APPNA MERIT committee on Artificial Intelligence.

 

References

 

1.       McCarthy J. "Professor John McCarthy", available from: jmc.stanford.edu.

2.       Mann, CJH. "Deep Blue - An Artificial Intelligence Milestone", Kybernetes, 2004; 33: No. 1. https://doi.org/10.1108/ k.2004.06733aae.008

3.       Prat CS, Madhyastha TM, Mottarella MJ. Relating Natural Language Aptitude to Individual Differences in Learning Programming Languages. Sci Rep 2020; 10: 3817. https://doi.org/10.1038/s41598-020-60661-8

4.       Taherkordi A, Eliassen F, Horn,G. From IoT Big Data to IoT Big Services: Proceedings of the Symposium on Applied Computing. 2017; pp 485-491.

5.       https://www.britannica.com/technology/artificial-intelligence. Retrieved Aug. 2021

6.       Indolia S, Goswami Ak, Mishra SP, Asopa P. Conceptual Understanding of Convolutional Neural Network-Deep Learning Approach. Procedia Computer Sciences 2018;132: 679-688.

7.       Gavin E. Machine Learning: An Introduction. 2018. From, https://towards data science.com/what-are-the-types-of-machine-learning. Cited on 21. November 2021

8.       Artem O, What is Deep Learning and How it Works. 2019 From, https://contentsimplicity.com/what-is-deep-learning-and-how-does-it-work. 

9.       Mun SK, Wong KH, Lo SB, Li Y, Bayarsaikhan S. Artificial Intelligence for the Future Radiology Diagnostic Service. Front Mol Biosci. 2021;7:614258. Published 2021 Jan 28. doi:10.3389/ fmolb.2020.614258

10.     Hutson, M.AI Glossary: Artificial intelligence, in so many words. Science. 2017, 357:19. DOI: 10.1126/science.357.6346.1919.

11.     Palanica A, Flaschner P, Thommandram A, Li M, Fossat Y. Physicians' Perceptions of Chatbots in Health Care: Cross-Sectional Web-Based Survey J Med Internet Res 2019;21:e12887.

12.     Frank Holmes. AI Will Add $15 Trillion To The World Economy By 2030. Forbes, February 25, 2019.

13.     Ali A, Qadir Q, Rasool R, Sathiaseelan A, Zwitter A, and Crowcroft A. 2016. Big data for Development: Applications and Techniques. Big Data Analytics.2016; 1:1.

14.     Jain AJ, Arora SS, Best-Rowden L, Cao K, Sudhish PS, Bhatnagar A, Koda A. Proceedings of the Eighth International Conference on Information and Communication Technologies and Development. June 2016 Article No.: 29 Pages 1-4 https://doi.org/10.1145/ 2909609.2909612.

15.     Caicedo-Torres W, Paternina A,  Pinzón H. Machine Learning Models for Early Dengue Severity Prediction. In: Montes y Gómez M., Escalante H., Segura A., Murillo J. (eds) Advances in Artificial Intelligence - IBERAMIA 2016.Lecture Notes in Computer Science, vol 10022. Springer, Cham. https://doi.org/10.1007/978-3-319- 47955-2_21.

16.    Chretien JP, Burkom HS, Sedyaningsih ER, Larasati RP, Lescano AG, Mundaca C, Blazes DL, Munayco CV, Coberly JS, Ashar RJ, and others. Syndromic surveillance: Adapting innovations to developing settings. PLoS Medicine 2008; 5: 3.

17.    Martin, S. Artificial Intelligence Replace the Human Doctors in the  Future. Is it True? https://medicalfuturist.com/5-reasons-artificialintelligence- wont-replace-physicians. Cited on 21. November, 2021

18.    Parnas DL. The Real Risk of Artificial Intelligence. Communications of the Association of Computing Machinery. 2017; 60: 10; 27-31.

19.     What It's Like To be a Computer: An Interview with GPT-3. Available at www.youtube.com/results?search_query=gpt+3+interview. Cited on 21. November, 2021.

Journal of the Pakistan Medical Association has agreed to receive and publish manuscripts in accordance with the principles of the following committees: