Signify Research predicts the market for machine learning in medical imaging will exceed $2 billion by 2023, saying, "AI will transform the diagnostic imaging industry, both in terms of enhanced productivity, increased diagnostic accuracy, more personalized treatment planning, and ultimately, improved clinical outcomes."
Innovations are emerging fast. University of Illinois researchers have developed a technology that can conduct optical spectroscopy using a smartphone camera and AI to check for infectious diseases while a Rice University-led team is developing wearable and point-of-care microscopes that use sensors and machine imaging to non-invasively aid in the diagnosis and monitoring of dozens of health conditions that can at present only be tested using biopsy or blood tests.
Regulation and the need for clinical validation slow down the wider deployment of new solutions, however a slightly more relaxed regulatory environment is emerging (such as the FDA's De Novo approach, which enabled Apple's ECG on Apple Watch). This willingness to develop less burdensome paths to market for innovative digital health tools in combination with the ongoing rapid evolution of sensors, operating systems and machine intelligence means new solutions are emerging worldwide. Most combine sensor technologies with mobile apps to deliver significant medical benefits to users. These benefits are bringing treatment into more people’s reach, as these examples show:
In the operating theatre
Excessive bleeding after childbirth kills tens of thousands of women each year, but a revolutionary iPad-based system may help prevent this. Triton Sponge accurately counts the surgical sponges used during an operation, applying machine vision and AI to estimate blood loss. The system is capable of spotting false positives (such as when presented with the same sponge twice) and also helps theatre staff to keep track of sponges used, protecting against incidents in which they can sometimes be left inside patients. Triton Sponge is already approved and saving lives in operating rooms across the U.S.
Cart-based ultrasound scanners can cost $50,000-$100,000. Butterfly IQ reduces the size and cost considerably, to around $2,000 for a handheld ultrasound scanner about the size of an electric razor. It uses AI and machine imaging to detect blood flow, pregnancy, cancer tumors and more – the company's own Chief Medical Officer diagnosed his own cancer using this kit. HIPAA-compliant, the Butterfly IQ imaging is viewed on a smartphone screen. Images are stored in the cloud where they can be accessed for diagnosis by medical professionals. There's plenty of interest – Butterfly IQ raised $250 million in its most recent funding round.
Millions of people need daily urine testing. This is an onerous process, as testing takes place at a lab or other point of care and demands the services of a trained professional, who must then provide results to the patient's doctor. Israel's Healthy.io has developed a smartphone-based home testing kit that accelerates and reduces the cost of this process. It uses machine vision and AI to provide results that are as accurate as a lab-based urinalysis analyzer, capturing images with the camera on board the smartphone. Healthy.io is clinically approved for sale in Israel, the EU and U.S., and is used at Salford Royal NHS Foundation Trust's virtual renal clinic.
Bloomlife is a smartphone solution that combines physical sensors with AI and an app to monitor pregnancy in real-time. A sensor sits on the mother's abdomen accurately monitoring and recording pregnancy patterns. It watches uterine activity for contraction frequency, duration, patterns and trends, helping patients tell the difference between normal twinges and pre-birth contractions. Babyndex is another good expression of AI around birth. Developed in Hungary and FDA-approved for use in the U.S., this uses the camera on your smartphone and machine intelligence to analyze dried saliva samples to check when body estrogen hormone levels increase and women are at their most fertile.
EzLab is a smartphone-compatible tool that allows users to take and test pathogen samples in the field and receive 90 percent accurate analysis in minutes. This consists of a smartphone app, the cloud software, and a portable field microscope that’s both easy to use and sufficiently robust for use in war zones. It carries its own micro-sensing optics system and works with a proprietary cloud-based image analyzing neural network. Cheaper and more portable than standard testing equipment, this innovative smartphone-based mobile solution means tests that usually take days as samples are returned to labs for analysis can now be transacted in the field in minutes.
There are already numerous smartphone solutions that claim to detect cancers based on melanoma, but the technology is advancing as we reported in Why AI is becoming the disease detective. In 2016 researchers at the University of Texas Health Science Center in Houston developed a technology that used smartphones and cameras to diagnose non-melanoma skin cancers, delivering accuracy between 60-90 percent.
Comments from U.S. regulators suggest the rate of introduction of these solutions will increase rapidly. FDA Commissioner Scott Gottlieb, MD, and Jeff Shuren, MD, JD, Radiological Health Director at the FDA's Center for Devices said: "Due to the great promise of these technologies and the rapid pace of change, the FDA is working to modernize our regulatory approach to better enable and more efficiently spur innovation in this novel area to improve the health and quality of life of consumers and patients."
Find out more about how Orange Healthcare is enabling the development of mobile health solutions here.
Jon Evans is a highly experienced technology journalist and editor. He has been writing for a living since 1994. These days you might read his daily regular Computerworld AppleHolic and opinion columns. Jon is also technology editor for men's interest magazine, Calibre Quarterly, and news editor for MacFormat magazine, which is the biggest UK Mac title. He's really interested in the impact of technology on the creative spark at the heart of the human experience. In 2010 he won an American Society of Business Publication Editors (Azbee) Award for his work at Computerworld.