With global healthcare expenditure exceeding $7T per year — accounting for nearly 10% of global GDP — Apple’s interest in the space is unsurprising.
A core part of the tech giant’s broader healthcare strategy is developing AI and machine learning solutions, from algorithms to detect early signs of Alzheimer’s to computer vision systems that developers can use to build AI healthcare apps.
Apple’s active installed base of 1B+ devices gives the company immediate access to a large audience and a wealth of health data, which can power artificial intelligence systems to help users monitor and manage their health. The tech giant also announced new health features for its Apple Watch in its September 15 conference, including a sensor to read blood oxygen and an improved sleep tracking app.
Its hardware infrastructure includes sensors that collect patient data on a daily basis, making the company a coveted partner for researchers, care providers, and developers interested in creating new AI and ML applications.
In this brief, we examine Apple’s healthcare AI initiatives, its data and app ecosystem, and what challenges and opportunities await the tech giant moving forward.
Table of contents
- Apple’s foray into consumer health with AI
- How Apple is enabling third-party AI applications
- What’s next
Apple’s foray into consumer health with AI
Apple’s AI strategy revolves around using sensors and algorithms integrated with the iPhone and Apple Watch to collect and analyze medical data.
The company reportedly plans to identify new areas where machine learning can convert its vast amount of sensor-captured health data into actionable health insights, according to an Apple job posting.
The Apple Watch, in particular, has risen to the forefront of Apple’s healthcare vision with its ability to monitor user heart rates, respiration, sleep, and movement disorders.
Heart Monitoring
Apple has clinically tested two kinds of algorithms to alert users of atrial fibrillation (AFib), an irregular heartbeat and signal of stroke risk that’s notoriously difficult to diagnose:
- PPG-based algorithms, based on data collected using LEDs and light sensors in the Apple Watch that detect blood volume in the user’s wrist, and
- ECG-based algorithms, based on data collected by electrodes on the back of the Apple Watch.
To test PPG-based algorithms, Apple partnered with Stanford Medicine on the Apple Heart Study in 2017. It is the largest AFib screening study conducted to date, enrolling over 400,000 smartwatch users in the span of 8 months.
