When product teams think about health data, they typically picture a wearable on a user’s wrist — an Apple Watch streaming heart rate, an Oura Ring tracking sleep stages, a Fitbit counting steps. The assumption is that meaningful health data requires a dedicated device.
That assumption leaves out most of your users.
Global smartphone penetration has reached 79.5%, with over 6.6 billion people owning smartphones [1]. In the US, it’s 91% [1]. Wearable ownership is a fraction of that — even the most optimistic estimates put it well below 30% globally. If your app’s health features only work with a wearable, you’ve excluded the majority of your potential user base before they open the app.
This article explains what modern smartphones can passively capture about a user’s health, how the sensor capabilities differ between iOS and Android, and what’s technically required to turn raw sensor data into actionable health signals.
What’s inside a modern smartphone
Every smartphone shipped in the last decade contains sensors that were originally designed for screen rotation, navigation, and power management — but that produce data streams directly relevant to health monitoring.
Accelerometer
Measures acceleration forces along three axes (x, y, z). This is the primary sensor for step counting, activity detection, movement intensity estimation, and — when the phone is stationary during sleep — breathing rate estimation.
Research has shown that smartphone accelerometers can detect sleep apnea events with F1 scores of 0.89–0.96 for moderate-to-severe cases, correlating strongly with polysomnography (r = 0.90–0.96) when the phone is placed on the abdomen [2].
Gyroscope
Measures rotational velocity around three axes. Combined with the accelerometer, it significantly improves activity recognition accuracy — distinguishing between standing and sitting, and classifying vigorous activities more reliably than either sensor alone [3].
Barometer
Measures atmospheric pressure, primarily used for altitude estimation. In health contexts, it detects floor changes (stairs climbed) and can improve activity classification by distinguishing between flat walking and elevation gain.
GPS and location services
Provides latitude, longitude, altitude, and speed. For health monitoring, GPS data reveals mobility patterns — time spent at home, number of locations visited, travel distance, and movement consistency. These patterns are strongly associated with mental health outcomes [4][5].
Ambient light sensor
Measures environmental light levels. In health contexts, this provides signals about sleep environment (dark room vs screen exposure before bed) and circadian alignment.
Motion coprocessor (iOS)
Apple’s M-series motion coprocessor continuously classifies user activity — walking, running, cycling, stationary, or automotive — using minimal power. This runs independently of the main CPU, meaning it works even when the app isn’t actively running [6].
Metadata signals
Beyond hardware sensors, smartphone usage patterns themselves carry health information: screen on/off times (sleep timing), interaction frequency (behavioral patterns), typing speed and patterns (motor function), and notification response latency (engagement and alertness).
What each platform provides
iOS: Core Motion and HealthKit
iOS provides two complementary frameworks for passive health data:
Core Motion gives direct access to sensor data through classes like CMMotionActivityManager and CMPedometer [6]:
- Activity recognition — walking, running, cycling, stationary, automotive — classified continuously by the motion coprocessor
- Step counting — hardware-accelerated, available as real-time updates or historical queries (up to 7 days)
- Distance estimation — derived from step count and stride length
- Floor counting — using barometer data
- Historical queries — up to 7 days of retroactive activity and step data
Apps request motion permission via NSMotionUsageDescription and can query activity history without needing the app to have been running during that period.
HealthKit aggregates data from Core Motion and any connected wearables into a unified store. Your app can read step counts, walking/running distance, flights climbed, and exercise minutes from HealthKit — which includes contributions from both the iPhone’s sensors and any paired Apple Watch.
For background operation, iOS allows apps to register for HealthKit background delivery (hourly updates) and use background processing tasks to periodically collect and transmit data.
Android: Activity Recognition and Health Connect
Android provides similar capabilities through different APIs:
Activity Recognition API (part of Google Play Services) classifies user activity — walking, running, cycling, in vehicle, on bicycle, on foot, still, tilting, unknown — and can deliver updates even when the app is in the background.
Step counter and step detector sensors are available through the Android Sensor API. The step counter provides a cumulative count since last device reboot; the step detector fires an event with each detected step.
Health Connect aggregates data from various sources into a centralized store, similar to HealthKit. Apps can read and write activity, sleep, and other health data types.
Background limitations on Android are more complex. Battery optimization varies by manufacturer — Samsung, Xiaomi, and OnePlus each apply different restrictions that can throttle or kill background processes. Apps must handle ACTION_REQUEST_IGNORE_BATTERY_OPTIMIZATIONS and implement robust retry logic.
Platform comparison for passive collection
| Capability | iOS | Android |
|---|---|---|
| Activity type recognition | Core Motion (M-coprocessor) | Activity Recognition API |
| Step counting | Core Motion + HealthKit | Sensor API + Health Connect |
| Floor/elevation detection | Barometer via Core Motion | Barometer via Sensor API |
| Historical activity queries | Up to 7 days retroactive | Varies by implementation |
| Background reliability | Consistent across all iPhones | Varies by manufacturer |
| Sleep inference from device usage | Screen lock/unlock + motion | Screen state + motion |
| Location-based behavioral signals | Core Location (significant changes) | Fused Location Provider |
From raw signals to health metrics
Raw sensor data — accelerometer readings, step counts, GPS coordinates — isn’t directly useful for health-aware product features. The value comes from what you derive from it.
Activity metrics
- Daily step count — total and by time-of-day distribution
- Active minutes — time spent in walking, running, or cycling states
- Sedentary time — extended periods in stationary state
- Activity intensity distribution — light vs moderate vs vigorous, derived from accelerometer magnitude
- Active hours — hours of the day with meaningful movement (similar to Apple Watch’s stand hours)
Sleep-related signals
Without a wearable, smartphones can’t directly measure sleep stages or physiological sleep metrics. But they can infer sleep-adjacent signals:
- Sleep window — time between last phone interaction at night and first interaction in the morning
- Sleep regularity — consistency of sleep/wake timing across days
- Nocturnal movement — accelerometer data when the phone is on or near the bed
- Pre-sleep screen exposure — screen time in the hour before detected sleep onset
- Sleep environment — ambient light levels during the detected sleep period
These aren’t replacements for wearable-derived sleep staging, but they provide meaningful signals about sleep behavior and circadian alignment — metrics that correlate with health outcomes in research.
Behavioral and mobility patterns
Digital phenotyping research has demonstrated that smartphone sensor data can surface behavioral patterns associated with health conditions [4][5][7]:
- Home time — percentage of time spent at primary location; increased home time is an early predictor of depression severity, detectable up to two weeks in advance [5]
- Location diversity — number of distinct locations visited; reduced diversity correlates with anxiety and depression [4]
- Mobility radius — daily travel distance; contraction correlates with worsening mental health
- Routine consistency — regularity of daily patterns; disruption signals stress or health changes
- Social context — time spent in locations associated with social activity vs isolation
A systematic review analyzing 36 studies found that smartphone passive sensing using GPS, accelerometer, and usage patterns achieved meaningful prediction accuracy for stress, anxiety, and mild depression in non-clinical populations, with 78% of studies using machine learning approaches [4].
The accuracy question
A fair assessment: smartphones are less precise than wearables for most health metrics, but they’re accurate enough for many product use cases — and they reach dramatically more users.
Step counting
Research shows wearable devices are more accurate than smartphone apps for step counting, with approximately 30% better precision in sustained measurements [8]. However, the gap narrows for trend detection: if you need to know whether a user is becoming more or less active this week compared to last, smartphone step data is directionally reliable even if the absolute number is less precise.
Activity classification
Both iOS and Android motion activity recognition achieve high accuracy for basic activity types (walking, running, stationary, automotive). Combined accelerometer + gyroscope approaches reduce classification errors significantly [3]. The main failure mode is edge cases: slow walking may be classified as stationary, and some activities (e.g., weightlifting while standing) don’t map cleanly to standard categories.
Sleep inference
Smartphone-derived sleep timing (from screen/motion patterns) correlates with self-reported sleep times but is inherently less precise than wearable-derived sleep staging. The phone can tell you roughly when someone went to bed and woke up, but not how much time they spent in deep sleep or REM. For circadian regularity and sleep consistency metrics, smartphone data is robust.
Behavioral signals
Mobility and behavioral pattern metrics from GPS and usage data are unique to smartphones (wearables typically don’t track location) and have strong research backing for mental health signal detection [4][5][7].
Why smartphone-only health data matters
Reach
The math is simple. If your health features require a wearable, you serve the minority of your users who own one. If your features work with a smartphone alone, you serve everyone.
For consumer apps — especially in fitness, wellness, insurance, and lifestyle categories — this is often the difference between a niche feature used by power users and a core experience available to your entire user base.
Onboarding friction
Requiring a wearable connection during onboarding is a drop-off point. Users who don’t own a compatible device (or don’t have it paired) see a feature they can’t use. Smartphone-only data collection starts working the moment the SDK is initialized — no pairing, no device purchase, no compatibility check.
Global markets
In markets outside North America and Western Europe, wearable penetration is significantly lower while smartphone penetration is high and growing [1]. An app targeting Southeast Asian, Latin American, or African markets that requires a wearable for health features is excluding most of its addressable market.
Complement, don’t replace
The best implementation treats smartphone data as the baseline that works for everyone, with wearable data as an enhancement for users who have compatible devices. Users with an Apple Watch or Oura Ring get richer signals (heart rate, HRV, sleep stages). Users with only a smartphone still get meaningful health insights from activity, mobility, sleep timing, and behavioral patterns.
This layered approach means every user gets value from day one, and the experience improves as the user connects additional data sources.
The engineering challenge
Building a passive health data collection system that works reliably across iOS and Android is substantially harder than it appears:
Background processing is the core difficulty. Both platforms restrict background execution to preserve battery life. iOS is more predictable but limits background delivery to hourly intervals. Android varies wildly by manufacturer — a background service that works perfectly on a Pixel may be killed within minutes on a Xiaomi device with aggressive battery optimization.
Sensor sampling and battery trade-offs. Higher sampling rates produce better data but drain battery faster. The system needs to balance data quality against user-visible battery impact — a health data SDK that noticeably reduces battery life will be uninstalled.
Data processing on-device vs in the cloud. Raw accelerometer data at 50Hz generates substantial volume. Processing it into activity classifications, step counts, and behavioral features should happen on-device to reduce data transmission, protect privacy, and minimize latency. This requires efficient signal processing algorithms running within mobile resource constraints.
Cross-platform normalization. iOS and Android sensors produce different data at different rates in different formats. The accelerometer on a Samsung Galaxy S24 behaves differently from an iPhone 15, which behaves differently from a Google Pixel 8. Normalizing these into consistent health metrics requires device-specific calibration and extensive testing.
Permission and consent management. Motion data, location data, and health data each require separate permissions with different user-facing descriptions. The permission flow must explain why each data type is needed without overwhelming the user or triggering privacy concerns.
Build vs integrate
For most product teams, the question isn’t whether passive smartphone health data is valuable — it’s whether to build the collection and processing pipeline in-house or integrate an existing solution.
Building in-house gives you full control but requires deep expertise in mobile sensor processing, cross-platform development, background execution edge cases, and health data normalization. It’s a multi-month engineering investment with permanent maintenance obligations (see Build vs Buy: The True Cost of Health Data Infrastructure).
Health data APIs that provide on-device SDKs handle the collection, normalization, and processing — delivering computed metrics (biomarkers, scores, archetypes) to your backend. Your team consumes the outputs and builds product features on top.
The right choice depends on whether passive health data collection is your core differentiator or a capability that enables your core product. For most teams, it’s the latter.
References
- TechRT. (2025). Smartphone Statistics 2025: Ownership, Usage. https://techrt.com/smartphone-statistics/
- Kühnel, M., et al. (2025). Detection of sleep apnea using smartphone-embedded inertial measurement unit. Scientific Reports. https://doi.org/10.1038/s41598-025-99801-3
- Hartmann, Y., et al. (2022). Smartphone-Based Activity Recognition Using Multistream Movelets Combining Accelerometer and Gyroscope Data. Sensors, 22(7). https://pmc.ncbi.nlm.nih.gov/articles/PMC9002497/
- JMIR mHealth and uHealth. (2024). Smartphone digital phenotyping for detecting stress, anxiety, and mild depression in non-clinical populations: a systematic review. https://mhealth.jmir.org/2024/1/e40689
- Sükei, E., et al. (2023). Differential temporal utility of passively sensed smartphone features for depression and anxiety symptom prediction: a longitudinal cohort study. npj Mental Health Research. https://doi.org/10.1038/s44184-023-00041-y
- Apple. (2026). CMMotionActivityManager — Core Motion. Apple Developer Documentation. https://developer.apple.com/documentation/CoreMotion/CMMotionActivityManager
- Meyerhoff, J., et al. (2025). Smartphone digital phenotyping in mental health disorders: A review of raw sensors utilized, machine learning processing pipelines, and derived behavioral features. Psychiatry Research. https://doi.org/10.1016/j.psychres.2025.116473
- Lee, S., et al. (2020). Accuracy of Mobile Applications versus Wearable Devices in Long-Term Step Measurements. Sensors, 20(21), 6293. https://doi.org/10.3390/s20216293