Software Development Engineer - Location Technologies, Sensing & Connectivity
Apple
Software Engineering
Cupertino, CA, USA
USD 147,400-272,100 / year + Equity
Posted on Mar 28, 2026
Our mission is to personalize the user experience on Apple devices based on where you go, when, and what those places mean to you. You're experiencing our work whenever you see a suggested location in Maps or Calendar, or browse your Memories in Photos or Journal. We're working for you whenever your phone engages Do Not Disturb While Driving or remembers where you parked. We're the Location Context team, and we build the location intelligence backbone powering Maps Visited Places, Siri location suggestions, and predictive features across the OS. We're looking for engineers who love solving hard problems at the intersection of location state estimation, on-device machine learning, and privacy-preserving systems. Are you excited by any of these challenges? • Building location state estimators that fuse GPS, WiFi, IMU, and altimeter data to understand not just where users are, but what floor of a building they're on • Designing ML models to infer the semantics of a place and forecast where the device will go next, entirely on-device with strict power and memory budgets • Developing clustering algorithms and data pipelines that process billions of location events while preserving user privacy • Optimizing system performance at massive scale—where a 1% edge case impacts 10 million devices and a power regression of 0.1% matters • Collaborating with Maps, Siri, Photos, HomeKit, Journal, and Safety teams to power features that require deep contextual understanding If this sounds like you, read on.
In this role, you'll develop the next frontier of location intelligence, in partnership with teams across sensing, Siri, Maps, and system frameworks. You'll work on problems from research through production deployment: Design and implement location state estimation algorithms that fuse multi-modal sensor data (GPS, WiFi positioning, accelerometer, altimeter, barometer) to build a rich understanding of user context and mobility patterns Develop on-device machine learning models for place inference, route prediction, and behavioral forecasting that operate within strict power and memory constraints Build data processing pipelines that aggregate, filter, and cluster real-world sensor data on mobile devices, balancing intelligence with resource constraints Implement sophisticated algorithms for background location awareness and semantic understanding — then integrate them into production code running on hundreds of millions of devices Collect and analyze real-world datasets to train models, validate performance, and iterate on algorithm design Test rigorously. Dogfood your work. Collect metrics across diverse user populations and edge cases. An issue that affects 1% of a billion devices is a big issue. Optimize for the full system: CPU, memory, power consumption, and radio usage. Our software needs to provide a high level of intelligence while sipping battery—this is one of the most exciting engineering challenges in mobile computing. A dedication to users' privacy and security is core to how Apple does business. We want their devices to exhibit the high level of intelligence and proactivity that can only come from deep contextual understanding. We don't want their sensitive data coming back to Apple or being exposed to third parties. Other companies solve similar problems in very different ways. Our way is more work. We believe it's worth it.
- Conceptualize, explore, and define new inferential and predictive location- and motion-based capabilities for Apple's platforms
- Design and implement location state estimation algorithms, sensor fusion techniques, and ML models for on-device inference
- Develop clustering and pattern recognition algorithms to identify significant locations, routes, and behavioral patterns from noisy sensor data
- Build and optimize data processing pipelines that operate within strict power and memory budgets on mobile hardware
- Collect, curate, and analyze real-world datasets of varying size and complexity to validate algorithm performance
- Integrate algorithms into production code (Objective-C, Swift, C++), working within daemon and framework architectures
- Profile and optimize system performance: measure CPU, memory footprint, power consumption, and latency; iterate to improve
- Collaborate across teams (Maps, Siri, Photos, Health, Safety) to understand requirements and deliver capabilities that enable compelling user experiences
- Write robust, maintainable code. Test thoroughly. Address edge cases. Build systems that scale to billions of devices.
- 5+ years experience developing commercial software, preferably systems-level or embedded software running on resource-constrained devices
- Strong programming skills in C, C++, Objective-C, or Swift, with solid foundation in algorithms, data structures, and computational complexity
- Working knowledge of statistics and probability, including comfort with histograms, probability distributions, Bayesian inference, and hypothesis testing
- Experience evaluating and optimizing system performance: memory footprint, CPU usage, power consumption, and I/O
- Deep expertise in location technologies: GPS/GNSS positioning, WiFi-based localization, indoor positioning, sensor fusion for state estimation, or IMU-based dead reckoning. If you've built location estimators that fuse multiple sensor modalities, we especially want to hear from you.
- Experience with machine learning for time-series data, spatial data, or behavioral prediction. On-device ML experience (model size optimization, quantization, power-efficient inference) is a strong plus.
- Background in signal processing, Kalman filtering, particle filters, or other probabilistic state estimation techniques.
- Experience with clustering algorithms (DBSCAN, hierarchical clustering, etc.) and unsupervised learning applied to spatial or temporal data.
- Track record of shipping production systems that operate at scale under resource constraints (mobile, embedded, or edge computing environments).
- Strong collaboration skills and ability to work effectively across teams with diverse expertise. At Apple, you'll partner closely with teams in sensing, connectivity, privacy, and application frameworks. You'll need to communicate clearly, plan collaboratively, and execute flexibly.
- Experience with performance profiling tools (Instruments, dtrace, etc.) and systematic optimization of CPU, memory, and power usage.
- Experience with large-scale data analysis for offline algorithm development, model validation, and performance evaluation across diverse user populations.
Apple is an equal opportunity employer that is committed to inclusion and diversity. We seek to promote equal opportunity for all applicants without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability, Veteran status, or other legally protected characteristics. Learn more about your EEO rights as an applicant.