Senior Software Engineer, SDLC Analytics
Apple
Software Engineering, Data Science
Cupertino, CA, USA
USD 181,100-318,400 / year + Equity
Posted on Apr 12, 2026
Do you love building elegant systems that make engineering teams measurably better? Do you like transforming complex data into actionable insights? Are you passionate about applying software engineering rigor to the measurement and optimization of the development lifecycle itself? As part of our Software Development Life Cycle (SDLC) team, you'll architect and engineer a comprehensive telemetry and analytics platform that provides deep telemetry across the entire software development lifecycle. Your work will empower engineering teams across Apple to quantify their performance, identify bottlenecks through engineered observability, and continuously improve their development workflows.
We are seeking an experienced Software Architect or Senior Software Engineer with a strong data analytics background in building data-intensive systems to join our team. The ideal candidate brings production-grade software engineering discipline to data systems — Architecting data models for our scalable SDLC event platform that serves as the backbone for data pipelines, and engineering scalable data infrastructure and analytics capabilities across one of the world's largest software organizations. Modern software development at Apple spans multiple specialized platforms — source control, build system, deployment orchestration, artifact management, and observability tooling. Each generates rich telemetry, but analyzing these signals in isolation yields limited insight. You'll solve this by engineering a unified analytics platform that correlates events across the entire development pipeline, turning fragmented data into a coherent picture of engineering effectiveness.
- Data Model & Schema Engineering: Engineer unified schemas representing SDLC entities and events across all platforms. Define and standardize event contracts using Apple's CDEvents specification, model relationships between applications, services, deployments, and incidents, and architect a data model that cleanly supports both real-time streaming and batch analytics workloads.
- SDLC Platform Integrations: Design for fault tolerance, and schema evolution. Design, build, and own data pipelines that reliably ingest events from source control, build systems, deployment services, test platforms, artifact registries, and monitoring systems. Leverage the CDEvents Platform architecture for event ingestion, implement robust validation and enrichment pipelines, and handle both real-time streaming (Kafka) and API-based data access patterns with production-quality reliability and observability.
- Analytics Infrastructure & Database Architecture: Architect and implement scalable database solutions optimized for SDLC analytics — including time-series storage for deployment events and metrics, dimensional modeling for applications and teams, event streaming for real-time dashboards, and a data warehouse layer for historical analysis and trend identification. Apply software engineering best practices: versioning, testing, CI/CD, and operational runbooks.
- Analysis, Tooling & Reporting: Build first-class analytics capabilities to measure engineering effectiveness through DORA metrics (Deployment Frequency, Lead Time to Change, Change Failure Rate, Mean Time to Recover). Partner with stakeholders to deliver intuitive visualizations, implement anomaly detection, and ship custom analytics tooling that drives continuous improvement across engineering organizations.
- Minimum 7 years of relevant industry experience
- Strong software engineering foundation with significant experience building and operating production data systems
- Proven track record of designing, shipping, and maintaining internal or external data-intensive products end-to-end
- Deep expertise in data modeling, including time-series and dimensional modeling approaches
- Hands-on experience with event streaming platforms (e.g., Kafka) and pipeline orchestration frameworks (e.g., Airflow, Spark)
- Proficiency in Java and Python; strong SQL skills for data modeling and pipeline development; comfort working across the full data stack from ingestion to serving
- Solid understanding of database technologies for both operational and analytical workloads
- BS in Computer Science, Computer Science, or a related technical field
- Experience building analytics or telemetry platforms for software development workflows
- Hands-on familiarity with SDLC tooling — Git, build systems, CI/CD pipelines, deployment orchestration
- Strong software design skills: ability to write clean, testable, maintainable code in a collaborative engineering environment
- Some experience using AI-powered solution on analyzing CI/CD data
- Strong problem-solving and analytical skills
- Ability to work well in a team and communicate effectively with both technical and non-technical stakeholders
- Self-motivated and well-organized, with a demonstrated ability to take ownership and drive ambiguous projects to completion
Apple is an equal opportunity employer that is committed to inclusion and diversity. We seek to promote equal opportunity for all applicants without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability, Veteran status, or other legally protected characteristics. Learn more about your EEO rights as an applicant.