Senior Data Management Professional - Data Engineering - Third Party Index
Bloomberg
Software Engineering, Data Science
New York, NY, USA
Bloomberg runs on data. Our products are fueled by powerful information. We combine data and context to paint the whole picture for our clients, around the clock - from around the world. In Data, we are responsible for delivering this data, news and analytics through innovative technology - quickly and accurately. We apply problem-solving skills to identify innovative workflow efficiencies, and we implement technology solutions to enhance our systems, products and processes.
Our Team:
The Third Party Index team is responsible for managing third-party market benchmark and custom index data that powers some of Bloomberg's most widely used market analysis and portfolio management workflows. Within this group, the Automation team partners with Market Benchmarks, Custom Indices, Product, and Engineering to improve how index data is onboarded, processed, validated, and delivered. The team uses deep index provider knowledge and technical expertise to build scalable workflow solutions, strengthen data quality, and generate operational insights that enhance the client experience across Terminal and Enterprise offerings.
The Role:
The Third Party Index Automation team is seeking a driven and technically strong Data Engineer to design and scale automated data workflows that support the onboarding, transformation, validation, and delivery of third-party index data. You will build robust pipeline solutions, identify opportunities to improve operational efficiency, and partner across Data, Product, and Engineering to implement systematic improvements to our end-to-end processing environment. As a Data Management Professional, you will help develop business outcome-based data strategies that optimize the value of data for our customers and improve data operations.
We’ll expect you to:
● Design, build, and maintain scalable data pipelines and workflow solutions for third-party benchmark and custom index datasets
● Automate data onboarding, validation, enrichment, and exception-management processes to improve timeliness, accuracy, and operational scale
● Analyze existing processes to identify bottlenecks, reduce manual touchpoints, and deliver workflow improvements that enhance efficiency and control
● Collaborate with subject matter experts across Data, as well as partners in Product and Engineering, to define requirements and deliver automation initiatives
● Develop monitoring, controls, and operational analyses that surface data quality, workflow performance, and processing risks
● Use programming and data engineering techniques to process large, complex datasets and support reliable downstream data delivery
● Support root cause analysis and remediation for data issues and client inquiries, and help strengthen the quality of our data products over time
● Provide mentorship within the Third Party Index team by helping bridge domain expertise and technical implementation
You’ll need to have:
● A BA/BS degree or higher in Computer Science, Engineering, Mathematics, Finance, or a related field, or equivalent professional work experience
● 4+ years of experience in data engineering, data analysis, and/or financial data operations
● Strong programming and scripting skills, with hands-on experience in Python, SQL, and data pipeline development
● Proven ability to build and optimize ETL or workflow automation solutions and to analyze, validate, and improve large datasets
● Experience working with structured and time-series data, including designing controls to support data quality and operational monitoring
● Experience troubleshooting production workflows, performing root cause analysis, and implementing durable process improvements
● Solid ability to combine strong technical engineering skills with business and product understanding
● Strong analytical and problem-solving abilities, with a collaborative approach and passion for data-driven decision-making
We’d love to see:
● Knowledge of index data, benchmark methodologies, custom baskets, or related financial market datasets
● Experience with cloud or distributed data processing technologies and modern workflow orchestration practices
● Experience building dashboards, operational reporting, or data quality monitoring solutions for business-critical processes
Does this sound like you?
Apply if you think we're a good match. We'll get in touch to let you know what the next steps are.