Lead Cloud Development Engineer - Snowflake Platform Development Engineer

DTCC

DTCC

Chennai, Tamil Nadu, India · Hyderabad, Telangana, India
Posted on Aug 27, 2025

Lead Cloud Development Engineer - Snowflake Platform Development Engineer

Chennai, India
and 1 more

Job Info

  • Job Identification 211271
  • Job Category Information Technology
  • Posting Date 08/26/2025, 08:02 AM
  • Locations RMZ Nexity, Hyderabad, Telangana, 500032, IN 3rd Floor, Block A, Global Info City Park, Chennai, Tamil Nadu, 600096, IN
  • Job Schedule Full time
  • Salary Range Salary will be Commensurate with Experience
  • Featured Opportunities No
  • FLSA Status Exempt

Job Description

Are you ready to make an impact at DTCC?

Do you want to work on innovative projects, collaborate with a dynamic and supportive team, and receive investment in your professional development? At DTCC, we are at the forefront of innovation in the financial markets. We're committed to helping our employees grow and succeed. We believe that you have the skills and drive to make a real impact. We foster a thriving internal community and are committed to creating a workplace that looks like the world that we serve.

Pay and Benefits:

  • Competitive compensation, including base pay and annual incentive
  • Comprehensive health and life insurance and well-being benefits, based on location
  • Pension / Retirement benefits
  • Paid Time Off and Personal/Family Care, and other leaves of absence when needed to support your physical, financial, and emotional well-being.
  • DTCC offers a flexible/hybrid model of 3 days onsite and 2 days remote (onsite Tuesdays, Wednesdays and a third day unique to each team or employee).

The Impact you will have in this role:

As a vital contributor to DTCC’s Snowflake Data Platform Engineering team, this role advances our enterprise data capabilities by packaging and integrating the latest Snowflake features into a secure, scalable, and resilient common platform. Through automation, disaster recovery validation, and close collaboration with application development teams, this position ensures DTCC’s cloud-based data infrastructure remains robust, secure, and capable of supporting the flawless clearing of trillions of transactions between market participants on a weekly basis. Your work will directly improve operational efficiency, accelerate innovation, and reinforce the reliability of data services that position DTCC as the essential player in the world’s financial markets.

We are seeking a skilled and motivated Snowflake Data Platform Engineer to join our data platform engineering team. The successful candidate will be responsible for packaging new Snowflake features for consumption by multiple application development teams within the organization. This role involves integrating features into the Snowflake security architecture, validating non-functional requirements such as disaster recovery and backup for Snowflake objects, and developing Snowflake administration utilities in Python or Bash. This role involves supporting our operations team with complex issues and collaborating with Snowflake to diagnose problems and optimize the performance and availability of DTCC’s Snowflake environment. You will work closely with lead customers on our application/development teams to support the introduction of new platform capabilities into the operational environment.

Your Primary Responsibilities:

  • Package and integrate new Snowflake features into a common data platform for use by multiple application development teams.
  • Integrate Snowflake features into the Snowflake security architecture.
  • Validate non-functional requirements, including disaster recovery and backup for Snowflake objects.
  • Automate Snowflake environment deployment and administration using SQL, Python or Bash utilities.
  • Collaborate with Application Development teams to ensure seamless integration and deployment of Snowflake platform capabilities.
  • Utilize CI/CD tools such as GitLab, Bitbucket, and Jenkins to manage the configuration and execution of Snowflake utilities.
  • Perform Snowflake and AWS administration tasks using Terraform or Ansible automation tools.
  • Provide escalated Snowflake support to DTCC’s Cloud Data Operations team.
  • Develop Snowflake platform feature documentation, training materials and job aids for our operations staff.
  • Support Disaster Recovery and Operational Rotation activities to validate the effectiveness of DTCC’s business continuity planning.
  • Provide Snowflake implementation and integration advice and consulting to application development teams.
  • Report defects and problems to Snowflake and work with DTCC’s Snowflake account team to validate and implement suggested fixes.

Qualifications:

  • Minimum of 6+ years of related experience
  • Bachelor's degree preferred or equivalent experience
  • Proven experience with Snowflake data platform administration. A current SnowPro Core certification is strongly desired. More advanced Snowflake platform certifications are ideal.
  • Experience with SQL programming and/or relational database administration on mainstream database platforms (Snowflake, Oracle, MS SQL Server, MySQL, PostgreSQL).
  • Proficiency in Python and Bash script development on Linux.
  • Basic proficiency with Terraform and Ansible.
  • Experience with CI/CD toolchains including GitLab, Bitbucket and Jenkins.
  • Basic AWS IaaS administration skills: experience with Azure or Google Cloud Platform is acceptable.
  • Basic Linux system administration skills
  • Familiar with Agile/Scrum project management methodology and associated workflow tools such as JIRA
  • Familiar with ITSM processes (Incident, Problem, Change, Service Request fulfillment) and supporting tools such as ServiceNow.

Preferred Qualifications:

  • Snowflake integration experience with enterprise Identity and Access Management (IAM) technologies such as Entra (Azure Active Directory) or Hashicorp Vault
  • Experience integrating Snowflake monitoring features with observability tools such as Splunk, Grafana, or Dynatrace.
  • Experience integrating Snowflake with ITSM tools such as ServiceNow or CMDB tools like CloudAware.
  • Experience with Linux batch scheduling software such as Autosys or equivalent.
  • Familiar with FinOps concepts and Snowflake cost/performance optimization strategies.
  • Experience with Business Analytics applications such as Power BI, Looker or Quicksight
  • Experience with the usage and administration of Snowpark container services.
  • Experience with the usage and administration of Snowpipe stream analytics.
  • Experience with the usage and administration of Snowflake Cortex AI services.

Talents Needed for Success:

  • Excellent problem-solving skills and attention to detail.
  • Strong communication and collaboration skills.
  • Ability to identify, isolate, and assess relevant factors in the feature development process and understand their implications.
  • The capacity to make sound, timely, and effective decisions, even under pressure or in uncertain situations.
  • Possess the highest standards of professional conduct, ownership, and behavior.
  • Initiative - The ability to take action in the absence of specific direction.
  • Curiosity.
  • Love for data processing, databases, data analytics and deriving insights from data

Actual salary is determined based on the role, location, individual experience, skills, and other considerations. We are an equal opportunity employer and value diversity at our company. We do not discriminate on the basis of race, religion, color, national origin, sex, gender, gender expression, sexual orientation, age, marital status, veteran status, or disability status. We will ensure that individuals with disabilities are provided reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment. Please contact us to request accommodation.

Similar Jobs