Data Engineer - Commerce Hub

Fiserv

Fiserv

Software Engineering, Data Science
Nenagh, Co. Tipperary, Ireland
Posted on Oct 9, 2025

You deserve to do what you love, and love what you do – a career that works as hard for you as you do. At Fiserv, we are more than 40,000 #FiservProud innovators delivering superior value for our clients through leading technology, targeted innovation and excellence in everything we do. You have choices – if you strive to be a part of a team driven to create with purpose, now is your chance to Find your Forward with Fiserv.

Responsibilities

Requisition ID R-10373334 Date posted 10/03/2025 End Date 10/31/2025 City Nenagh State/Region Tipperary Country Ireland Additional Locations Dublin, undefined Location Type Onsite

Calling all innovators – find your future at Fiserv.

We’re Fiserv, a global leader in Fintech and payments, and we move money and information in a way that moves the world. We connect financial institutions, corporations, merchants, and consumers to one another millions of times a day – quickly, reliably, and securely. Any time you swipe your credit card, pay through a mobile app, or withdraw money from the bank, we’re involved. If you want to make an impact on a global scale, come make a difference at Fiserv.

Job Title

Data Engineer - Commerce Hub

What does a successful Data Engineer do:

You will be part of and support agile teams of data analytics domain in EMEA by designing and buildingcutting-edgedata migration, data integration, data replication and data streamingsystems to ensure we make data available with amazing quality and speed in Snowflake. You willbe responsible for architecting and implementing very large-scale data intelligence solutions around Snowflake Data Warehouse and hence a solid experience and understanding of architecting, designing and operationalization of large-scale data and analytics solutions on Snowflake Cloud Data Warehouse is required.

In collaboration with a multidisciplinary delivery team, you will be responsible for the deployment, monitoring, troubleshooting and maintenance of critical data driven solutions in production.

What you will do:

  • Develop ETL pipelines in and out of data warehouse using combination of Java/Scala/Python Spark jobs for data transformation and aggregation
  • Write SQL queries against Snowflake.

  • Provide production support for Data Warehouse issues such data load problems, transformation translation problems

  • Develop unit tests for transformations and aggregations

  • Develop production grade real time or batch data integrations between systems

  • Real time data processing of events from Kafka using streaming processing into data warehouse.

  • Design and build data pipelines of medium to high complexity

  • Translate requirements for BI and Reporting to Database design and reporting design

  • Understanding data transformation and translation requirements and which tools to leverage to get the job done

  • Design and build machine learning pipelines of medium to high complexity

  • Execute practices such as continuous integration and test-driven development to enable the rapid delivery of working code.

  • Deploy production grade data pipelines, data infrastructure and data artifacts as code.

  • Develop estimates for data driven solutions

  • Communicate technical, product and project information to stakeholders

  • Establish standards of good practice such as coding standards and data governance

  • Peer review code developed by others

What you will need to have:

  • BSc or BTech / B.E in Computer Science, Engineering, or related discipline.

  • Relevant professional qualification such as AWS Certified Big Data, SnowPro Core certification, other Data Engineer certifications

  • Strong development hands-on background in creating Snow pipe, and complex data transformations and manipulations using Snow Pipe, Snow SQL

  • Hands-on experience with Snowflake external tables concepts, Staging, Snow scheduler & performance tuning.

  • Good understanding of Snowflake Timetravel concepts and zero-copy cloning, Network policies, clustering, and tasks

  • 5+ year experience working in an enterprise big data environment

  • Deep knowledge of Spark, Kafka and data warehouse such as snowflake, Hive, Redshift etc

  • Hands-on experience in development, deployment and operation of data technologies and platforms such as:

  • Integration using APIs, micro-services and ETL patterns

  • Low latency/Streaming, batch and micro batch processing

  • Data platforms such as Hadoop, Hive, Redshift or Snowflake

  • Cloud Services such as AWS

  • Cloud query services such as Athena

  • DevOps Platforms such as Gitlab

  • Containerisation technologies such as Docker and Kubernetes

  • Orchestration solutions such as Airflow

  • Deep knowledge of key non-functional requirements such as availability, scalability, operability, and maintainability

  • Deep knowledge of SQL

  • OS knowledge particularly Linux

  • Responsible for planning, highlighting, and implementing possible improvements for existing and new applications.

What would be good to have:

  • Migration experience to Snowflake

  • Hands on experience with Oracle RDBMS

  • Exposure to Streamsets, DBT or other ETL tool .

#LI-1IB

Thank you for considering employment with Fiserv. Please:

  • Apply using your legal name
  • Complete the step-by-step profile and attach your resume (either is acceptable, both are preferable).

Our commitment to Diversity and Inclusion:

Fiserv is proud to be an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, national origin, gender, gender identity, sexual orientation, age, disability, protected veteran status, or any other category protected by law.

Note to agencies:

Fiserv does not accept resume submissions from agencies outside of existing agreements. Please do not send resumes to Fiserv associates. Fiserv is not responsible for any fees associated with unsolicited resume submissions.

Warning about fake job posts:

Please be aware of fraudulent job postings that are not affiliated with Fiserv. Fraudulent job postings may be used by cyber criminals to target your personally identifiable information and/or to steal money or financial information. Any communications from a Fiserv representative will come from a legitimate Fiserv email address.


#FiservProud

Corporate Social Responsibility

Global Leadership

Unique Suite of Solutions