Senior Software Engineer
Wells Fargo
About this role:
Wells Fargo is seeking a Senior Software Engineer.
This is for Data Engineering to join the CALM (Corporate Asset and Liability Management) Data Engineering team within the Enterprise Functions Technology (EFT) organization. In this role, you will be responsible for designing, developing, optimizing, and maintaining metadata‑driven, scalable, high‑performance data engineering frameworks that power critical financial risk processes across Corporate Treasury.
You will work independently to build resilient data pipelines, APIs, wrappers, and supporting components to enable reliable data ingestion, transformation, validation, and delivery across cloud and on‑prem ecosystems. This position plays a key role in Data Center exit migrations, DPC onboarding, and enterprise-wide modernization initiatives.
The role requires deep technical expertise, hands‑on problem‑solving, and technical leadership in distributed data engineering, cloud platforms, data quality, and performance engineering.
In this role, you will:
- Lead moderately complex initiatives and deliverables within technical domain environments
- Contribute to large scale planning of strategies
- Design, code, test, debug, and document for projects and programs associated with technology domain, including upgrades and deployments
- Review moderately complex technical challenges that require an in-depth evaluation of technologies and procedures
- Resolve moderately complex issues and lead a team to meet existing client needs or potential new clients needs while leveraging solid understanding of the function, policies, procedures, or compliance requirements
- Collaborate and consult with peers, colleagues, and mid-level managers to resolve technical challenges and achieve goals
- Lead projects and act as an escalation point, provide guidance and direction to less experienced staff
Required Qualifications:
- 4+ years of Software Engineering experience, or equivalent demonstrated through one or a combination of the following: work experience, training, military experience, education
Desired Qualifications:
- Strong years of Software Engineering experience OR equivalent (industry, training, military, education).
- Hands-on experience with Python, SQL, and bash scripting for automation.
- Strong experience building big data pipelines using Apache Spark, Hive, Hadoop.
- Experience with Autosys/Airflow or similar orchestration tools.
- Working knowledge of REST APIs, Object Storage, Dremio, and CI/CD pipelines.
- Strong troubleshooting and problem‑solving capabilities.
- Solid foundation in data modeling (conceptual/logical/physical) and database design.
- Platform & DevOps
- Cloud-native engineering experience — serverless, managed Spark, event-driven architectures.
- Familiarity with containerization (Docker, K8s) and workflow operators.
- Strong experience implementing test automation for data pipelines (unit, contract, integration tests).
- Data Lakehouse & Storage
- Hands‑on with optimization techniques: clustering, Z‑ordering, vectorized IO (Parquet/ORC), compaction.
- Experience implementing Medallion architectures and governed ingestion zones.
- Advanced Data Engineering
- Experience architecting pipelines using distributed systems patterns (shuffle optimization, spill, broadcast, storage layouts).
- Experience with streaming frameworks like Spark Structured Streaming or Apache Flink.
- Data Quality & Governance
- Knowledge of data governance platforms (Collibra, Alation, Purview).
- Understanding of financial data controls, validation rules, reconciliation checks, and compliance (SOX/PCI).
- Experience implementing lineage, observability, drift detection.
- GenAI for Data Engineering
- Applying GenAI for metadata extraction, data anomaly detection, automated documentation, or pipeline optimization.
- Domain Expertise
- Exposure to financial risk, treasury functions, or Asset & Liability Management (ALM) processes.
Job Expectations:
- Deliver high-quality engineering outcomes during Data Center exit migrations and DPC onboarding, ensuring validations, automation, and production readiness.
- Collaborate with cross-functional teams to build scalable, high‑performance data solutions using Python, SQL, Spark, Iceberg, Dremio, and Autosys.
Data Engineering & Pipeline Development
- Design, build, test, deploy, and maintain large-scale structured and unstructured data pipelines using Python, SQL, Apache Spark, and modern data lake/lakehouse technologies.
- Develop and optimize metadata-driven pipelines, wrappers, ingestion frameworks, and validation layers to support CALM data workflows.
- Build and maintain high-quality ELT/ETL pipelines following best practices in reliability, performance, observability, and reusability.
Distributed Computing & Lakehouse Engineering
- Engineer and optimize Spark pipelines for large‑scale batch and streaming workloads (partitioning, caching, Catalyst optimization, AQE, Tungsten).
- Work with open table formats such as Iceberg, Delta, or Hudi for versioned data, time-travel, compaction, and schema evolution.
- Implement Medallion (Bronze/Silver/Gold) architecture patterns for modern lakehouse systems.
Data Quality, Testing & Observability
- Implement automated data quality frameworks using tools such as Great Expectations / Deequ or custom validators.
- Build data health monitoring frameworks with SLAs/SLOs, anomaly detection, and lineage capture.
- Ensure strong validation layers during Data Center exits, migration programs, and DPC onboarding.
API & Microservice Development
- Build RESTful and metadata APIs using Python frameworks (FastAPI/Flask) to enable secure, governed data access.
- Collaborate with application teams to integrate data access patterns and platform services.
Cloud & Platform Engineering
- Design and deploy data pipelines in cloud platforms (AWS, Azure, GCP) leveraging managed compute, orchestration, and storage.
- Build CI/CD workflows and infrastructure automation using Jenkins, GitHub Actions, Azure DevOps, Terraform, or Helm.
- Apply secure engineering principles including IAM, secrets management, encryption standards, and network controls.
Orchestration & Scheduling
- Build resilient orchestration flows using Autosys or equivalent tools.
- Apply modular design with retries, alerts, SLAs, and workflow dependency management.
Collaboration & Delivery
- Work with cross-functional Agile teams (Product, Architecture, QA, Treasury SMEs).
- Analyze technical requirements, evaluate design alternatives, and provide recommendations aligned with enterprise standards.
- Independently deliver complex engineering tasks and contribute to architecture/roadmap discussions.
Posting End Date:
23 Feb 2026*Job posting may come down early due to volume of applicants.
We Value Equal Opportunity
Wells Fargo is an equal opportunity employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability, status as a protected veteran, or any other legally protected characteristic.
Employees support our focus on building strong customer relationships balanced with a strong risk mitigating and compliance-driven culture which firmly establishes those disciplines as critical to the success of our customers and company. They are accountable for execution of all applicable risk programs (Credit, Market, Financial Crimes, Operational, Regulatory Compliance), which includes effectively following and adhering to applicable Wells Fargo policies and procedures, appropriately fulfilling risk and compliance obligations, timely and effective escalation and remediation of issues, and making sound risk decisions. There is emphasis on proactive monitoring, governance, risk identification and escalation, as well as making sound risk decisions commensurate with the business unit’s risk appetite and all risk and compliance program requirements.
Candidates applying to job openings posted in Canada: Applications for employment are encouraged from all qualified candidates, including women, persons with disabilities, aboriginal peoples and visible minorities. Accommodation for applicants with disabilities is available upon request in connection with the recruitment process.
Applicants with Disabilities
To request a medical accommodation during the application or interview process, visit Disability Inclusion at Wells Fargo.
Drug and Alcohol Policy
Wells Fargo maintains a drug free workplace. Please see our Drug and Alcohol Policy to learn more.
Wells Fargo Recruitment and Hiring Requirements:
a. Third-Party recordings are prohibited unless authorized by Wells Fargo.
b. Wells Fargo requires you to directly represent your own experiences during the recruiting and hiring process.