GCP Datalake - Senior Engineer

Apply now »

Posted On: 16 Feb 2026

Location: Noida, UP, India

Company: Iris Software

Why Join Iris?
Are you ready to do the best work of your career at one of India’s Top 25 Best Workplaces in IT industry? Do you want to grow in an award-winning culture that truly values your talent and ambitions?
Join Iris Software — one of the fastest-growing IT services companies — where you own and shape your success story.
 
About Us  
At Iris Software, our vision is to be our client’s most trusted technology partner, and the first choice for the industry’s top professionals to realize their full potential.
With over 4,300 associates across India, U.S.A, and Canada, we help our enterprise clients thrive with technology-enabled transformation across financial services, healthcare, transportation & logistics, and professional services.
Our work covers complex, mission-critical applications with the latest technologies, such as high-value complex Application & Product Engineering, Data & Analytics, Cloud, DevOps, Data & MLOps, Quality Engineering, and Business Automation.

Working with Us
At Iris, every role is more than a job — it’s a launchpad for growth.
Our Employee Value Proposition, “Build Your Future. Own Your Journey.” reflects our belief that people thrive when they have ownership of their career and the right opportunities to shape it.
We foster a culture where your potential is valued, your voice matters, and your work creates real impact. With cutting-edge projects, personalized career development, continuous learning and mentorship, we support you to grow and become your best — both personally and professionally.
Curious what it’s like to work at Iris? Head to this video for an inside look at the people, the passion, and the possibilities. Watch it here.

Job Description

1. Data Engineering:

Cloud Platform Expertise: Strong knowledge of GCP (Google Cloud Platform) services, including Composer, Data Fusion, BigQuery, Cloud Storage, and Dataflow.

Data Pipeline Development: Experience in building and maintaining data pipelines using technologies like Apache Airflow, Python, SQL, and potentially other ETL tools.

Data Modeling & Schema Design: Ability to design and implement efficient data models and schemas for data ingestion, storage, and processing.

Data Quality & Validation: Expertise in ensuring data quality, validating data integrity, and implementing data quality checks within the pipeline.

Troubleshooting & Debugging: Proficiency in identifying and resolving issues within the pipeline, including performance bottlenecks, data inconsistencies, and error handling.

CI/CD & Automation: Experience with continuous integration and continuous delivery (CI/CD) pipelines for automating data pipeline deployments and updates.

 

2. Data Integration & Connectivity:

API Integration: Expertise in integrating with various APIs (e.g., Salesforce, Workday, Peoplesoft, Siplast, etc.) and understanding API security, authentication, and data formats.

Data Transformation & Manipulation: Skill in transforming data using different methods (e.g., data cleaning, aggregation, filtering, etc.) and applying appropriate data transformation techniques.

Database Technologies: Experience with relational databases

 

3. Application & Platform Support:

Troubleshooting & Problem Solving: Ability to identify and resolve issues related to application performance, data integration, and platform stability.

Monitoring & Alerting: Experience in setting up monitoring systems to track pipeline performance, data quality, and potential failures.

Incident Management: Skill in handling and managing incidents, communicating updates to stakeholders, and following incident resolution procedures.

Documentation & Communication: Clear communication skills and ability to document procedures, best practices, and technical documentation for the team and stakeholders.

 

4. Domain Expertise:

Business Understanding: A deep understanding of the business processes and data requirements related to the specific data pipelines.

Data Understanding: Knowledge of the data structures, formats, and relationships within the various source systems.

Technical Collaboration: Ability to effectively collaborate with other teams (e.g., business analysts, data analysts, application developers) to understand requirements and troubleshoot issues.

Prior experience in SAP, Salesforce etc esp finance domain

 

5. Additional Skills:

Version Control (Git): Understanding and experience with version control systems like Git for managing code and pipeline changes.

Cloud Security: Knowledge of cloud security best practices and implementing security measures for data pipelines and cloud infrastructure.

Mandatory Competencies

Cloud - GCP - Cloud Build, Google Cloud Deploy, Dataflow, Cloud Run
Cloud - GCP - Cloud Data Fusion, Dataproc, BigQuery, Cloud Composer, Cloud Bigtable
Cloud - Azure - Azure Data Factory (ADF), Azure Databricks, Azure Data Lake Storage, Event Hubs, HDInsight
Development Tools and Management - Development Tools and Management - CI/CD
Programming Language - Python - Apache Airflow
Database - Database Programming - SQL
DevOps/Configuration Mgmt - DevOps/Configuration Mgmt - Git
Enterprise Applications - ERP - PeopleSoft
Beh - Communication and collaboration

Perks and Benefits for Irisians
Iris provides world-class benefits for a personalized employee experience. These benefits are designed to support financial, health and well-being needs of Irisians for a holistic professional and personal growth. Click here to view the benefits.

Apply now »