Test Automation Data - Senior Engineer

Apply now »

Posted On: 12 Feb 2026

Location: Noida, UP, India

Company: Iris Software

Why Join Iris?
Are you ready to do the best work of your career at one of India’s Top 25 Best Workplaces in IT industry? Do you want to grow in an award-winning culture that truly values your talent and ambitions?
Join Iris Software — one of the fastest-growing IT services companies — where you own and shape your success story.
 
About Us  
At Iris Software, our vision is to be our client’s most trusted technology partner, and the first choice for the industry’s top professionals to realize their full potential.
With over 4,300 associates across India, U.S.A, and Canada, we help our enterprise clients thrive with technology-enabled transformation across financial services, healthcare, transportation & logistics, and professional services.
Our work covers complex, mission-critical applications with the latest technologies, such as high-value complex Application & Product Engineering, Data & Analytics, Cloud, DevOps, Data & MLOps, Quality Engineering, and Business Automation.

Working with Us
At Iris, every role is more than a job — it’s a launchpad for growth.
Our Employee Value Proposition, “Build Your Future. Own Your Journey.” reflects our belief that people thrive when they have ownership of their career and the right opportunities to shape it.
We foster a culture where your potential is valued, your voice matters, and your work creates real impact. With cutting-edge projects, personalized career development, continuous learning and mentorship, we support you to grow and become your best — both personally and professionally.
Curious what it’s like to work at Iris? Head to this video for an inside look at the people, the passion, and the possibilities. Watch it here.

Job Description

Design and develop automation scripts for:
API testing (REST/JSON)
ETL pipeline validation
Data transformation checks in Databricks
Build and maintain reusable automation components using Python, PySpark 
Automate regression suites covering DBT model outputs, schema validations, and upstream/downstream data layers.
Implement automated data validation across:
Databricks Delta Lake
SQL Server sources
AWS S3 raw layers

 

Data Engineering Test Automation
•    Automate validation of DBT transformations (tests, snapshots, seed data checks).
•    Build SQL-based and script-based automation for: 
o    Data reconciliation
o    Aggregation validation
o    Schema evolution testing
o    Data freshness checks
•    Use Databricks APIs or automation tools to validate notebook runs and workflows.
API & Integration Automation
•    Develop API automation scripts for: 
o    Data ingestion
o    Data consumption
o    Metadata services
•    Use tools like Postman/Newman or Python requests for automated Web/API testing.
CI/CD Integration
•    Integrate automation suites with: 
o    AWS CodePipeline / GitHub Actions / Jenkins / GitLab CI
•    Configure pipelines to run tests automatically for every code push, DBT model change, or Databricks workflow update.
Agile Collaboration
•    Participate actively in Agile ceremonies—stand-ups, sprint planning, grooming, retros.
•    Work closely with Data Engineers, DBT Developers, Cloud Engineers, and Product Owners.
•    Provide automation insights, effort estimates, and feasibility judgments.
Defect Management
•    Log defects with clear data evidence in Jira.
•    Collaborate with teams to identify root cause (pipeline logic, DBT model,  AWS service failure, etc.).
•    Maintain traceability between requirements → test cases → automated scripts.
Quality Improvement & Standards
•    Enhance test coverage and reliability by contributing to: 
o    Automation strategy
o    Data testing best practices
o    Test data generation utilities
o    Error handling and logging improvements
•    Advocate for quality-first development in a data engineering environment.

Technical Skills:
•    Strong experience in API and Automation Testing.
•    Hands-on experience automating data validations for: 
o    Snowflake
o    Databricks (SQL/PySpark)
o    SQL Server
•    Understanding of DBT (models, tests, documentation, lineage).
•    Strong SQL skills (joins, CTEs, window functions, reconciliations).
•    Experience with Python automation frameworks (pytest, unittest), or Java (TestNG, JUnit).
•    Exposure to AWS data services: 
o    S3, Glue, Lambda, Step Functions, Athena, EMR (optional)
•    CI/CD exposure (GitHub Actions, Jenkins, AWS CodePipeline).
•    Experience with test management tools (Zephyr, Jira).

Mandatory Competencies

QA/QE - QA Automation - ETL Testing
Beh - Communication
Big Data - Big Data - Pyspark
Cloud - AWS - AWS S3, S3 glacier, AWS EBS
Development Tools and Management - Development Tools and Management - Postman
QA/QE - QA Manual - API Testing
DevOps/Configuration Mgmt - DevOps/Configuration Mgmt - GitLab,Github, Bitbucket
QA/QE - QA Automation - Python
Database - Sql Server - SQL Packages
Development Tools and Management - Development Tools and Management - CI/CD

Perks and Benefits for Irisians
Iris provides world-class benefits for a personalized employee experience. These benefits are designed to support financial, health and well-being needs of Irisians for a holistic professional and personal growth. Click here to view the benefits.

Apply now »