Cloud Data - Senior Engineer

Apply now »

Posted On: 12 Apr 2024

Location: Noida, UP, India

Company: Iris Software

Why Join Us?
Are you inspired to grow your career at one of India’s Top 25 Best Workplaces in IT industry? Do you want to do the best work of your life at one of the fastest growing IT services companies? Do you aspire to thrive in an award-winning work culture that values your talent and career aspirations?

It’s happening right here at Iris Software.

 

About Iris Software
At Iris Software, our vision is to be our client’s most trusted technology partner, and the first choice for the industry’s top professionals to realize their full potential.

With over 4,300 associates across India, U.S.A, and Canada, we help our enterprise clients thrive with technology-enabled transformation across financial services, healthcare, transportation & logistics, and professional services.

Our work covers complex, mission-critical applications with the latest technologies, such as high-value complex Application & Product Engineering, Data & Analytics, Cloud, DevOps, Data & MLOps, Quality Engineering, and Business Automation.

 

Working at Iris
Be valued, be inspired, be your best.
At Iris Software, we invest in and create a culture where colleagues feel valued, can explore their potential, and have opportunities to grow.
Our employee value proposition (EVP) is about “Being Your Best” – as a professional and person. It is about being challenged by work that inspires us, being empowered to excel and grow in your career, and being part of a culture where talent is valued. We’re a place where everyone can discover and be their best version.

Job Description

Primary Role Function:
-    Create and maintain optimal data pipeline architecture,
-    Assemble large, complex data sets that meet functional / non-functional business requirements.
-    Experience with AWS cloud services: EC2, Glue, RDS, Redshift
-    Experience with big data tools: Hadoop, Spark, Kafka, etc.
-    Experience with relational SQL and NoSQL databases, including Postgres and Cassandra.
-    Experience with data pipeline and workflow management tools: Azkaban, Luigi, Airflow, etc.
-    Experience with object-oriented/object function scripting languages: Python.
-    Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL and AWS ‘big data’ technologies.
-    Build analytics tools that utilize the data pipeline to provide actionable insights into customer acquisition, operational efficiency and other key business performance metrics.
-    Keep our data separated and secure across national boundaries through multiple data centers and AWS regions.
-    Create data tools for analytics and data scientist team members that assist them in building and optimizing our product into an innovative industry leader.
-    Work with data and analytics experts to strive for greater functionality in our data systems.
-    Writes high quality and well-documented code according to accepted standards based on user requirements

Knowledge:
- Thorough in-depth knowledge of design and analysis methodology and application development processes 
- Exhibits solid knowledge of databases
- Programming experience with extensive business knowledge
- University degree in Computer Science, Engineering or equivalent industry experience
1.    Solid understanding of SDLC and QA requirements

Mandatory Competencies

Cloud - AWS
Data on Cloud - AWS S3
DevOps - CLOUD AWS
ETL - AWS Glue
Beh - Communication and collaboration

Apply now »