Cloud Data -Lead

Apply now »

Posted On: 21 Aug 2024

Location: Noida, UP, India

Company: Iris Software

Why Join Us?
Are you inspired to grow your career at one of India’s Top 25 Best Workplaces in IT industry? Do you want to do the best work of your life at one of the fastest growing IT services companies? Do you aspire to thrive in an award-winning work culture that values your talent and career aspirations? It’s happening right here at Iris Software.

 

About Iris Software
At Iris Software, our vision is to be our client’s most trusted technology partner, and the first choice for the industry’s top professionals to realize their full potential. With over 4,300 associates across India, U.S.A, and Canada, we help our enterprise clients thrive with technology-enabled transformation across financial services, healthcare, transportation & logistics, and professional services.

Our work covers complex, mission-critical applications with the latest technologies, such as high-value complex Application & Product Engineering, Data & Analytics, Cloud, DevOps, Data & MLOps, Quality Engineering, and Business Automation.

 

Working at Iris
Be valued, be inspired, be your best.
At Iris Software, we invest in and create a culture where colleagues feel valued, can explore their potential, and have opportunities to grow.
Our employee value proposition (EVP) is about “Being Your Best” – as a professional and person. It is about being challenged by work that inspires us, being empowered to excel and grow in your career, and being part of a culture where talent is valued. We’re a place where everyone can discover and be their best version.

Job Description

Job Title: Cloud Data Engineer (Azure, Databricks, Data Factory)
Description: Experienced Cloud Data Engineer responsible for designing and developing Data Pipelines to populate a Cloud Data Lakehouse hosted on Azure Databricks.
Location: Offshore to United Kingdom
Reports to: Enterprise Applications Architect / Data Lakehouse Technical Lead
Role Summary:
Cloud Data Engineer required to assist in the ongoing development of a centralised Data Lakehouse hosted on Azure Databricks. This role will work closely with other members of the Data Lakehouse Team as well as the Architecture, Data Strategy and Transformation Teams and will play an important part in the overall Data Lakehouse Team. Ability to progressively learn all aspects of Databricks and to explore new ways of implementing requirements.
Key Characteristics
• Highly passionate, self-motivated and driven individual who has a strong understanding of the core aspects of a Databricks Data Lakehouse
• They will have a strong desire to improve existing solutions and to innovate and introduce new and exciting ways to improve the current way that Data is managed at Christie’s
• They will be good communicators that have an open and transparent style of working that can influence and operate with both technical and business stakeholders effectively. They will have strong interpersonal skills, can demonstrate facilitation skills and the ability to set technical direction when appropriate
Key interactions: Within CTG team, Architects Team, Data Lakehouse/Data Science Team, Infrastructure Team, Within the wider Christie’s business, 
Information Strategy Business Stakeholders
• Business Stakeholders within their business area(s) when required especially when assessing data needs
Key Tasks & Responsibilities (duties include but are not limited to the following)
• Development of highly performant and scalable Data Pipelines from various Sources into a Data Lakehouse
• Applying best practise for Data Modelling the Data Lakehouse Zones
• Implementing CI/CD across all artifact types (Databricks Workflows, Azure Functions, Data Factory, Python Wheel)
• Processing, cleansing, and verifying the integrity of data used for analysis
• Performing ad-hoc analysis and presenting results in a clear manner
• Clarify/explain data to senior business stakeholders in business friendly terms
Education and Professional Membership
• Educated to Degree Level preferably in Computer Science or related subject
• Azure certifications especially Databricks and Spark
Experience/ Skills Requirements
• Strong experience of Azure Databricks and Spark/SQL
• Strong experience in Python development
• Strong Experience of using CI/CD Pipelines within Azure DevOps
• Excellent understanding of the Delta Lake concept and Data Modelling Guidelines
• Excellent Code/Repository/Environment management skills
• Experience of Azure Data Factory
• Experience in Reporting and Visualisation Tools (Tableau, Power BI)
• Experience with Data Warehousing projects
• Experience with Monitoring and supporting/optimizing production workloads
• Experience of Azure Purview/Databricks Unity Catalog advantageous
• Excellent communication and interpersonal skills

Mandatory Competencies

ETL - Azure Data Factory
Data Science - Databricks
Beh - Communication
Database - SQL
Data on Cloud - Azure Data Lake (ADL)

Perks and Benefits for Irisians

At Iris Software, we offer world-class benefits designed to support the financial, health and well-being needs of our associates to help achieve harmony between their professional and personal growth. From comprehensive health insurance and competitive salaries to flexible work arrangements and ongoing learning opportunities, we're committed to providing a supportive and rewarding work environment.
Join us and experience the difference of working at a company that values its employees' success and happiness.

Apply now »