Still Struggling with Hiring? Discover Olibr's Solution Now!

Data Engineer - ETL Tools|Bangalore |15 +Yrs

  • CareerZgraph
  • India, B...
  • 15 - 20 Yrs

Job Closed

Job Description

  • Lead and mentor a team of data engineers in designing, developing, and maintaining efficient and reliable ETL pipelines.
  • Develop advanced scripts using Python for data extraction, transformation, and loading (ETL) processes.
  • Architect robust ETL solutions to support diverse data integration needs and complex business logic.
  • Translate business requirements into technical designs, ensuring solutions meet functional and performance expectations.
  • Set up and oversee monitoring systems to provide production support for Data Warehouse issues, such as data load problems and transformation/translation problems.
  • Thoroughly understand data transformation and translation requirements and determine the most effective tools for the job.
  • Design and automate modern data pipelines using a mix of cloud-based and on-premises technologies.
  • Effectively communicate complex results and technical designs to both technical and non-technical audiences, including business functional owners and stakeholders.
  • Promote continuous improvement in all aspects of the team's operations.
  • Perform additional duties and responsibilities as assigned to support the team's objectives.

Job Responsibilities

  • Bachelor's Degree in Computer Science or a related field with relevant experience.
  • Experience in developing SQL code in MS SQL/Oracle or other databases.
  • Experience in Python scripting for data processing and ETL tasks.
  • Over 10+ years of experience with ETL tools such as SSIS, ODI, SAP Data Services, Informatica or similar.
  • Extensive exposure and working experience in Azure Synapse, Azure Data Factory (ADF), or similar technologies.
  • Exposure to modern integration tools like Bhoomi, Celigo, Talend or similar
  • Proven experience working with APIs, lambda functions, and designing complex ETLs extracting data from various sources like RDBMS, Flat files, XML, and JSON.
  • Ability to analyze and optimize poorly performing queries/ETL mappings, with a strong understanding of performance tuning.
  • Extensive experience with cloud services like AWS EMR, EC2, IAM, S3, and CloudFormation, as well as databases like DB2, Oracle, and Hive.
  • Deep knowledge of data warehouse architectures and data modelling principles
  • Demonstrated ability to work independently and as a team leader, with a strong belief in automated testing and version control.
  • Exceptional analytical, problem-solving, and communication skills, with client-facing experience.
  • Proven ability to architect ETL solutions, translate business requirements into technical design, and effectively communicate with business functional owners and stakeholders.

Location

Bangalore Urban, Karnataka, India