Still Struggling with Tech Hiring? Discover Olibr's Solution Now!

Data Engineer - ETL/Data Warehousing|Gurgaon/Gurugram|4+Yrs

  • HuQuo
  • India, G...
  • 4 - 9 Yrs

Job Description

  • Data strategies to meet the demands of business requirements
  • Technical requirements of the organizational data needs from diverse teams
  • Data architecture, models and go-forward data solutions for various teams of the organization
  • The person will be part of the Data Solutions team for a major Insurance client. He/ She will work with different stakeholders as SME for data engineering; A typical workday will involve working with stakeholders in individual contributor.
  • Engages in technical problem solving across multiple technologies; often develop new solutions and recommend technologies that can be leveraged to create data solutions
  • You will develop, construct, test, and maintain data architectures for data platform, database and analytical/reporting systems.
  • You will partner with other technology platform teams to leverage innovative and new technology for delivering solutions that best fit internal data needs for various analytical solutions
  • You will handle codes independently, manage them and also guide other team members around its development/management

Job Responsibilities

  •  4+ years of experience in Data Engineering: SQL, DWH (Redshift), Python (PySpark), Spark and associated data engineering jobs.
  • Experience with AWS core technologies; Lambda, S3, EMR, Glue, Redshift, step-functions
  • Experience with building and supporting cloud based ETL (Extract Transform Load) Data Pipelines
  • Excellent communication & presentation skills
  • Should be able to work on a problem independently and prepare client ready deliverable with minimal or no supervision
  • Experience with working in an agile environment, development life cycle and with diverse stakeholders (like IT, Business, Project Managers etc)
  • Ability to develop, enhance data models and identify ETL optimization opportunities.
  • Exposure to ETL tools is going to help in the work
  • Should have strong grasp of advanced SQL functionalities (joins, nested query, procedures, PL/SQL)
  • Strong grasp of python libraries and concepts around PySpark, Numpy, Pandas, functions, constructors etc
  • Strong ability to translate functional specifications / requirements to technical requirements
  • Bachelor's/Master's degree in economics, mathematics, actuarial sciences, computer science/engineering, operations research or related analytics areas, candidates with BA/BS degrees in the same fields from the top tier academic institutions are also welcome to apply
  • Strong and in-depth understanding of DE Fundamentals
  • Exposure on designing ETL data pipelines and data analysis
  • Exposure on end-to-end data lifecycle management
  • Superior analytical and problem-solving skills
  • Outstanding written and verbal communication skills
  • Able to work in fast pace continuously evolving environment and ready to take up uphill challenges


Gurgaon, Haryana, India