Must-Have:
- 10+ years of software development and delivery experience.
- 5 to 7 years of large-scale Data application experience.
- Strong problem-solving, analytical skills, and aptitude.
- Experience leading teams of 6-7 members or handling 2-3 smaller teams.
- Proficiency in coding, debugging, performance tuning, and deploying apps to Production.
Should have good working experience on:
- Hadoop ecosystem (HDFS, Hive, Yarn, Avro/Parquet).
- Spark for Batch Processing.
- Setting ETL pipelines.
- Proficiency in Python or Java.
- Cloud-based Data Lake and Data Warehousing.
- Exposure to Snowflake, Redshift, or other Distributed Data Warehousing Systems.
- Agile Methodology experience.
- Ability to quickly learn and adapt to new technologies.
- Ownership of project planning, design, implementation, UAT, and delivery.
- Excellent communication and coordination skills.
- Knowledge (preferably hands-on) of UNIX environments, CICD tools, scheduling tools like Autosys, Airflow, etc.
- Ability to integrate quickly into the team and work independently towards team goals.
Nice to Have:
- Experience in AWS/Azure/GCP, preferably AWS.
- Familiarity with Snowflake.
Bengaluru, Karnataka, India
Hyderabad, Telangana, India
Pune, Maharashtra, India