Bachelor's or Master's degree in Computer Science, Engineering, or a related field.
Proficiency in Apache Spark 3.X and SCALA programming.
Hands-on experience with Delta lake implementation.
Strong knowledge of streaming solutions, especially for IOT, using Spark streaming.
Expertise in Kafka for building data pipelines.
Proven track record of designing and optimizing Spark applications.
Strong problem-solving and debugging skills.
Excellent communication and collaboration abilities.
Job Responsibilities
Develop, implement, and maintain Spark applications for data processing and analytics.
Design and optimize Spark jobs for performance and scalability.
Implement Delta lake solutions for efficient data storage and management.
Build streaming solutions for Internet of Things (IOT) using Spark streaming.
Utilize Kafka expertise to design and manage data pipelines.
Collaborate with cross-functional teams to understand and address data processing requirements.
Troubleshoot and debug issues in Spark applications and pipelines.
Stay up-to-date with industry best practices and emerging technologies in Big Data and Spark ecosystem.
Location
Gurgaon, Haryana, India
About Company
HuQuo (derived from Human Quotient) is the next stage in the evolution of HR. In an industry thirsty for innovation, HuQuo is a fresh perspective towards people and organizations. At HuQuo,we do not look at human beings as mere resources, but as an ocean of untapped potential. Human Quotient goes beyond IQ and EQ. It’s the holistic representation of all human potential. We strategically manage this Human quotient. We believe that every organization has a human like individuality which can help nurture and flourish those individuals that resonate with it. Our endeavor is to bring such high resonance individuals and organizations together. We will achieve this through extensive use of analytics and deep immersion to understand the core of the organizations and people we work with.