Agile Process (Scrum cadences, Roles, deliverables) & working experience in either Azure DevOps, JIRA or Similar
Strong In/should have :
Designing and implementing a fully operational production grade large scale data solution on Snowflake Data Warehouse.
Hands on experience with building productionized data ingestion and processing pipelines using Java (or Spark, Scala, Python etc).
Expertise and excellent understanding of Snowflake Internals and integration of Snowflake with other data processing and reporting technologies.
Experience in writing stored procedure/function using Snowflake Scripting or Javascript.
Deploying Snowflake features such as data sharing, events, and lake-house patterns
Leveraging Snowflake utilities, SnowSQL, SnowPipe, and Big Data modeling techniques using Python
Data Migration from RDBMS to Snowflake cloud data warehouse
NoSQL data stores, methods, and approaches (star and snowflake, dimensional modeling)
Developing ETL workflows/pipelines in and out of data warehouse using combination of Python or Snow pipe.
Understanding data pipelines and modern ways of automating data pipeline using cloud-based implementations
Basic understanding of Snowflake architecture (Storages and Warehouses)
Good understanding of RBAC (Role Based access Controls) for user and access management.
Good understanding of Structured and semi-structured data formats (xml, ORC, json, Parquet, Avro).
Good experience in creating the file formats, external and internal stages to connect to Cloud storages (AWS S3, Azure BLOB etc), zero copy clone.
Good understanding of concepts around Time Travel and Failsafe features in Snowflake.
Translate requirements for BI and Reporting to Database design and reporting design.
Understanding data transformation and translation requirements and which tools to leverage to get the job done
Good to have basic knowledge on Snowflake Billing, setting up Resource Monitors etc.
Good to have knowledge on other cloud Data warehousing solutions like AWS Redshift, Google big query.
Job Responsibilities
Snowflake data engineers/developers will be responsible for architecting and implementing very large-scale data intelligence solutions around Snowflake Data Warehouse.
A solid experience and understanding of architecting, designing, and operationalization of large-scale data and analytics solutions on Snowflake Cloud Data Warehouse is a must.
Location
Gurgaon, Haryana, India
About Company
HuQuo (derived from Human Quotient) is the next stage in the evolution of HR. In an industry thirsty for innovation, HuQuo is a fresh perspective towards people and organizations. At HuQuo,we do not look at human beings as mere resources, but as an ocean of untapped potential. Human Quotient goes beyond IQ and EQ. It’s the holistic representation of all human potential. We strategically manage this Human quotient. We believe that every organization has a human like individuality which can help nurture and flourish those individuals that resonate with it. Our endeavor is to bring such high resonance individuals and organizations together. We will achieve this through extensive use of analytics and deep immersion to understand the core of the organizations and people we work with.