Location: Ahmedabad, India
Job Type: Full Time
- Minimum 3 years’ experience in architecting and implementing large-scale data intelligence solutions around Snowflake Data Warehouse.
- Developing ETL pipelines in and out of data warehouse using a combination of DBT , Airflow and python
- Experience in writing SQL queries against Snowflake.
- Developing Dags in Airflow and Transformation queries in DBT.
- Knowledge of Testing tools.
- Data pipelines and modern ways of automating data pipeline using cloud-based Testing and document implementations.
- Experience with building production sized data ingestion and processing pipelines using Java, Spark, Scala, Python.
- Databases such as MSSQL, MySQL, Oracle or DB2 Expertise and excellent understanding of Snowflake Internals.
- Snowflake with other data processing tools and reporting technologies
- Excellent presentation and communication skills.
- Should be SnowPro Core or Snowpro Advanced: Data Engineer or Snowpro Advanced: Data Analyst certified.