- Good command of English (both written/verbal) is essential
- 3+ years of experience in data engineering.
- 2+ years hands-on with Snowflake or Databricks Lakehouse as the main platform.
- Proven track record in end-to-end migration projects (e.g., Oracle/SQL Server → Snowflake, Hadoop → Databricks).
- Data Warehousing & Modeling: Kimball/Star Schema, Data Vault, Lakehouse patterns.
- SQL: Advanced query design, tuning, and optimization.
- ETL/ELT: dbt (mandatory) + Fivetran, Matillion, or Airbyte.
- Workflow Orchestration: Airflow, Databricks Workflows, Step Functions.
- Programming: Strong in Python; exposure to Scala/Java for Spark pipelines.
- Cloud Platforms: AWS (Redshift, Glue, S3, Lambda, IAM), GCP (BigQuery, Dataflow), Azure Synapse.
- Governance & Security: RBAC, encryption, PII handling; familiar with Snowflake access controls / Databricks Unity Catalog.
- Performance Tuning: Query optimization, storage tuning, resource scaling.
- Ecosystem: Kafka/Kinesis/PubSub (streaming), Looker/Tableau/Power BI (preferred).
Middle/Senior Data Engineer (Fluent in English – HCM/HN)
Deadline to apply: October 31, 2025