Middle/Senior Data Engineer (Fluent in English – HCM/HN)

Deadline to apply: October 31, 2025

Middle/Senior Data Engineer (Fluent in English – HCM/HN)

Deadline to apply: October 31, 2025

JOB DESCRIPTION

Join our team to design, build, and optimize modern data platforms on Snowflake or Databricks Lakehouse. You’ll work on end-to-end migration projects, build scalable pipelines with dbt, and collaborate with data scientists, analysts, and product teams to deliver impactful data solutions.

  • Design, build, and maintain end-to-end data pipelines on Snowflake’s cloud platform.
  • Lead proof-of-concept projects and execute large-scale data migrations from legacy systems.
  • Apply best practices in data modeling, ensuring scalability and performance.
  • Optimize queries, storage, and compute resources for cost efficiency.
  • Configure and manage Snowflake security, including role-based access controls and governance.
  • Collaborate with cross-functional teams (data scientists, analysts, business stakeholders) to define and deliver data requirements.
  • Develop ETL/ELT workflows using Snowflake features and partner tools.
  • Maintain documentation for data models, flows, and processes.
  • Implement robust data quality controls and monitoring solutions

REQUIRED SKILLS AND EXPERIENCE

  • Good command of English (both written/verbal) is essential
  • 3+ years of experience in data engineering.
  • 2+ years hands-on with Snowflake or Databricks Lakehouse as the main platform.
  • Proven track record in end-to-end migration projects (e.g., Oracle/SQL Server → Snowflake, Hadoop → Databricks).
  • Data Warehousing & Modeling: Kimball/Star Schema, Data Vault, Lakehouse patterns.
  • SQL: Advanced query design, tuning, and optimization.
  • ETL/ELT: dbt (mandatory) + Fivetran, Matillion, or Airbyte.
  • Workflow Orchestration: Airflow, Databricks Workflows, Step Functions.
  • Programming: Strong in Python; exposure to Scala/Java for Spark pipelines.
  • Cloud Platforms: AWS (Redshift, Glue, S3, Lambda, IAM), GCP (BigQuery, Dataflow), Azure Synapse.
  • Governance & Security: RBAC, encryption, PII handling; familiar with Snowflake access controls / Databricks Unity Catalog.
  • Performance Tuning: Query optimization, storage tuning, resource scaling.
  • Ecosystem: Kafka/Kinesis/PubSub (streaming), Looker/Tableau/Power BI (preferred).
PREFERRED QUALIFICATIONS (Preferred, not required)
  • SnowPro Core/Advanced (Snowflake).
  • Databricks Certified Data Engineer Professional.
  • AWS/GCP/Azure Architect Associate.

BENEFITS

  • Salary: up to 60M
  • Probation salary is 100% of official salary
  • 13th-month salary and performance review twice a year
  • Bonus for special occasions each year (Labor Day, National Day, Solar New year, Lunar New Year)
  • Project Bonus
  • IT Certificate allowance
  • Health Care Insurance
  • Social, health and unemployment Insurance following Government policy
  • Enjoy company summer trips and other team building activities held  monthly and quarterly
  • Work five days per week with flexible check-in time
  • Professional, creative and dynamic working environment

CONTACT