Data Engineer (Malaysia)

Klitit Team
Top Talents
0 +

Overview

Vivasoft is seeking experienced Data Engineers to design, build, and optimize large-scale data pipelines using AWS and PySpark. This role focuses on data reliability, performance, and scalability, working closely with cross-functional teams where the Data Engineer often serves as the primary data expert.

The ideal candidate is a self-starter who can operate independently while contributing to a broader data ecosystem.

Responsibilities:

  • Design, build, and optimize large-scale data pipelines using PySpark, SQL, and AWS data services
  • Develop and maintain ETL/ELT workflows with strong focus on data quality, lineage, and auditability
  • Work extensively with AWS Glue, S3, Redshift, and Athena
  • Implement and support data processing using Databricks, SparkSQL, and streaming platforms (e.g., Kafka)
  • Independently triage, repair, and optimize data pipelines, including safe backfills and urgent production fixes
  • Collaborate with technical and non-technical stakeholders to deliver reliable data solutions
  • Support PowerBI dashboard and workflow tasks when required
  • Ensure performance, cost efficiency, and reliability across cloud data environments

Requirements (Must Have):

  • Strong hands-on experience with PySpark, SQL, and AWS data services, especially AWS Glue
  • Proven experience designing and maintaining robust ETL/ELT pipelines
  • Experience with Databricks, SparkSQL, and streaming data systems such as Kafka
  • Ability to independently troubleshoot and resolve production data issues
  • Strong problem-solving skills and ability to work autonomously
  • Excellent communication skills and ability to collaborate across teams

Should Have:

  • 4–6 years deep experience across the AWS ecosystem (EMR, DynamoDB, Lambda)
  • Experience with Infrastructure as Code (Terraform, CloudFormation)
  • Familiarity with CI/CD and DevOps practices
  • Experience with data modeling, DDL, and data governance tools (e.g., Collibra, Unity Catalog)
  • Prior experience working with retail/e-commerce and financial services/banking data environments

Nice to Have:

  • Ability to support PowerBI dashboards and workflows
  • Experience with performance tuning and cloud cost optimization
  • Experience with workflow orchestration tools such as Airflow or DBT
  • Familiarity with monitoring and alerting for data pipelines
  • Experience working in Agile environments and global delivery models
  • AWS Solutions Architect certification or equivalent
  • Interest and ability to mentor team members and contribute to a broader data community

What we offer:

  • Opportunity to work on large-scale, cloud-native data platforms
  • High ownership role with autonomy and impact
  • Exposure to advanced AWS-based data architectures
  • Collaborative and technically driven work environment
  • Continuous learning and professional growth opportunities

Job Information:

Job Location:
Malaysia
Job Type:
Full-time
Number of Vacancies:
10
Salary:
RM 11-12K
Application Deadline:
Open until filled

Send Us Your Resume

As we continue to grow our core values stay the same and guide us through everything we do — from hiring to helping customers

lets build our future together

Get to Know Us Better

Explore our expertise, projects, and vision.