About Us
Live Payments is one of Australia&039;s leading payment service providers, servicing thousands of businesses across Australia. We continue to innovate our payment solutions and expand our product offering alongside our industry- leading strategic partners, including Qantas Loyalty and Ingenico.
Role and Responsibilities
Live Payments is seeking a talented, experienced Data Engineer to join our growing, multi- national team. The Data Engineer will be responsible for designing, building, and maintaining the systems and infrastructure that enable the processing, collection, storage, and analysis of large volumes of data.
Core duties include:
Data Pipeline Development & Operations
Develop event- driven data processing solutions with proper handling of idempotency, retries, and deduplication
Design, build, and maintain cloud- native data pipelines
Implement ETL/ELT processes for moving and transforming data at scale
Production Support & Reliability
Implement data quality checks and reconciliation processes
Set up alerting systems and troubleshoot pipeline failures
Monitor production data pipelines and respond to incidents
Infrastructure & Cloud Services Management
Implement monitoring and logging using CloudWatch, Azure Monitor, or Datadog
Utilise cloud object storage solutions (S3, Azure Blob Storage/ADLS) for data lake operations
Develop and maintain infrastructure as code using Terraform or CloudFormation
Work with managed database services (AWS qRDS PostgreSQL/SQL Server, Azure SQL)
Development & Deployment
Collaborate with cross- functional teams on data requirements and solutions
Implement CI/CD pipelines for data infrastructure
Write advanced SQL queries and Python code for data transformations
Key skills
Understanding of data dependencies and reconciliation.
Familiarity with CI/CD and Infrastructure as Code (Terraform, CloudFormation)
Strong understanding of ETL/ELT patterns, orchestration, and data movement at scale
Advanced SQL skills and proficient Python skills
Experience designing and operating cloud- native data pipelines.
Strong experience designing batch ETL/ELT (incremental loads, watermarking, backfills)
Experience using cloud object storage: Amazon S3, Azure Blob Storage / ADLS
Solid understanding of event- driven processing (idempotency, retries, replay, event ordering, deduplication)
Monitoring, alerting, and incident troubleshooting. Data quality and reconciliation checks.
Experience with cloud monitoring and logging: AWS CloudWatch, Azure Monitor, Datadog
Nice- to- have skills
Experience designing and operating data synchronisation pipelines between cloud platforms.
Experience with at least one managed cloud ETL service (e.g. AWS Glue, Azure Data Factory, or equivalent), and the ability to design pipelines beyond tool- specific limitations.
Experience with modern data platforms such as Snowflake or Databricks
Benefits:
Clear and attractive career path opportunities
13th- month salary & performance bonus
Generous leave: 14 annual leave days, 5 sick leave days, 3 conference leave days, and 1 birthday leave day per year
Full salary during probation period
Personal learning & career development allowance (15.6 million VND/year)
Comprehensive health scheme (6.2 million VND/year)
Free parking, coffee, snacks, and more
Biannual performance reviews to support your career growth
International, dynamic & friendly working environment
Regular team dinners & team building activities
Annual health check- ups & vaccinations
Company laptop provided
Social insurance contributions based on full salary, in accordance with Vietnam Labor Law.
Premium health insurance (covering spouse & children)