Data Engineering

Location: Bangkok, Thailand

Rank: All levels

Job Description

From smartphones to artificial intelligence, digital channels continue to drive how customers experience their brand. KPMG Thailand aims to help clients rethink their strategies and their business models so that they can embrace the latest digital thinking. This in turn will help them be more competitive and efficient. 

KPMG technology and data team is helping banks, insurers and non-bank clients on their digital transformation journey to becoming data-driven organization. The team is involved in a comprehensive range of consulting services and solutions to our client focusing on data and technology domain as well as help client solving their strategic problems focused on improving performance and profitability. 

Responsibilities

  • Advise clients about current state data architecture and data sources and create an appropriate transition path to cloud-based platform
  • Work with customer stakeholders in order to understand the implementation of database requirements, analyze performance, and troubleshoot any existent issues
  • Coordinate customer stakeholders to gather and consolidate business/technical requirements for designing, developing and delivering the data solutions to business as the expected target
  • Design and build large-scale enterprise data solutions and applications using one or more AZURE data and analytics services in combination with 3rd parties
  • Design and build production data pipelines from ingestion to consumption within a big data architecture, using Java, Python, Scala
  • Design and implement data engineering, ingestion, and curation functions on AZURE cloud using AZURE native or custom programming
  • Development, maintenance, improvement, cleaning, and manipulation of data in the business’s operational and large-scale processing systems architectures

Qualifications

What you will need:

We are looking for a motivated, technical minded individual with a track record of using modern technology stack to build data pipeline automation. A successful candidate should solid background in.

  • Degree in Computer Engineering and Computer Science or equivalent experience
  • Minimum 3 years of experience in full-life cycle development/implementation of enterprise data applications, big data technology, data analysis and system development
  • 5-7 years of performing data exploration and feature engineering
  • Experience (intermediate) as practitioner in Cloud Platform; AZURE (preferrable), AWS and GCP
  • Experience in selecting and integrating any Big Data tools and frameworks to meet required capabilities
  • Experience in building cloud data pipelines from ingestion to consumption
  • Experience in Python, R, Tableau, Hadoop, SQL
  • Experience with NoSQL databases, such as HBase, Cassandra, MongoDB
  • Experience with design and implementation of ETL/ELT framework for complex warehouses/marts
  • Strong internal team and external client stakeholder management with a collaborative, diplomatic, and flexible style, able to work effectively in a matrixed organization
  • Excellent presentation skills, including strong oral and written capabilities in English
  • Self-motivated, results-oriented individual with the ability to progress multiple priorities concurrently

#LI-CD1