about the company
our client is a leading financial institutions that provides innovative solutions, focusing on customer-centric services supported by digital advancements.
about the role
you will play a crucial role in building, optimizing, and maintaining data infrastructure that supports various business intelligence, analytics, and machine learning initiatives. This position involves designing scalable data pipelines, integrating diverse data sources, and ensuring efficient data flow between systems.
about the job
- lead the process of gathering requirements for data-related projects
- design, build, and maintain data pipelines to enable effective ETL using Oracle and MS SQL
- manage and implement batch processing jobs to support daily operations, ensuring accurate and timely data processing
- use Kafka for managing real-time data streaming and integrating it with core systems
- collaborate with development teams, business analysts, and stakeholders to meet data requirements and deliver projects on schedule
- identify risks in data projects, escalating issues or suggesting contingency plans when needed
knowledge, skills and experience
- bachelor's degree in computer science, information technology, data science, or a related field, or equivalent practical experience
- at least 2 years of practical experience in data engineering, particularly with Oracle and MS SQL
- proficient in SQL with experience in performance tuning for data pipelines
- experience with Kafka or similar real-time data streaming platforms
- familiarity with batch processing and designing data pipelines
- strong understanding of ETL processes and data integration methodologies
- experience in data modeling and designing schemas
...
preferred qualifications
- familiarity with big data technologies and frameworks (e.g., Hadoop, Spark) and their use in data engineering
- experience with cloud-based data services and platforms like AWS, Azure, or Google Cloud
- knowledge of CI/CD tools and practices related to data engineering, including Jenkins, GitLab CI/CD, or Airflow
- understanding of data security and compliance requirements, particularly concerning financial data
how to apply
interested candidates may contact Hua Hui at +6017 960 0313 for a confidential discussion.