about the company
you will be joining a global leader in the financial services industry, providing innovative solutions to support long-term financial well-being.
...
about the role
you will be responsible for designing and implementing robust data solutions leveraging Medallion architecture and modern cloud platforms. This includes developing and maintaining extraction pipelines, managing data flows and scaling the data lake to integrate multiple sources efficiently. You will play a critical role in building scalable, high-performance data systems that support analytics and business needs.
about the job
- design, develop, and maintain scalable data pipelines and architectures for large-scale data processing
- implement ETL/ELT processes to transform raw data into structured formats for analytics and reporting
- optimize and maintain data lakes and data warehouses for performance, reliability, and scalability
- ensure data quality, consistency, and security across all pipelines and systems
- manage and integrate data from multiple sources, including APIs, databases, and third-party tools
- monitor and troubleshoot data workflows to identify and resolve performance bottlenecks
knowledge, skills and experience
- strong proficiency in Python & SQL
- hands-on experience with ETL/ELT processes and tools like Apache Spark, for handling large-scale data processing
- experience with Azure (preferred) or other cloud platforms
- familiarity with DevOps and pipeline deployment (good to have)
- basic understanding of tools like Power BI for presenting data insights (good to have)
how to apply
interested candidates may contact Hua Hui at +6017 960 0313 for a confidential discussion.