Are you an experienced ETL Developer with expertise in Azure Data Lake, Databricks, Python, and ETL tools like SSIS? Do you have hands-on experience building data ingestion pipelines, designing data warehouses, and optimizing data transformations? Our client seeks a Business Intelligence Specialist to design, develop, and implement scalable ETL solutions using Azure Data Factory, Databricks, and SQL Server. You will be responsible for data integration, transformation, and automation in a cloud-based data architecture.
...
Advantages
- Hybrid work model (Minimum 2 days onsite per week)
- Work on cloud-based data solutions using Azure, Databricks, and SQL Server
- Hands-on experience with Oracle GoldenGate, Change Data Capture (CDC), and ETL automation
- Opportunity to collaborate with cross-functional teams in a dynamic Agile environment
- Competitive compensation and contract stability
Responsibilities
Design, develop and implement data ingestion pipeline from Oracle source to Azure Data Lake and Databricks - initial load and incremental ETL. Used tools are:
- Oracle GoldenGate (knowledge and experience are an asset) for data ingestion and Change Data Capture (currently in the final stages of proof of concept).
- Azure Data Factory (good knowledge) to orchestrate pipeline execution.
- Azure Databricks/PySpark (expert Python/PySpark knowledge required) to build transformations of raw (bronze) data into curated zone (silver) and datamart zone (gold).
- PowerDesigner (asset) to read and maintain data models.
As well as:
- Review requirements, source data tables and relationships to identify solutions for optimum data models and transformations.
- Review existing on-prem design to produce design and migration steps.
- Design data ingestion mechanisms and transformations to update Delta Lake zones (bronze, silver, and gold), using GoldenGate as CDC.
- Work with IT partner on the configuration of GoldenGate - responsible for providing direction and "how to".
- Prepare design artifacts and process diagrams, understand and update dimensional data models and source-to-target-mapping (STTM) documents.
- Analyze data - physical model mapping from data source to datamart model.
- Understand data requirements and recommend changes to the data model.
- Develop scripts to build physical models, and to create schema structure.
- Access Oracle DB and SqlServer environments, use SSIS and other development tools for analyzing legacy solutions to be migrated.
- Proactively communicate with leads on any changes required to conceptual, logical and physical models, and communicate and review dependencies and risks.
- Develop ETL strategy and solution for different sets of data modules.
- Create physical-level design documents and unit test cases.
- Develop Databricks notebooks and deployment packages for Incremental and Full Load.
- Develop test plans and perform unit testing of pipelines and scripts.
- Assess data quality and conduct data profiling.
- Troubleshoot performance issues, and ETL Load issues, and check log activity for each Individual package and transformation.
- Participate in Go Live planning and production deployment, and create production deployment steps and packages.
- Create design and release documentation.
- Provide Go Live support and review after Go Live.
- Review existing ETL processes, and tools and provide recommendations on improving performance and reducing ETL timelines.
- Review Infrastructure and any performance issues for overall process improvement.
- Knowledge Transfer to Ministry staff, develop documentation on the work completed.
Qualifications
Must Haves:
- 3+ Azure Data Lake and Data Warehouse, and building Databricks notebooks.
- 3+ years in ETL tools such as Microsoft SSIS, and stored procedures.
- 3+ years in Python and PySpark.
Desired Skills:
- Experience of 3+ years of working with SQL Server, SSIS, and T-SQL Development.
- Experience building data ingestion and change data capture using Oracle GoldenGate.
- Experience working with building Databases, Data Warehouse and Data Mart and working with incremental and full loads.
- Experience with any ETL tools such as Azure Data Factory and SqlServer Integration Services.
- Experience working with MS SQL Server and other RDBMS (Oracle, PL/SQL).
- Experience in dimensional data modeling, and tools – e.g. PowerDesigner.
- Experience with snowflake and star schema models.
- Experience in designing data warehouse solutions using slowly changing dimensions.
- Experience with Delta Lake concepts and Medallion architecture (bronze/silver/gold).
- Understanding data warehouse architecture, dimensional data and fact models.
- Analyzing, designing, developing, testing and documenting ETL from detailed and high-level specifications, and assisting in troubleshooting.
- Utilize SQL to perform tasks other than data transformation (DDL, complex queries).
- Good knowledge of database and delta lake performance optimization techniques.
- Experience working in an Agile environment, using DevOps tools for user stories, code repository, test plans and defect tracking.
- Ability to assist in requirements analysis and design specifications.
- Work closely with Designers, Business Analysts and other Developers.
- Liaise with Project Managers, Quality Assurance Analysts and Business Intelligence Consultants.
General Skills:
- Azure Data Factory.
- Oracle GoldenGate.
- SQL Server.
- Oracle.
- Ability to present technical solutions to business users.
Assets:
- Knowledge and experience building data ingestion, history, and change data capture using Oracle GoldenGate is an asset.
Summary
Our client is looking for a Business Intelligence Specialist – ETL Developer with expertise in Azure Data Lake, Databricks, and ETL tools like SSIS. If you have experience in data pipeline automation, cloud-based data integration, and optimizing ETL workflows, this role offers an exciting opportunity to work on large-scale enterprise data solutions in a hybrid work environment.
Also, remember that updating your profile on Randstad.ca helps us find you faster when we have roles that match your skills! So even if this role isn’t for you, please update your profile so we can find you!
We look forward to supporting you in your job search! Good luck!
Randstad Canada is committed to fostering a workforce reflective of all peoples of Canada. As a result, we are committed to developing and implementing strategies to increase the equity, diversity and inclusion within the workplace by examining our internal policies, practices, and systems throughout the entire lifecycle of our workforce, including its recruitment, retention and advancement for all employees. In addition to our deep commitment to respecting human rights, we are dedicated to positive actions to affect change to ensure everyone has full participation in the workforce free from any barriers, systemic or otherwise, especially equity-seeking groups who are usually underrepresented in Canada's workforce, including those who identify as women or non-binary/gender non-conforming; Indigenous or Aboriginal Peoples; persons with disabilities (visible or invisible) and; members of visible minorities, racialized groups and the LGBTQ2+ community.
Randstad Canada is committed to creating and maintaining an inclusive and accessible workplace for all its candidates and employees by supporting their accessibility and accommodation needs throughout the employment lifecycle. We ask that all job applications please identify any accommodation requirements by sending an email to accessibility@randstad.ca to ensure their ability to fully participate in the interview process.
show more
Are you an experienced ETL Developer with expertise in Azure Data Lake, Databricks, Python, and ETL tools like SSIS? Do you have hands-on experience building data ingestion pipelines, designing data warehouses, and optimizing data transformations? Our client seeks a Business Intelligence Specialist to design, develop, and implement scalable ETL solutions using Azure Data Factory, Databricks, and SQL Server. You will be responsible for data integration, transformation, and automation in a cloud-based data architecture.
Advantages
- Hybrid work model (Minimum 2 days onsite per week)
- Work on cloud-based data solutions using Azure, Databricks, and SQL Server
- Hands-on experience with Oracle GoldenGate, Change Data Capture (CDC), and ETL automation
- Opportunity to collaborate with cross-functional teams in a dynamic Agile environment
- Competitive compensation and contract stability
Responsibilities
Design, develop and implement data ingestion pipeline from Oracle source to Azure Data Lake and Databricks - initial load and incremental ETL. Used tools are:
- Oracle GoldenGate (knowledge and experience are an asset) for data ingestion and Change Data Capture (currently in the final stages of proof of concept).
...
- Azure Data Factory (good knowledge) to orchestrate pipeline execution.
- Azure Databricks/PySpark (expert Python/PySpark knowledge required) to build transformations of raw (bronze) data into curated zone (silver) and datamart zone (gold).
- PowerDesigner (asset) to read and maintain data models.
As well as:
- Review requirements, source data tables and relationships to identify solutions for optimum data models and transformations.
- Review existing on-prem design to produce design and migration steps.
- Design data ingestion mechanisms and transformations to update Delta Lake zones (bronze, silver, and gold), using GoldenGate as CDC.
- Work with IT partner on the configuration of GoldenGate - responsible for providing direction and "how to".
- Prepare design artifacts and process diagrams, understand and update dimensional data models and source-to-target-mapping (STTM) documents.
- Analyze data - physical model mapping from data source to datamart model.
- Understand data requirements and recommend changes to the data model.
- Develop scripts to build physical models, and to create schema structure.
- Access Oracle DB and SqlServer environments, use SSIS and other development tools for analyzing legacy solutions to be migrated.
- Proactively communicate with leads on any changes required to conceptual, logical and physical models, and communicate and review dependencies and risks.
- Develop ETL strategy and solution for different sets of data modules.
- Create physical-level design documents and unit test cases.
- Develop Databricks notebooks and deployment packages for Incremental and Full Load.
- Develop test plans and perform unit testing of pipelines and scripts.
- Assess data quality and conduct data profiling.
- Troubleshoot performance issues, and ETL Load issues, and check log activity for each Individual package and transformation.
- Participate in Go Live planning and production deployment, and create production deployment steps and packages.
- Create design and release documentation.
- Provide Go Live support and review after Go Live.
- Review existing ETL processes, and tools and provide recommendations on improving performance and reducing ETL timelines.
- Review Infrastructure and any performance issues for overall process improvement.
- Knowledge Transfer to Ministry staff, develop documentation on the work completed.
Qualifications
Must Haves:
- 3+ Azure Data Lake and Data Warehouse, and building Databricks notebooks.
- 3+ years in ETL tools such as Microsoft SSIS, and stored procedures.
- 3+ years in Python and PySpark.
Desired Skills:
- Experience of 3+ years of working with SQL Server, SSIS, and T-SQL Development.
- Experience building data ingestion and change data capture using Oracle GoldenGate.
- Experience working with building Databases, Data Warehouse and Data Mart and working with incremental and full loads.
- Experience with any ETL tools such as Azure Data Factory and SqlServer Integration Services.
- Experience working with MS SQL Server and other RDBMS (Oracle, PL/SQL).
- Experience in dimensional data modeling, and tools – e.g. PowerDesigner.
- Experience with snowflake and star schema models.
- Experience in designing data warehouse solutions using slowly changing dimensions.
- Experience with Delta Lake concepts and Medallion architecture (bronze/silver/gold).
- Understanding data warehouse architecture, dimensional data and fact models.
- Analyzing, designing, developing, testing and documenting ETL from detailed and high-level specifications, and assisting in troubleshooting.
- Utilize SQL to perform tasks other than data transformation (DDL, complex queries).
- Good knowledge of database and delta lake performance optimization techniques.
- Experience working in an Agile environment, using DevOps tools for user stories, code repository, test plans and defect tracking.
- Ability to assist in requirements analysis and design specifications.
- Work closely with Designers, Business Analysts and other Developers.
- Liaise with Project Managers, Quality Assurance Analysts and Business Intelligence Consultants.
General Skills:
- Azure Data Factory.
- Oracle GoldenGate.
- SQL Server.
- Oracle.
- Ability to present technical solutions to business users.
Assets:
- Knowledge and experience building data ingestion, history, and change data capture using Oracle GoldenGate is an asset.
Summary
Our client is looking for a Business Intelligence Specialist – ETL Developer with expertise in Azure Data Lake, Databricks, and ETL tools like SSIS. If you have experience in data pipeline automation, cloud-based data integration, and optimizing ETL workflows, this role offers an exciting opportunity to work on large-scale enterprise data solutions in a hybrid work environment.
Also, remember that updating your profile on Randstad.ca helps us find you faster when we have roles that match your skills! So even if this role isn’t for you, please update your profile so we can find you!
We look forward to supporting you in your job search! Good luck!
Randstad Canada is committed to fostering a workforce reflective of all peoples of Canada. As a result, we are committed to developing and implementing strategies to increase the equity, diversity and inclusion within the workplace by examining our internal policies, practices, and systems throughout the entire lifecycle of our workforce, including its recruitment, retention and advancement for all employees. In addition to our deep commitment to respecting human rights, we are dedicated to positive actions to affect change to ensure everyone has full participation in the workforce free from any barriers, systemic or otherwise, especially equity-seeking groups who are usually underrepresented in Canada's workforce, including those who identify as women or non-binary/gender non-conforming; Indigenous or Aboriginal Peoples; persons with disabilities (visible or invisible) and; members of visible minorities, racialized groups and the LGBTQ2+ community.
Randstad Canada is committed to creating and maintaining an inclusive and accessible workplace for all its candidates and employees by supporting their accessibility and accommodation needs throughout the employment lifecycle. We ask that all job applications please identify any accommodation requirements by sending an email to accessibility@randstad.ca to ensure their ability to fully participate in the interview process.
show more