- 8+ years of experience in data warehousing, data integration or data engineering projects.
- Possess in-depth knowledge and hands on development experience building end to end data pipeline on Google Cloud Platform (GCP) including ingestion, processing, storage and data validation.
- Experience in working with core GCP data technologies, BigQuery, Composer/Airflow and GCS
- Strong Python & SQL coding experience [Must].
- Strong programming foundation in Python, including debugging and performance analysis.
- Good Knowledge of other GCP services like Dataplex, Cloud Data Fusion and Dataflow
- Experience working on platforms such as Oracle, Informatica, DB2, Cloudera, Collibra is preferred
- Experience in migrating legacy EDW to cloud is preferred
- Experience working in all phases of development i.e. requirement gathering, design, development, data validation & testing, deployment and maintenance, of data applications
who we are looking for
Location: Prague, Czech Republic
Employment Type: Permanent
Hybrid Work Model: Hybrid: 2 – 3 Days Work From Office and rest Remote (Per Week).
Language Proficiency: English & Czech (Preferred)
MANDATORY SKILLS:
Scala, python, or Java as well as GCP or AWS experience.
how to apply
Have some question about the position first? Feel free to contact us!
Or you can just apply to job offer, send us your cv, and we will contact you with more details.
To see all our open positions, go directly to www.randstad.cz