About the Role
We are seeking a Senior GCP Data Engineer (Databricks) to join our team at Xebia Poland. This role is perfect for someone who is passionate about cloud-based solutions and has a strong background in data engineering and backend development. You will be responsible for designing, building, and deploying scalable infrastructure with a focus on distributed systems. Your work will involve creating architecture patterns for data processing, workflow definitions, and system integrations using Big Data and Cloud technologies.
Key Responsibilities
- Design, build, and deploy at-scale infrastructure with a focus on distributed systems.
- Develop and maintain architecture patterns for data processing and system integrations.
- Evaluate and translate technical designs into workable solutions and technical specifications.
- Drive the creation of reusable artifacts and establish automated processes for data analysis and model development.
- Collaborate with analysts and data scientists to understand the impact on downstream data models.
- Write efficient and well-organized software to ship products in an iterative release environment.
- Promote good software engineering practices across the team.
- Communicate effectively with both technical and non-technical audiences.
- Define data retention policies and monitor performance to advise on necessary infrastructure changes.
Your Profile
- 3+ years of experience with GCP (BigQuery, Dataflow, Pub/Sub, Bigtable, Dataproc, Storage, Kubernetes Engine).
- 5+ years of experience in data engineering or backend/fullstack software development.
- Strong SQL skills and Python scripting proficiency.
- Experience with data transformation tools such as Databricks and Spark.
- Familiarity with data manipulation libraries like Pandas, NumPy, and PySpark.
- Ability to structure and model data in both relational and non-relational forms.
- Experience with CI/CD tooling (GitHub, Azure DevOps, Harness).
- Good verbal and written communication skills in English.
Nice to Have
- Experience with Apache Hadoop and data modeling tools like DBT.
- Familiarity with enterprise Data Warehouse solutions, preferably Snowflake.
- Experience with ETL tools such as Informatica, Talend, Datastage, Stitch, Fivetran.
- Experience in containerization and orchestration (Docker, Kubernetes).
- Cloud certification (Azure, AWS, GCP).
Work Requirements
- Must be able to work from the European Union region and have a valid work permit.
Recruitment Process
- CV Review
- HR Call
- Interview
- Optional Client Interview
- Decision
Join us at Xebia, a place where experts grow, and be part of a team that is passionate about innovation and technological excellence.
Similar jobs
Last update: 23 minutes ago
Cloud Engineer, Data and Analytics
Join Google as a Cloud Engineer in Data and Analytics, focusing on data processing and cloud solutions.
Senior Cloud Data Engineer
Senior Cloud Data Engineer role focusing on data architecture, pipeline design, and cloud platforms like AWS and Snowflake.
Software Engineer III, BigQuery, Google Cloud
Join Google Cloud as a Software Engineer III, focusing on BigQuery and data processing technologies.
Remote Data Engineer - Google BigQuery
Join Hostaway as a Remote Data Engineer, leveraging Google BigQuery and Python to optimize data infrastructure and support revenue operations.
Data Engineer with GCP and Big Data Experience
Join Nextory as a Data Engineer in Stockholm. Work with GCP, Big Data, and more to enrich lives through reading.
Solutions Engineer with Data Analytics and GCP Experience
Join Databricks as a Solutions Engineer in Amsterdam, leveraging data analytics and GCP expertise to solve complex data challenges.
Senior Data Engineer with Spark
Senior Data Engineer role focusing on Spark, Kafka, and Airflow for data platform evolution. Fully remote, competitive benefits.
Lead Data Engineer with GCP Expertise
Lead Data Engineer role in Berlin, focusing on GCP, BigQuery, and data pipelines.
Senior Data Platform Engineer
Senior Data Platform Engineer needed for Blip, focusing on Big Data management and cloud solutions. Expertise in SQL, Python, Spark, and cloud platforms required.
Senior DevOps Engineer with GCP Expertise
Senior DevOps Engineer with expertise in GCP, CI/CD, and automation for IBM in Bucharest. Advanced GCP knowledge required.
Senior Data Engineer
Join diconium as a Senior Data Engineer in Cluj-Napoca, Romania. Lead data engineering projects, manage pipelines, and support stakeholders.
Senior Data Engineer
Join Bitpanda as a Senior Data Engineer in Barcelona, enhancing data platforms with Python, SQL, and GCP.
Senior Data Engineer - Apache Spark, PySpark, Azure Databricks
Senior Data Engineer specializing in Apache Spark, PySpark, and Azure Databricks for a leading UK fintech company.
Senior Data Engineer
Join Exclaimer as a Senior Data Engineer to design and maintain scalable data systems using Python, Azure, and Kafka.
Cloud Data Engineer
Seeking a Cloud Data Engineer with expertise in AWS, Python, and CI/CD for a hybrid role in Hannover. Join our dynamic team!
Senior Data Engineer
Join OBRAMAT as a Senior Data Engineer to manage cloud infrastructure and data integration in a hybrid work environment.
Senior Data Engineer
Senior Data Engineer role in Barcelona, skilled in Python, Spark, AWS, SQL, and big data processing.
Senior Backend Engineer, Data Platform
Senior Backend Engineer needed for scaling data platform at a fast-growing SaaS company in Poland.
Data Engineer with Apache Spark Experience
Join Mapiq as a Data Engineer to build scalable data pipelines using Apache Spark in a hybrid work environment.
Senior Python Engineer
Join GlobalLogic as a Senior Python Engineer to develop AI platforms using Python and cloud services.
Senior Data Engineer
Senior Data Engineer at Procter & Gamble, Warsaw. Lead data design, collaborate on projects, and optimize data flow. Big Data, ETL, Azure expertise needed.
Cloud Data Engineer
Join NPO as a Cloud Data Engineer to enhance data platforms using GCP, Python, and more. Flexible hours and growth opportunities.
Senior Software Engineer - Data Pipeline Team
Senior Software Engineer for Data Pipeline team, remote work, expertise in Python, NoSQL, Big Data technologies.
Senior Data Engineer
Join us as a Senior Data Engineer in Lisbon to design and maintain data infrastructure. Hybrid role with flexible benefits.