Current jobs related to Senior GCP Data Engineer @ - Remote Wrocław Gdańsk Rzeszów - Xebia sp. z o.o.
-
Senior GCP Data Engineer @ Xebia sp. z o.o.
6 days ago
Remote, Wrocław, Gdańsk, Rzeszów, Czech Republic Xebia sp. z o.o. Full time5+ years in a data engineering role, with hands-on experience in building data processing pipelines, experience in leading the design and implementing of data pipelines and data products, proficiency with GCP services, for large-scale data processing and optimization, extensive experience with Apache Airflow, including DAG creation, triggers, and...
-
Senior Data Engineer
6 days ago
Remote, Czech Republic SoftServe Full timeIF YOU ARE A skilled professional with 5+ years of hands-on experience developing and optimizing ETL solutions within Google Cloud Platform (GCP) Proficient with GCP services such as BigQuery, Cloud Run Functions, Cloud Run, and Dataform Experienced with SQL and Python for data processing and transformation Knowledgeable in RDBMS, particularly MS SQL...
-
Senior Data Engineer
2 weeks ago
Remote, Czech Republic beBeeDataEngineer Full time 450,000 - 850,000Job Title: Senior Data Engineer - Data Streaming SpecialistWe are seeking an experienced Senior Data Engineer to join our team and lead the development of high-quality data streaming pipelines.Key Responsibilities:Design, develop, and maintain scalable data streaming solutions using Confluent and Spark Structured Streaming.Collaborate with cross-functional...
-
Data Architect GCP @ SquareOne
6 days ago
Remote, Warsaw, Czech Republic SquareOne Full time7+ years of experience in data architecture, database design, and data engineering Proven expertise in Google Cloud Platform (GCP), including: Dataplex, BigQuery, Dataflow (Apache Beam) and other GCP-native tools Strong experience with Apache-based data pipelining tools (Beam, Airflow, Kafka, Spark) Expertise in data modeling (conceptual, logical,...
-
GCP HPC DevOps Engineer @ Xebia sp. z o.o.
6 days ago
Remote, Wrocław, Gdańsk, Rzeszów, Czech Republic Xebia sp. z o.o. Full timeYour profile: 5+ years of experience with HPC (High-Performance Computing) environments, including SLURM workload manager, MPI, and other HPC-related software, extensive hands-on experience managing Linux-based systems, including performance tuning and troubleshooting in an HPC context, proven experience migrating and managing SLURM clusters in cloud...
-
Remote, Czech Republic Link Group Full time5+ years of hands-on experience with GCP and public cloud infrastructure. Strong IaC background (Terraform), preferably with Azure DevOps pipelines. Experience with managed GCP services: Cloud Run, BigQuery, Dataproc, Vertex AI, GKE. Knowledge of monitoring and observability practices. Hands-on with Kubernetes, Bash, and Linux system administration. Solid...
-
Senior Data Engineer @
2 weeks ago
Remote, Wrocław, Czech Republic AVENGA (Agencja Pracy, nr KRAZ: 8448) Full timeKey Requirements:5 years of experience as Data EngineerProven experience in Azure Databricks (data engineering, pipelines, performance tuning, Python)Azure DevOps (Repos, Pipelines, YAML)Azure Key VaultAzure Data Factory (optional)Good to have knowledge within Power BI.Strong analytical and problem-solving skillsExcellent communication and stakeholder...
-
GCP DevOps Engineer @
2 weeks ago
Remote, Kraków, Czech Republic Antal Full time3+ years of experience in DevOps or Cloud Engineering rolesPractical experience with CI/CD tools such as Jenkins, GitHub Actions, Nexus and AnsibleHands-on experience with cloud platforms (GCP preferred, AWS or Azure also considered)Proficiency in Terraform and Infrastructure as Code (IaC) approachesStrong scripting skills in Bash and PythonVery good...
-
Senior Data Engineer @ Dogtronic Solutions
6 days ago
Remote, Czech Republic Dogtronic Solutions Full timeRequirements Strong Python and SQL skills Proven experience delivering Databricks PySpark pipelines into production Solid hands-on experience with Databricks Experience with cloud platforms (Azure, AWS, or GCP) Comfortable working with large datasets and building robust data pipelines Clear communication skills – able to explain technical...
-
Senior BigData Engineer
6 days ago
Remote, Czech Republic SoftServe Full timeIF YOU ARE Skilled in SQL, Python, and building or optimizing ETL pipelines Experienced with BigQuery, Dataflow (Apache Beam) or Dataproc (Spark/PySpark), and dbt/Dataform Aware of data modeling and DWH architecture Familiar with Data Quality metrics, checks, and reporting Comfortable with DataOps practices, such as CI/CD and Terraform Open to exploring AI...

Senior GCP Data Engineer @
2 weeks ago
- 5+ years in a data engineering role, with hands-on experience in building data processing pipelines,
- experience in leading the design and implementing of data pipelines and data products,
- proficiency with GCP services, for large-scale data processing and optimization,
- extensive experience with Apache Airflow, including DAG creation, triggers, and workflow optimization,
- knowledge of data partitioning, batch configuration, and performance tuning for terabyte-scale processing,
- strong Python proficiency, with expertise in modern data libraries and frameworks (e.g., Databricks, Snowflake, Spark, SQL),
- hands-on experience with ETL tools and processes,
- practical experience with dbt for data transformation,
- deep understanding of relational and NoSQL databases, data modelling, and data warehousing concepts,
- excellent command of oral and written English,
- Bachelor's or Master's degree in Computer Science, Information Systems, or a related field.
Work from the European Union region and a work permit are required.
Candidates must have an active VAT status in the EU VIES registry: ec.europa.eu/taxation_customs/vies.
Nice to have:- experience with ecommerce systems and their data integration,
- knowledge of data visualization tools (e.g., Tableau, Looker),
- understanding of machine learning and data analytics,
- certification in cloud platforms (AWS Certified Data Analytics, Google Professional Data Engineer, etc.).
Who We Are
While Xebia is a global tech company, in Poland, our roots came from two teams – PGS Software, known for world-class cloud and software solutions, and GetInData, a pioneer in Big Data. Today, we're a team of 1,000+ experts delivering top-notch work across cloud, data, and software. And we're just getting started.
Hello, let's meetWhat We Do
We work on projects that matter – and that make a difference. From fintech and e-commerce to aviation, logistics, media, and fashion, we help our clients build scalable platforms, data-driven solutions, and next-gen apps using ML, LLMs, and Generative AI. Our clients include Spotify, Disney, ING, UPS, Tesco, Truecaller, AllSaints, Volotea, Schmitz Cargobull, and Allegro or InPost.
We value smart tech, real ownership, and continuous growth. We use modern, open-source stacks, and we're proud to be trusted partners of Databricks, dbt, Snowflake, Azure, GCP, and AWS. Fun fact: we were the first AWS Premier Partner in Poland
Beyond Projects
What makes Xebia special? Our community. We run events like the Data&AI Warsaw Summit, organize meetups (Software Talks, Data Tech Talks), and have a culture that actively support your growth via Guilds, Labs, and personal development budgets — for both tech and soft skills. It's not just a job. It's a place to grow.
What sets us apart?
Our mindset. Our vibe. Our people. And while that's hard to capture in text – come visit us and see for yourself.
,[collaborating with data engineers to ensure data engineering best practices are integrated into the development process,, ensuring data integrity, consistency, and availability across all data systems,, integrating data from various sources, including transactional databases, third-party APIs, and external data sources, into the data lake,, implementing ETL processes to transform and load data into the data warehouse for analytics and reporting,, working closely with cross-functional teams including Engineering, Business Analytics, Data Science and Product Management to understand data requirements and deliver solutions,, collaborating with data engineers to ensure data engineering best practices are integrated into the development process,, optimizing data storage and retrieval to improve performance and scalability,, monitoring and troubleshooting data pipelines to ensure high reliability and efficiency,, implementing and enforcing data governance policies to ensure data security, privacy, and compliance,, developing documentation and standards for data processes and procedures.] Requirements: GCP, Apache Airflow, Python, SQL, Cloud platform Additionally: Training budget, Private healthcare, Multisport, Integration events, International projects, Mental health support, Referral program, Modern office, Canteen, Free snacks, Free beverages, Free tea and coffee, No dress code, Playroom, In-house trainings, In-house hack days, Normal atmosphere :).