Lead GCP Data Engineer, Architect @ Xebia sp. z o.o.
2 weeks ago
6+ years of hands-on experience in data engineering and large-scale distributed systems, proven expertise in building and maintaining complex ETL/ELT pipelines, deep knowledge of orchestration frameworks (Airflow) and workflow optimization, strong GCP cloud infrastructure experience, GKE experience, expert-level programming in Python or Scala, solid understanding of Spark internals, experience with CI/CD tools (e.g., Jenkins, GitHub Actions) and infrastructure as code, familiarity with managing self-hosted tools like Spark or Airflow on Kubernetes, experience managing data warehouse in BigQuery, strong communication skills and a proactive, problem-solving mindset, very good command of English (min. C1). Nice to have: working experience with messaging systems like Kafka, Redpanda, experience with real-time data streaming platforms (e.g., Flink, Spark Structured Streaming), familiarity with ML platforms or MLOps workflows, familiarity with Kubeflow, Valido, Looker, Looker Studio. Work from the European Union region and a work permit are required. Who We Are While Xebia is a global tech company, our journey in CEE started with two Polish companies – PGS Software, known for world-class cloud and software solutions, and GetInData, a pioneer in Big Data. Today, we’re a team of 1,000+ experts delivering top-notch work across cloud, data, and software. And we’re just getting started. What We Do We work on projects that matter – and that make a difference. From fintech and e-commerce to aviation, logistics, media, and fashion, we help our clients build scalable platforms, data-driven solutions, and next-gen apps using ML, LLMs, and Generative AI. Our clients include Spotify, Disney, ING, UPS, Tesco, Truecaller, AllSaints, Volotea, Schmitz Cargobull, and Allegro or InPost. We value smart tech, real ownership, and continuous growth. We use modern, open-source stacks, and we’re proud to be trusted partners of Databricks, dbt, Snowflake, Azure, GCP, and AWS. Fun fact: we were the first AWS Premier Partner in Poland Beyond Projects What makes Xebia special? Our community. We run events like the Data&AI Warsaw Summit, organize meetups (Software Talks, Data Tech Talks), and have a culture that actively support your growth via Guilds, Labs, and personal development budgets — for both tech and soft skills. It’s not just a job. It’s a place to grow. What sets us apart? Our mindset. Our vibe. Our people. And while that’s hard to capture in text – come visit us and see for yourself. ,[designing, building and optimizing the data ingestion pipeline to reliably deliver billions of events daily in defined SLA, , leading initiatives to improve scalability, performance and reliability, , providing support for all product teams in building and optimizing their complex pipelines, , identifying and addressing pain points in the existing data platform; proposing and implementing high-leverage improvements, , developing new tools and frameworks to streamline the data platform workflows,, driving adoption of best practices in data and software engineering (testing, CI/CD, version control, monitoring), , working in close collaboration with data scientists and data analysts to help support their work in production, , supporting production ML workflows and real-time streaming use cases, , mentoring other engineers and contributing to a culture of technical excellence and knowledge sharing.] Requirements: GCP, Python/Scala, Airflow, ETL/ELT, GKE, Kafka, Flink, Looker Additionally: Training budget, Private healthcare, Multisport, Integration events, International projects, Mental health support, Referral program, Modern office, Canteen, Free snacks, Free beverages, Free tea and coffee, No dress code, Playroom, In-house trainings, In-house hack days, Normal atmosphere :).
-
Senior GCP Data Engineer @ Xebia sp. z o.o.
9 hours ago
Remote, Wrocław, Gdańsk, Rzeszów, Czech Republic Xebia sp. z o.o. Full time5+ years in a data engineering role, with hands-on experience in building data processing pipelines, experience in leading the design and implementing of data pipelines and data products, proficiency with GCP services, for large-scale data processing and optimization, extensive experience with Apache Airflow, including DAG creation, triggers, and workflow...
-
Remote, Wrocław, Czech Republic Xebia sp. z o.o. Full timeYour profile: smart and tech-savvy engineer with 5+ years of experience working with DevOps practices and Continuous Delivery, very communicative and collaborative, understands the entire supply chain and has a holistic approach and desire to reduce the gap between development and operations, understands the software development lifecycle, application and...
-
Kraków, Wrocław, Poznań, Gdańsk, Warszawa, Czech Republic Capgemini Polska Sp. z o.o. Full timeYOUR PROFILE Good experience as an Architect and/or Technical Leader in Cloud or Big Data projects in the field of data processing and visualization (in different phases of SDLC); Practical knowledge of one of the following clouds: AWS, Azure, GCP in the area of Storage, Compute (+Serverless), Networking and Devops supported by work on commercial projects;...
-
Remote, Wrocław, Czech Republic Data Hiro sp. z o.o. Full time6+ years of experience in IT 4+ years of hands-on experience with the Microsoft Power Platform, including: Working with Solutions Managing multiple environments (SDLC) Deployment Pipelines PowerApps (model-driven and canvas) Power BI (Dataverse connectivity) Power Automate (cloud flows, web services, Azure Functions) Power Pages Strong familiarity with Azure...
-
DevOps Engineer
4 days ago
Remote, Białystok, Radom, Kraków, Gdańsk, Wrocław, Poznań, Lublin, Warszawa, Łódź, Czech Republic Moondigo Sp. z o.o. Full timeSolid hands-on experience with Google Cloud Platform (GCP) Knowledge of PaaS, IaaS, SaaS services on GCP Experience in Cloud Security Posture Management and Vulnerability Management Familiarity with Kubernetes (GKE, Anthos) Understanding of Cloud SOC Operations (log analysis, detections, threat mitigation) Strong knowledge of GCP networking (VPCs, firewalls,...
-
Gdańsk, Czech Republic Projekt Parking Sp. z o.o. Full timeuczciwość i niekaralność; umiejętność obsługi urządzeń elektronicznych; gotowość do pracy według ustalonego grafiku (zmiany ranne, popołudniowe i weekendy); Projekt Parking Sp. z o.o. poszukuje pracownika na stanowisko: INSPEKTOR STREFY PŁATNEGO PARKOWANIA W GDAŃSKU Oferujemy: - umowę zlecenie; - konkurencyjne wynagrodzenie podstawowe i...
-
Data Engineer @ deepsense.ai Sp. z o.o.
1 week ago
Remote, Bydgoszcz, Warsaw, Wrocław, Poznań, Gdańsk, Łódź, Czech Republic deepsense.ai Sp. z o.o. Full timeGood knowledge of Python and SQL. Experience with any of the major SQL databases (PostgreSQL preferred). Strong knowledge of cloud computing platforms (Azure/GCP/AWS). Familiarity with containerization technologies (Docker/Kubernetes). Experience with ETL and Big Data elements (Spark, Kafka, etc.). Basic experience in DevOps...
-
Cloud DevOps Engineer
1 week ago
Remote, Białystok, Radom, Kraków, Gdańsk, Wrocław, Poznań, Lublin, Warszawa, Łódź, Czech Republic Moondigo Sp. z o.o. Full timeSolid hands-on experience with Google Cloud Platform (GCP) Knowledge of PaaS, IaaS, SaaS services on GCP Experience in Cloud Security Posture Management and Vulnerability Management Familiarity with Kubernetes (GKE, Anthos) Understanding of Cloud SOC Operations (log analysis, detections, threat mitigation) Strong knowledge of GCP networking (VPCs, firewalls,...
-
Wrocław, Czech Republic Data Hiro sp. z o.o. Full time5+ years of hands-on experience in implementing, developing, or maintaining data systems in commercial environments. Solid understanding of data modeling and data architecture principles. Strong Python skills, with a focus on writing clean, maintainable code and applying solid object-oriented design principles. Advanced SQL expertise, including performance...
-
Wrocław, Czech Republic Data Hiro sp. z o.o. Full time5+ years of hands-on experience in implementing, developing, or maintaining data systems in commercial environments. Solid understanding of data modeling and data architecture principles. Strong Python skills, with a focus on writing clean, maintainable code and applying solid object-oriented design principles. Advanced SQL expertise, including performance...