Data Architect GCP @

2 days ago


Remote Warsaw, Czech Republic SquareOne Full time
  • 7+ years of experience in data architecture, database design, and data engineering
  • Proven expertise in Google Cloud Platform (GCP), including: Dataplex, BigQuery, Dataflow (Apache Beam) and other GCP-native tools
  • Strong experience with Apache-based data pipelining tools (Beam, Airflow, Kafka, Spark)
  • Expertise in data modeling (conceptual, logical, physical) for structured and semi-structured data
  • Solid knowledge of ETL/ELT processes, data transformation, and cloud integration techniques
  • Strong understanding of data governance, metadata management, and security compliance within GCP
  • Excellent communication and presentation skills — ability to engage with clients and translate complex data concepts into actionable business insights
  • Proven ability to collaborate with cross-functional teams (engineers, analysts, business stakeholders) to ensure data integrity and accessibility

As a Data Architect, you will play a key role in designing and implementing scalable, cloud-native data solutions on Google Cloud Platform (GCP). You'll partner with engineering, analytics, and business teams to build reliable data architectures that support enterprise-scale analytics and compliance.

,[Designing, implementing, and optimizing cloud-native data architectures within GCP, Leveraging Dataplex for data governance, cataloging, and lifecycle management, Building and managing Apache-based data pipelines (Beam, Airflow, Kafka, Spark) to ensure efficient, scalable data processing, Developing and maintaining ETL/ELT workflows, focusing on both cloud-based and streaming architectures, Defining and enforcing data governance and compliance best practices across platforms, Collaborating with engineering and analytics teams to ensure data availability, reliability, and performance, Providing expertise in big data processing and enterprise-scale analytics solutions on GCP, Staying current with emerging data technologies and recommending improvements to existing architectures] Requirements: GCP, Google Cloud Platform, Data pipelines, Airflow, Kafka, Spark, ETL, Big Data, Data engineering, BigQuery, Apache Beam, Data modeling, Cloud, Security

  • Remote, Warsaw, Czech Republic beBeeData Full time €100,000 - €125,000

    Cloud Native Data Solutions SpecialistWe are seeking an experienced professional to play a critical role in designing and implementing scalable cloud-native data solutions on Google Cloud Platform (GCP).Key Responsibilities:Design and implement cloud-native data architectures within GCP.Leverage Dataplex for data governance, cataloging, and lifecycle...


  • Remote, Wrocław, Gdańsk, Rzeszów, Czech Republic Xebia sp. z o.o. Full time

    7+ years in a data engineering role, with hands-on experience in building data processing pipelines,experience in leading the design and implementing of data pipelines and data products,proficiency with GCP services, for large-scale data processing and optimization,extensive experience with Apache Airflow, including DAG creation, triggers, and workflow...


  • Remote, Czech Republic beBeeDataScientist Full time €100,000 - €120,000

    Cloud Data Architect Job OpportunityYou're an ideal candidate for this position if you have:4+ years of experience in data pipeline development and maintenance.Proven expertise in SQL database management.Extensive knowledge of cloud technologies, including data storage solutions (file systems, relational databases, MPP, NoSQL).Strong understanding of data...


  • Remote, Czech Republic Link Group Full time

    5+ years of hands-on experience with GCP and public cloud infrastructure.Strong IaC background (Terraform), preferably with Azure DevOps pipelines.Experience with managed GCP services: Cloud Run, BigQuery, Dataproc, Vertex AI, GKE.Knowledge of monitoring and observability practices.Hands-on with Kubernetes, Bash, and Linux system administration.Solid...


  • Remote, Czech Republic beBeeData Full time 1,200,000 - 1,500,000

    Lead Data ArchitectWe are seeking a highly experienced Lead Data Architect to join our team responsible for designing, implementing, and optimizing modern data solutions. The ideal candidate will have a strong background in data strategy, data engineering, platform architecture, modeling, business intelligence, big data, reporting, and data warehousing.

  • Enterprise Architect

    2 weeks ago


    Remote, Czech Republic beBeeEnterprise Full time 1,005,000 - 1,343,000

    Enterprise Architecture Specialist Job Description">"]Job SummaryAs an Enterprise Architecture Specialist, you will be responsible for defining and governing the technology landscape that underpins our business strategy. You will work closely with senior team members to ensure a strong foundation in technology and IT concepts.Key Responsibilities:Support...


  • Remote, Warszawa, Czech Republic beBeeData Full time 900,000 - 1,300,000

    Data Architect Job OpportunityWe are seeking a highly skilled Data Architect to design, implement and optimize large-scale data architectures.

  • Big Data Architect

    2 weeks ago


    Remote, Czech Republic beBeeDataArchitecture Full time 21,600 - 27,800

    Job Title: Data Architecture SpecialistAbout the Role:We are seeking a highly skilled Data Architecture Specialist to join our team. In this role, you will be responsible for designing and implementing data architectures that meet the needs of our business.Key Responsibilities:Develop, optimize and maintain big data (ELT/ETL) pipelines for statistical...


  • Warsaw, Kraków, Czech Republic beBeeData Full time €85,000 - €115,000

    Job Opportunity:We are seeking a Senior Data Engineer to lead the design, implementation, and optimization of our modern data infrastructure.This role will have a strong focus on enabling analytics through clean data modelling, automation, and observability – empowering domain teams with trusted, self-serve data products.Key Responsibilities:Design robust...

  • Big Data Engineer

    2 weeks ago


    Remote, Czech Republic Link Group Full time

    Must-Have QualificationsAt least 3+ years of experience in big data engineering.Proficiency in Scala and experience with Apache Spark.Strong understanding of distributed data processing and frameworks like Hadoop.Experience with message brokers like Kafka.Hands-on experience with SQL/NoSQL databases.Familiarity with version control tools like Git.Solid...