Mid Databricks Data Engineer @

3 days ago


Remote Wrocław Gdańsk Kraków Poznań Warszawa, Czech Republic RemoDevs Full time
  • ​Proven commercial experience with Azure Databricks and Azure Data Factory (ADF).
    • ​Strong proficiency in SQL and Python for data engineering.
      • ​Hands-on experience in building data pipelines and data modeling.
        • ​A good command of English (min. B2 level), enabling seamless communication within an international team.
          • ​Experience with Agile methodologies and tools like Azure DevOps.

We're looking for experienced Data Engineers to join a team delivering a strategic project for our client — a major leader in the Danish financial sector (insurance and pension funds). The project's goal is the full migration of existing data warehouse solutions from on-premises systems to a modern cloud platform based on Azure Databricks. You will be part of an elite team (5-8 engineers) working closely with the client's experts to transform their data landscape.

Your Responsibilities:

  • ​Designing and implementing ETL/ELT processes within a Medallion architecture (Bronze, Silver, and Gold layers).
  • Transforming and modeling data to create advanced data products used in insurance rate calculations, AI, and business analytics.
  • ​Implementing Data Contracts to ensure data quality and consistency.
  • ​Implementing data domains such as Agreements, Claims, Policies, and Customers.
  • ​Creating tests (including automated tests) for data pipelines and data products.
  • Collaborating with analysts and business stakeholders to understand and translate requirements into technical solutions.
  • ​Establishing and maintaining development best practices for SQL, Python, and Azure Data Factory (ADF).
  • ​Planning and tracking work in the Azure DevOps environment.

Requirements:

  • ​Proven commercial experience with Azure Databricks and Azure Data Factory (ADF).
  • ​Strong proficiency in SQL and Python for data engineering.
  • ​Hands-on experience in building data pipelines and data modeling.
  • ​A good command of English (min. B2 level), enabling seamless communication within an international team.
  • ​Experience with Agile methodologies and tools like Azure DevOps.

Nice to Have:

  • ​Knowledge of Azure Purview.
  • ​Familiarity with data validation tools like Great Expectations.
  • Certifications such as Databricks Certified Data Engineer Associate or Microsoft Azure Data Fundamentals (DP-900).
  • ​Previous experience in projects for the financial, insurance, or pension sectors.

What We Offer:

  • ​An attractive rate of up to 140 PLN/h (+VAT) on a B2B contract.
  • ​A long-term, stable contract — the project is planned for a minimum of 18 months.
  • ​A 100% remote work model with a flexible approach.
  • ​The opportunity to work on a large-scale, international cloud migration project for an industry leader.
  • ​Collaboration with a highly skilled team of engineers from both the client and Dateonic.
,[Designing and implementing ETL/ELT processes within a Medallion architecture (Bronze, Silver, and Gold layers)., Transforming and modeling data to create advanced data products used in insurance rate calculations, AI, and business analytics., Implementing Data Contracts to ensure data quality and consistency., Implementing data domains such as Agreements, Claims, Policies, and Customers., Creating tests (including automated tests) for data pipelines and data products., Collaborating with analysts and business stakeholders to understand and translate requirements into technical solutions., Establishing and maintaining development best practices for SQL, Python, and Azure Data Factory (ADF)., Planning and tracking work in the Azure DevOps environment.] Requirements: Azure Databricks, Azure Data Factory, SQL, Python, Azure DevOps
  • Data Engineer

    3 days ago


    Remote, Czech Republic Upvanta Full time

     Qualifications:Proven hands-on experience with dbt, including model development, testing, and deployment.Strong SQL proficiency and experience with at least one major data warehousing solution (Databricks, Redshift, or similar).Familiarity with version control systems (e.g., Git) and CI/CD practices.Solid understanding of data modeling...


  • Remote, Gdańsk, Wrocław, Warsaw, Kraków, Poznań, Czech Republic RemoDevs Full time

    Proven experience with Azure Databricks and Azure Data Factory (ADF).Strong skills in SQL and Python for data engineering.Experience in building pipelines and data models.Good English (minimum B2) to communicate in an international team.Experience with Agile methods and Azure DevOps. We are looking for skilled Data Engineers to join a team working on an...


  • Remote, Czech Republic Link Group Full time

    Proven experience in data engineering with a strong focus on streaming.Strong expertise in Confluent Kafka and Spark Structured Streaming.Hands-on experience with CDC and real-time data extraction.Solid programming background: Python, SQL, Bash, Node.js, Linux/Unix Shell.Strong knowledge of data engineering lifecycle: versioning, release management,...

  • Data Engineer @

    7 days ago


    Wrocław, Województwo dolnośląskie, Czech Republic Ework Group Full time

    Proven experience in Azure Databricks and Python (data engineering, pipelines, performance tuning).Azure DevOps (Repos, Pipelines, YAML).Azure Key Vault.Azure Data Factory (optional).Good to have knowledge within PowerBi.Strong analytical and problem-solving skills.Excellent communication and stakeholder management abilities.Fluent in English (C1). Ework...


  • Remote, Wrocław, Gdańsk, Kraków, Poznań, Warszawa, Czech Republic beBeeDataEngineer Full time 1 - 140

    Transforming the Data LandscapeWe are seeking a skilled Data Engineer to join our team and contribute to the transformation of the data landscape for a major leader in the Danish financial sector.The ideal candidate will have a strong background in Azure Databricks, Azure Data Factory, SQL, Python, and Agile methodologies. Experience with ETL/ELT processes,...


  • Remote, Czech Republic beBeeDataEngineer Full time 900,000 - 1,200,000

    Senior Azure Data Engineer with DatabricksThis is a challenging role that requires a strong foundation in data engineering and experience with cloud-based data platforms.RequirementsA minimum of 3 years' experience with Azure Data Factory and Databricks, along with at least 5 years' experience in data engineering or backend software development.Strong SQL...

  • Cloud Engineer

    1 week ago


    Kraków, Gdańsk, Wrocław, Warszawa, Poznań, Czech Republic beBeeDataEngineer Full time €50,000 - €85,000

    As a senior data engineer, you will play a crucial role in shaping our organization's data strategy. Your expertise in Azure cloud services will enable us to process extensive and complex datasets, utilizing specialized tools and platforms.Key ResponsibilitiesDesigning and implementing Azure data solutions for handling large-scale datasets;Utilizing Azure...

  • Data Engineer @

    4 days ago


    Wrocław, Województwo dolnośląskie, Czech Republic SNI Full time

    Data & Engineering has been your world for at least 3-5 yearsCloud service platform expertise; preferably AzureHands-on experience working with Databricks (PySpark)Databases (like SQL Server or Netezza) don't have any secret for youKnowledge of at least three of these services: Azure Data Factory, Azure Synapse, Databricks, Azure SQL Database, Power BI,...


  • Remote, Gdańsk, Wrocław, Warsaw, Kraków, Poznań, Czech Republic beBeeDataEngineer Full time €54,560 - €72,880

    The company seeks an experienced Senior Data Engineer to enhance and improve their data environment.Key Responsibilities:Design efficient ETL/ELT processes with Medallion architecture (Bronze, Silver, Gold layers) for optimal data processing.Transform and model data into actionable products for insurance pricing, AI, and analytics.Maintain data quality and...


  • Remote, Wrocław, Gdańsk, Rzeszów, Czech Republic Xebia sp. z o.o. Full time

    7+ years in a data engineering role, with hands-on experience in building data processing pipelines,experience in leading the design and implementing of data pipelines and data products,proficiency with GCP services, for large-scale data processing and optimization,extensive experience with Apache Airflow, including DAG creation, triggers, and workflow...