Mid Databricks Data Engineer @ RemoDevs

20 hours ago


Remote Wrocław Gdańsk Kraków Poznań Warszawa, Czech Republic RemoDevs Full time
  • ​Proven commercial experience with Azure Databricks and Azure Data Factory (ADF).
  • ​Strong proficiency in SQL and Python for data engineering.
  • ​Hands-on experience in building data pipelines and data modeling.
  • ​A good command of English (min. B2 level), enabling seamless communication within an international team.
  • ​Experience with Agile methodologies and tools like Azure DevOps.

We’re looking for experienced Data Engineers to join a team delivering a strategic project for our client — a major leader in the Danish financial sector (insurance and pension funds). The project's goal is the full migration of existing data warehouse solutions from on-premises systems to a modern cloud platform based on Azure Databricks. You will be part of an elite team (5-8 engineers) working closely with the client's experts to transform their data landscape.

Your Responsibilities:

  • ​Designing and implementing ETL/ELT processes within a Medallion architecture (Bronze, Silver, and Gold layers).
  • Transforming and modeling data to create advanced data products used in insurance rate calculations, AI, and business analytics.
  • ​Implementing Data Contracts to ensure data quality and consistency.
  • ​Implementing data domains such as Agreements, Claims, Policies, and Customers.
  • ​Creating tests (including automated tests) for data pipelines and data products.
  • Collaborating with analysts and business stakeholders to understand and translate requirements into technical solutions.
  • ​Establishing and maintaining development best practices for SQL, Python, and Azure Data Factory (ADF).
  • ​Planning and tracking work in the Azure DevOps environment.

Requirements:

  • ​Proven commercial experience with Azure Databricks and Azure Data Factory (ADF).
  • ​Strong proficiency in SQL and Python for data engineering.
  • ​Hands-on experience in building data pipelines and data modeling.
  • ​A good command of English (min. B2 level), enabling seamless communication within an international team.
  • ​Experience with Agile methodologies and tools like Azure DevOps.

Nice to Have:

  • ​Knowledge of Azure Purview.
  • ​Familiarity with data validation tools like Great Expectations.
  • Certifications such as Databricks Certified Data Engineer Associate or Microsoft Azure Data Fundamentals (DP-900).
  • ​Previous experience in projects for the financial, insurance, or pension sectors.

What We Offer:

  • ​An attractive rate of up to 140 PLN/h (+VAT) on a B2B contract.
  • ​A long-term, stable contract — the project is planned for a minimum of 18 months.
  • ​A 100% remote work model with a flexible approach.
  • ​The opportunity to work on a large-scale, international cloud migration project for an industry leader.
  • ​Collaboration with a highly skilled team of engineers from both the client and Dateonic.
,[Designing and implementing ETL/ELT processes within a Medallion architecture (Bronze, Silver, and Gold layers)., Transforming and modeling data to create advanced data products used in insurance rate calculations, AI, and business analytics., Implementing Data Contracts to ensure data quality and consistency., Implementing data domains such as Agreements, Claims, Policies, and Customers., Creating tests (including automated tests) for data pipelines and data products., Collaborating with analysts and business stakeholders to understand and translate requirements into technical solutions., Establishing and maintaining development best practices for SQL, Python, and Azure Data Factory (ADF)., Planning and tracking work in the Azure DevOps environment.] Requirements: Azure Databricks, Azure Data Factory, SQL, Python, Azure DevOps

  • Remote, Wrocław, Gdańsk, Kraków, Poznań, Warszawa, Czech Republic RemoDevs Full time

    ​Proven commercial experience with Azure Databricks and Azure Data Factory (ADF).​Strong proficiency in SQL and Python for data engineering.​Hands-on experience in building data pipelines and data modeling.​A good command of English (min. B2 level), enabling seamless communication within an international team.​Experience with Agile methodologies...

  • Senior Data Engineer

    20 hours ago


    Remote, Gdańsk, Wrocław, Warsaw, Kraków, Poznań, Czech Republic RemoDevs Full time

    Proven experience with Azure Databricks and Azure Data Factory (ADF). Strong skills in SQL and Python for data engineering. Experience in building pipelines and data models. Good English (minimum B2) to communicate in an international team. Experience with Agile methods and Azure DevOps. We are looking for skilled Data Engineers to join a team working on...

  • Senior Data Engineer

    2 weeks ago


    Remote, Gdańsk, Wrocław, Warsaw, Kraków, Poznań, Czech Republic RemoDevs Full time

    Proven experience with Azure Databricks and Azure Data Factory (ADF).Strong skills in SQL and Python for data engineering.Experience in building pipelines and data models.Good English (minimum B2) to communicate in an international team.Experience with Agile methods and Azure DevOps. We are looking for skilled Data Engineers to join a team working on an...

  • AI Engineer @ RemoDevs

    20 hours ago


    Remote, Wrocław, Gdańsk, Warsaw, Kraków, Poznań, Czech Republic RemoDevs Full time

    Experience as an AI/ML Engineer or similar, delivering NLP and ML models in SaaS. Strong ML knowledge with hands-on use of NLP libraries (spaCy, Hugging Face, PyTorch, TensorFlow). Experience deploying and scaling LLM apps (fine-tuning, RAG, evaluation, monitoring). Strong Python skills (preferred) and experience building APIs and connecting AI models to...


  • Kraków, Warszawa, Wrocław, Czech Republic Capgemini Polska Sp. z o.o. Full time

    YOUR PROFILEYou have hands-on experience in data engineering and are comfortable working independently on moderately complex tasks.You've worked with Databricks and PySpark in real-world projects.You're proficient in Python for data transformation and automation.You've used at least one cloud platform (AWS, Azure, or GCP) in a production environment.You...


  • Remote, Gdynia, Warszawa, Poznań, Kraków, Wrocław, Czech Republic Idego Group Sp. z o.o. Full time

    Minimum 4+ years of experience as a Data Engineer.Proven commercial experience with Databricks.Strong knowledge of AWS (nice to have).Proficiency in Python, PySpark, and SQL.Excellent command of English, both spoken and written. We are looking for a Senior Data Engineer to join one of our clients' projects in an international environment.Our perkswork...

  • Data Engineer

    20 hours ago


    Remote, Warsaw, Wrocław, Białystok, Kraków, Gdańsk, Czech Republic Addepto Full time

    What you’ll need to succeed in this role: At least 3 years of commercial experience implementing, developing, or maintaining Big Data systems. Strong programming skills in Python: writing a clean code, OOP design. Strong SQL skills, including performance tuning, query optimization, and experience with data warehousing solutions. Experience in...

  • Senior Data Engineer

    20 hours ago


    Remote, Warsaw, Wrocław, Białystok, Kraków, Gdańsk, Czech Republic Addepto Full time

    What you’ll need to succeed in this role: At least 5 years of commercial experience implementing, developing, or maintaining Big Data systems. Strong programming skills in Python: writing a clean code, OOP design. Strong SQL skills, including performance tuning, query optimization, and experience with data warehousing solutions. Experience in...

  • Mid Data Engineer

    1 week ago


    Remote, Czech Republic Upvanta Full time

     Requirements:Strong proficiency in data streaming technologies (CDC, Confluent, Spark Structured Streaming).Programming experience in Python, SQL, Bash, Node.js and working knowledge of Linux/Unix Shell.Hands-on experience with data engineering practices (versioning, release management, deployment).Familiarity with Agile methodologies and software...

  • Junior Data Engineer

    20 hours ago


    Remote, Warsaw, Wrocław, Białystok, Kraków, Gdańsk, Czech Republic Addepto Full time

    What you’ll need to succeed in this role: At least 1 year of proven commercial experience developing, or maintaining Big Data systems. Hands-on experience with Big Data technologies, including Databricks, Apache Spark, Airflow, and DBT. Strong programming skills in Python: writing a clean code, OOP design. Experience in designing and implementing...