Data Engineer @ Ework Group

16 hours ago


Remote Wrocław, Czech Republic Ework Group Full time
  • Azure Databricks (PySpark, Spark SQL; Unity Catalog; Jobs/Workflows).
  • Azure data services: Azure Data Factory, Azure Key Vault, storage (ADLS), fundamentals of networking/identities.
  • Python for data engineering (APIs, utilities, tests).
  • Azure DevOps (Repos, Pipelines, YAML) and Git-based workflows.
  • Experience operating production pipelines (monitoring, alerting, incident handling, cost control).

💻 Ework Group - founded in 2000, listed on Nasdaq Stockholm, with around 13,000 independent professionals on assignment - we are the total talent solutions provider who partners with clients, in both the private and public sector, and professionals to create sustainable talent supply chains.

With a focus on IT/OT, R&D, Engineering and Business Development, we deliver sustainable value through a holistic and independent approach to total talent management.

By providing comprehensive talent solutions, combined with vast industry experience and excellence in execution, we form successful collaborations. We bridge clients and partners & professionals throughout the talent supply chain, for the benefit of individuals, organizations and society.

🔹 For our Client we are looking for Senior Data Engineer 🔹

Preferred candidates are based in Wrocław, but applicants from other locations in Poland will also be considered.

✔️ Main assignment:

Maintain and evolve the data flows used by the Picto application: Azure + Databricks pipelines (ADF + notebooks) that ingest data from APIs using Ingestion Framework, transform it (PySpark/Spark SQL), and deliver trusted datasets.

✔️ Tech stack you’ll meet:

Azure, Databricks (PySpark/Spark SQL, Unity Catalog, Workflows), ADF, ADLS/Delta, Key Vault, Azure DevOps (Repos/Pipelines YAML), Python, SQL


,[Own day-to-day operations of Picto data pipelines (ingest → transform → publish), ensuring reliability, performance and cost efficiency., Develop and maintain Databricks notebooks (PySpark/Spark SQL) and ADF pipelines/Triggers; manage Jobs/Workflows and CI/CD., Implement data quality checks, monitoring & alerting (SLA/SLO), troubleshoot incidents, and perform root-cause analysis., Secure pipelines (Key Vault, identities, secrets) and follow platform standards (Unity Catalog, environments, branching)., Collaborate with BI Analysts and Architects to align data models and outputs with business needs., Document datasets, flows and runbooks; contribute to continuous improvement of the Ingestion Framework.] Requirements: Azure, Databricks, PySpark, Azure Data Factory, Azure DevOps, AI

  • Wrocław, Czech Republic Ework Group Full time

    Proven experience in Azure Databricks and Python (data engineering, pipelines, performance tuning). Azure DevOps (Repos, Pipelines, YAML). Azure Key Vault. Azure Data Factory (optional). Good to have knowledge within PowerBi. Strong analytical and problem-solving skills. Excellent communication and stakeholder management abilities. Fluent in English (C1). ...


  • Wrocław, Czech Republic Ework Group Full time

    Requirements: Degree in Computer Science, Engineering, or equivalent experience. 3+ years of experience in DevSecOps or related roles. Hands-on experience with most of the items below: o CI/CD tools: Azure DevOps, GitHub Actions o Cloud: Microsoft Azure o Infrastructure as Code: HCP Terraform o Containers: Docker, Kubernetes/OpenShift, Helm charts o...


  • Wrocław, Czech Republic Ework Group Full time

    For our Client we are looking for Senior Solution Architect 🔹 ✔️ Requirements: 1. Core Azure Services & Architecture Azure Resource Manager (ARM) templates and Bicep Azure subscription management and governance (via PowerShell / MS Graph) Hybrid cloud architectures Azure regions, availability zones, and disaster recovery planning Cost optimization...


  • Remote, Warszawa, Czech Republic Ework Group Full time

    Requirements: Has performed role of key architect in development of at least 2 software systems (one, in the case of the basic profile) Expert in systems architecting in the relevant domain Expert in designing enterprise level software systems Expert in planning capacity and security of the systems Holds excellent technical knowledge about: databases,...


  • Wrocław, Województwo dolnośląskie, Czech Republic Ework Group Full time

    Requirements:Degree in Computer Science, Engineering, or equivalent experience.3+ years of experience in DevSecOps or related roles.Hands-on experience with most of the items below:o CI/CD tools: Azure DevOps, GitHub Actionso Cloud: Microsoft Azureo Infrastructure as Code: HCP Terraformo Containers: Docker, Kubernetes/OpenShift, Helm chartso Scripting: Bash,...


  • Warszawa, Gdynia, Wrocław, Czech Republic Ework Group Full time

    Must-have knowledge and experience: Experience in IT and/or business analysis Excellent written and spoken English communication skills, and familiar with international collaboration Experience of using Jira, Confluence. Experience or knowledge of API development, Data Mapping and Process Modelling Knowledge of the banking sector and payment...


  • Remote, Czech Republic Ework Group Full time

    About the Team You will become part of a global Zscaler Product Team within the Cloud & Corporate Cloud Infrastructure department in Global IT Operations (GITO) at The Client. The team consists of both The Client’s colleagues and an external operations partner located across Denmark, India, and Poland. It works in close collaboration with the Network...

  • Big Data Engineer

    7 days ago


    Remote, Czech Republic Link Group Full time

    Must-Have Qualifications At least 3+ years of experience in big data engineering. Proficiency in Scala and experience with Apache Spark. Strong understanding of distributed data processing and frameworks like Hadoop. Experience with message brokers like Kafka. Hands-on experience with SQL/NoSQL databases. Familiarity with version control tools like...

  • Data Engineer

    7 days ago


    Remote, Czech Republic Link Group Full time

    Must-Have Qualifications At least 3+ years of experience in data engineering. Strong expertise in one or more cloud platforms: AWS, GCP, or Azure. Proficiency in programming languages like Python, SQL, or Java/Scala. Hands-on experience with big data tools such as Hadoop, Spark, or Kafka. Experience with data warehouses like Snowflake, BigQuery, or...

  • Data Engineer

    7 days ago


    Remote, Czech Republic Link Group Full time

    Proven experience in data engineering with a focus on streaming. Strong proficiency in Confluent Kafka and Spark Structured Streaming. Experience with CDC and other data extraction techniques. Hands-on programming experience: Python, SQL, Bash, Node.js, Linux/Unix Shell. Familiarity with data engineering best practices: versioning, release management, agile...