Senior Data Engineer @ AVENGA

2 weeks ago


Wrocław, Czech Republic AVENGA (Agencja Pracy, nr KRAZ: 8448) Full time

Must-have competences & skills: Azure Databricks (PySpark, Spark SQL; Unity Catalog; Jobs/Workflows). Azure data services: Azure Data Factory, Azure Key Vault, storage (ADLS), fundamentals of networking/identities. Advanced SQL and solid data modeling in a lakehouse/Delta setup. Python for data engineering (APIs, utilities, tests). Azure DevOps (Repos, Pipelines, YAML) and Git-based workflows. Experience operating production pipelines (monitoring, alerting, incident handling, cost control). Nice to have: AI Project: is about e-purchasing (ranging from data about buyers, sellers, material prices, KPI calculations, to the entire parts ordering process). The project is divided into two teams. The business uses our SAP BO solutions and PBI dashboards. This project processes truly massive amounts of data, measured in terabytes. Team: we work in scrum in every team in our company. There are several Data Engineers and Data Analysts, DPOs in each team, and every team is international. Tech stack you’ll meet: Azure, Databricks (PySpark/Spark SQL, Unity Catalog, Workflows), ADF, ADLS/Delta, Key Vault, Azure DevOps (Repos/Pipelines YAML), Python, SQL ,[Own day-to-day operations of Picto data pipelines (ingest → transform → publish), ensuring reliability, performance and cost efficiency., Develop and maintain Databricks notebooks (PySpark/Spark SQL) and ADF pipelines/Triggers; manage Jobs/Workflows and CI/CD., Implement data quality checks, monitoring & alerting (SLA/SLO), troubleshoot incidents, and perform root-cause analysis., Secure pipelines (Key Vault, identities, secrets) and follow platform standards (Unity Catalog, environments, branching)., Collaborate with BI Analysts and Architects to align data models and outputs with business needs., Document datasets, flows and runbooks; contribute to continuous improvement of the Ingestion Framework.] Requirements: Azure, Databricks, ADF, PySpark, Spark SQL, Data pipelines, CD, SLA, Key Vault, UNITY, BI, Data models, Azure Databricks, Azure Data, Azure Data Factory, Storage, ADLS, Networking, SQL, Data modeling, Python, Data engineering, Azure DevOps, YAML, Git, AI Additionally: International projects, Cafeteria system, Multisport card, Integration events, Insurance, Friendly atmosphere, Free coffee, Canteen, Bike parking, Free beverages, No dress code, Free parking, Modern office.



  • Wrocław, Czech Republic AVENGA (Agencja Pracy, nr KRAZ: 8448) Full time

    Requirements: min. 5 years of experience as Data Engineer proven experience in Azure Databricks (data engineering, pipelines, performance tuning). Azure DevOps (Repos, Pipelines, YAML) Python Pyspark SQL Streaming Workflows Unity Catalog SQL Server experience as desirable Excellent communication and stakeholder management abilities. We are looking for a...


  • Wrocław, Czech Republic AVENGA (Agencja Pracy, nr KRAZ: 8448) Full time

    Development in Microsoft SQL Server BI stack (T-SQL, SSIS) Understanding of various data structures – data warehouses, tabular Development of Power BI semantic models Design and development of front-ends with SQL Server Reporting Services, Power BI ETL development – good knowledge of designing and developing high-performing ETL packages in SSIS, ADF...


  • Wrocław, Czech Republic AVENGA (Agencja Pracy, nr KRAZ: 8448) Full time

    Minimum 5 years of professional experience as a Data Scientist, Machine Learning Engineer, or in a similar role. Proven experience working with the Databricks platform. Hands-on experience with MLflow for building experiments and deploying models to production. Strong understanding and experience with recommender system algorithms and their deployment....

  • Senior Data Engineer

    2 weeks ago


    Wrocław, Czech Republic Experis Polska Full time

    6+ years of professional experience with on-premises ETL tools (preferably SSIS). Advanced SQL skills, including performance tuning and optimization. .NET knowledge – ability to read and write code. Should have: Python skills (at least ability to read and understand code). Nice to have: Web development knowledge (HTML). Experience with GIT repository...


  • Remote, Wrocław, Czech Republic Ework Group Full time

    Azure Databricks (PySpark, Spark SQL; Unity Catalog; Jobs/Workflows). Azure data services: Azure Data Factory, Azure Key Vault, storage (ADLS), fundamentals of networking/identities. Python for data engineering (APIs, utilities, tests). Azure DevOps (Repos, Pipelines, YAML) and Git-based workflows. Experience operating production pipelines (monitoring,...


  • Zielona Góra, Wrocław, Czech Republic Auctane Poland Sp. z o.o. Full time

    What are we looking for? At least 5 years of proven experience in data engineering roles A strong Data Engineering background in a data warehouse or data lake architecture Experience working in AWS/GCP cloud infrastructure. Experience developing and supporting robust, automated, and reliable data pipelines in Python and SQL. Mastered Python and SQL skills....


  • Wrocław, Czech Republic Data Hiro sp. z o.o. Full time

    5+ years of hands-on experience in implementing, developing, or maintaining data systems in commercial environments. Solid understanding of data modeling and data architecture principles. Strong Python skills, with a focus on writing clean, maintainable code and applying solid object-oriented design principles. Advanced SQL expertise, including performance...


  • Wrocław, Czech Republic Data Hiro sp. z o.o. Full time

    5+ years of hands-on experience in implementing, developing, or maintaining data systems in commercial environments. Solid understanding of data modeling and data architecture principles. Strong Python skills, with a focus on writing clean, maintainable code and applying solid object-oriented design principles. Advanced SQL expertise, including performance...

  • Senior Data Engineer

    2 weeks ago


    Remote, Warsaw, Wrocław, Białystok, Kraków, Gdańsk, Czech Republic Addepto Full time

    What you’ll need to succeed in this role: At least 5 years of commercial experience implementing, developing, or maintaining Big Data systems. Strong programming skills in Python: writing a clean code, OOP design. Strong SQL skills, including performance tuning, query optimization, and experience with data warehousing solutions. Experience in...

  • Senior Data Engineer

    2 weeks ago


    Remote, Warszawa, Wrocław, Białystok, Kraków, Gdańsk, Czech Republic Addepto Full time

    What you’ll need to succeed in this role: At least 5 years of commercial experience implementing, developing, or maintaining Big Data systems, data governance and data management processes. Strong programming skills in Python (or Java/Scala): writing a clean code, OOP design. Hands-on with Big Data technologies like Spark, Cloudera, Data Platform,...