Data Engineer

7 days ago


Remote Warszawa Wrocław Białystok Kraków Gdańsk, Czech Republic Addepto Full time
What you'll need to succeed in this role:
  • At least 3 years of commercial experience implementing, developing, or maintaining Big Data systems, data governance and data management processes.
  • Strong programming skills in Python (or Java/Scala): writing a clean code, OOP design.
  • Hands-on with Big Data technologies like Spark, Cloudera Data Platform, Airflow, NiFi, Docker, Kubernetes, Iceberg, Trino or Hudi.
  • Excellent understanding of dimensional data and data modeling techniques.
  • Experience implementing and deploying solutions in cloud environments.
  • Consulting experience with excellent communication and client management skills, including prior experience directly interacting with clients as a consultant.
  • Ability to work independently and take ownership of project deliverables.
  • Fluent English (at least C1 level).
  • Bachelor's degree in technical or mathematical studies.
Nice to have:
  • Experience with an MLOps framework such as Kubeflow or MLFlow.
  • Familiarity with Databricks, dbt or Kafka.

Addepto is a leading consulting and technology company specializing in AI and Big Data, helping clients deliver innovative data projects. We partner with top-tier global enterprises and pioneering startups, including Rolls Royce, Continental, Porsche, ABB, and WGU. Our exclusive focus on AI and Big Data has earned us recognition by Forbes as one of the top 10 AI consulting companies.

As a Data Engineer, you will have the exciting opportunity to work with a team of technology experts on challenging projects across various industries, leveraging cutting-edge technologies. Here are some of the projects we are seeking talented individuals to join:

  • Development and maintenance of a large platform for processing automotive data. A significant amount of data is processed in both streaming and batch modes. The technology stack includes Spark, Cloudera, Airflow, Iceberg, Python, and AWS.
  • Design and development of a universal data platform for global aerospace companies. This Azure and Databricks powered initiative combines diverse enterprise and public data sources. The data platform is at the early stages of the development, covering design of architecture and processes as well as giving freedom for technology selection.
  • Centralized reporting platform for a growing US telecommunications company. This project involves implementing BigQuery and Looker as the central platform for data reporting. It focuses on centralizing data, integrating various CRMs, and building executive reporting solutions to support decision-making and business growth.


Discover our perks and benefits:
  • Work in a supportive team of passionate enthusiasts of AI & Big Data.
  • Engage with top-tier global enterprises and cutting-edge startups on international projects.
  • Enjoy flexible work arrangements, allowing you to work remotely or from modern offices and coworking spaces.
  • Accelerate your professional growth through career paths, knowledge-sharing initiatives, language classes, and sponsored training or conferences, including a partnership with Databricks, which offers industry-leading training materials and certifications.
  • Choose from various employment options: B2B, employment contracts, or contracts of mandate.
  • Make use of 20 fully paid days off available for B2B contractors and individuals under contracts of mandate.
  • Participate in team-building events and utilize the integration budget.
  • Celebrate work anniversaries, birthdays, and milestones.
  • Access medical and sports packages, eye care, and well-being support services, including psychotherapy and coaching.
  • Get full work equipment for optimal productivity, including a laptop and other necessary devices.
  • With our backing, you can boost your personal brand by speaking at conferences, writing for our blog, or participating in meetups.
  • Experience a smooth onboarding with a dedicated buddy, and start your journey in our friendly, supportive, and autonomous culture.
,[Develop and maintain a high-performance data processing platform for automotive data, ensuring scalability and reliability., Design and implement data pipelines that process large volumes of data in both streaming and batch modes., Optimize data workflows to ensure efficient data ingestion, processing, and storage using technologies such as Spark, Cloudera, and Airflow., Work with data lake technologies (e.g., Iceberg) to manage structured and unstructured data efficiently., Collaborate with cross-functional teams to understand data requirements and ensure seamless integration of data sources., Monitor and troubleshoot the platform, ensuring high availability, performance, and accuracy of data processing., Leverage cloud services (AWS) for infrastructure management and scaling of processing workloads., Write and maintain high-quality Python (or Java/Scala) code for data processing tasks and automation.] Requirements: Python, SQL, Spark, Airflow, AWS, Cloudera, CI/CD, Kubernetes, Kafka, NiFi, Trino, Hudi, Java, Scala, Docker, Databricks, MLOps, DevOps, Iceberg Tools: Jira, Confluence, Wiki, GitHub, Agile, Scrum, Kanban. Additionally: Private healthcare, Multisport card, Referral bonus, MyBenefit cafeteria, International projects, Flat structure, Paid leave, Training budget, Language classes, Team building events, Small teams, Flexible form of employment, Flexible working hours and remote work possibility, Free coffee, Startup atmosphere, No dress code, In-house trainings.

  • Remote, Warszawa, Wrocław, Białystok, Kraków, Gdańsk, Czech Republic beBeeData Full time €80,000 - €100,000

    Job Title: High-Performance Data EngineerJob Description:We are seeking an experienced and skilled High-Performance Data Engineer to join our team. As a senior data engineer, you will play a key role in designing and implementing high-performance data processing platforms for large-scale automotive data.The ideal candidate will have extensive experience...


  • Remote, Kraków, Białystok, Wrocław, Czech Republic Grape Up Full time

    PhD or master's degree in computer science, Data Science, AI, or related field5+ years of professional experience in Data Engineering and Big DataProven experience in implementing and deploying solutions in AWS using AWS stack (Redshift, Kinesis, Athena)Proven experience with AWS Data Processing (Glue, EMR)Experience with Data Pipelines Orchestration...

  • Data Scientist

    2 days ago


    Remote, Warszawa, Gdańsk, Wrocław, Białystok, Kraków, Czech Republic Addepto Full time

    🎯 What you’ll need to succeed in this role: At least 2+ years of proven commercial experience designing and implementing scalable AI solutions (Machine Learning, Predictive Modeling, Optimization, NLP, Computer Vision, GenAI). Proficiency in developing ML algorithms from scratch to production deployment. Strong programming skills in Python: writing...


  • Remote, Kraków, Białystok, Wrocław, Czech Republic Grape Up Full time

    PhD or master’s degree in computer science, Data Science, AI, or related field 5+ years of professional experience in Data Engineering and Big Data Proven experience in implementing and deploying solutions in AWS using AWS stack (Redshift, Kinesis, Athena) Proven experience with AWS Data Processing (Glue, EMR) Experience with Data Pipelines Orchestration...


  • Remote, Warszawa, Wrocław, Białystok, Kraków, Gdańsk, Czech Republic Addepto Full time

    What you'll need to succeed in this role:At least 5 years of commercial experience implementing, developing, or maintaining Big Data systems, data governance and data management processes.Strong programming skills in Python (or Java/Scala): writing a clean code, OOP design.Hands-on with Big Data technologies like Spark, Cloudera, Data Platform, Airflow,...


  • Remote, Warszawa, Wrocław, Białystok, Kraków, Gdańsk, Czech Republic Addepto Full time

    What you'll need to succeed in this role:At least 5 years of commercial experience implementing, developing, or maintaining Big Data systems, data governance and data management processes.Strong programming skills in Python (or Java/Scala): writing a clean code, OOP design.Hands-on with Big Data technologies like Spark, Cloudera Data Platform, Airflow,...


  • Remote, Wrocław, Gdańsk, Kraków, Poznań, Warszawa, Czech Republic RemoDevs Full time

    ​Proven commercial experience with Azure Databricks and Azure Data Factory (ADF).​Strong proficiency in SQL and Python for data engineering.​Hands-on experience in building data pipelines and data modeling.​A good command of English (min. B2 level), enabling seamless communication within an international team.​Experience with Agile methodologies...

  • Engineering Manager

    2 days ago


    Remote, Warszawa, Wrocław, Białystok, Kraków, Gdańsk, Czech Republic Addepto Full time

    What you’ll need to succeed in this role: Bachelor’s or higher in Computer Science, Mathematics, Physics, or related field. Hands-on experience with Python and Data Science applications (Generative AI, LLMs, Machine Learning, Predictive Modeling, NLP, Computer Vision, Deep Learning). Proven track record in managing technical teams and leading end-to-end...


  • Kraków, Wrocław, Poznań, Gdańsk, Warszawa, Czech Republic beBeeDataEngineer Full time 800,000 - 1,200,000

    Cloud Data Engineer OpportunityThis is an exciting opportunity to join a dynamic team delivering cutting-edge data solutions. We specialize in Cloud & Big Data engineering, building scalable systems across various cloud platforms.About the RoleWe are seeking a skilled Cloud Data Engineer to design and develop scalable data processing systems, leveraging...

  • Data Engineer

    2 days ago


    Remote, Warszawa, Wrocław, Białystok, Kraków, Gdańsk, Czech Republic Addepto Full time

    What you’ll need to succeed in this role: At least 3 years of commercial experience implementing, developing, or maintaining Big Data systems, data governance and data management processes. Strong programming skills in Python (or Java/Scala): writing a clean code, OOP design. Hands-on with Big Data technologies like Spark, Cloudera Data Platform,...