Junior Data Engineer

1 day ago


Remote Warsaw Wrocław Białystok Kraków Gdańsk, Czech Republic Addepto Full time
What you’ll need to succeed in this role:
  • At least 1 year of proven commercial experience developing, or maintaining Big Data systems.
  • Hands-on experience with Big Data technologies, including Databricks, Apache Spark, Airflow, and DBT.
  • Strong programming skills in Python: writing a clean code, OOP design.
  • Experience in designing and implementing data governance and data management processes.
  • Experience implementing and deploying solutions in cloud environments (with a preference for Azure).
  • Practical knowledge of DevOps practices, including designing and maintaining CI/CD pipelines for data and ML workflows, and Terraform for Infrastructure as Code.
  • Knowledge of how to build and deploy Power BI reports and dashboards for data visualization.
  • Excellent understanding of dimensional data and data modeling techniques.
  • Excellent communication skills and consulting experience with direct interaction with clients.
  • Ability to work independently and take ownership of project deliverables.
  • Bachelor’s or Master’s degree in Computer Science, Data Science, Mathematics, Physics, or a related field.

Addepto is a leading consulting and technology company specializing in AI and Big Data, helping clients deliver innovative data projects. We partner with top-tier global enterprises and pioneering startups, including Rolls Royce, Continental, Porsche, ABB, and WGU. Our exclusive focus on AI and Big Data has earned us recognition by Forbes as one of the top 10 AI consulting companies.

As a Junior Data Engineer, you will have the exciting opportunity to work with a team of technology experts on challenging projects across various industries, leveraging cutting-edge technologies.  Here are some of the projects we are seeking talented individuals to join:

  • Design and development of a universal data platform for global aerospace companies. This Azure and Databricks powered initiative combines diverse enterprise and public data sources. The data platform is at the early stages of the development, covering design of architecture and processes, as well as giving freedom for technology selection.
  • Data Platform Transformation for energy management association body.  This project addressed critical data management challenges, boosting user adoption, performance, and data integrity. The team is implementing a comprehensive data catalog, leveraging Databricks and Apache Spark/PySpark, for simplified data access and governance. Secure integration solutions and enhanced data quality monitoring, utilizing Delta Live Table tests, established trust in the platform. The intermediate result is a user-friendly, secure, and data-driven platform, serving as a basis for further development of ML components.
  • Design of the data transformation and following data ops pipelines for global car manufacturer. This project aims to build a data processing system for both real-time streaming and batch data. We’ll handle data for business uses like process monitoring, analysis, and reporting, while also exploring LLMs for chatbots and data analysis. Key tasks include data cleaning, normalization, and optimizing the data model for performance and accuracy.


Discover our perks and benefits:
  • Work in a supportive team of passionate enthusiasts of AI & Big Data.
  • Engage with top-tier global enterprises and cutting-edge startups on international projects.
  • Enjoy flexible work arrangements, allowing you to work remotely or from modern offices and coworking spaces.
  • Accelerate your professional growth through career pathsknowledge-sharing initiatives, language classes, and sponsored training or conferences, including a partnership with Databricks, which offers industry-leading training materials and certifications.
  • Choose from various employment options: B2B, employment contracts, or contracts of mandate.
  • Make use of 20 fully paid days off available for B2B contractors and individuals under contracts of mandate.
  • Participate in team-building events and utilize the integration budget.
  • Celebrate work anniversaries, birthdays, and milestones.
  • Access medical and sports packages, eye care, and well-being support services, including psychotherapy and coaching.
  • Get full work equipment for optimal productivity, including a laptop and other necessary devices.
  • With our backing, you can boost your personal brand by speaking at conferences, writing for our blog, or participating in meetups.
  • Experience a smooth onboarding with a dedicated buddy, and start your journey in our friendly, supportive, and autonomous culture.
,[Design scalable data processing pipelines for streaming and batch processing using Big Data technologies like Databricks, Airflow and/or Dagster., Contribute to the development of CI/CD and MLOps processes., Develop applications to aggregate, process, and analyze data from diverse sources., Collaborate with the Data Science team on Machine Learning projects, including text/image analysis and predictive model building., Develop and organize data transformations using Databricks/DBT and Apache Airflow., Translate business requirements into technical solutions and ensure optimal performance and quality.] Requirements: Python, SQL, ETL, Azure, Airflow, Databricks, Spark, Docker, CI/CD, Kubernetes, Kafka, Power BI, Dagster, dbt Tools: Jira, Confluence, Wiki, GitHub, Agile, Scrum, Kanban. Additionally: Private healthcare, Multisport card, Referral bonus, MyBenefit cafeteria, International projects, Flat structure, Paid leave, Training budget, Language classes, Team building events, Small teams, Flexible form of employment, Flexible working hours and remote work possibility, Free coffee, Startup atmosphere, No dress code, In-house trainings.

  • Remote, Wrocław, Czech Republic Comscore (via CC) Full time

    An ideal candidate would have:A solid foundation in Python programming, future proficiency in SQL, and a solid basic understanding of statistical concepts. Python programming, proficiency in SQL, and a basic understandingKnowledge of testing methodologies and practical experience in API testing and data testingExperience in testing large datasets is...


  • Remote, Wrocław, Czech Republic beBeeData Full time 6,000 - 8,000

    As a junior linux engineer in big data validation, you will join an international team and collaborate with industry leaders.Our company is a global leader in media analytics, providing insights into consumer behavior and digital engagement.We are seeking a skilled engineer to work closely with our ai and data engineering teams to ensure the quality of data,...


  • Remote, Wrocław, Czech Republic Comscore (via CC) Full time

    An ideal candidate would have: A solid foundation in Python programming, future proficiency in SQL, and a solid basic understanding of statistical concepts. Python programming, proficiency in SQL, and a basic understanding Knowledge of testing methodologies and practical experience in API testing and data testing Experience in testing large datasets is...


  • Remote, Kraków, Białystok, Wrocław, Czech Republic Grape Up Full time

    PhD or master's degree in computer science, Data Science, AI, or related field5+ years of professional experience in Data Engineering and Big DataProven experience in implementing and deploying solutions in AWS using AWS stack (Redshift, Kinesis, Athena)Proven experience with AWS Data Processing (Glue, EMR)Experience with Data Pipelines Orchestration...


  • Remote, Warszawa, Wrocław, Białystok, Kraków, Gdańsk, Czech Republic beBeeData Full time €80,000 - €100,000

    Job Title: High-Performance Data EngineerJob Description:We are seeking an experienced and skilled High-Performance Data Engineer to join our team. As a senior data engineer, you will play a key role in designing and implementing high-performance data processing platforms for large-scale automotive data.The ideal candidate will have extensive experience...

  • Senior Data Engineer

    2 weeks ago


    Remote, Gdańsk, Wrocław, Warsaw, Kraków, Poznań, Czech Republic RemoDevs Full time

    Proven experience with Azure Databricks and Azure Data Factory (ADF).Strong skills in SQL and Python for data engineering.Experience in building pipelines and data models.Good English (minimum B2) to communicate in an international team.Experience with Agile methods and Azure DevOps. We are looking for skilled Data Engineers to join a team working on an...

  • Data Engineer

    1 day ago


    Remote, Warsaw, Wrocław, Białystok, Kraków, Gdańsk, Czech Republic Addepto Full time

    What you’ll need to succeed in this role: At least 3 years of commercial experience implementing, developing, or maintaining Big Data systems. Strong programming skills in Python: writing a clean code, OOP design. Strong SQL skills, including performance tuning, query optimization, and experience with data warehousing solutions. Experience in...


  • Remote, Warsaw, Wrocław, Białystok, Kraków, Gdańsk, Czech Republic Addepto Full time

    What you’ll need to succeed in this role: 5+ years of commercial experience in designing and implementing scalable AI solutions (Machine Learning, Predictive Modeling, Optimization, NLP, Computer Vision, GenAI, LLMs, Deep Learning). Proficiency in developing ML algorithms from scratch to production deployment. Strong programming skills in Python:...


  • Remote, Warsaw, Wrocław, Białystok, Kraków, Gdańsk, Czech Republic Addepto Full time

    What you’ll need to succeed in this role: 5+ years of commercial experience designing and implementing scalable AI solutions (Machine Learning, Predictive Modeling, Optimization, NLP, Computer Vision, GenAI, LLMs, Deep Learning). Proficiency in developing ML algorithms from scratch to production deployment. Strong programming skills in Python: writing...

  • Senior Data

    2 weeks ago


    Remote, Kraków, Wrocław, Gdańsk, Warsaw, Czech Republic N-iX Full time

    5+ years of experience in Data Engineering or Analytics Engineering rolesStrong experience building and maintaining pipelines in BigQuery, Athena, Glue, and AirflowAdvanced SQL skills and experience designing dimensional models (star/snowflake)Experience with AWS CloudSolid Python skills, especially for data processing and workflow...