Senior Data Engineer

5 days ago


Remote Warszawa Wrocław Białystok Kraków Gdańsk, Czech Republic Addepto Full time
What you'll need to succeed in this role:
  • 5+ years of experience in a Data Engineer or similar role.
  • Proficiency in Python for data processing and scripting.
  • Strong experience with Google Cloud Platform, especially: BigQuery, Compute Instances, IAM (Identity and Access Management).
  • Experience with Terraform and Terragrunt for infrastructure automation.
  • Hands-on experience with Dagster or similar orchestration frameworks.
  • Advanced knowledge of Looker and Looker Studio.
  • Familiarity with Docker and GitHub Actions for containerization and automation.
  • Solid understanding of DBT and SQL.
  • Experience with Airbyte or other ETL/ELT platforms.
  • Strong problem-solving skills and ability to work in cross-functional teams.
  • Excellent written and verbal communication skills in English.
  • Previous experience in a consulting or client-facing role.


Nice to have:
  • Knowledge of data modeling best practices (e.g., Kimball, Data Vault).
  • Familiarity with data quality frameworks and data observability tools.

Addepto is a leading consulting and technology company specializing in AI and Big Data, helping clients deliver innovative data projects. We partner with top-tier global enterprises and pioneering startups, including Rolls Royce, Continental, Porsche, ABB, and WGU. Our exclusive focus on AI and Big Data has earned us recognition by Forbes as one of the top 10 AI consulting companies.

As a Senior Data Engineer, you will have the exciting opportunity to work with a team of technology experts on challenging projects across various industries, leveraging cutting-edge technologies. Here are some of the projects we are seeking talented individuals to join:

  • Building and development of a modern data warehouse for the US retail industry, enabling the client to make data-driven decisions and build foundations for future AI features. This role will require a consultant mindset to help and guide the client through the product's roadmap.
  • Development of an operational warehouse for a big automotive client supporting near real-time processing and building foundations for integrating AI into their business processes.
  • Development and maintenance of a large platform for processing automotive data. A significant amount of data is processed in both streaming and batch modes. The technology stack includes Spark, Cloudera, Airflow, Iceberg, Python, and AWS.


Discover our perks and benefits:
  • Work in a supportive team of passionate enthusiasts of AI & Big Data.
  • Engage with top-tier global enterprises and cutting-edge startups on international projects.
  • Enjoy flexible work arrangements, allowing you to work remotely or from modern offices and coworking spaces.
  • Accelerate your professional growth through career paths, knowledge-sharing initiatives, language classes, and sponsored training or conferences, including a partnership with Databricks, which offers industry-leading training materials and certifications.
  • Choose from various employment options: B2B, employment contracts, or contracts of mandate.
  • Make use of 20 fully paid days off available for B2B contractors and individuals under contracts of mandate.
  • Participate in team-building events and utilize the integration budget.
  • Celebrate work anniversaries, birthdays, and milestones.
  • Access medical and sports packages, eye care, and well-being support services, including psychotherapy and coaching.
  • Get full work equipment for optimal productivity, including a laptop and other necessary devices.
  • With our backing, you can boost your personal brand by speaking at conferences, writing for our blog, or participating in meetups.
  • Experience a smooth onboarding with a dedicated buddy, and start your journey in our friendly, supportive, and autonomous culture.
,[Design, develop, and maintain scalable data pipelines and ETL processes in GCP., Collaborate with BI teams to develop semantic models and visualizations in Looker and Looker Studio., Work with infrastructure as code using Terraform and Terragrunt., Build orchestrated workflows with Dagster for reliable and repeatable data processing., Integrate and manage data sources via Airbyte or similar ETL tools., Optimize and maintain BigQuery datasets and compute resources., Ensure high-quality data through testing, monitoring, and CI/CD using GitHub Actions., Contribute to overall data architecture, best practices, and documentation., Collaborate with cross-functional teams to understand business needs and translate them into effective data solutions that align with reporting and strategic goals.] Requirements: Python, SQL, GCP, Looker, Terraform, Dagster, dbt, GitHub Actions, Terragrunt, Looker Studio, Airbyte Tools: Jira, Confluence, Wiki, GitHub, Agile, Scrum, Kanban. Additionally: Private healthcare, Multisport card, Referral bonus, MyBenefit cafeteria, International projects, Flat structure, Paid leave, Training budget, Language classes, Team building events, Small teams, Flexible form of employment, Flexible working hours and remote work possibility, Free coffee, Startup atmosphere, No dress code, In-house trainings.

  • Łódź, Białystok, Gdańsk, Wrocław, Warszawa, Czech Republic Godel Technologies Europe Full time

    Required skills & qualifications:At least 1 year of production experience with SnowflakeStrong understanding of data modeling (Data Vault), data warehousing (Redshift), and ETL/ELT tools (Apache Airflow, dbt)Solid programming skills in Python (at least 2 years for Middle, and 4 years for Senior)Strong knowledge of SQL and database optimizationExperience in...


  • Remote, Warszawa, Wrocław, Białystok, Kraków, Gdańsk, Czech Republic beBeeData Full time

    Job DescriptionWe are seeking a talented Senior Data Engineer to join our team. As a Senior Data Engineer, you will play a key role in designing and implementing high-performance data processing platforms for automotive data. Your primary focus will be on ensuring scalability and reliability of these platforms.Key responsibilities include developing and...


  • Remote, Wrocław, Gdańsk, Rzeszów, Czech Republic beBeeDataEngineering Full time

    Job Title: Senior Cloud Data Engineer">We are seeking a highly skilled


  • Remote, Warszawa, Wrocław, Białystok, Kraków, Gdańsk, Czech Republic Addepto Full time

    What you'll need to succeed in this role:At least 5 years of commercial experience implementing, developing, or maintaining Big Data systems, data governance and data management processes.Strong programming skills in Python (or Java/Scala): writing a clean code, OOP design.Hands-on with Big Data technologies like Spark, Cloudera Data Platform, Airflow,...


  • Remote, Warszawa, Wrocław, Białystok, Kraków, Gdańsk, Czech Republic Addepto Full time

    What you'll need to succeed in this role:5+ years of experience in a Data Engineer or similar role.Proficiency in Python for data processing and scripting.Strong experience with Snowflake and data warehouse architecture.Hands-on experience with Amazon Web Services (S3, Glue, IAM, Lambda, Redshift optional).Solid knowledge of SQL and data modeling...


  • Remote, Wrocław, Warszawa, Kraków, Czech Republic Holisticon Connect Full time

    You might be the perfect match if you are/have:Bachelor's or Master's degree in Computer Science or a related field (or equivalent experience). 5+ years of professional experience in data engineering, with at least 2 years focused onSnowflake and DBT. Proven track record in complex Finance reporting projects. Experience in migration projects,...


  • Remote, Wrocław, Gdańsk, Rzeszów, Czech Republic Xebia sp. z o.o. Full time

    7+ years in a data engineering role, with hands-on experience in building data processing pipelines,experience in leading the design and implementing of data pipelines and data products,proficiency with GCP services, for large-scale data processing and optimization,extensive experience with Apache Airflow, including DAG creation, triggers, and workflow...


  • Remote, Kraków, Zagreb, Wrocław, Split, Warsaw, Czech Republic ELEKS Full time

    REQUIREMENTSExperience in Data Engineering, SQL, ETL(data validation + data mapping + exception handling) 3+ years Hands-on experience with Databricks 2+ years Experience with Python Experience with AWS (e.g. S3, Redshift, Athena, Glue, Lambda, etc.) Knowledge of the Energy industry (e.g. energy trading, utilities, power systems etc.) would be a...


  • Remote, Wrocław, Warszawa, Kraków, Czech Republic beBeeDataEngineer Full time

    Are you a seasoned expert in data engineering?We are seeking an experienced Senior Data Engineer with strong DBT and Snowflake skills to join our team. This role requires someone who is proactive, solution-oriented and not afraid to speak up, push for quality and champion their ideas in a dynamic, international setting involving close collaboration with...


  • Remote, Warszawa, Wrocław, Białystok, Kraków, Gdańsk, Czech Republic beBeeData Full time

    Job OpportunityAre you a skilled data professional looking for a challenging role that leverages your expertise in data engineering? We are seeking a Senior Data Engineer to join our team and contribute to the development of modern data warehouses, scalable data pipelines, and ETL processes.">Job Description:We are seeking a talented Senior Data Engineer to...