Data Engineer with AWS

1 week ago


Wrocław Kraków Gdańsk Poznan Warszawa, Czech Republic Capgemini Polska Sp. z o.o. Full time
  • You have hands-on experience in data engineering and can independently handle moderately complex tasks.
  • You've worked with PySpark in distributed data processing scenarios.
  • You are familiar with AWS data services such as S3, Glue, EMR, Lambda, Redshift, or similar.
  • Experience working with additional  major cloud platform (Azure, or GCP)
  • You are proficient in Python for ETL and automation tasks.
  • You communicate clearly and confidently in English.

Nice to Have

  • Strong SQL skills and understanding of data modeling.
  • Exposure to CI/CD pipelines, Terraform, or CloudFormation.
  • Familiarity with streaming technologies like Kafka or Kinesis.
  • AWS certifications

Join our growing Insights & Data team — over 400 professionals delivering advanced, data-driven solutions. We specialize in Cloud & Big Data engineering, building scalable data architectures on AWS. We manage the full Software Development Life Cycle (SDLC) using modern frameworks, agile methodologies, and DevOps best practices.

,[Design and build data pipelines using AWS services and PySpark., Process large-scale datasets efficiently and reliably., Collaborate with architects and team members to implement scalable data solutions., Ensure data quality, consistency, and security in cloud environments., Participate in code reviews and contribute to continuous improvement efforts.] Requirements: AWS, PySpark, SQL, CI/CD, Terraform Tools: . Additionally: Private healthcare, Training budget, Free parking, In-house trainings.

  • Kraków, Gdańsk, Poznań, Wrocław, Warszawa, Czech Republic Capgemini Polska Sp. z o.o. Full time

    YOUR PROFILEYou have hands-on experience in data engineering and can independently handle moderately complex tasks.You've worked with PySpark in distributed data processing scenarios.You are familiar with AWS data services such as S3, Glue, EMR, Lambda, Redshift, or similar.Experience working with additional  major cloud platform (Azure, or GCP)You are...

  • Cloud Data Engineer

    1 week ago


    Wrocław, Kraków, Gdańsk, Poznan, Warszawa, Czech Republic beBeeDataEngineer Full time €75,000 - €98,000

    We are seeking a highly skilled data professional with expertise in cloud-based data engineering and processing.Responsibilities:Design, build and manage scalable data pipelines using AWS services and PySpark.Process large-scale datasets efficiently, ensuring high-quality results and reliability.Collaborate with cross-functional teams to develop innovative...


  • Remote, Wrocław, Gdańsk, Rzeszów, Czech Republic Xebia sp. z o.o. Full time

    3+ years' experience with AWS (Glue, Lambda, Redshift, RDS, S3),5+ years' experience with data engineering or backend/fullstack software development,strong SQL skills,Python scripting proficiency,experience with data transformation tools – Databricks and Spark,data manipulation libraries (such as Pandas, NumPy, PySpark),experience in structuring and...


  • Wrocław, Gdańsk, Kraków, Poznan, Warsaw, Czech Republic Capgemini Polska Sp. z o.o. Full time

    You have hands-on experience in data engineering and are comfortable working independently on moderately complex tasks.You've worked with Databricks and PySpark in real-world projects.You're strong in Python for data transformation and automation.You've used at least one cloud platform (AWS, Azure, or GCP) in a production environment.You communicate clearly...


  • Warszawa, Mazovia, Czech Republic Link Group Full time

    5+ years of experience in Python and strong software engineering skillsSolid knowledge of AWS cloud services and best practicesExperience building scalable Spark pipelines in PySpark or ScalaPractical experience with Spark Streaming for low-latency pipelinesFamiliarity with Delta Lake and modern data lakehouse architecturesHands-on experience with Kubernetes...

  • AWS Engineer @

    2 days ago


    Warszawa, Gdańsk, Czech Republic AVENGA (Agencja Pracy, nr KRAZ: 8448) Full time

    Orchestration services : Step function, Event-Bridge, Airflow (MWAA), LamdaProcessing: Glue, EMR, EKSStorage Service: S3Querying and Analysis services: AthenaBig data ETL development and frameworks using : PySpark with Python or/and Spark with Scala (nice to have)Experience with working with Python/Pandas transformationExperience with Hadoop Ecosystem: Hive,...

  • Data Engineer @

    1 week ago


    Kraków, Wrocław, Warszawa, Czech Republic Unit8 SA Full time

    As a member of agile project teams, your mission will be to build solutions and infrastructure aiming at solving the business problems of our clients.You are a proficient software engineer who knows the fundamentals of computer science and you master at least one widely adopted programming language (Python, Java, C#, C++).You know how to write...


  • Wrocław, Gdańsk, Kraków, Poznan, Waszawa, Czech Republic beBeeData Full time 900,000 - 1,200,000

    Job Title:Senior Data EngineerOverviewWe are seeking an experienced Senior Data Engineer to join our team. As a key member of our Insights & Data team, you will design and maintain robust data pipelines using Databricks and PySpark.ResponsibilitiesData Engineering: Develop and maintain data processing pipelines using Databricks and PySpark.Collaborate with...


  • Kraków, Wrocław, Warszawa, Czech Republic Unit8 SA Full time

    About YouMSc level in the field of Computer Science, Machine Learning, Applied Statistics, Mathematics, or equivalent work experience.You are a proficient software engineer who knows the fundamentals of computer science and has experience in applying a blend of software engineering, machine learning, and statistical methods to solve real-world business...


  • Kraków, Warszawa, Wrocław, Czech Republic Capgemini Polska Sp. z o.o. Full time

    YOUR PROFILEYou have hands-on experience in data engineering and are comfortable working independently on moderately complex tasks.You've worked with Databricks and PySpark in real-world projects.You're proficient in Python for data transformation and automation.You've used at least one cloud platform (AWS, Azure, or GCP) in a production environment.You...