Data Engineer with AWS
7 days ago
- You have hands-on experience in data engineering and can independently handle moderately complex tasks.
- You've worked with PySpark in distributed data processing scenarios.
- You are familiar with AWS data services such as S3, Glue, EMR, Lambda, Redshift, or similar.
- Experience working with additional major cloud platform (Azure, or GCP)
- You are proficient in Python for ETL and automation tasks.
- You communicate clearly and confidently in English.
Nice to Have
- Strong SQL skills and understanding of data modeling.
- Exposure to CI/CD pipelines, Terraform, or CloudFormation.
- Familiarity with streaming technologies like Kafka or Kinesis.
- AWS certifications
Join our growing Insights & Data team—over 400 professionals delivering advanced, data-driven solutions. We specialize in Cloud & Big Data engineering, building scalable data architectures on AWS. We manage the full Software Development Life Cycle (SDLC) using modern frameworks, agile methodologies, and DevOps best practices.
Choosing Capgemini means choosing a company where you will be empowered to shape your career in the way you'd like, where you'll be supported and inspired by a collaborative community of colleagues around the world, and where you'll be able to reimagine what's possible. Join us and help the world's leading organizations unlock the value of technology and build a more sustainable, more inclusive world.
WHAT YOU'LL LOVE ABOUT WORKING HERE- Enjoy hybrid working model that fits your life - after completing onboarding, connect work from a modern office with ergonomic work from home, thanks to home office package (including laptop, monitor, and chair). Ask your recruiter about the details.
- Access to over 70 training tracks with certification opportunities (e.g., GenAI, Architects, Google) on our NEXT platform. Dive into a world of knowledge with free access to Education First languages platform, Pluralsight, TED Talks, Coursera and Udemy Business materials and trainings.
- Practical benefits: private medical care with Medicover with additional packages (e.g., dental, senior care, oncology) available on preferential terms, life insurance and 40+ options on our NAIS benefit platform, including Netflix, Spotify or Multisport.
Capgemini is committed to diversity and inclusion, ensuring fairness in all employment practices. We evaluate individuals based on qualifications and performance, not personal characteristics, striving to create a workplace where everyone can succeed and feel valued.
Do you want to get to know us better? Check our Instagram — @capgeminipl or visit our Facebook profile — Capgemini Polska. You can also find us on TikTok — @capgeminipl.
ABOUT CAPGEMINICapgemini is a global leader in partnering with companies to transform and manage their business by harnessing the power of technology. The Group is guided everyday by its purpose of unleashing human energy through technology for an inclusive and sustainable future. It is a responsible and diverse organization of over 360,000 team members globally in more than 50 countries. With its strong 55-year heritage and deep industry expertise, Capgemini is trusted by its clients to address the entire breadth of their business needs, from strategy and design to operations, fueled by the fast evolving and innovative world of cloud, data, AI, connectivity, software, digital engineering and platforms.
Apply now
,[Design and build data pipelines using AWS services and PySpark., Process large-scale datasets efficiently and reliably., Collaborate with architects and team members to implement scalable data solutions., Ensure data quality, consistency, and security in cloud environments., Participate in code reviews and contribute to continuous improvement efforts.] Requirements: Data engineering, PySpark, AWS, AWS S3, Glue, Amazon EMR, AWS Lambda, Redshift, Cloud platform, ETL, GCP, Python, SQL, Data modeling, CD, Terraform, CloudFormation, Kafka, Kinesis Additionally: Training budget, Sport subscription, Private healthcare, International projects, Free coffee, Bike parking, Free parking, Mobile phone, In-house trainings, Modern office, No dress code.-
Data Engineer with AWS
7 days ago
Wrocław, Kraków, Gdańsk, Poznan, Warszawa, Czech Republic Capgemini Polska Sp. z o.o. Full timeYou have hands-on experience in data engineering and can independently handle moderately complex tasks.You've worked with PySpark in distributed data processing scenarios.You are familiar with AWS data services such as S3, Glue, EMR, Lambda, Redshift, or similar.Experience working with additional major cloud platform (Azure, or GCP)You are proficient...
-
Senior AWS Data Engineer @
5 days ago
Remote, Wrocław, Gdańsk, Rzeszów, Czech Republic Xebia sp. z o.o. Full time3+ years' experience with AWS (Glue, Lambda, Redshift, RDS, S3),5+ years' experience with data engineering or backend/fullstack software development,strong SQL skills,Python scripting proficiency,experience with data transformation tools – Databricks and Spark,data manipulation libraries (such as Pandas, NumPy, PySpark),experience in structuring and...
-
Data Engineer @
1 week ago
Kraków, Wrocław, Warszawa, Czech Republic Unit8 SA Full timeAs a member of agile project teams, your mission will be to build solutions and infrastructure aiming at solving the business problems of our clients.You are a proficient software engineer who knows the fundamentals of computer science and you master at least one widely adopted programming language (Python, Java, C#, C++).You know how to write...
-
Data Engineer with Databricks
7 days ago
Wrocław, Gdańsk, Kraków, Poznan, Warsaw, Czech Republic Capgemini Polska Sp. z o.o. Full timeYou have hands-on experience in data engineering and are comfortable working independently on moderately complex tasks.You've worked with Databricks and PySpark in real-world projects.You're strong in Python for data transformation and automation.You've used at least one cloud platform (AWS, Azure, or GCP) in a production environment.You communicate clearly...
-
Data Engineer @
3 days ago
Wrocław, Województwo dolnośląskie, Czech Republic SNI Full timeStrong hands-on experience with PySparkSolid understanding of distributed data processing frameworksExperience in building, maintaining, and optimizing ETL/ELT pipelinesFamiliarity with cloud-based data platforms (e.g., Azure, AWS, or GCP) is a plusGood communication skills and ability to work in a collaborative environmentEnglish proficiency is required...
-
Data Engineer @
1 week ago
Remote, Warszawa, Czech Republic Antal Full timeTechnical Requirements:Strong experience with AWS CDK for Infrastructure as Code (essential)Knowledge of CloudFormation (desirable) or Terraform (if CDK or CF are unavailable)Familiarity with AWS IAM and KMSHands-on expertise with AWS Lambda and AWS Kinesis (streaming)Exposure to AWS Data Zone (experience not mandatory)Proficiency in Python, particularly...
-
Azure Data Engineer @
7 days ago
Kraków, Wrocław, Poznań, Gdańsk, Warszawa, Czech Republic Capgemini Polska Sp. z o.o. Full timeYOUR PROFILEStrong experience in data engineering, including working with AzureStrong Python for data processing and automation.Hands-on experience with one of the following: Databricks, Snowflake, or Microsoft Fabric.Strong communication skills and very good English language skills.Nice to have:Strong SQL skills and experience with database...
-
Data Engineer with Pyspark @
7 days ago
Gdańsk, Kraków, Wrocław, Poznań, Warszawa, Czech Republic Capgemini Polska Sp. z o.o. Full timeYOUR PROFILEYou have hands-on experience in data engineering and can independently handle moderately complex tasks.You are proficient in PySpark and understand distributed data processing.You are comfortable working with Python for data transformation and automation.You have experience with relational and/or NoSQL databases.You communicate clearly and...
-
Data Engineer with Databricks
7 days ago
Kraków, Warszawa, Wrocław, Czech Republic Capgemini Polska Sp. z o.o. Full timeYOUR PROFILEYou have hands-on experience in data engineering and are comfortable working independently on moderately complex tasks.You've worked with Databricks and PySpark in real-world projects.You're proficient in Python for data transformation and automation.You've used at least one cloud platform (AWS, Azure, or GCP) in a production environment.You...
-
Middle Data Engineer @
3 days ago
Remote, Gdynia, Wrocław, Gdańsk, Kraków, Warsaw, Czech Republic N-iX Full timeMust-Have:Strong experience with Snowflake (e.g., performance tuning, storage layers, cost management).Production-level proficiency with dbt (modular development, testing, deployment).Experience developing Python data pipelines.Proficiency in SQL (analytical queries, performance optimization).Nice-to-Have:Experience with orchestration tools like Airflow,...