
Data Engineer with AWS
21 hours ago
- You have hands-on experience in data engineering and can independently handle moderately complex tasks.
- You’ve worked with PySpark in distributed data processing scenarios.
- You are familiar with AWS data services such as S3, Glue, EMR, Lambda, Redshift, or similar.
- Experience working with additional major cloud platform (Azure, or GCP)
- You are proficient in Python for ETL and automation tasks.
- You communicate clearly and confidently in English.
Nice to Have
- Strong SQL skills and understanding of data modeling.
- Exposure to CI/CD pipelines, Terraform, or CloudFormation.
- Familiarity with streaming technologies like Kafka or Kinesis.
- AWS certifications
Join our growing Insights & Data team—over 400 professionals delivering advanced, data-driven solutions. We specialize in Cloud & Big Data engineering, building scalable data architectures on AWS. We manage the full Software Development Life Cycle (SDLC) using modern frameworks, agile methodologies, and DevOps best practices.
Choosing Capgemini means choosing a company where you will be empowered to shape your career in the way you’d like, where you’ll be supported and inspired by a collaborative community of colleagues around the world, and where you’ll be able to reimagine what’s possible. Join us and help the world’s leading organizations unlock the value of technology and build a more sustainable, more inclusive world.
WHAT YOU’LL LOVE ABOUT WORKING HERE- Enjoy hybrid working model that fits your life - after completing onboarding, connect work from a modern office with ergonomic work from home, thanks to home office package (including laptop, monitor, and chair). Ask your recruiter about the details.
- Access to over 70 training tracks with certification opportunities (e.g., GenAI, Architects, Google) on our NEXT platform. Dive into a world of knowledge with free access to Education First languages platform, Pluralsight, TED Talks, Coursera and Udemy Business materials and trainings.
- Practical benefits: private medical care with Medicover with additional packages (e.g., dental, senior care, oncology) available on preferential terms, life insurance and 40+ options on our NAIS benefit platform, including Netflix, Spotify or Multisport.
Capgemini is committed to diversity and inclusion, ensuring fairness in all employment practices. We evaluate individuals based on qualifications and performance, not personal characteristics, striving to create a workplace where everyone can succeed and feel valued.
Do you want to get to know us better? Check our Instagram — @capgeminipl or visit our Facebook profile — Capgemini Polska. You can also find us on TikTok — @capgeminipl.
ABOUT CAPGEMINICapgemini is a global leader in partnering with companies to transform and manage their business by harnessing the power of technology. The Group is guided everyday by its purpose of unleashing human energy through technology for an inclusive and sustainable future. It is a responsible and diverse organization of over 360,000 team members globally in more than 50 countries. With its strong 55-year heritage and deep industry expertise, Capgemini is trusted by its clients to address the entire breadth of their business needs, from strategy and design to operations, fueled by the fast evolving and innovative world of cloud, data, AI, connectivity, software, digital engineering and platforms.
Apply now
,[Design and build data pipelines using AWS services and PySpark., Process large-scale datasets efficiently and reliably., Collaborate with architects and team members to implement scalable data solutions., Ensure data quality, consistency, and security in cloud environments., Participate in code reviews and contribute to continuous improvement efforts.] Requirements: Data engineering, PySpark, AWS, AWS S3, Glue, Amazon EMR, AWS Lambda, Redshift, Cloud platform, ETL, GCP, Python, SQL, Data modeling, CD, Terraform, CloudFormation, Kafka, Kinesis Additionally: Training budget, Sport subscription, Private healthcare, International projects, Free coffee, Bike parking, Free parking, Mobile phone, In-house trainings, Modern office, No dress code.-
Data Engineer with AWS
5 days ago
Kraków, Gdańsk, Poznań, Wrocław, Warszawa, Czech Republic Capgemini Polska Sp. z o.o. Full timeYOUR PROFILEYou have hands-on experience in data engineering and can independently handle moderately complex tasks.You've worked with PySpark in distributed data processing scenarios.You are familiar with AWS data services such as S3, Glue, EMR, Lambda, Redshift, or similar.Experience working with additional major cloud platform (Azure, or GCP)You are...
-
AWS Cloud Data Architect Position
2 days ago
Remote, Wrocław, Gdańsk, Rzeszów, Czech Republic beBeeData Full time 2,200,000 - 2,500,000About the RoleWe are seeking an experienced Cloud Data Architect to design, build, and deploy scalable data processing systems using cloud technologies.The ideal candidate will have strong experience with AWS services such as Glue, Lambda, Redshift, RDS, and S3. They will also have proficiency in Python scripting, SQL skills, and experience with data...
-
AWS Engineer @
2 weeks ago
Warszawa, Gdańsk, Czech Republic AVENGA (Agencja Pracy, nr KRAZ: 8448) Full timeOrchestration services : Step function, Event-Bridge, Airflow (MWAA), LamdaProcessing: Glue, EMR, EKSStorage Service: S3Querying and Analysis services: AthenaBig data ETL development and frameworks using : PySpark with Python or/and Spark with Scala (nice to have)Experience with working with Python/Pandas transformationExperience with Hadoop Ecosystem: Hive,...
-
Warszawa, Czech Republic Link Group Full time5+ years of experience in Python and strong software engineering skills Solid knowledge of AWS cloud services and best practices Experience building scalable Spark pipelines in PySpark or Scala Practical experience with Spark Streaming for low-latency pipelines Familiarity with Delta Lake and modern data lakehouse architectures Hands-on experience with...
-
Cloud Data Solutions Specialist
5 days ago
Kraków, Gdańsk, Poznań, Wrocław, Warszawa, Czech Republic beBeeDataEngineering Full time 80,000 - 120,000As a highly skilled Data Engineer, you will play a pivotal role in designing and implementing scalable data solutions using AWS services.We are seeking an expert in data engineering to handle moderately complex tasks independently.The ideal candidate will have hands-on experience in distributed data processing with PySpark and be proficient in utilizing AWS...
-
Senior AWS Data Engineer @ Xebia sp. z o.o.
21 hours ago
Remote, Wrocław, Gdańsk, Rzeszów, Czech Republic Xebia sp. z o.o. Full time3+ years’ experience with AWS (Glue, Lambda, Redshift, RDS, S3), 5+ years’ experience with data engineering or backend/fullstack software development, strong SQL skills, Python scripting proficiency, experience with data transformation tools – Databricks and Spark, data manipulation libraries (such as Pandas, NumPy, PySpark), experience in structuring...
-
Azure Data Engineer @
5 days ago
Kraków, Wrocław, Poznań, Gdańsk, Warszawa, Czech Republic Capgemini Polska Sp. z o.o. Full timeYOUR PROFILEStrong experience in data engineering, including working with AzureStrong Python for data processing and automation.Hands-on experience with one of the following: Databricks, Snowflake, or Microsoft Fabric.Strong communication skills and very good English language skills.Nice to have:Strong SQL skills and experience with database...
-
High-Performance Data Engineer
1 week ago
Remote, Warszawa, Wrocław, Białystok, Kraków, Gdańsk, Czech Republic beBeeData Full time €80,000 - €100,000Job Title: High-Performance Data EngineerJob Description:We are seeking an experienced and skilled High-Performance Data Engineer to join our team. As a senior data engineer, you will play a key role in designing and implementing high-performance data processing platforms for large-scale automotive data.The ideal candidate will have extensive experience...
-
Senior Data Engineer with Databricks @
4 days ago
Remote, Gdynia, Warszawa, Poznań, Kraków, Wrocław, Czech Republic Idego Group Sp. z o.o. Full timeMinimum 4+ years of experience as a Data Engineer.Proven commercial experience with Databricks.Strong knowledge of AWS (nice to have).Proficiency in Python, PySpark, and SQL.Excellent command of English, both spoken and written. We are looking for a Senior Data Engineer to join one of our clients' projects in an international environment.Our perkswork...
-
Data Engineer with Databricks
5 days ago
Kraków, Warszawa, Wrocław, Czech Republic Capgemini Polska Sp. z o.o. Full timeYOUR PROFILEYou have hands-on experience in data engineering and are comfortable working independently on moderately complex tasks.You've worked with Databricks and PySpark in real-world projects.You're proficient in Python for data transformation and automation.You've used at least one cloud platform (AWS, Azure, or GCP) in a production environment.You...