Cloud Data Engineer @

2 days ago


Warszawa, Mazovia, Czech Republic Webellian Full time
Hard skills
  • Improve and refine data ingestion and transformation pipelines (ETL/ELT).
  • Proficiency in Python with focus on PySpark
  • Excellence in SQL and relational databases
  • Experience with Databrick, Azure Data Services is a plus
  • Continuous integration, deployment, and delivery practitioner (GitHub knowledge)
  • General understanding of Infrastructure, Orchestration and IT Security Principles (especially on an enterprise level)
  • Bachelor (BSc), Master (MSc) or equivalent experience in a technical field (for example, Computer Science, Engineering)
  • Fluent English (written and spoken) is a must, other languages (e.g. German, French, Italian, etc.) are a plus
  • Experience in insurance domain is a strong plus
Soft skills
  • DevOps mindset (you build it, you run it)
  • Capability of understanding complex requirements and breaking them down into actionable implementation tasks with attention to business logic
  • Capability of result oriented communication with people from different departments with different skill sets
  • Writing clear technical specifications
  • Excellent verbal communication skills
  • Leadership, autonomy and drive to grow and learn new technologies and tools
About the Webellian

Webellian is a well-established Digital Transformation and IT consulting company committed to creating a positive impact for our clients. We strive to make a meaningful difference in diverse sectors such as insurance, banking, healthcare, retail, and manufacturing. Our passion for cutting-edge and disruptive technologies, as well as our shared values and strong principles, are what motivate us. We are a community of engineers and senior advisors who work with our clients across industries, playing a deep and meaningful role in accelerating and realizing their vision and strategy.

About the position

We are looking for Cloud Data Engineer to work on a project for one of our key customers in the insurance industry. You will work in hybrid mode with your teammates based in Poland and other stakeholders located worldwide and you will be in direct contact with business users of the solution.

What we offer
  • Contract under Polish law: B2B or Umowa o Pracę
  • Benefits such as private medical care, group insurance, Multisport card
  • There are English classes available
  • Hybrid work (at least 1 day/week on-site) in Warsaw (Mokotów)
  • Opportunity to work with excellent professionals
  • High standards of work and focus on the quality of code
  • New technologies in use
  • Continuously learning and growth
  • International team
  • Pinball, PlayStation & much more (on-site)

Interested? Please click the "Apply for this job" button and send us your CV in English.

Please include the following statement: I hereby authorize Webellian Poland Sp. z o.o. to the process personal data provided in this document for realising the recruitment process pursuant to the Personal Data Protection Act of 10 May 2018 (Journal of Laws 2018, item 1000) and in agreement with Regulation (EU) 2016/679 of the European Parliament and of the Council of 27 April 2016 on the protection of natural persons with regard to the processing of personal data and on the free movement of such data, and repealing Directive 95/46/EC (General Data Protection Regulation).

,[Leverage a global data platform and enrich it with additional data capabilities, Design and implement the solutions for complete use case data pipelines: from data ingestion and storage, through data processing & implementation of business rules to data consumption (e.g. reporting), Define and apply best practices for development and maintenance of the platform, Keep up with trends and evolving technology in the big data and analytics world, Look for opportunities to improve performance, reliability and automation] Requirements: Python, SQL, ETL, PySpark, Azure Data Services, Databricks

  • Warszawa, Mazovia, Czech Republic beBee Careers Full time

    Cloud Data Engineer Job OpportunityWe are looking for a skilled Cloud Data Engineer to join our team. As a key member of our data engineering team, you will be responsible for designing and implementing complete use case data pipelines from data ingestion and storage through data processing and implementation of business rules to data consumption.The ideal...


  • Warszawa, Mazovia, Czech Republic Winged IT Full time

    Your skills and experiences:- 5+ years of experience in data engineering, particularly in designing and developing data pipelines;- Strong programming skills in Python and experience with API integrations;- Extensive experience with AWS Cloud Services (e.g., S3, EC2, RDS, Lambda, Glue Jobs, EMR) and understanding of best practices in cloud security;-...


  • Warszawa, Mazovia, Czech Republic beBee Careers Full time

    Job Description:Your skills and experiences include designing and developing data pipelines, programming in Python, API integrations, AWS Cloud Services, Terraform, SQL, relational and NoSQL databases, graph and vector databases, and cloud security.You should have 5+ years of experience in data engineering, proficiency in SQL, and a strong understanding of...


  • Warszawa, Mazovia, Czech Republic Link Group Full time

    What We're Looking For:5+ years of experience in Python and strong software engineering skillsSolid knowledge of AWS cloud services and best practicesExperience building scalable Spark pipelines in PySpark or ScalaPractical experience with Spark Streaming for low-latency pipelinesFamiliarity with Delta Lake and modern data lakehouse architecturesHands-on...

  • Data Engineer

    1 week ago


    Warszawa, Mazovia, Czech Republic Link Group Full time

    Proficiency in Scala or Java with substantial experience.Minimum 4–5 years of relevant experienceHands-on expertise in working with Spark.Familiarity with CI/CD methodologies and tools such as GitHub Actions.Knowledge of cloud platforms (GCP) and infrastructure as code (Terraform) is a plus.Experience with Airflow is an added advantage.Background in data...

  • Data Engineer @

    1 week ago


    Warszawa, Mazovia, Czech Republic Devire Full time

    We are looking for someone who brings:3+ years of hands-on experience with Azure data components (Data Factory, Synapse, Data Lake, Blob Storage, SharePoint, Databricks)Experience in building ETL/ELT pipelines in Azure (especially ADF)Good understanding of Databricks as part of the architecture — understanding how data should flow and be stored in...


  • Warszawa, Mazovia, Czech Republic Asana Full time

    About youDegree in Computer Science, Engineering or equivalent technical field experience6+ years of hands-on experience in Data Engineering or Software Engineering.Fluent in SQL and proficient in at least one programming language (e.g., Python, Java, Scala etc.)Strong expertise in Databricks, AWS S3, Spark, Snowflake, and AirflowKnowledge of system and...


  • Warszawa, Mazovia, Czech Republic beBee Careers Full time

    About this roleJob Description:We are seeking a senior software engineer with expertise in infrastructure and operations, as well as a strong interest in big data systems. You will work with a world-class team of engineers to deploy and operate existing systems, and build new systems to meet the unique challenges of our data platforms.The Data Infrastructure...


  • Warszawa, Mazovia, Czech Republic beBee Careers Full time

    Cloud Data Engineer**Job Description**We are seeking an experienced Cloud Data Engineer to join our team. The successful candidate will be responsible for designing and implementing data pipelines, ensuring the smooth operation of our data platform. This is a fantastic opportunity for someone who is passionate about data engineering and wants to make a real...


  • Warszawa, Mazovia, Czech Republic beBee Careers Full time

    About the RoleThe ideal candidate will be a seasoned software engineer with expertise in data infrastructure and operations. This role involves working on cross-functional projects to help define the future of Asana's data platforms, enabling colleagues to make the best possible use of data.Key Responsibilities:Collaborate with data engineering and other...