Data Engineer @

5 days ago


Kraków, Lesser Poland, Czech Republic Tesco Technology Full time

This job requires to be based in/close to Kraków. We currently work in a hybrid model and meet in our office 3 days a week.

Qualifications

Mandatory skills: 

  • Data Processing: Apache Spark - Scala or Python
  • Data Storage: Apache HDFS or respective cloud alternative
  • Resource Manager: Apache Yarn or respective cloud alternative
  • Lakehouse: Apache Hive/Kyuubi or alternative
  • Workflow Scheduler: Airflow or alternative

Nice to have skills:

  • Functional programming
  • Apache Kafka
  • Kubernetes
  • Stream processing
  • CI/CD

Unsure if you fit all the criteria? Apply and give us the chance to evaluate your potential – you could be the perfect fit

Job Description

The Data Engineering department at Tesco Technology is at the forefront of data processing within the retail and technology industry. This vital department handles a range of responsibilities, including:

  • Analyzing order and delivery data to optimize logistics processes and enhance delivery efficiency.
  • Managing critical data related to customer orders, suppliers, and products to ensure the seamless flow of our fulfillment operations.
  • Upholding data integrity and security during the processing of order and delivery-related information.

In this role, you will take charge of expanding and refining our data and data pipeline architecture. Additionally, you will be instrumental in optimizing data flow and collection to cater to the needs of cross-functional teams.

This role calls for a high level of self-direction and the ability to effectively support the data requirements of multiple teams, systems, and products.

Company Description

Tesco is a leading multinational retailer, with more than 330 000 colleagues.

Our software is used by millions of people across several countries every day. Whether it's the tills and websites our customers use, or the systems our colleagues and partners use, you'll play your part in keeping it running like a well-oiled machine. And when a business problem pops up? You and the creative minds in our team will be challenged to solve it.

As Tech Hub we cooperate within the group of Tesco Technology Hubs located in the UK, Poland, Hungary, and India.

What our colleagues like the most at Tesco:

  • We develop our own products
  • We make an impact; large scale of operation
  • Accountability and respect are given to us
  • We cooperate and support each other
  • There are great colleagues who are divided into small teams here
  • We can develop and learn new things

Additional Information

Hybrid working
We know life looks a little different for each of us. That's why at Tesco, we always welcome chats about different flexible working options. Some people are at the start of their careers, some want the freedom to do the things they love. Others are going through life-changing moments like becoming a carer, adapting to parenthood, or something else. So, talk to us throughout your application about how we can support.

This role requires you to be based in or near Kraków, as you will spend 60% (3 days) of your week collaborating with colleagues at our office locations or local sites and the rest remotely.

Benefits

Tesco is a diverse and exciting employer, dedicated to being #aplacetogeton, providing career-defining opportunities to all of our colleagues. If you choose to join our business, we will provide you with (for all):

  • Permanent contract from the go – as a sign of our trust in your abilities
  • MacBook as your tool for work
  • Learning opportunities - certified technical training and learning platforms like Udemy
  • Referral Bonus
  • Sports activities with a personal trainer in the office

Benefits for colleagues on employment of contract only:

  • Additional 4 days of paid leave to support your well-being and family life
  • Up to 20% yearly salary bonus – based on both individual and business performance
  • Private healthcare (LuxMed)
  • Cafeteria & Multisport
  • Supporting those, who are not yet eligible for full holiday entitlement, by expanding their pool from 20 to 25 days
  • Relocation Help
  • IP Tax Deductible Costs

If that sounds exciting, then we'd love to hear from you.

Tesco is committed to celebrating diversity and everyone is welcome at Tesco. As a Disability Confident Employer, we're committed to providing a fully inclusive and accessible recruitment process, allowing candidates the opportunity to thrive and inform us of any reasonable adjustments they may require.

,[Create and maintain optimal data pipeline architecture, Assemble large complex data sets that meet functional / non-functional business requirements., Identify design and implement internal process improvements: automating manual processes optimising data delivery re-designing infrastructure for greater scalability etc., Build the infrastructure required for optimal extraction transformation and loading of data from a wide variety of data sources, Build analytics tools that utilize the data pipeline to provide actionable insights into customer acquisition operational efficiency and other key business performance metrics., Work with stakeholders including the Executive Product Data and Design teams to assist with data-related technical issues and support their data infrastructure needs., Keep our data separated and secure, Create data tools for analytics and data scientist team members that assist them in building and optimising our product into an innovative industry leader., Work with data and analytics experts to strive for greater functionality in our data systems.] Requirements: Hadoop, Scala, Kafka, Kubernetes, Spark, Spark Streaming, Hive, Functional programming Additionally: International projects, Private healthcare, Small teams, Sport subscription, Free coffee, Canteen, Bike parking, Playroom, Mobile phone, Free parking, Modern office, No dress code, Shower.

  • Kraków, Lesser Poland, Czech Republic ITDS Full time

    You're ideal for this role if you have:Strong experience in PySpark, Scala, or similar data engineering languages Hands-on experience building production data pipelines using Hadoop, Spark, and Hive Knowledge of cloud platforms and migrating on-premise solutions to the cloud Experience with scheduling tools such as Airflow and workflow...

  • Data Engineer @

    3 days ago


    Kraków, Lesser Poland, Czech Republic ABB Full time

    Advanced degree in Computer Science, Engineering, Data Science, or a related field (Master's preferred).Proven experience (preferably 3+ years) as a Data Engineer with demonstrated expertise in building production-grade data pipelines and hands-on experience with Microsoft Fabric (Data Factory, Lakehouse, Dataflows).Strong knowledge of ETL/ELT concepts, data...


  • Kraków, Lesser Poland, Czech Republic HSBC Technology Poland Full time

    Strong experience with database technologies (SQL, NoSQL), data warehousing solutions, and big data technologies (Hadoop, Spark).Proficiency in programming languages such as Python, Java, or Scala.Experience with cloud platforms (AWS, Azure, Google Cloud) and their data services. Certifications in cloud platforms (AWS Certified Data Analytics, Google...

  • Data Engineer @

    1 week ago


    Kraków, Lesser Poland, Czech Republic Antal Full time

    5+ years of IT experience, with 2+ years in software development using Big Data technologies, microservices, and event-driven cloud architectures.Hands-on experience with Apache NiFi, Kafka, Spark, Hive, HDFS, Oozie, SQL, Python, and Linux Shell scripting.Strong database skills: at least one SQL database (Oracle, PostgreSQL, MySQL, etc.) and one NoSQL...


  • Kraków, Lesser Poland, Czech Republic HSBC Technology Poland Full time

    What you need to have to succeed in this roleExcellent experience in the Data Engineering Lifecycle. You will have created data pipelines which take data through all layers from generation, ingestion, transformation and serving. Senior stakeholder management skills. Experience of modern Software Engineering principles and experience of creating well tested...


  • Kraków, Lesser Poland, Czech Republic beBeeDataEngineering Full time 105,000 - 135,000

    Job Title: Big Data Expertise EngineerJob DescriptionAs a skilled data engineer, you will play a pivotal role in creating innovative solutions for managing and analyzing large datasets. Your primary focus will be on utilizing Scala and Spark to design and implement scalable data pipelines that cater to the needs of our organization.Key responsibilities...


  • Kraków, Lesser Poland, Czech Republic beBeeDataEngineering Full time 5,400,000 - 7,800,000

    Unlock your potential as a Data and Financial Engineering Expert in our Product Control department. Collaborate with cross-functional teams to design, build, and maintain automated solutions that enhance controls and analytical processes.Key Responsibilities:Develop innovative solutions using Python, SQL, and VBA to improve data quality and reduce manual...

  • Data Engineer @

    3 days ago


    Kraków, Lesser Poland, Czech Republic HSBC Technology Poland Full time

    Experience with most of the following technologies (Apache Hadoop, Scala, Apache Spark, Sparkstreaming, YARN, Kafka, Hive, Python, ETL frameworks, Map Reduce, SQL, RESTful services). Sound knowledge on working Unix/Linux Platform. Hands-on experience building data pipelines using Hadoop components - Hive, Spark, Spark SQL. Experience with industry...

  • Data Engineer @

    5 days ago


    Kraków, Lesser Poland, Czech Republic Mindbox S.A. Full time

    Minimum 10 years of software development experience, including minimum 7 years of Python programming experience.Solid experience in Python, with knowledge of at least one Python web framework such as Django, Flask, etc.Have mindset of design thinking and ability to draw out the solution design.Experience of streaming data pipeline using PySpark, Apache Beam...

  • Data Engineer @

    2 weeks ago


    Kraków, Lesser Poland, Czech Republic Mindbox S.A. Full time

    Minimum 5 years of overall IT experience, including 2+ years in software development with Big Data technologies, microservices, and cloud-based event-driven architectures.Strong hands-on expertise with Apache NiFi, Apache Kafka, Apache Spark, Apache Hive, Apache HDFS, Apache Oozie, SQL, Python, Google SDK, REST API, Linux Shell Scripting.Solid database...