Data Engineer

10 hours ago


Kraków, Czech Republic Mindbox S.A. Full time

Technologies / Tech Stack:

  • Data Platform: GCP BigQuery, Cloud Storage, Dataflow, Composer (Apache Airflow), Compute Engine (VMs), IAM, etc.
  • Data Processing: SQL, Python, Prophecy, Spark.
  • DevSecOps: Ansible, Jenkins, Terraform, gcloud & YAML, Python, GCP API calls

    Requirements
  • Strong programming skills in Python (libraries, API, tokens).
  • Experience preparing and presenting architecture artefacts.
  • Knowledge of peer code review practices and architecture patterns.
  • Hands-on experience with GCP services (BigQuery, Cloud Composer, Dataflow, Storage, Pub/Sub, Service Accounts).
  • Broad technical knowledge enabling innovative solution design.
  • Proven track record of end-to-end solution delivery in complex environments.
  • Strong communication skills with business, IT teams, and vendors.
  • Familiarity with API design, microservices, SDLC, Agile, and DevOps.
  • Ability to actively contribute to architecture groups and forums.

Creating an inspiring place to thrive for the talented, we use their expertise and courage to introduce the technology of the future into your business. - This is the foundation of Mindbox and the goal of our business and technology journey. We operate and develop in four areas:

🤖 Autonomous Enterprise - automation of business processes using RPA, OCR, and AI.

🌐Business Managment Systems ERP - we implement, adapt, optimize, and maintain flexible, safe, and open ERP of production and distribution companies worldwide.

🤝Talent Network - we provide access to the best specialists.

☁️ Modern Architecture - we build integrated, sustainable, and open CI / CD environments based on containers enabling safe and more frequent delivery of proven changes in the application code.

We treat technology as a tool to achieve a goal. Thanks to our consultants' reliability and proactive approach, initial projects usually become long-term cooperation. For over 16 years, it has provided various services to support clients in digital transformation

We offer:

  • We are open to the employment form according to your preferences
  • Work with experienced and engaged team, willing to learn, share knowledge and open for growth and new ideas
  • Hybrid working system – 6 days / month from office in Kraków
  • Mindbox is a dynamically growing IT company, but still not a large one – everybody can have a real impact on where we are going next
  • We invest in developing skills and abilities of our employees
  • We have attractive benefits and provide all the tools required for work f.e. computer
  • Interpolska Health Care, Multisport, Warta Insurance, training platform (Sages)
,[Work for the Data Platform Tech Manager and manage all technical aspects., Define and maintain the technology stack and roadmap., Provide key decisions in terms of stack, design, and code quality., Ensure the solution is aligned with Client's standards in terms of architecture, controls, security, scalability, and performance., Resolve technical issues with the help of the technical development team., Perform code reviews to ensure best practices and quality., Collaborate under the direction of the Data Platform Manager and Product Owner with: Technical development team, Scrum Master, Architect, Business Analysts (providing Data Requirements)] Requirements: GCP, BigQuery, Cloud, Storage, Apache Airflow, IAM, SQL, Python, Spark, Ansible, Jenkins, Terraform, YAML, API, Cloud Composer, PUB, Communication skills, Microservices, SDLC, DevOps Additionally: Sport subscription, Private healthcare, International projects, Free coffee.
  • Data Engineer @

    2 weeks ago


    Kraków, Lesser Poland, Czech Republic ABB Full time

    Advanced degree in Computer Science, Engineering, Data Science, or a related field (Master's preferred).Proven experience (preferably 3+ years) as a Data Engineer with demonstrated expertise in building production-grade data pipelines and hands-on experience with Microsoft Fabric (Data Factory, Lakehouse, Dataflows).Strong knowledge of ETL/ELT concepts, data...


  • Kraków, Lesser Poland, Czech Republic HSBC Technology Poland Full time

    Strong experience with database technologies (SQL, NoSQL), data warehousing solutions, and big data technologies (Hadoop, Spark).Proficiency in programming languages such as Python, Java, or Scala.Experience with cloud platforms (AWS, Azure, Google Cloud) and their data services. Certifications in cloud platforms (AWS Certified Data Analytics, Google...


  • Kraków, Czech Republic ITDS Full time

    You're ideal for this role if you have: Strong experience in PySpark, Scala, or similar data engineering languages  Hands-on experience building production data pipelines using Hadoop, Spark, and Hive  Knowledge of cloud platforms and migrating on-premise solutions to the cloud  Experience with scheduling tools such as Airflow and workflow...


  • Kraków, Czech Republic Antal Full time

    Must-have qualifications: Minimum 5 years of experience as a Data Engineer / Big Data Engineer Hands-on expertise in Hadoop, Hive, HDFS, Apache Spark, Scala, SQL Solid experience with GCP and services like BigQuery, Dataflow, DataProc, Pub/Sub, Composer (Airflow) Experience with CI/CD processes and DevOps tools: Jenkins, GitHub, Ansible Strong data...

  • Data Engineer @ ABB

    5 days ago


    Kraków, Czech Republic ABB Full time

    Advanced degree in Computer Science, Engineering, Data Science, or a related field (Master's preferred). Proven experience (preferably 3+ years) as a Data Engineer with demonstrated expertise in building production-grade data pipelines and hands-on experience with Microsoft Fabric (Data Factory, Lakehouse, Dataflows). Strong knowledge of ETL/ELT concepts,...


  • Kraków, Lesser Poland, Czech Republic beBeeData Full time 900,000 - 1,200,000

    Key Job ResponsibilitiesDesign, build, and automate Hadoop clusters and related services to drive business value.Work across the technology stack – from backend containerized services and APIs to front-end UI solutions – delivering automation tools and system improvements.Collaborate with solution architects, engineers, and stakeholders to create robust,...


  • Kraków, Czech Republic Antal Full time

    5+ years of IT experience, with 2+ years in software development using Big Data technologies, microservices, and event-driven cloud architectures. Hands-on experience with Apache NiFi, Kafka, Spark, Hive, HDFS, Oozie, SQL, Python, and Linux Shell scripting. Strong database skills: at least one SQL database (Oracle, PostgreSQL, MySQL, etc.) and one NoSQL...

  • Data Engineer @

    2 weeks ago


    Kraków, Lesser Poland, Czech Republic HSBC Technology Poland Full time

    Experience with most of the following technologies (Apache Hadoop, Scala, Apache Spark, Sparkstreaming, YARN, Kafka, Hive, Python, ETL frameworks, Map Reduce, SQL, RESTful services). Sound knowledge on working Unix/Linux Platform. Hands-on experience building data pipelines using Hadoop components - Hive, Spark, Spark SQL. Experience with industry...


  • Kraków, Lesser Poland, Czech Republic beBeeDataEngineer Full time €90,000 - €110,000

    Job OverviewThis is a hybrid work opportunity that involves developing robust systems for millions of users. As a key member of the engineering team, you will have hands-on experience building production data pipelines using Hadoop, Spark, and Hive.The ideal candidate will be responsible for designing, developing, and maintaining end-to-end data pipelines...


  • Kraków, Czech Republic HSBC Technology Poland Full time

    Strong experience with database technologies (SQL, NoSQL), data warehousing solutions, and big data technologies (Hadoop, Spark). Proficiency in programming languages such as Python, Java, or Scala. Experience with cloud platforms (AWS, Azure, Google Cloud) and their data services. Certifications in cloud platforms (AWS Certified Data Analytics, Google...