Senior Data Engineer

2 days ago


Kraków Warszawa Wrocław Poznań Łódź, Czech Republic GFT Poland Full time

Minimum 5 years of overall IT experience and 2+ years in the field of software development with Big Data technologies, micro services design and development, event based architectures in cloud Required Technology Skillset: Apache NiFi, Apache Kafka, Apache Spark, Apache Hive, Apache HDFS, Apache Oozie, SQL, Python, Google SDK, REST API, Linux Shell Scripting Strong database skills: at least one of SQL databases (Oracle, Teradata, MySQL, PostgreSQL, MS SQL Server) and at least one of NoSQL databases (MongoDB, Apache Hbase) Framework: experience with one of Cloud platform providers (AWS, GCP, Azure) - GCP preferred Strong understanding of using Hadoop (Hortonworks/Cloudera) ecosystem such as HDFS and Hive for data storing and analysis. Good understanding of RDBMS, Star Schema, Slowly Changing Dimensions, CDC Experience in dataflow automation using Apache NiFi and creating near Real-time data pipelines using Apache Kafka Basic knowledge of Python & SQL, and capable of navigating databases especially Hive. Data pipeline building technologies. DRY, KISS, SOLID, Clean Code, Self-documenting Code Optionally, knowledge and experience of BigQuery, DataProc, Java SpringBoot Working in English on a daily basis Positive, proactive and can-do attitude Why join GFT? You will work with and learn from top IT experts. You will join a crew of experienced engineers: 60% of our employees are senior level. Interested in the cloud? You will enjoy our full support in developing your skills: training programs, certifications and our internal community of experts. We have strong partnerships with top cloud providers: Google, Amazon and Microsoft - we are number one in Poland in numbers of GCP certificates. Apart from GCP, you can also develop in AWS or Azure. We are focused on development and knowledge sharing. Internal expert communities provide a comfortable environment where you can develop your skillset in areas such as blockchain, Big Data, cloud computing or artificial intelligence.   You will work in a stable company (32 years on the market) in demanding and challenging projects for the biggest financial institutions in the world. We offer you: Hybrid work from – 2 office day per week Contract of employment or B2B contract Working in a highly experienced and dedicated team Benefit package that can be tailored to your personal needs (private medical coverage, sport & recreation package, lunch subsidy, life insurance, etc.) On-line training and certifications fit for career path Access to e-learning platform Mindgram - a holistic mental health and wellbeing platform Work From Anywhere (WFA) - the temporary option to work remotely outside of Poland for up to 140 days per year (including Italy, Spain, the UK, Germany, Portugal, and Bulgaria)Social events ,[Openness to work in a hybrid model (2 days from the office per week), Openness to visiting the client's office in Cracow once every two months (for 3 days), Design, develop, and maintain scalable data pipelines and solutions using Big Data technologies (e.g., Apache NiFi, Kafka, Spark, Hive, HDFS), Implement event-driven and microservices-based architectures in cloud environments, preferably Google Cloud Platform (GCP), Automate data flows and build near real-time data pipelines using Apache NiFi and Kafka, Work with large-scale datasets from structured and unstructured sources, utilizing the Hadoop ecosystem (HDFS, Hive) for data storage and analysis., Leverage RDBMS and NoSQL databases to support data ingestion, transformation, and querying, Write clean, maintainable, and efficient code following best practices (DRY, KISS, SOLID, Clean Code), Use Python, SQL, and Shell scripting for data processing and orchestration, Collaborate with cross-functional teams to deliver secure and reliable software solutions aligned with business needs, Contribute to the development of enterprise-scale data solutions in a fast-paced, agile environment, Engage in daily communication in English within a global team] Requirements: Spark, Hadoop, Apache Kafka, Apache Nifi, Apache Spark, Apache Hive, Apache HDFS, Apache Oozie, SQL, Python, Google SDK, REST API, Linux, GCP, BigQuery, DataProc, Java, Spring Boot Additionally: Home office, Knowledge sharing, Life insurance, Sport subscription, Training budget, Private healthcare, International projects, Integration events, English lessons, Platforma Mindgram, Free coffee, Playroom, Free snacks, Free beverages, In-house trainings, In-house hack days, Modern office, Free fruits.



  • Łódź, Czech Republic CommerzBank Full time

    Which technology & skills are important for us? 👌 Very good knowledge of data pipeline orchestration (design scalable, cloud-native data pipelines for data transformation and aggregation based on business use cases). Very good knowledge of GCP (or other Cloud) and creating Cloud based architecture (BigQuery, Dataproc/PySpark, Cloud Composer/Apache...


  • Poznan, Warszawa, Wrocław, Łódź, Kraków, Czech Republic GFT Poland Full time

    Openness to work in a hybrid model (2 days from the office per week) At least 2 years of commercial experience as a Data Engineer Strong SQL and Python and Spark/PySpark Good understanding of Data Warehousing concepts Experience with Data Modelling Understanding of key concepts around Data Warehousing, Data Lakes and Data Lakehouses Experience with Cloud...


  • Kraków, Czech Republic Ocado Technology Full time

    A proven track record of delivering scalable, reliable data engineering solutions using modern cloud-native tools and workflows (e.g. dbt, Airflow, Terraform). Strong communication and stakeholder collaboration skills, with the ability to work across engineering, product, and data functions. A pragmatic, delivery-focused mindset with a deep appreciation for...


  • Kraków, Czech Republic ELEKS Full time

    REQUIREMENTS 5+ years of commercial experience as a Data Engineer Strong expertise in AWS (particularly within the stack mentioned above) Focus on data processing Upper-Intermediate or higher level of English ELEKS Software Engineering & Development Office is looking for a Senior Data Engineer (AWS) in Poland or Ukraine. ABOUT PROJECT The platform replaces...

  • Senior Data Engineer

    2 weeks ago


    Warszawa, Czech Republic Link Group Full time

    5+ years of software development experience, strong in Python and SQL. Experience with web technologies (HTML, JavaScript, APIs) and Linux. Familiarity with web scraping tools (Selenium, Scrapy, Postman, XPath). Knowledge of containerization (Docker) and cloud platforms (AWS or Azure preferred). Strong problem-solving skills and ability to work...


  • Warszawa, Czech Republic Bayer Full time

           5+ years of experience in data engineering on AWS        Strong knowledge of AWS data stack: Glue, Athena, Lake Formation, S3, Step Functions, Lambda, RDS etc.        Proficient in Python for scripting, automation and data manipulation tasks        Experience with PySpark for building scalable, distributed ETL/ELT pipelines...


  • Warszawa, Poznan, Wrocław, Łódź, Kraków, Czech Republic GFT Poland Full time

    Expertise in KDB+/q with production systems: GW/RDB/HDB design, sym/partition strategies, attributes, asof/aj/uj, IPC patterns Strong skills in q/KDB+; working proficiency in Python/PyKX and ideally Java for integration services Solid Linux/UNIX fundamentals (networking, OS tuning) and familiarity with TCP/IP, UDP, Multicast; knowledge of FIX/OUCH/ITCH...


  • Kraków, Czech Republic HSBC Technology Poland Full time

    Strong experience with database technologies (SQL, NoSQL), data warehousing solutions, and big data technologies (Hadoop, Spark). Proficiency in programming languages such as Python, Java, or Scala. Experience with cloud platforms (AWS, Azure, Google Cloud) and their data services. Certifications in cloud platforms (AWS Certified Data Analytics, Google...


  • Kraków, Czech Republic HSBC Technology Poland Full time

    What you need to have to succeed in this role Excellent experience in the Data Engineering Lifecycle. You will have created data pipelines which take data through all layers from generation, ingestion, transformation and serving. Senior stakeholder management skills.  Experience of modern Software Engineering principles and experience of creating well...


  • Wrocław, Kraków, Czech Republic Ocado Technology Full time

    ESSENTIAL Significant experience in data analytics, ideally in fast-paced or high-growth environments. Advanced proficiency in SQL and Python, with experience building scalable analytical workflows. Exposure to cloud data platforms (e.g. BigQuery, Snowflake, Redshift) and modern data stack tools. Strong command of experimental design, causal inference, and...