Current jobs related to Senior Data Engineer @ - Kraków, Lesser Poland - HSBC Technology Poland


  • Kraków, Lesser Poland, Czech Republic HSBC Technology Poland Full time

    Strong experience with database technologies (SQL, NoSQL), data warehousing solutions, and big data technologies (Hadoop, Spark).Proficiency in programming languages such as Python, Java, or Scala.Experience with cloud platforms (AWS, Azure, Google Cloud) and their data services. Certifications in cloud platforms (AWS Certified Data Analytics, Google...

  • Senior Data

    2 weeks ago


    Kraków, Lesser Poland, Czech Republic Ocado Technology Full time

    Proven experience and Subject Matter Expertise in Supply Chain (ideally within the Retail/Grocery space)Advanced analytics and modelling experienceProven track record of informing business strategies and driving business change from actionable insight and data models or simulationsHands-on experience using SQL / BigQuery & Excel / Google SheetsAbility to...

  • Senior Data Analyst

    1 week ago


    Kraków, Lesser Poland, Czech Republic beBeeData Full time 22,000 - 31,000

    Job OpportunityData Analytics EngineerWe are seeking a skilled Senior Data Analyst to design and implement robust data architectures that support AI initiatives. The ideal candidate will develop and manage cloud-based data infrastructure, ETL workflows, and orchestrate data pipelines.About the RoleThis position offers the opportunity to work with...


  • Kraków, Lesser Poland, Czech Republic beBeeDataEngineer Full time €68,000 - €98,000

    Job Title: Senior Data EngineerWe are seeking an experienced Senior Data Engineer to lead the development of our data infrastructure and pipeline processes. The ideal candidate will have a strong background in data engineering, excellent leadership skills, and a thorough understanding of the banking industry's data requirements.


  • Kraków, Lesser Poland, Czech Republic Nordic Semiconductor Poland Sp. z o.o. Full time

    Bachelor's or Master's degree in Computer Science, Data Engineering, Electronics or a related field.Strong programming skills in PythonHands-on experience with big data technologiesExpertise in building data pipelines and ETL workflows.Experience with cloud-based data servicesSolid knowledge of SQL and NoSQL databases.Familiarity with CI/CD practices for...


  • Kraków, Lesser Poland, Czech Republic beBeeDataEngineer Full time €90,000 - €110,000

    Job OverviewThis is a hybrid work opportunity that involves developing robust systems for millions of users. As a key member of the engineering team, you will have hands-on experience building production data pipelines using Hadoop, Spark, and Hive.The ideal candidate will be responsible for designing, developing, and maintaining end-to-end data pipelines...

  • Data Engineer @

    2 weeks ago


    Kraków, Lesser Poland, Czech Republic ABB Full time

    Advanced degree in Computer Science, Engineering, Data Science, or a related field (Master's preferred).Proven experience (preferably 3+ years) as a Data Engineer with demonstrated expertise in building production-grade data pipelines and hands-on experience with Microsoft Fabric (Data Factory, Lakehouse, Dataflows).Strong knowledge of ETL/ELT concepts, data...


  • Kraków, Lesser Poland, Czech Republic beBeeDataEngineer Full time 5,300,000 - 7,100,000

    Job Title: Data EngineerWe are seeking a skilled and detail-oriented data engineer to design and implement robust data infrastructure solutions that enable advanced analytics and AI-driven insights for industrial asset management.


  • Kraków, Lesser Poland, Czech Republic beBeeData Full time 900,000 - 1,200,000

    Key Job ResponsibilitiesDesign, build, and automate Hadoop clusters and related services to drive business value.Work across the technology stack – from backend containerized services and APIs to front-end UI solutions – delivering automation tools and system improvements.Collaborate with solution architects, engineers, and stakeholders to create robust,...


  • Kraków, Lesser Poland, Czech Republic beBeeSoftware Full time 70,000 - 105,000

    Software Development ExpertThe MI IT team provides data-driven insights and automated document generation to the Compliance business areas, including Group Risk Assurance, Fraud, Financial Crime Risk, and Financial Crime Threat Mitigation.The user base is global, covering differing needs of business customers from interactive management information,...

Senior Data Engineer @

2 weeks ago


Kraków, Lesser Poland, Czech Republic HSBC Technology Poland Full time

What you need to have to succeed in this role

  • Proven (3+ years) hands on experience in SQL querying and optimization of complex queries/transformation in BigQuery, with a focus on cost, time-effective SQL coding and concurrency/data integrity
  • Proven (3+ years) hands on experience in SQL Data Transformation/ETL/ELT pipelines development, testing and implementation, ideally in GCP Datafusion
  • Proven Experience in Data Vault modelling and usage.
  • Hands on experience in Cloud Composer/Airflow, Cloud Run, Pub/Sub
  • Hands on development in Python, Terraform
  • Proficiency in Git usage for version control and collaboration.
  • Proficiency with CI/CD processes/pipelines designing, creation, maintenance in DevOps tools like Ansible/Jenkins etc. for Cloud Based Applications (Ideally GCP)
  • Experience in working in DataOps model
  • Experience in working in Agile environment and toolset.
  • Strong problem-solving and analytical skills
  • Enthusiastic willingness to learn and develop technical and soft skills as needs require rapidly and independently.
  • Strong organisational and multi-tasking skills.
  • Good team player who embraces teamwork and mutual support.

Nice to  Have

  • Experience designing, testing, and implementing data ingestion pipelines on GCP Data Fusion, CDAP or similar tools, including ingestion and parsing and wrangling of CSV, JSON, XML etc formatted data from RESTful & SOAP APIs, SFTP servers, etc.
  • Modern world data contract best practices understanding with experience for independently directing, negotiating, and documenting best in class data contracts.
  • Java development, testing and deployment skills (ideally custom plugins for Data Fusion)
  • Proficiency in working with Continuous Integration (CI), Continuous Delivery (CD) and continuous testing tools, ideally for Cloud based Data solutions.

Some careers shine brighter than others.

If you're looking for a career that will help you stand out, join HSBC, and fulfil your potential. Whether you want a career that could take you to the top, or simply take you in an exciting new direction, HSBC offers opportunities, support and rewards that will take you further.

Your career opportunity

HSBC Markets and Securities is an emerging markets-led and financing-focused investment banking and trading business that provides tailored financial solutions to major government, corporate and institutional clients worldwide.

In IT we provide HSBC with a genuine competitive advantage across the globe. Global Business Insights (GBI) provide critical metrics and reports to Markets and Securities Services Operations to enable them to monitor the health of their business and make data-driven decisions.

The GBI Transformation is a large and complex data integration program spanning all of MSS Ops globally. We serve a diverse audience of users and data visualisation requirements from Exco down, and over 80 data sources in multiple time-zones across Middle Office, Post-Trade and Securities Services IT and elsewhere. We are a critical enabler for the Rubix 2025 Strategy and the MSS control agenda, providing operational KPI and KRI metrics which allow senior management to measure the success of their BAU and CTB investment dollars.

We are looking for a GCP developer who can design, develop, test and deploy ETL/SQL pipelines connected to a variety of on-prem and Cloud data sources - both data stores and files. We will be using mainly GCP technologies like Cloud Store, BigQuery, and Data Fusion.

You will also need to work with our devops tooling to deliver continuous integration/deployment capabilities, automated testing, security, and IT compliance.

The role will be responsible for the provisioning of subject matter expertise to support Enterprise Risk Management (ERM) Leadership Team (LT) and ERM Assurance teams discharge their responsibilities in relation to operational risk and resilience risk steward delivery across all service areas, delivery of assurance activities, embedding of assurance practices and embedding of stewardship activities and service catalogue in respective GB/GF/Specialist team.

If your CV meets our criteria, you should expect the following steps in the recruitment process:

  • Online behavioural
  • Telephone screen
  • Job interview with the hiring manager
,[Design, build, test and deploy Google Cloud data models and transformations in BigQuery environment (e.g. SQL, stored procedures, indexes, clusters, partitions, triggers, etc.), Creating and managing ETl/ELT data pipelines to model raw/unstructured data into Data Vault universal model, enriched, transformed and optimized raw data into suitable for end consumers usage, Review and refine, interpret and implement business and technical requirements, Deliver non-functional requirements, IT standards and developer and support tools to ensure our applications are a secure, compliant, scalable, reliable and cost effective, Monitoring data pipelines for failures or performance issues and implementing fixes or improvements as needed, Optimizing ETL/ELT processes for performance and scalability, ensuring they can handle large volumes of data efficiently, Integrating data from multiple sources, ensuring consistency and accuracy, Manage code artefacts and CI/CD using tools like Git, Jenkins, Google Secrets Manager, etc., Fix defects and provide enhancements during the development period and hand-over knowledge, expertise, code and support responsibilities to support team] Requirements: SQL, BigQuery, ETL, Testing, GCP, Vault, Cloud Composer, Cloud, PUB, Python, Terraform, Git, DevOps, Ansible, Jenkins, SOAP, CSV, JSON, REST API, XML Additionally: Training budget, Private healthcare, Flat structure, International projects, Multisport card, Monthly remote work subsidy, Psychological support, Conferences, PPK option, Annual performance based bonus, Integration budget, International environment, Small teams, Employee referral bonus, Mentoring, Workstation reimbursement, Company share purchase plan, Childcare support programme, Bike parking, Playroom, Shower, Canteen, Free coffee, Free beverages, Free parking, In-house trainings, In-house hack days, No dress code, Modern office, Knowledge sharing, Garden, Massage chairs, Kitchen.