Data Engineer with Databricks @ Data Hiro sp. z o.o.

3 days ago


Wrocław, Czech Republic Data Hiro sp. z o.o. Full time

5+ years of hands-on experience in implementing, developing, or maintaining data systems in commercial environments. Solid understanding of data modeling and data architecture principles. Strong Python skills, with a focus on writing clean, maintainable code and applying solid object-oriented design principles. Advanced SQL expertise, including performance tuning, query optimization. Deep knowledge of modern technologies, such as Apache Airflow, Databricks, Spark, DBT. Experience deploying and managing solutions in cloud environments, ideally AWS. Proven experience in designing and implementing data governance and data management frameworks. Prior consulting experience, with the ability to advise clients on architecture design, technology selection, and best practices. Aptitude for bridging the gap between business needs and technical specifications, showcasing a well-rounded understanding of both dimensions. Proactive and independent work style, with strong ownership of deliverables and project outcomes. Who are we looking for? We are currently seeking a Data Engnieer to join out team. The new member will have the exciting opportunity to work on green field project for our new banking customer. We are seeking an experienced Data Engnieer with a high proficiency in the AWS and Databricks ready to work directly with clients in order to devise new technical solutions. Location? Remote you however occasional business trips might be needed. Everything will be agreeded in advance. Working from our office in Wrocław is also an option. You will be responsible: You will join a team responsible for creating, developing, and maintaining new Data Platform for a banking customer Your tasks will include technical analysis, direct contact with the client, development of new technical solutions, translating business requirements into technical ones, and implementing solutions at both the concept and technical levels. ,[Architect and deliver data platforms, ensuring high availability, performance, and reliability across environments., Design and orchestrate complex data transformations using Databricks, DBT, and Apache Airflow, maintaining data integrity and consistency throughout the process., Develop and maintain data pipelines that transform and analyze information from multiple sources, ensuring efficiency and scalability., Focus on automation and architecture adherence., Translate business requirements into robust, scalable, and efficient technical solutions that prioritize performance and data quality., Enforce data security, compliance, and governance standards across all data pipelines and environments.] Requirements: Power BI, SQL, AWS, Databricks, Python, Airflow, Oracle Additionally: Sport subscription, Private healthcare.



  • Wrocław, Czech Republic Data Hiro sp. z o.o. Full time

    5+ years of hands-on experience in implementing, developing, or maintaining data systems in commercial environments. Solid understanding of data modeling and data architecture principles. Strong Python skills, with a focus on writing clean, maintainable code and applying solid object-oriented design principles. Advanced SQL expertise, including performance...


  • Remote, Wrocław, Czech Republic Data Hiro sp. z o.o. Full time

    6+ years of experience in IT 4+ years of hands-on experience with the Microsoft Power Platform, including: Working with Solutions Managing multiple environments (SDLC) Deployment Pipelines PowerApps (model-driven and canvas) Power BI (Dataverse connectivity) Power Automate (cloud flows, web services, Azure Functions) Power Pages Strong familiarity with Azure...


  • Remote, Bydgoszcz, Warsaw, Wrocław, Poznań, Gdańsk, Łódź, Czech Republic deepsense.ai Sp. z o.o. Full time

    Good knowledge of Python and SQL. Experience with any of the major SQL databases (PostgreSQL preferred). Strong knowledge of cloud computing platforms (Azure/GCP/AWS). Familiarity with containerization technologies (Docker/Kubernetes). Experience with ETL and Big Data elements (Spark, Kafka, etc.). Basic experience in DevOps...


  • Remote, Wrocław, Gdańsk, Rzeszów, Czech Republic Xebia sp. z o.o. Full time

    5+ years in a data engineering role, with hands-on experience in building data processing pipelines, experience in leading the design and implementing of data pipelines and data products, proficiency with GCP services, for large-scale data processing and optimization, extensive experience with Apache Airflow, including DAG creation, triggers, and workflow...


  • Kraków, Wrocław, Poznań, Gdańsk, Warszawa, Czech Republic Capgemini Polska Sp. z o.o. Full time

    YOUR PROFILE strong experience in Big Data or Cloud projects in the areas of processing and visualization of large and unstructured datasets (in different phases of Software Development Life Cycle); practical knowledge of the Azure cloud in Storage, Compute (+Serverless), Networking and DevOps areas supported by commercial project work experience; very...


  • Kraków, Gdańsk, Wrocław, Warszawa, Poznań, Czech Republic Capgemini Polska Sp. z o.o. Full time

    YOUR PROFILE Proven experience in Big Data or Cloud projects involving processing and visualization of large and unstructured datasets, across various phases of the SDLC. Hands-on experience with the AWS cloud, including Storage, Compute (and Serverless), Networking, and DevOps services. Solid understanding of AWS services, ideally supported by relevant...


  • Remote, Wrocław, Gdańsk, Rzeszów, Czech Republic Xebia sp. z o.o. Full time

    6+ years of hands-on experience in data engineering and large-scale distributed systems, proven expertise in building and maintaining complex ETL/ELT pipelines, deep knowledge of orchestration frameworks (Airflow) and workflow optimization, strong GCP cloud infrastructure experience, GKE experience, expert-level programming in Python or Scala, solid...


  • Kraków, Wrocław, Poznań, Gdańsk, Warszawa, Czech Republic Capgemini Polska Sp. z o.o. Full time

    YOUR PROFILE Good experience as an Architect and/or Technical Leader in Cloud or Big Data projects in the field of data processing and visualization (in different phases of SDLC); Practical knowledge of one of the following clouds: AWS, Azure, GCP in the area of Storage, Compute (+Serverless), Networking and Devops supported by work on commercial projects;...


  • Zielona Góra, Wrocław, Czech Republic Auctane Poland Sp. z o.o. Full time

    What are we looking for? A strong Data Engineering background. Experience working in AWS cloud infrastructure. Experience developing and supporting robust, automated, and reliable data pipelines in Python and SQL. Excellent Python and SQL skills. Experience with data processing frameworks like Spark, Trino (Athena), or Pandas. Knowledge of Data Orchestration...


  • Zielona Góra, Wrocław, Czech Republic Auctane Poland Sp. z o.o. Full time

    What are we looking for? At least 5 years of proven experience in data engineering roles A strong Data Engineering background in a data warehouse or data lake architecture Experience working in AWS/GCP cloud infrastructure. Experience developing and supporting robust, automated, and reliable data pipelines in Python and SQL. Mastered Python and SQL skills....