Junior Data Engineer @ DataFuze

2 weeks ago


Warszawa, Czech Republic DataFuze Full time

Requirements: Strong academic background - higher education (Bachelor's degree minimum, Master's degree preferred) in Computer Science or related field Minimum 12 months of commercial experience in a similar role Knowledge of DB theory Experience in creating efficient and optimized queries in SQL Ability to design and implement ETL processes Basic knowledge of Python Experience with Git Analytical thinking and problem-solving skills Proactive approach, eagerness to learn, explore and grow Strong sense of ownership and responsibility Effective communication and team collaboration skills Dedication to excellence in technical skills Proficiency in Polish (C1 level) and English (B2 level) Nice to Have: Familiarity with any of: Snowflake, Ariflow, dbt, Kafka, Oracle, APEX Relevant trainings or certifications in any of the above tools/technologies Contribution to free-time projects, blogs and other community involvement Who are we looking for ? We are currently seeking a candidate for the position of Junior Data Engineer. The ideal candidate should possess good SQL skills and demonstrate knowledge of DB theory. Familiarity with any of: Snowflake, Ariflow, dbt, Kafka, Oracle, APEX will be considered as a significant advantage. Who is DataFuze ? We are a boutique software development company. At DataFuze, we create and deliver data-driven IT solutions that empower our clients' data and provide lasting value for their businesses.  At the forefront of our CULTURE are these core values :  Technical Mastery : dedication to excellence in technical skills, ensuring a deep understanding of the latest industry trends and best practices, commitment to delivering scalable and efficient software solutions. Ownership and Accountability : encouraging a strong sense of responsibility among all team members to maintain the highest standards at every stage of the software development lifecycle. Client-Centric Agility : prioritizing the needs and satisfaction of clients by staying agile, responsive, and adaptable to evolving project requirements. Innovation Excellence : embracing a culture of constant learning, technological exploration and creative problem-solving to deliver cutting-edge solutions. Collaborative Synergy : fostering a collaborative and inclusive environment, where diverse talents work seamlessly together, sharing knowledge and expertise to achieve success. Recruitment process We believe in a simple and transparent recruitment process. No fluff ;) We will reply to your application within 5 days. We will try to close the entire recruitment process within 3 weeks. That's our promise. What we offer ? Honest Compensation - A career path supported by a competitive and transparent salary structure. Flexible Benefits Package - Tailor our benefits package to your individual needs. Dynamic Professional Growth - Opportunities for continuous professional development within a challenging and dynamic IT environment. Expert Team Collaboration - Everyday teamwork with seasoned experts specializing in database solutions. Flexible Work Options - Enjoy the flexibility of hybrid or remote work based on seniority and project requirements. Work-Life Balance - Embrace flexibility in working hours. ,[Design and implement ETL processes using any of: Snowflake, Kafka or Airflow to support scalable data workflows, Build reliable and governed data pipelines with dbt in order to accelerate analytics and AI initiatives, Design, develop and maintain databases, Create efficient and optimized queries in SQL, Create and utilize functions, packages and procedures in PL/SQL, Assist in resolving technical bottlenecks to improve overall system efficiency, functionality and performance, Low-code application development using Oracle Application Express (APEX)] Requirements: SQL, ETL, Python, Git, Snowflake, Airflow, dbt, Oracle, Apex, Kafka Tools: . Additionally: Sport subscription, Training budget, Flat structure, Masterclazz training, English lessons with Native Speaker, Small teams, Free coffee, Bike parking, Shower, Startup atmosphere, No dress code, Modern office.



  • Warszawa, Czech Republic DataFuze Full time

    Requirements: Strong academic background — minimum Bachelor’s degree in Computer Science or a related field (Master’s degree preferred) At least 1 year of production experience with:- Python (including pandas, numpy, SQLAlchemy, OOP, distributed workflows) in large-scale projects (including packages developement)- SQL (any dialect; Oracle is a plus)...


  • Warszawa, Czech Republic Bayer Full time

    Master’s degree in Statistics, Computer Science, Data Management, Data Science or a related field or Bachelor’s degree with 1+ years of professional experience. Some experience as a Data Analyst or Data Steward, preferably within the consumer-packaged goods, FMCG, pharmaceutical or healthcare industry is a plus. Basic knowledge of data management...


  • Warszawa, Czech Republic Bayer Full time

    Master’s degree in Statistics, Computer Science, Data Management, Data Science or a related field or Bachelor’s degree with 1+ years of professional experience. Some experience as a Data Analyst or Data Steward, preferably within the consumer-packaged goods, FMCG, pharmaceutical or healthcare industry is a plus. Basic knowledge of data management...


  • Warszawa, Czech Republic Bayer Full time

    Bachelor’s/Masters degree in Computer Science, Engineering, or a related field. 3+ years of working experience in the field of Data & Analytics, preferably in the CPG industry 3+ years of proficient coding experience with Python for data engineering, including SQL and PySpark (DataFrame API, Spark SQL, MLlib), with hands-on experience in various databases...


  • Warszawa, Czech Republic Bolt Full time

    You have 3–4 years of experience as a software or data engineer, and have hands-on experience building and deploying data pipelines or analytics workloads in cloud environments. You’re proficient in Python and SQL, developing production-ready code for internal libraries, data pipelines, and automation. You have excellent communication skills, are fluent...


  • Warszawa, Czech Republic Bayer Full time

    Bachelor/Master’s degree in Computer Science, Engineering, or a related field. 5+ years of working experience in the field of Data & Analytics, preferably in the CPG industry 5+ years of proficient coding experience with Python for data engineering, including SQL and PySpark (DataFrame API, Spark SQL, MLlib), with hands-on experience in various databases...


  • Warszawa, Czech Republic Bayer Full time

    5+ years of working experience in the field of Data & Analytics, preferably in the CPG industry 5+ years of proficient coding experience with Python for data engineering, including SQL and PySpark (DataFrame API, Spark SQL, MLlib), with hands-on experience in various databases (SQL/NoSQL), key libraries (e.g., pandas, SQLAlchemy), parallel processing, and...


  • Warszawa, Czech Republic Respect Energy Fuels Sp zo.o. Full time

    Technical Skills: Strong knowledge of Python, especially for ETL processes and data transformation. Experience with workflow orchestration tools, preferably Apache Airflow. Solid SQL skills and experience designing, querying, and administering relational databases (e.g., MySQL), including user access management and schema design. Hands-on experience working...


  • Warszawa, Czech Republic ITDS Full time

    You’re ideal for this role if you have: Strong proficiency in SQL with experience optimizing complex queries Hands-on experience with Google Cloud Platform (BigQuery, Dataflow, Composer) Solid programming skills in Python and familiarity with Git Experience in building and maintaining cloud-based data solutions Understanding of data governance principles...

  • Data Engineer @ EPIKA

    2 weeks ago


    Warszawa, Wrocław, Łódz, Czech Republic EPIKA Full time

    Data Engineering Foundations: Azure Databricks - PySpark, Spark SQL, Unity Catalog, Workflows Azure Data Factory, Key Vault, and ADLS/Delta OAuth, OpenID, SAML, JWT Medallion Architecture Strong SQL and data modeling in a lakehouse/Delta architecture Python for data engineering (API integration, utilities, testing) Operations: Running production pipelines...