Data Engineer

8 hours ago


Warsaw, Czech Republic Bayer Full time

WHO YOU ARE: Required: Bachelor’s degree in Computer Science, Data Engineering, Information Technology, or a related field Proven experience in data visualization tools, specifically with Tableau, QV and Power BI. Strong analytical skills with a solid understanding of data analysis and interpretation. Strong skills in SQL and scripting Familiarity with ETL processes and tools Expertise in data modeling and design Experience in designing and building scalable data pipelines Familiarity with data visualization best practices and design principles. Experience with other data visualization tools or programming languages (e.g., Python, R) is a plus. Preferred: Relevant certifications (e.g., GCP Certified, AWS Certified, Azure Certified) Strong analytical and communication skills Ability to work collaboratively in a team environment High level of accuracy and attention to detail At Bayer we’re visionaries, driven to solve the world’s toughest challenges and striving for a world where ,Health for all, Hunger for none’ is no longer a dream, but a real possibility. We’re doing it with energy, curiosity and sheer dedication, always learning from unique perspectives of those around us, expanding our thinking, growing our capabilities and redefining ‘impossible’. There are so many reasons to join us. If you’re hungry to build a varied and meaningful career in a community of brilliant and diverse minds to make a real difference, there’s only one choice. We are looking for a Data Engineer  A Data Engineer delivers the designs set by more senior members of the data engineering community.  Individuals in this role will: Implement data flows to connect operational systems, data for analytics and business intelligence (BI) systems Document source-to-target mappings Re-engineer manual data flows to enable scaling and repeatable use Support the build of data streaming systems Write ETL (extract, transform, load) scripts and code to ensure the ETL process performs optimally Develop business intelligence reports that can be reused Build accessible data for analysis Specific on this position is a focus on Data Visualization. WHAT DO WE OFFER: A flexible, hybrid work model Great workplace in a new modern office in Warsaw Career development, 360° Feedback & Mentoring programme Wide access to professional development tools, trainings, & conferences Company Bonus & Reward Structure Increased tax-deductible costs for authors of copyrighted works VIP Medical Care Package (including Dental & Mental health) Holiday allowance (“Wczasy pod gruszą”) Life & Travel Insurance Pension plan Co-financed sport card - FitProfit Meals Subsidy in Office Budget for Home Office Setup & Maintenance Access to Company Game Room equipped with table tennis, soccer table, Sony PlayStation 5 and Xbox Series X consoles setup with premium game passes, and massage chairs Tailored-made support in relocation to Warsaw when needed Please send your CV in English WORK LOCATION: WARSAW AL. JEROZOLIMSKIE 158 ,[Show an awareness of the need to translate technical concepts into non-technical language, Understand what communication is required with internal and external stakeholders, Provide training and support to end-users on dashboard functionalities and data interpretation., Work closely with data engineering and IT teams to optimize data sources for visualization., Design, develop, and maintain interactive dashboards and reports using data visualization tools (Tableau, Power BI, …) that cater to various business needs., Collaborate with cross-functional teams to gather requirements and translate them into effective visual solutions., Analyze and interpret large datasets to identify trends, insights, and opportunities for improvement., Implement best practices in data visualization to enhance clarity, usability, and engagement., Design, build and test data products based on feeds from multiple systems, using a range of different storage technologies, access methods or both, Create repeatable and reusable products, Show an awareness of opportunities for innovation with new tools and uses of data, Explain the concepts and principles of data modeling, Produce, maintain and update relevant data models for an organization's specific needs, Reverse-engineer data models from a live system, Ensure data quality and integrity through data cleansing and validation processes., Understand the role of testing and how it works, , Focus on first project will be the responsibility to replicate or reconnect the current infrastructure (ETLs, Local data warehouses and data visualization tools) used by the countries in EMEA region. The target scenario will be a global infrastructure cloud-based data platforms (AWS, Snowflake, Tableau..) combining information in two different levels:, Global pipelines and global data models, Local pipelines to enrich the global data models covering the necessities of the countries in terms of business.] Requirements: ETL, Tableau, Power BI, SQL, AWS, Python, R Additionally: Sport subscription, Training budget, Private healthcare, International projects, Annual bonus, Authorship tax relief, Pension plan, PPK, Christmas bonus, Child allowance bonus, Budget for Home Office Setup & Maintenance, Free coffee, Bike parking, Playroom, Free beverages, In-house trainings, In-house hack days, Modern office, Free parking, No dress code.



  • Warsaw, Czech Republic RemoDevs Full time

    3+ years of Python development experience, including Pandas 5+ years writing complex SQL queries with RDBMSes. 5+ years of Experience with developing and deploying ETL pipelines using Airflow, Prefect, or similar tools. Experience with cloud-based data warehouses in environments such as RDS, Redshift, or Snowflake. Experience with data warehouse design:...


  • Warsaw, Czech Republic Innowise Full time

    What we expect: — Python knowledge: OOP basics, threads and GIL, working with the pandas library;— SQL knowledge: query language (DQL, DDL, DML, TCL), transactions and ACID principles, indexes;— Big Data knowledge: basic concepts — OLAP vs. OLTP, Data Warehouse and Data Lake, data normalization and denormalization;— Spoken English...


  • Remote, Warsaw, Czech Republic hubQuest Full time

    What we expect 5+ years of professional experience as a Data Engineer or Software Engineer in data-intensive environments Strong Python development skills, with solid understanding of OOP, modular design, and testing (unit/integration) Experience with PySpark and distributed data processing frameworks Hands-on experience with Azure Data ecosystem,...


  • Warsaw, Czech Republic Experis Polska Full time

    Tech Stack Programming: Python, PySpark, SQL, SparkSQL, Bash Azure: Databricks, Data Factory, Delta Lake, Data Vault 2.0 CI/CD: Azure DevOps, GitHub, Jenkins Orchestration: Airflow, Azure Data Factory Databases: SQL Server, Oracle, PostgreSQL, Vertica Cloud: Azure (expert), AWS (intermediate) Tools: FastAPI, REST APIs, Docker, Unity Catalog Preferred...


  • Warsaw, Czech Republic Bayer Full time

    WHO YOU ARE: Required: Cloud & Container Technologies: Strong understanding of cloud platforms (AWS+GCP), Kubernetes, containerization, Linux fundamentals, networking, and related technologies Reliability & Scalability: Strong understanding of distributed systems, system design, error budgets, capacity planning and other patterns that ensure resilient,...


  • Warsaw, Czech Republic Winged IT Full time

    Your skills and experiences: -> 6+ years of experience with SQL, PySpark, Python; -> Framework knowledge: Apache Airflow, AWS Glue, Kafka, Redshift; -> Cloud & DevOps: AWS (S3, Lambda, CloudWatch, SNS/SQS, Kinesis), Terraform; Git; CI/CD; -> Proven ownership of mission-critical data products (batch + streaming); -> Data modeling, schema evolution,...


  • Remote, Warsaw, Czech Republic C&F S.A. Full time

    What you will need:  3+ years of experience in Azure; Hands-on knowledge of the following data services and technologies in Azure (for example Databricks, Data Lake, Synapse, Azure SQL, Azure Data Factory, Azure Data Explorer); Experience with analytics, databases, data systems and working directly with clients; Good knowledge of SQL or Python; Azure...


  • Remote, Warsaw, Czech Republic CreatorIQ Full time

    Who you are and what you’ll need for this position:  5+ years of data engineering experience with production-scale systems Expert-level SQL skills with analytical databases (columnar databases preferred) Strong Python programming with data libraries: pandas, numpy, pyarrow Experience with ETL orchestration tools: Apache Airflow, Prefect, dbt, or similar...


  • Remote, Warsaw, Czech Republic C&F S.A. Full time

    What you will need:  4+ years of experience in Azure and 5+ of industrial experience in the domain of large-scale data management, visualization and analytics Hands-on knowledge of the following data services and technologies in Azure (for example Databricks, Data Lake, Synapse, Azure SQL, Azure Data Factory, Azure Data Explorer) Experience with...


  • Remote, Warsaw, Czech Republic KMD Poland Full time

    Ideal candidate:   Has 3+ years of commercial experience in implementing, developing, or maintaining data load systems (ETL/ELT).  Is proficient in Python, with a solid understanding of data processing challenges.  Has experience working with Apache Spark and Databricks.  Is familiar with MSSQL databases or other relational databases.  Has some...