Senior/Middle Data Engineer @

5 days ago


Remote Kraków Zagreb Wrocław Split Warsaw, Czech Republic ELEKS Full time
REQUIREMENTS
  • Experience in Data Engineering, SQL, ETL(data validation + data mapping + exception handling) 3+ years 
  • Hands-on experience with Databricks 2+ years 
  • Experience with Python 
  • Experience with AWS (e.g. S3, Redshift, Athena, Glue, Lambda, etc.) 
  • Knowledge of the Energy industry (e.g. energy trading, utilities, power systems etc.) would be a plus 
  • Experience with Geospatial data would be a plus 
  • At least an Upper-Intermediate level of English 

ELEKS Software Engineering and Development Office is looking for a Senior/Middle Data Engineer in Ukraine, Poland, or Croatia.

ABOUT CLIENT

The customer is a British company producing electricity with zero carbon emissions.

,[Building Databases and Pipelines: Developing databases, data lakes, and data ingestion pipelines to deliver datasets for various projects , End-to-End Solutions: Designing, developing, and deploying comprehensive solutions for data and data science models, ensuring usability for both data scientists and non-technical users. This includes following best engineering and data science practices , Scalable Solutions: Developing and maintaining scalable data and machine learning solutions throughout the data lifecycle, supporting the code and infrastructure for databases, data pipelines, metadata, and code management , Stakeholder Engagement: Collaborating with stakeholders across various departments, including data platforms, architecture, development, and operational teams, as well as addressing data security, privacy, and third-party coordination ] Requirements: Data engineering, SQL, Data mapping, Databricks, Python, AWS, AWS S3, Redshift, Athena, Glue, AWS Lambda, Data pipelines

  • Remote, Kraków, Zagreb, Wrocław, Split, Warsaw, Czech Republic beBeeDataEngineer Full time €90,000 - €120,000

    Job OpportunityWe are seeking a highly skilled Data Engineer to develop databases, data lakes, and ingestion pipelines.Key Responsibilities:Designing, implementing, and deploying comprehensive solutions for data models, ensuring usability for both technical and non-technical users.Developing and maintaining scalable data solutions throughout the data...


  • Remote, Warsaw, Czech Republic Square One Resources Full time

    Bachelor s/Master s degree in Computer Science, Information Systems, or related field.Strong hands-on experience with Microsoft Fabric components:Data FactoryLakehouse / OneLakeSynapse Data EngineeringPower BIExperience with data modeling (star/snowflake) and performance tuning in Power BI.Deep understanding of modern data architecture patterns including...


  • Remote, Kraków, Gdynia, Rzeszów, Wrocław, Warszawa, Czech Republic ELEKS Full time

    REQUIREMENTSOverall experience in automation 4 + years Knowledge of fundamentals of QA methodology Web Testing, API Testing JS, Type Script Test automation framework-WebDriverIO At least an upper-intermediate level of English NICE TO HAVEPerformance testing with JMeter  ELEKS Quality Assurance Office is looking for a Middle strong/Senior Automation...


  • Remote, Czech Republic beBeeDataEngineer Full time 900,000 - 1,200,000

    Job Title: Senior Data EngineerWe are seeking a highly skilled Senior Data Engineer to join our team.The ideal candidate will have 2+ years of experience in data pipelines and transformations, with a strong focus on data modeling, ETL concepts, and data warehouse design.A minimum of 1 year of hands-on experience with Python, especially using Pandas, as well...


  • Remote, Warsaw, Kraków, Wrocław, Gdańsk, Czech Republic beBeeDataEngineering Full time 600,000 - 800,000

    Senior Data Engineer PositionThis is a senior-level data engineering position where you will be responsible for developing and maintaining data platforms, leading a team of data engineers, and driving competency development.Job DescriptionAs a senior data engineer, you will oversee the entire engineering lifecycle, ensuring successful data platform...


  • Wrocław, Gdańsk, Kraków, Poznan, Warsaw, Czech Republic Capgemini Polska Sp. z o.o. Full time

    You have hands-on experience in data engineering and are comfortable working independently on moderately complex tasks.You've worked with Databricks and PySpark in real-world projects.You're strong in Python for data transformation and automation.You've used at least one cloud platform (AWS, Azure, or GCP) in a production environment.You communicate clearly...


  • Remote, Czech Republic Matrix Global Services Full time

    At least 2 years' experience with JavaExperience in building, optimizing, and maintaining large-scale big data pipelines using popular open-source frameworks (Kafka, Spark, Hive, Presto, Airflow, etc)Experience with SQL/NoSQL/key value DBsHands-on experience in Spring, Sprint BootExperience with AWS cloud servicessuch as EMR, Aurora, Snowflake, S3, Athena,...

  • Middle Engineer @

    5 days ago


    Remote, Czech Republic N-iX Full time

    We are seeking a talented and experienced Middle Node.JS  Engineer to join our dynamic development team. About Us:The client is defining the future of cybersecurity through our XDR platform that automatically prevents, detects, and responds to threats in real-time. Singularity XDR ingests data and leverages our patented AI models to deliver autonomous...


  • Kraków, Warszawa, Wrocław, Czech Republic beBeeData Full time €60,000 - €80,000

    Unlock Your Potential as a Data EngineerAs a Data Engineer, you will be responsible for developing and maintaining data processing pipelines using Databricks and PySpark. This role requires collaboration with senior engineers and architects to implement scalable data solutions.Key Responsibilities:Develop and maintain data processing pipelines using...


  • Remote, Wrocław, Gdańsk, Rzeszów, Czech Republic Xebia sp. z o.o. Full time

    7+ years in a data engineering role, with hands-on experience in building data processing pipelines,experience in leading the design and implementing of data pipelines and data products,proficiency with GCP services, for large-scale data processing and optimization,extensive experience with Apache Airflow, including DAG creation, triggers, and workflow...