Big Data Consultant

2 weeks ago


Remote Warszawa Wrocław Białystok Kraków Gdańsk, Czech Republic Addepto Full time

What you'll need to succeed in this role:

  • 5+ years of proven commercial experience in implementing, developing, or maintaining Big Data systems.
  • Strong programming skills in Python or Java/Scala: writing a clean code, OOP design.
  • Experience in designing and implementing data governance and data management processes.
  • Familiarity with Big Data technologies like Spark, Cloudera, Airflow, NiFi, Docker, Kubernetes, Iceberg, Trino or Hudi.
  • Proven expertise in implementing and deploying solutions in cloud environments (with a preference for AWS).
  • Excellent understanding of dimensional data and data modeling techniques.
  • Excellent communication skills and consulting experience with direct interaction with clients.
  • Ability to work independently and take ownership of project deliverables.
  • Master's or Ph.D. in Computer Science, Data Science, Mathematics, Physics, or a related field.
  • Fluent English (C1 level) is a must.

Addepto is a leading consulting and technology company specializing in AI and Big Data, helping clients deliver innovative data projects. We partner with top-tier global enterprises and pioneering startups, including Rolls Royce, Continental, Porsche, ABB, and WGU. Our exclusive focus on AI and Big Data has earned us recognition by Forbes as one of the top 10 AI consulting companies.

As a Big Data Consultant you will have the exciting opportunity to work with a team of technology experts on challenging projects across various industries, leveraging cutting-edge technologies. Here are some of the projects we are seeking talented individuals to join:

  • Design and development of the platform for managing vehicle data for global automotive company. This project develops a shared platform for processing massive car data streams. It ingests terabytes of daily data, using both streaming and batch pipelines for near real-time insights. The platform transforms raw data for data analysis and Machine Learning, this empowers teams to build real-world applications like digital support and smart infotainment and unlocks data-driven solutions for car maintenance and anomaly detection across the organization.
  • Design and development of a universal data platform for global aerospace companies. This Azure and Databricks powered initiative combines diverse enterprise and public data sources. The data platform is at the early stages of the development, covering design of architecture and processes as well as giving freedom for technology selection.

This role represents a gradual shift away from hands-on coding towards a more strategic focus on system design, business consultation, and creative problem-solving. It offers an opportunity to engage more deeply with architecture-level decisions, collaborate closely with clients, and contribute to building innovative data-driven solutions from a broader perspective.


Discover our perks and benefits:
  • Work in a supportive team of passionate enthusiasts of AI & Big Data.
  • Engage with top-tier global enterprises and cutting-edge startups on international projects.
  • Enjoy flexible work arrangements, allowing you to work remotely or from modern offices and coworking spaces.
  • Accelerate your professional growth through career paths, knowledge-sharing initiatives, language classes, and sponsored training or conferences, including a partnership with Databricks, which offers industry-leading training materials and certifications.
  • Choose from various employment options: B2B, employment contracts, or contracts of mandate.
  • Make use of 20 fully paid days off available for B2B contractors and individuals under contracts of mandate.
  • Participate in team-building events and utilize the integration budget.
  • Celebrate work anniversaries, birthdays, and milestones.
  • Access medical and sports packages, eye care, and well-being support services, including psychotherapy and coaching.
  • Get full work equipment for optimal productivity, including a laptop and other necessary devices.
  • With our backing, you can boost your personal brand by speaking at conferences, writing for our blog, or participating in meetups.
  • Experience a smooth onboarding with a dedicated buddy, and start your journey in our friendly, supportive, and autonomous culture.
,[Design and develop scalable data management architectures, infrastructure, and platform solutions for streaming and batch processing using Big Data technologies like Apache Spark, Hadoop, Iceberg., Design and implement data management and data governance processes and best practices., Contribute to the development of CI/CD and MLOps processes., Develop applications to aggregate, process, and analyze data from diverse sources., Collaborate with the Data Science team on data analysis and Machine Learning projects, including text/image analysis and predictive model building., Develop and organize data transformations using DBT and Apache Airflow., Translate business requirements into technical solutions and ensure optimal performance and quality.] Requirements: Python, SQL, Spark, AWS, Airflow, Docker, Kubernetes, Hadoop, Iceberg, NiFi, Java, Scala, Kafka, Trino, Hudi Tools: Jira, Confluence, Wiki, GitHub, Agile, Scrum, Kanban. Additionally: Private healthcare, Multisport card, Referral bonus, MyBenefit cafeteria, International projects, Flat structure, Paid leave, Training budget, Language classes, Team building events, Small teams, Flexible form of employment, Flexible working hours and remote work possibility, Free coffee, Startup atmosphere, No dress code, In-house trainings.

  • Remote, Warsaw, Wrocław, Białystok, Kraków, Gdańsk, Czech Republic beBeeData Full time 800,000 - 1,200,000

    Job OverviewWe are seeking a skilled Big Data Developer to join our team and contribute to the design, development, and implementation of innovative data solutions.Key ResponsibilitiesDesign and develop large-scale data systems using Big Data technologies such as Databricks, Apache Spark, and Airflow.Develop and maintain high-quality code in Python, ensuring...

  • Big Data Engineer

    2 weeks ago


    Remote, Czech Republic Link Group Full time

    Must-Have QualificationsAt least 3+ years of experience in big data engineering.Proficiency in Scala and experience with Apache Spark.Strong understanding of distributed data processing and frameworks like Hadoop.Experience with message brokers like Kafka.Hands-on experience with SQL/NoSQL databases.Familiarity with version control tools like Git.Solid...

  • Big Data Developer

    2 weeks ago


    Warszawa, Mazovia, Czech Republic ASTEK Polska Full time

    Expectations:3+ years of experience with functional programming in ScalaExperience building Spark applications in ScalaFamiliarity with Hadoop technologies: Hive, Oozie, KafkaExperience with Docker and KubernetesGood communication skills and team collaborationProficiency in English (B2+)Nice to have:Experience in data engineering and building ETL/ELT...

  • Big Data Developer

    2 weeks ago


    Gdańsk, Pomerania, Czech Republic ASTEK Polska Full time

    Expectations:3+ years of experience with functional programming in ScalaExperience building Spark applications in ScalaFamiliarity with Hadoop technologies: Hive, Oozie, KafkaExperience with Docker and KubernetesGood communication skills and team collaborationProficiency in English (B2+)Nice to have:Experience in data engineering and building ETL/ELT...

  • Big Data Developer

    2 weeks ago


    Gdańsk, Pomerania, Czech Republic ASTEK Polska Full time

    Expectations:3+ years of experience with functional programming in ScalaExperience building Spark applications in ScalaFamiliarity with Hadoop technologies: Hive, Oozie, KafkaExperience with Docker and KubernetesGood communication skills and team collaborationProficiency in English (B2+)Nice to have:Experience in data engineering and building ETL/ELT...

  • Big Data Leader

    2 weeks ago


    Remote, Czech Republic beBeeDataEngineer Full time €100,000 - €130,000

    Job DescriptionWe are seeking a seasoned professional with experience in data pipeline creation and batch/streaming processing to lead our team.As a senior big data engineer, you will be responsible for designing, developing, and maintaining large-scale data systems.


  • Warszawa, Mazovia, Czech Republic Netcompany Poland Full time

    We are looking for skilled and ambitious Data Consultant to develop and deliver data warehouses and other business intelligence solutions.As a Consultant in our Warsaw office, you will work closely together with your Polish, Danish, Norwegian, the UK colleagues in developing and delivering exciting data projects.Lots of responsibility right from the...


  • Warszawa, Mazovia, Czech Republic beBeeDataScientist Full time 800,000 - 1,200,000

    Big Data Developer RoleWe are seeking a skilled Big Data Developer to join our team.Key Responsibilities:Design and implement scalable, high-performing global reporting solutions using Big Data technologies such as Spark-based applications in Scala.Collaborate with cross-functional teams to build efficient data pipelines that integrate dozens of data...

  • Big Data Developer

    2 weeks ago


    Warszawa, Mazovia, Czech Republic ASTEK Polska Full time

    Expectations:3+ years of experience with functional programming in ScalaExperience building Spark applications in ScalaFamiliarity with Hadoop technologies: Hive, Oozie, KafkaExperience with Docker and KubernetesGood communication skills and team collaborationProficiency in English (B2+)Nice to have:Experience in data engineering and building ETL/ELT...


  • Remote, Warszawa, Wrocław, Białystok, Kraków, Gdańsk, Czech Republic beBeeDataengineeringspecialist Full time €90,000 - €120,000

    Job OverviewThis role represents a Data Engineering Specialist who will have the exciting opportunity to work with a team of technology experts on challenging projects across various industries, leveraging cutting-edge technologies.As a key member of our team, you will be responsible for designing and developing scalable data management architectures,...