Senior Data Engineer

24 hours ago


Remote, Czech Republic SoftServe Full time
IF YOU ARE
  • A skilled professional with 5+ years of hands-on experience developing and optimizing ETL solutions within Google Cloud Platform (GCP)
  • Proficient with GCP services such as BigQuery, Cloud Run Functions, Cloud Run, and Dataform
  • Experienced with SQL and Python for data processing and transformation
  • Knowledgeable in RDBMS, particularly MS SQL Server
  • Familiar with BI & Reporting systems and data modeling concepts, including tools like Power BI and Looker
  • Experienced in working with Data Quality metrics, checks, and reporting to ensure accuracy, reliability, and governance of data solutions
  • Skilled in migrating legacy SQL stored procedures to modern, cloud-native data processing solutions on Google Cloud Platform
  • Adaptable and effective in fast-paced, changing environments
  • A collaborative team member with excellent consulting and interpersonal skills
  • Detail-oriented with strong analytical skills and sound judgment in technical decision-making
  • Familiarity with Dataplex, LookML, Looker Studio, and Azure Data Factory is a plus
WE ARE

SoftServe is a global digital solutions company headquartered in Austin, Texas, founded in 1993. Our associates work on 2,000+ projects with clients across North America, EMEA, APAC, and LATAM. We are about people who create bold things, make a difference, have fun, and love their work.

Big Data & Analytics is the Center of Excellence's data consulting and data engineering branch. Hundreds of data engineers and architects nowadays build data & analytics end-to-end solutions from strategy through technical design and proof of concepts to full-scale implementation. We have customers in the healthcare, finance, manufacturing, retail, and energy domains.

We hold top-level partnership statuses with all the major cloud providers and collaborate with many technology partners like AWS, GCP, Microsoft, Databricks, Snowflake, Confluent, and others.

TOGETHER WE WILL
  • Address different business and technology challenges, engage in impactful projects, use top-notch technologies, and drive multiple initiatives as a part of the Center of Excellence
  • Support your technical and personal growth — we have a dedicated career plan for all roles in our company
  • Investigate new technologies, build internal prototypes, and share knowledge with the SoftServe data community
  • Upskill with full access to Udemy learning courses
  • Pass professional certifications, encouraged and covered by the company
  • Adopt best practices from experts while working in a team of top-notch engineers and architects
  • Collaborate with world-leading companies and attend professional events

SoftServe is an Equal Opportunity Employer. All qualified applicants will receive consideration for employment regardless of race, color, religion, age, sex, nationality, disability, sexual orientation, gender identity and expression, veteran status, and other protected characteristics under applicable law. Let’s put your talents and experience in motion with SoftServe.

,[Be part of a data-focused engineering team contributing to modern data transformation and analytics initiatives, migrating large-scale systems from Azure to GCP, Collaborate closely with data engineers, BI developers, and business stakeholders to design and implement robust, high-quality data pipelines and models that drive strategic decision-making, Participate in the entire project lifecycle: from discovery and PoCs to MVPs and full production rollout, Engage with customers ranging from global enterprises to innovative startups, Continuously learn, share knowledge, and explore new cloud services, Contribute to building a data platform that integrates batch, streaming, and real-time components, Work in an environment that values technical ownership, code quality, and clean design] Requirements: GCP, BigQuery, ETL Additionally: Sport subscription, Training budget, Private healthcare, International projects, Flat structure, Small teams, Free coffee, Canteen, Bike parking, Free snacks, Free parking, In-house trainings, Modern office, No dress code.

  • Remote, Czech Republic beBeeDataEngineer Full time 450,000 - 850,000

    Job Title: Senior Data Engineer - Data Streaming SpecialistWe are seeking an experienced Senior Data Engineer to join our team and lead the development of high-quality data streaming pipelines.Key Responsibilities:Design, develop, and maintain scalable data streaming solutions using Confluent and Spark Structured Streaming.Collaborate with cross-functional...


  • Remote, Wrocław, Czech Republic AVENGA (Agencja Pracy, nr KRAZ: 8448) Full time

    Key Requirements:5 years of experience as Data EngineerProven experience in Azure Databricks (data engineering, pipelines, performance tuning, Python)Azure DevOps (Repos, Pipelines, YAML)Azure Key VaultAzure Data Factory (optional)Good to have knowledge within Power BI.Strong analytical and problem-solving skillsExcellent communication and stakeholder...


  • Remote, Czech Republic Dogtronic Solutions Full time

    RequirementsStrong Python and SQL skills Proven experience delivering Databricks PySpark pipelines into production Solid hands-on experience with Databricks Experience with cloud platforms (Azure, AWS, or GCP) Comfortable working with large datasets and building robust data pipelines Clear communication skills – able to explain technical work...


  • Remote, Czech Republic SquareOne Full time

    What You Bring? We’re looking for experienced data engineers ready to take ownership and elevate enterprise-level data systems: 4+ years in data engineering and cloud data platforms Proven expertise with Snowflake, Python, and SQL Experience with data lakes, batch/stream processing, and cloud-native architectures Strong understanding of data...


  • Remote, Czech Republic PAR Data Central Full time

    What do we need from you:  Strong expertise in MS SQL Server, including: Writing complex stored procedures, triggers, and functions Query tuning and performance optimization using tools like Execution Plans and SQL Profiler Hands-on experience with high-performance databases such as ClickHouse or similar solutions Deep understanding of OOP, Design...


  • Remote, Czech Republic Dogtronic Solutions Full time

    Requirements Strong Python and SQL skills  Proven experience delivering Databricks PySpark pipelines into production  Solid hands-on experience with Databricks  Experience with cloud platforms (Azure, AWS, or GCP)  Comfortable working with large datasets and building robust data pipelines  Clear communication skills – able to explain technical...


  • Remote, Czech Republic beBeeDataEngineer Full time 900,000 - 1,200,000

    Senior Azure Data Engineer with DatabricksThis is a challenging role that requires a strong foundation in data engineering and experience with cloud-based data platforms.RequirementsA minimum of 3 years' experience with Azure Data Factory and Databricks, along with at least 5 years' experience in data engineering or backend software development.Strong SQL...


  • Remote, Czech Republic SoftServe Full time

    IF YOU AREA skilled professional with 5+ years of hands-on experience developing and optimizing ETL solutions within Google Cloud Platform (GCP)Proficient with GCP services such as BigQuery, Cloud Run Functions, Cloud Run, and DataformExperienced with SQL and Python for data processing and transformationKnowledgeable in RDBMS, particularly MS SQL...


  • Remote, Czech Republic Matrix Global Services Full time

    At least 2 years’ experience with Java Experience in building, optimizing, and maintaining large-scale big data pipelines using popular open-source frameworks (Kafka, Spark, Hive, Presto, Airflow, etc) Experience with SQL/NoSQL/key value DBs Hands-on experience in Spring, Sprint Boot Experience with AWS cloud services such as EMR, Aurora, Snowflake, S3,...


  • Remote, Warsaw, Cracow, Czech Republic RTB House Full time

    5+ years of hands-on experience in data engineering roles, building and maintaining large-scale distributed data systems. Proven experience working with petabyte-scale datasets and high-throughput systems. Strong programming skills in Python, Java, or Scala. Solid understanding of database management systems (both relational and non-relational). Expertise...