
Senior Data Engineer
1 day ago
- A skilled professional with 5+ years of hands-on experience developing and optimizing ETL solutions within Google Cloud Platform (GCP)
- Proficient with GCP services such as BigQuery, Cloud Run Functions, Cloud Run, and Dataform
- Experienced with SQL and Python for data processing and transformation
- Knowledgeable in RDBMS, particularly MS SQL Server
- Familiar with BI & Reporting systems and data modeling concepts, including tools like Power BI and Looker
- Experienced in working with Data Quality metrics, checks, and reporting to ensure accuracy, reliability, and governance of data solutions
- Skilled in migrating legacy SQL stored procedures to modern, cloud-native data processing solutions on Google Cloud Platform
- Adaptable and effective in fast-paced, changing environments
- A collaborative team member with excellent consulting and interpersonal skills
- Detail-oriented with strong analytical skills and sound judgment in technical decision-making
- Familiarity with Dataplex, LookML, Looker Studio, and Azure Data Factory is a plus
SoftServe is a global digital solutions company headquartered in Austin, Texas, founded in 1993. Our associates work on 2,000+ projects with clients across North America, EMEA, APAC, and LATAM. We are about people who create bold things, make a difference, have fun, and love their work.
Big Data & Analytics is the Center of Excellence's data consulting and data engineering branch. Hundreds of data engineers and architects nowadays build data & analytics end-to-end solutions from strategy through technical design and proof of concepts to full-scale implementation. We have customers in the healthcare, finance, manufacturing, retail, and energy domains.
We hold top-level partnership statuses with all the major cloud providers and collaborate with many technology partners like AWS, GCP, Microsoft, Databricks, Snowflake, Confluent, and others.
TOGETHER WE WILL- Address different business and technology challenges, engage in impactful projects, use top-notch technologies, and drive multiple initiatives as a part of the Center of Excellence
- Support your technical and personal growth — we have a dedicated career plan for all roles in our company
- Investigate new technologies, build internal prototypes, and share knowledge with the SoftServe data community
- Upskill with full access to Udemy learning courses
- Pass professional certifications, encouraged and covered by the company
- Adopt best practices from experts while working in a team of top-notch engineers and architects
- Collaborate with world-leading companies and attend professional events
SoftServe is an Equal Opportunity Employer. All qualified applicants will receive consideration for employment regardless of race, color, religion, age, sex, nationality, disability, sexual orientation, gender identity and expression, veteran status, and other protected characteristics under applicable law. Let's put your talents and experience in motion with SoftServe.
,[Be part of a data-focused engineering team contributing to modern data transformation and analytics initiatives, migrating large-scale systems from Azure to GCP, Collaborate closely with data engineers, BI developers, and business stakeholders to design and implement robust, high-quality data pipelines and models that drive strategic decision-making, Participate in the entire project lifecycle: from discovery and PoCs to MVPs and full production rollout, Engage with customers ranging from global enterprises to innovative startups, Continuously learn, share knowledge, and explore new cloud services, Contribute to building a data platform that integrates batch, streaming, and real-time components, Work in an environment that values technical ownership, code quality, and clean design] Requirements: GCP, BigQuery, ETL Additionally: Sport subscription, Training budget, Private healthcare, International projects, Flat structure, Small teams, Free coffee, Canteen, Bike parking, Free snacks, Free parking, In-house trainings, Modern office, No dress code.-
Senior Data Engineer
17 hours ago
Remote, Czech Republic beBeeDataEngineer Full time 450,000 - 850,000Job Title: Senior Data Engineer - Data Streaming SpecialistWe are seeking an experienced Senior Data Engineer to join our team and lead the development of high-quality data streaming pipelines.Key Responsibilities:Design, develop, and maintain scalable data streaming solutions using Confluent and Spark Structured Streaming.Collaborate with cross-functional...
-
Senior Data Engineer @
1 day ago
Remote, Czech Republic Dogtronic Solutions Full timeRequirementsStrong Python and SQL skills Proven experience delivering Databricks PySpark pipelines into production Solid hands-on experience with Databricks Experience with cloud platforms (Azure, AWS, or GCP) Comfortable working with large datasets and building robust data pipelines Clear communication skills – able to explain technical work...
-
Data Engineering Specialist
7 days ago
Remote, Czech Republic beBeeDataEngineer Full time €90,000 - €120,000Job DescriptionWe are seeking a highly skilled Data Engineer to join our team. As a Senior Data Engineer, you will be responsible for designing, developing, and maintaining large-scale data processing systems.Your primary focus will be on leveraging Trino (Starburst or Apache) to deliver high-performance data engineering solutions. You will work closely with...
-
Senior Data Engineer
1 week ago
Remote, Warszawa, Czech Republic Crestt Full timeTechnologies & ToolsSnowflake (enterprise data warehouse)dbt Cloud (ETL/ELT development – licenses provided)Confluence (documentation)Azure DevOps / Jira (task planning and tracking) We are seeking an experienced Senior Data Engineer with strong expertise in Snowflake to support a short-term data engineering initiative (September–December 2025,...
-
Remote, Czech Republic beBeeCloudEngineer Full time 1,000,000 - 1,200,000About the RoleWe are seeking a senior platform and infrastructure engineer to join our team in building and evolving a robust data analytics platform using Google Cloud Platform (GCP).
-
Azure Data Engineering Specialist
6 days ago
Remote, Czech Republic beBeeDataEngineer Full time 900,000 - 1,200,000Senior Azure Data Engineer with DatabricksThis is a challenging role that requires a strong foundation in data engineering and experience with cloud-based data platforms.RequirementsA minimum of 3 years' experience with Azure Data Factory and Databricks, along with at least 5 years' experience in data engineering or backend software development.Strong SQL...
-
Senior GCP Data Engineer @
1 week ago
Remote, Wrocław, Gdańsk, Rzeszów, Czech Republic Xebia sp. z o.o. Full time7+ years in a data engineering role, with hands-on experience in building data processing pipelines,experience in leading the design and implementing of data pipelines and data products,proficiency with GCP services, for large-scale data processing and optimization,extensive experience with Apache Airflow, including DAG creation, triggers, and workflow...
-
Chief Data Engineer
1 week ago
Remote, Czech Republic beBeeData Full time €65,000 - €80,000We are seeking a talented Senior Data Scientist to join our organization and contribute to shaping the future of finance teams.Key Responsibilities:Design, build, and deploy machine learning models using frameworks like LangChain.Owning the full technical stack from data pipelines to deployment.Closing collaboration with engineers and product teams in...
-
Remote, Wrocław, Gdańsk, Rzeszów, Czech Republic beBeeDataEngineer Full time €85,000 - €110,000Senior Data EngineerJob DescriptionWe are looking for a senior data engineer to join our team. The ideal candidate will have 3+ years' experience with Azure, specifically in the areas of Data Factory, SQL, Data Lake, Power BI, DevOps, Delta Lake, and CosmosDB.Key ResponsibilitiesDesign and develop scalable data processing architectures using distributed...
-
Senior Data Engineer @
4 days ago
Remote, Kraków, Czech Republic Link Group Full timeExperience with Python, SQL, and object-oriented programmingHands-on knowledge of Kubernetes, Docker, and containerization best practicesFamiliarity with cloud technologies (Azure or AWS)Experience with DevOps practices for logging, monitoring, testing, and alerting in data pipelinesDatabase administration skills: performance tuning, monitoring, data...