Data Engineer @ Ework Group
2 weeks ago
Azure Databricks (PySpark, Spark SQL; Unity Catalog; Jobs/Workflows). Azure data services: Azure Data Factory, Azure Key Vault, storage (ADLS), fundamentals of networking/identities. Python for data engineering (APIs, utilities, tests). Azure DevOps (Repos, Pipelines, YAML) and Git-based workflows. Experience operating production pipelines (monitoring, alerting, incident handling, cost control). 💻 Ework Group - founded in 2000, listed on Nasdaq Stockholm, with around 13,000 independent professionals on assignment - we are the total talent solutions provider who partners with clients, in both the private and public sector, and professionals to create sustainable talent supply chains. With a focus on IT/OT, R&D, Engineering and Business Development, we deliver sustainable value through a holistic and independent approach to total talent management. By providing comprehensive talent solutions, combined with vast industry experience and excellence in execution, we form successful collaborations. We bridge clients and partners & professionals throughout the talent supply chain, for the benefit of individuals, organizations and society. 🔹 For our Client we are looking for Senior Data Engineer 🔹 Preferred candidates are based in Wrocław, but applicants from other locations in Poland will also be considered. ✔️ Main assignment: Maintain and evolve the data flows used by the Picto application: Azure + Databricks pipelines (ADF + notebooks) that ingest data from APIs using Ingestion Framework, transform it (PySpark/Spark SQL), and deliver trusted datasets. ✔️ Tech stack you’ll meet: Azure, Databricks (PySpark/Spark SQL, Unity Catalog, Workflows), ADF, ADLS/Delta, Key Vault, Azure DevOps (Repos/Pipelines YAML), Python, SQL ,[Own day-to-day operations of Picto data pipelines (ingest → transform → publish), ensuring reliability, performance and cost efficiency., Develop and maintain Databricks notebooks (PySpark/Spark SQL) and ADF pipelines/Triggers; manage Jobs/Workflows and CI/CD., Implement data quality checks, monitoring & alerting (SLA/SLO), troubleshoot incidents, and perform root-cause analysis., Secure pipelines (Key Vault, identities, secrets) and follow platform standards (Unity Catalog, environments, branching)., Collaborate with BI Analysts and Architects to align data models and outputs with business needs., Document datasets, flows and runbooks; contribute to continuous improvement of the Ingestion Framework.] Requirements: Azure, Databricks, PySpark, Azure Data Factory, Azure DevOps, AI
-
Databricks Data Engineer @ Link Group
1 week ago
Remote, Czech Republic Link Group Full timeRequirements: 5+ years of experience in Data Engineering 2+ years of hands-on experience with Databricks Strong skills in SQL, PySpark, and Python Solid background in data warehousing, ETL, distributed data processing, and data modeling Excellent analytical and problem-solving skills in big data environments Experience with structured, semi-structured, and...
-
Senior Data Engineer @ Link Group
6 days ago
Remote, Czech Republic Link Group Full timeRequired Skills & Experience 5–8 years of hands-on experience in data engineering or similar roles. Strong knowledge of AWS services such as S3, IAM, Redshift, SageMaker, Glue, Lambda, Step Functions, and CloudWatch. Practical experience with Databricks or similar platforms (e.g., Dataiku). Proficiency in Python or Java, SQL (preferably Redshift), Jenkins,...
-
Senior Azure Data Engineer @ Link Group
2 weeks ago
Remote, Czech Republic Link Group Full time4+ years of experience in Azure and 5+ of industrial experience in the domain of large-scale data management, visualization and analytics; Hands-on knowledge of the following data services and technologies in Azure (for example Databricks, Data Lake, Synapse, Azure SQL, Azure Data Factory, Azure Data Explorer); Good understanding of Azure as a platform;...
-
Senior Data Engineer
5 days ago
Remote, Czech Republic Link Group Full timeDesired Background 4+ years of experience as a Data Engineer in cloud-focused projects. Proven success in designing and maintaining large-scale data infrastructure in GCP. Advanced English communication skills, both written and spoken. Nice to Have Professional certifications in GCP or big data technologies. Practical experience with BI platforms (e.g.,...
-
AI Engineer @ Link Group
1 week ago
Remote, Warszawa, Czech Republic Link Group Full time4+ years of experience in AI, Machine Learning, or Data Engineering. Strong programming skills in Python and experience with libraries for AI/ML. Hands-on experience with LangChain or similar frameworks for LLM applications. Solid understanding of cloud technologies (Azure, AWS, or GCP). Experience integrating APIs and deploying production-grade AI systems....
-
Data Engineer with Snowflake @ Link Group
2 weeks ago
Remote, Warszawa, Czech Republic Link Group Full timeRequirements 3+ years of experience in data platforms/warehousing/lakes: source integration, pipeline development, and analytics. Strong experience with Python (e.g. pandas) for scripting and data processing. Excellent SQL skills, preferably Postgres and SQL Server (T-SQL). Familiarity with AWS services (S3, EC2, Glue). Experience with iPaaS tools (SnapLogic...
-
Senior Python Engineer @ Link Group
1 day ago
Remote, Czech Republic Link Group Full timeStrong background in Python and Django. Experience with AWS services and cloud-based development. Proficiency with Docker, Kubernetes, and CI/CD tools. Knowledge of PostgreSQL, DynamoDB, and solid data modeling practices. Strong problem-solving abilities and communication skills. Curiosity and adaptability with a willingness to learn new technologies. (Nice...
-
Senior AI Engineer @ Link Group
1 day ago
Remote, Czech Republic Link Group Full time5+ years of hands-on Python experience, with strong skills in system design and optimization. Deep knowledge of back-end frameworks (e.g., Django, FastAPI, Flask). Practical experience with generative AI, testing AI agents, and flexibility across models/frameworks. AWS expertise, particularly with Amazon Connect and Amazon Lex. Strong background in designing...
-
Data Engineer @ Experis Polska
3 days ago
Remote, Warszawa, Czech Republic Experis Polska Full timeProven experience with traditional data warehouse and ETL systems (Informatica is a strong plus) Hands-on expertise with PySpark and Python Experience working with Databricks Proficiency in cloud platforms, preferably Microsoft Azure Experience in large-scale data migration projects Familiarity with CI/CD practices in data engineering Start Date: ASAP /...
-
SAS Platform Engineer
1 week ago
Remote, Czech Republic Link Group Full timeExperience with SAS DI Studio, SAS Management Console, and SAS Viya (Studio, CAS, Visual Analytics) Ability to design and maintain ETL processes Practical knowledge of administering and configuring SAS environments Hands-on experience with CI/CD (GitLab CI/CD), including automation and version control Knowledge of Docker and Kubernetes for containerization...