Data Platform DevOps Engineer

7 days ago


Warszawa, Czech Republic T-Mobile Polska Full time

3+ years of experience as a DevOps Engineer with strong hands-on expertise in data management in GCP, kubernetes, Spark Prior experience supporting data infrastructure or analytics platforms Experience in Infrastructure as Code tools. Skilled in scripting languages for automation tasks. Familiarity with cloud monitoring tools. Strong understanding of networking, security and cloud infrastructure best practices. Excellent problem-solving skills, proactive mindset and strong communication abilities Join our data infrastructure team as a DevOps Engineer. In this role, you will not only design, implement and support cloud and onprem infrastructure but also ensure our operations meet all relevant privacy and security requirements. You will be key in providing documentation and implementing controls needed for compliance across cloud services, data pipelines, and analytics systems. ,[Collaborate with data engineering, analytics and operations teams to streamline data applications, incl. big data and operational workflows., Monitor infrastructure health, performance and security, and resolve issues promptly., Implement and enforce privacy and security requirements, in line with organizational and regulatory standards., Provide thorough documentation of infrastructure, processes and compliance controls., Lead efforts for technical implementation of access controls, encryption, data retention and security monitoring., Conduct regular reviews and audits of systems to ensure ongoing compliance and drive remediation as needed., Automate and document recurring operational and compliance procedures to ensure reliability and transparency.] Requirements: DevOps, Data management, Kubernetes, Spark, Infrastructure as Code, Cloud, Networking, Security, GCP



  • Warszawa, Czech Republic ITDS Full time

    You’re ideal for this role if you have: Proven experience in large-scale data platform migration and modernization projects Deep expertise in Azure Databricks, Delta Lake, and Azure Data Factory Strong programming skills in Python, PySpark, and SQL Hands-on experience with CI/CD pipelines using Azure DevOps, GitHub, or Jenkins Solid understanding of...

  • Data Engineer @ EPIKA

    2 weeks ago


    Warszawa, Wrocław, Łódz, Czech Republic EPIKA Full time

    Data Engineering Foundations: Azure Databricks - PySpark, Spark SQL, Unity Catalog, Workflows Azure Data Factory, Key Vault, and ADLS/Delta OAuth, OpenID, SAML, JWT Medallion Architecture Strong SQL and data modeling in a lakehouse/Delta architecture Python for data engineering (API integration, utilities, testing) Operations: Running production pipelines...


  • Remote, Warszawa, Czech Republic Trust Sourcing Full time

    What You Will Bring: 5+ years of hands-on DevOps/SRE experience in high-growth SaaS startups. Proven chops with AWS, ECS/EKS, Terraform, GitHub Actions, Jenkins, and modern observability stacks. Track record of platform engineering that boosts developer productivity and SRE rigor that keeps SLAs honest. Comfort automating the boring stuff—bash, Python,...

  • DevOps Engineer

    5 days ago


    Remote, Warszawa, Kraków, Gdańsk, Katowice, Wrocław, Poznań, Łódź, Czech Republic KPMG Full time

    4+ years of experience in DevOps roles, preferably with Azure Deep knowledge of Azure DevOps, GitHub Actions, or similar CI/CD tools Experience with infrastructure-as-code (ARM, Bicep, Terraform) Familiarity with deploying and managing Spark jobs on Azure (Synapse, Data Factory, or Databricks) Knowledge of monitoring tools (Azure Monitor, Log Analytics,...


  • Warszawa, Czech Republic ITDS Full time

    You're ideal for this role if you have: Minimum 1 year of experience working as a DevOps Engineer with Terraform or in a similar position Proficiency in scripting for automation with languages such as Python, Bash or Go Strong understanding of cloud solution design standards focused on availability and maintainability Hands-on experience with Microsoft Azure...


  • Warszawa, Czech Republic Bayer Full time

    Bachelor’s/Masters degree in Computer Science, Engineering, or a related field. 3+ years of working experience in the field of Data & Analytics, preferably in the CPG industry 3+ years of proficient coding experience with Python for data engineering, including SQL and PySpark (DataFrame API, Spark SQL, MLlib), with hands-on experience in various databases...


  • Remote, Warszawa, Czech Republic Experis Polska Full time

    Proven experience with traditional data warehouse and ETL systems (Informatica is a strong plus) Hands-on expertise with PySpark and Python Experience working with Databricks Proficiency in cloud platforms, preferably Microsoft Azure Experience in large-scale data migration projects Familiarity with CI/CD practices in data engineering Start Date: ASAP /...


  • Remote, Warszawa, Czech Republic Link Group Full time

    Proven experience as a Product Manager or Product Owner in cloud or DevOps-related environments. Strong track record of leading cloud-native product initiatives (AWS, Azure, GCP, or hybrid). Excellent understanding of Agile practices, sprint planning, and quarterly delivery cycles. Advanced skills in JIRA and Confluence for backlog management and stakeholder...

  • Engineering Manager

    5 days ago


    Remote, Wrocław, Warszawa, Kraków, Czech Republic Shelf Full time

    Note: This position does not assume developing system architecture and daily coding. 4+ years of Engineering Management experience leading cross-functional, backend-centric teams (10+ engineers) in product companies operating multi-tenant SaaS systems. 5+ years as a Senior Software Engineer, Tech Lead, or Staff Engineer with deep Node.js or Python...


  • Warszawa, Czech Republic Bayer Full time

    Bachelor/Master’s degree in Computer Science, Engineering, or a related field. 5+ years of working experience in the field of Data & Analytics, preferably in the CPG industry 5+ years of proficient coding experience with Python for data engineering, including SQL and PySpark (DataFrame API, Spark SQL, MLlib), with hands-on experience in various databases...