DataOps @ Adaptiq

7 days ago


Remote, Czech Republic Adaptiq Full time
  • 3+ years of experience working with Python and Airflow
  • 1+ years of experience writing queries for SQL databases and data warehouses such as MySQL, Presto, and Athena.

Nice to have:

  • Experience as a NOC or DevOps engineer
  • Experience with pySpark
  • Experience with NoSQL Databases and data warehouses (Couchbase, MongoDB)
  • At least a Bachelor’s degree in Computer Science, Math, Physics, Engineering, Statistics or other technical field

Who we are:

Adaptiq is a technology hub specializing in building, scaling, and supporting R&D teams for high-end, fast-growing product companies in a wide range of industries. 

About the Product:

The product our client develops is focused on tackling one of the biggest challenges in mobile app growth by leveraging Machine Learning and Big Data technology. Its platform is designed to accurately predict which apps users will love and seamlessly connect them in engaging ways.

Operating at a massive scale, its system processes over 50 TB of raw data daily, handles over 4 million requests per second, and reaches over a billion unique users each week. This cutting-edge platform has proven its strong product-market fit, reflected in the extraordinary growth and virtually zero customer churn and powers some of the most successful mobile brands globally.

About the Role: 

We’re seeking a data-focused, quality-driven analyst with a passion for ensuring robust data processes and observability. This role is ideal for someone dedicated to building and supporting high-quality data products and processes, as well as handling production data and ad-hoc data requests.

As a DataOps Analyst, you will be responsible for maintaining the quality of our data services and knowledge platform across all product data processes. You’ll work closely with stakeholders, playing a key role in driving business success by enhancing the quality, stability, and lifecycle of data. This role empowers our Operations teams to impact daily business outcomes effectively and efficiently.

,[Process Monitoring: Oversee and monitor daily data processes to ensure seamless operations. This includes troubleshooting server and process issues, escalating bugs as needed, and documenting any data-related problems. , Ad-hoc Operation Configuration: Act as a bridge between operational needs and data processes. Use tools like Airflow, Python scripting, and SQL to extract specific client-relevant data points and adjust configurations as necessary. This is often the role's most dynamic and time-intensive aspect, taking up about 30% of your time, but can increase to 80% during larger projects., Data Quality Automation: Develop and maintain automated data quality tests and validations using Python and testing frameworks., Metadata Store Ownership: Take charge of the metadata store by creating and managing a system that holds essential metadata for tables, columns, calculations, and data lineage. You’ll contribute to the design and development of the knowledge base metadata store and its UX, becoming the key point of reference for questions about data sources, purposes, and calculation methods. ] Requirements: Linux, k8, OCP, WebLogic, WebSphere Application Server, troubleshooting, Python, Bash script Tools: . Additionally: Private healthcare, International projects, Flat structure.