 
						Data Modeler
2 weeks ago
Bachelor’s degree in Computer Science, Information Systems, or a related field. Master’s degree preferred. Proven experience in data modeling and data warehousing, with at least 2 years of experience in a similar role, preferably within the insurance industry. Strong understanding of data modeling techniques, including normalization, denormalization, star schema, and snowflake schema. Experience with data modeling tools such as ERwin, Sparx Enterprise Architect, or similar. Proficiency in SQL and database management systems (e.g., SQL Server, Oracle). Familiarity with ETL processes and tools (e.g., Azure Data Factory, Informatica, Talend). Knowledge of data governance and data quality principles. Excellent analytical and problem-solving skills, with the ability to perform system analysis to identify and extract relevant data. Experience in developing and executing test plans to validate data models. Strong communication skills, with the ability to explain complex data concepts to non-technical stakeholders. The Data Modeler will be responsible for designing, implementing, and maintaining data models that support data warehousing solutions within the insurance domain. The team works on Azure, but experience with other cloud platforms is also acceptable. ,[Design and develop conceptual, logical, and physical data models to support data warehousing initiatives specific to insurance operations., Collaborate with business analysts, architects, and IT teams to understand insurance business requirements and translate them into data models., Perform system analysis to identify, analyze, and extract the right data from source systems, ensuring the data models are populated with accurate and relevant data., Optimize data models for performance, scalability, and maintainability, focusing on insurance data such as policy, claims, underwriting, and customer data., Implement data modeling standards and best practices tailored to the insurance industry., Conduct data analysis and profiling to understand data quality and integrity, particularly in the context of insurance data., Conduct testing to validate data models, ensuring they meet business requirements and performance criteria, including verifying data accuracy, consistency, and integrity within the data warehousing environment., Work with ETL developers to ensure the accurate implementation of data models in data warehousing solutions., Maintain and update data models as insurance business requirements evolve., Document data models and metadata for reference and training purposes., Participate in data governance and data management activities to ensure compliance with organizational policies and industry regulations.] Requirements: SQL, Azure Data Factory, Informatica, Telend, Tableau, Power BI Tools: Agile.
- 
					  Senior Data Scientist @ hubQuest7 days ago 
 Remote, Warsaw, Czech Republic hubQuest Full timeWhat we expect 5+ years of professional experience in Data Science or ML Engineering, including production deployments MSc or PhD in Computer Science, Statistics, Mathematics, Physics, or related technical field Strong Python programming skills, including software engineering practices (OOP, modular code design, testing) Solid experience with ML frameworks... 
- 
					  Azure Data Engineer @ C&F S.A.7 days ago 
 Remote, Warsaw, Czech Republic C&F S.A. Full timeWhat you will need: 3+ years of experience in Azure; Hands-on knowledge of the following data services and technologies in Azure (for example Databricks, Data Lake, Synapse, Azure SQL, Azure Data Factory, Azure Data Explorer); Experience with analytics, databases, data systems and working directly with clients; Good knowledge of SQL or Python; Azure... 
- 
					
					
 Warsaw, Czech Republic Innowise Full timeWhat we expect: — Python knowledge: OOP basics, threads and GIL, working with the pandas library;— SQL knowledge: query language (DQL, DDL, DML, TCL), transactions and ACID principles, indexes;— Big Data knowledge: basic concepts — OLAP vs. OLTP, Data Warehouse and Data Lake, data normalization and denormalization;— Spoken English... 
- 
					  Senior Azure Data Engineer @ C&F S.A.7 days ago 
 Remote, Warsaw, Czech Republic C&F S.A. Full timeWhat you will need: 4+ years of experience in Azure and 5+ of industrial experience in the domain of large-scale data management, visualization and analytics Hands-on knowledge of the following data services and technologies in Azure (for example Databricks, Data Lake, Synapse, Azure SQL, Azure Data Factory, Azure Data Explorer) Experience with... 
- 
					  Senior Data Engineer @ RemoDevs2 weeks ago 
 Warsaw, Czech Republic RemoDevs Full time3+ years of Python development experience, including Pandas 5+ years writing complex SQL queries with RDBMSes. 5+ years of Experience with developing and deploying ETL pipelines using Airflow, Prefect, or similar tools. Experience with cloud-based data warehouses in environments such as RDS, Redshift, or Snowflake. Experience with data warehouse design:... 
- 
					  GCP Data Lake Developer @ SNI7 days ago 
 Remote, Czech Republic SNI Full timeStrong hands-on experience with Google Cloud Platform (BigQuery, Vertex AI). Expertise in BigQuery data modeling, large dataset management, and performance optimization. Proficiency with Oracle databases, including writing complex queries, data migration, and tuning. Solid knowledge of Databricks for collaborative data science and machine learning... 
- 
					  Data Engineer7 days ago 
 Remote, Czech Republic SNI Full timeStrong SQL expertise with proven ability to write, debug, and optimize complex queries. Expert-level Oracle PL/SQL development skills (procedures, functions, triggers, advanced features). Hands-on experience with Oracle Database and Oracle Exadata platforms. In-depth knowledge of database architecture, administration, and performance tuning techniques.... 
- 
					
					
 Remote, Wrocław, Czech Republic Data Hiro sp. z o.o. Full time6+ years of experience in IT 4+ years of hands-on experience with the Microsoft Power Platform, including: Working with Solutions Managing multiple environments (SDLC) Deployment Pipelines PowerApps (model-driven and canvas) Power BI (Dataverse connectivity) Power Automate (cloud flows, web services, Azure Functions) Power Pages Strong familiarity with Azure... 
- 
					  Databricks Data Engineer @ Link Group1 week ago 
 Remote, Czech Republic Link Group Full timeRequirements: 5+ years of experience in Data Engineering 2+ years of hands-on experience with Databricks Strong skills in SQL, PySpark, and Python Solid background in data warehousing, ETL, distributed data processing, and data modeling Excellent analytical and problem-solving skills in big data environments Experience with structured, semi-structured, and... 
- 
					  Senior Data Engineer1 week ago 
 Remote, Czech Republic N-iX Full timeRequirements: Over 4+ years of experience showcasing technical expertise and critical thinking in data engineering. Hands-on experience with DBT and strong Python programming skills. Proficiency in Snowflake and expertise in data modeling are essential. Demonstrated experience in building consumer data lakes and developing consumer analytics...