Data Engineer

Місто : , Kyiv,
Компанія : Datumo
Зарплата :
Знайдено : день тому

Опис

Datumo specializes in providing Big Data and Cloud consulting services to clients from all over the world, primarily in Western Europe, Poland and the USA. Core industries we support include e-commerce, telecommunications and life science. Our team consists of exceptional people whose commitment allows us to conduct highly demanding projects .  Our team members tend to stick around for more than 3 years, and when a project wraps up, we don't let them go - we embark on a journey to discover exciting new challenges for them. It's not just a workplace; it's a community that grows together!  What we expect:  Must-have:  2 years of commercial experience in Big Data proven record with a selected cloud provider (GCP, Azure or AWS) good knowledge of Python or JVM (Scala preferred, can be Java) strong understanding of Spark or similar distributed data processing framework experience with BigQuery, Snowflake, Hive or similar distributed datastore designing and implementing Big Data systems following best practices ensuring solution quality through automatic tests, CI / CD and code review proven collaboration with businesses English proficiency at B2 level, communicative in Polish Nice to have:  experience in Snowflake/Databricks platform familiarity with Airflow or similar pipeline orchestrator  knowledge of Apache Kafka, Docker and Kubernetes technologies experience in Machine Learning projects willingness to share knowledge (conferences, articles, open-source projects)   What’s on offer:  100% remote work, with workation opportunity  20 free days onboarding with a dedicated mentor project switching possible after a certain period individual budget for training and conferences benefits: Medicover private medical care, co-financing of the Medicover Sport card opportunity to learn English with a native speaker regular company trips and informal get-togethers Development opportunities in Datumo:  participation in industry conferences establishing Datumo's online brand presence support in obtaining certifications (e.g. GCP, Azure, Snowflake) involvement in internal initiatives, like building technological roadmaps training budget access to internal technological training repositories   Discover our exemplary projects:  IoT data ingestion to cloud  The project integrates data from edge devices into the cloud using Azure services. The platform supports data streaming via either the IoT Edge environment with Java or Python modules, or direct connection using Kafka protocol to Event Hubs. It also facilitates batch data transmission to ADLS. Data transformation from raw telemetry to structured tables is done through Spark jobs in Databricks or data connections and update policies in Azure Data Explorer. Petabyte-scale data platform migration to Google Cloud The goal of the project is to improve scalability and performance of the data platform by transitioning over a thousand active pipelines to GCP. The main focus is on rearchitecting existing Spark applications to either Cloud Dataproc or Cloud BigQuery SQL, depending on the Client’s requirements and automate it using Cloud Composer. Data analytics platform for investing company The project centers on developing and overseeing a data platform for an asset management company focused on ESG investing. Databricks is the central component. The platform, built on Azure cloud, integrates various Azure services for diverse functionalities. The primary task involves implementing and extending complex ETL processes that enrich investment data, using Spark jobs in Scala. Integrations with external data providers, as well as solutions for improving data quality and optimizing cloud resources, have been implemented.  Realtime Consumer Data Platform The initiative involves constructing a consumer data platform (CDP) for a major Polish retail company. Datumo actively participates from the project’s start, contributing to planning the platform’s architecture. The CDP is built on Google Cloud Platform (GCP), utilizing services like Pub/Sub, Dataflow and BigQuery. Open-source tools, including a Kubernetes cluster with Apache Kafka, Apache Airflow and Apache Flink, are used to meet specific requirements. This combination offers significant possibilities for the platform.  Recruitment process: Quiz - 15 minutes Soft skills interview - 30 minutes Technical interview - 60 minutes Find out more by visiting our website - https://www.datumo.io If you like what we do and you dream about creating this world with us - don’t wait, apply now!

Схожі вакансії

    Customer Data Platforms Data Engineer

    • , Warsaw ,
    • 23 години тому

    ... + years of experience as a Data Engineer 5+ years of expertise in implementing and optimizing Customer Data Platforms 5+ years of exposure ...

    jobs.jti.com

    Consumer Data Product Manager

    • , Warsaw ,
    • 23 години тому

    ... employee referral program, your personal data will be processed until the ... the processing of your personal data may be extended each time ... the processing of your personal data during the recruitment process, further ...

    jobs.jti.com

    DC Service Delivery Manager

    • , Warsaw ,
    • 9 днів тому

    ... and escalate deviations to the data center service delivery lead Maximize ... employee referral program, your personal data will be processed until the ... the processing of your personal data during the recruitment process, further ...

    jobs.jti.com

    Marketing Automation Product Manager

    • , Warsaw ,
    • 23 години тому

    ... employee referral program, your personal data will be processed until the ... the processing of your personal data may be extended each time ... the processing of your personal data during the recruitment process, further ...

    jobs.jti.com

    I&O DE Process Manager

    • , Warsaw ,
    • 23 години тому

    ... employee referral program, your personal data will be processed until the ... the processing of your personal data may be extended each time ... the processing of your personal data during the recruitment process, further ...

    jobs.jti.com

    IT Process Analyst Manager (SAP PPM & SAP PS)

    • , WARSAW ,
    • 23 години тому

    ... Business Intelligence groups assess master data and reporting requirements for key ... employee referral program, your personal data will be processed until the ... the processing of your personal data during the recruitment process, further ...

    jobs.jti.com

    IT Business Expert Manager

    • , WARSAW ,
    • 23 години тому

    ... employee referral program, your personal data will be processed until the ... the processing of your personal data may be extended each time ... the processing of your personal data during the recruitment process, further ...

    jobs.jti.com

    Process Analyst Manager (SAP PPM & SAP PS)

    • , WARSAW ,
    • 5 днів тому

    ... Business Intelligence groups assess master data and reporting requirements for key ... employee referral program, your personal data will be processed until the ... the processing of your personal data during the recruitment process, further ...

    jobs.jti.com

    Business Expert Manager

    • , WARSAW ,
    • 5 днів тому

    ... employee referral program, your personal data will be processed until the ... the processing of your personal data may be extended each time ... the processing of your personal data during the recruitment process, further ...

    jobs.jti.com

    IT Process Analyst Manager

    • , WARSAW ,
    • 23 години тому

    ... employee referral program, your personal data will be processed until the ... the processing of your personal data may be extended each time ... the processing of your personal data during the recruitment process, further ...

    jobs.jti.com

    Process Analyst Manager

    • , WARSAW ,
    • 5 днів тому

    ... employee referral program, your personal data will be processed until the ... the processing of your personal data may be extended each time ... the processing of your personal data during the recruitment process, further ...

    jobs.jti.com

    Data Managment Associate

    • , Bucharest ,
    • 23 години тому

    ... on advancing their career in data analytics, data governance, and data management. This position is tailored ... in handling different types of data formats like CSV, JSON, parquet, ...

    jobs.jti.com

    Data Managment Associate

    • , Bucharest ,
    • 23 години тому

    ... on advancing their career in data analytics, data governance, and data management. This position is tailored ... in handling different types of data formats like CSV, JSON, parquet, ...

    jobs.jti.com

    Data Managment Associate

    • , Bucharest ,
    • 23 години тому

    ... on advancing their career in data analytics, data governance, and data management. This position is tailored ... in handling different types of data formats like CSV, JSON, parquet, ...

    jobs.jti.com

    DWH Architect Data Modeling

    • Kitopi
    • , Odesa,
    • день тому

    ... working in data, with experience spanning multiple data disciplines, including data science, data engineering, data analysis, and data warehousing.   3+ years of experience in data modeling at a high-growth ... that the data models would be able to ...

    ua.talent.com

    DWH Architect Data Modeling

    • Kitopi
    • , Kyiv,
    • 6 днів тому

    ... working in data, with experience spanning multiple data disciplines, including data science, data engineering, data analysis, and data warehousing.   3+ years of experience in data modeling at a high-growth ... that the data models would be able to ...

    ua.talent.com

    Data Governance Manager

    • , Bucharest ,
    • 23 години тому

    ... ensuring adherence to an enterprise data governance  and data quality framework for data policies, standards and practices, both ... and technology to ensure that data related business requirements are clearly ...

    jobs.jti.com

    Data Engineer

    • airSlate
    • , Odesa,
    • 4 дні тому

    ... seeking a talented Data Engineer to join us and help ... of experience in a Data Engineer role Python Scala Java Proficiency in data modeling (Dimensional data modeling, Data Vault, SCD, OBT, etc.) Building data processing pipelines utilizing AWS services ...

    ua.talent.com

    Data Engineer

    • airSlate
    • , Kyiv,
    • 4 дні тому

    ... seeking a talented Data Engineer to join us and help ... of experience in a Data Engineer role Python Scala Java Proficiency in data modeling (Dimensional data modeling, Data Vault, SCD, OBT, etc.) Building data processing pipelines utilizing AWS services ...

    ua.talent.com

    Data Engineer

    • HCLTech
    • , Kyiv,
    • 5 днів тому

    ... products on our Azure Paas data platform At HCL, we are growing our Data & Analytics footprint. As our new data engineer you will collaborate with product ...

    ua.talent.com
Top