Data?1548802806
Data Engineer @ Cocus

Description

At COCUS we are working at the critical intersection of IT and business. True to our name COCUS – Company for Customers – we are proud to develop tailored solutions focus on the Internet of Things, Blockchain, Data Analytics and Information Security. Our customers are world-leaders in der respective industries – telecommunications, tourism, media, automotive, transport and logistics – impacting the life of millions across the world.

To help our customers shape the future, we need the brightest minds today. This is a fantastic opportunity for someone with the passion to explore and the right experience to apply that passion and knowledge to the solutions we offer our customers, and experience one truly international, fun and productive working environment.

 

What you will be doing:

  • Supporting the configuration of the Operational Data Store and ingesting data into the platform tool
  • Ongoing maintenance and optimisation of pipelines as new data sources, incorporate new models, attributes and get updates
  • Drive the data into the platform which is used to trigger, personalise, segment and track campaigns
  • Support the build and integration of data science models and segmentation to ensure fit for purpose for the users
  • Build endpoints and queries within the systems to collect and request data, such as images, inventory, recommendations, to dynamically populate messages at time of sending
  • Build scalable API’s and connectors from these source systems to ensure that picks up and makes changes into all relevant fields
  • Ability to model and prepare the data including applying suppression & rules to data before ingestion
  • Support the ingestion of data & build of any Analytics & BI.

 

What we are looking for:

  • Advanced programming with any of Python, Go, Java or Scala
  • Experience with SQL databases - setting up, loading, manipulating and extracting complex data
  • Good understanding of NoSQL databases
  • Setting up and running Spark or Hadoop clusters
  • Experience with open source Big Data tools like Avro and Presto
  • Knowledge of ingesting with streaming utilising Kafka or a similar tool
  • Experience in consuming RESTful or SOAP APIs
  • Working with Git tools for code management
  • Strong communication skills and presentation of their work
  • Fluent in written and spoken English
  • Bachelor’s Degree in Computer Engineering or similar.

 

What will be a plus:

  • ELT experience with tools like Matillion
  • Experience with Data Wrangling Solution (Alteryx, DataMeer)
  • Experience with cloud services (AWS, Azure or Google Cloud).

 

What we can offer you:

  • The ability to work in Innovative projects for global projects in a fast-paced environment where you can have a direct impact on the application
  • Informal and friendly culture that rewards innovation and teamwork
  • Salary according to experience
  • Permanent Contract
  • Annual performance bonus
  • Gym Membership
  • Ticket meal
  • Continuous Development and Training
  • Health Insurance
  • Flexible schedules and remote work.

 

Interested? Apply through this link: https://talent.cake.hr/jobs/681ea7b0-561b-4b31-8e70-990239c8d30a