Data?1548802806
Data Engineer @ Cocus

Description

At COCUS we are working at the critical intersection of IT and business. True to our name COCUS – Company for Customers – we are proud to develop tailored solutions focus on the Internet of Things, Blockchain, Data Analytics and Information Security. Our customers are world-leaders in der respective industries – telecommunications, tourism, media, automotive, transport and logistics – impacting the life of millions across the world.

To help our customers shape the future, we need the brightest minds today. This is a fantastic opportunity for someone with the passion to explore and the right experience to apply that passion and knowledge to the solutions we offer our customers, and experience one truly international, fun and productive working environment.

What you will be doing:

  • Design, build and maintain AWS Data Pipelines to take data from multiple sources and format it into an appropriate source for the next steps
  • Manipulate and wrangle data at each stage of a pipeline
  • Design and build the Data Management architecture utilizing Cloud and Open Source components
  • Extracting data from Warehouses into a pipeline
  • Support Data Scientists and Data Analysts providing the appropriate data set and tools

What we are looking for:

  • Advanced programming with any of Python, Go, Java or Scala
  • Experience with Relational and NoSQL databases - setting up, loading, manipulating and extracting complex data
  • Experience in writing automated unit tests for data pipelines, covering integration and data transformation code
  • Experience in consuming and developing RESTful APIs
  • ELT/ETL experience with tools with that purpose
  • Experience with designing and implementing data models both for operational and analytical purposes
  • Experience with Data Warehousing standards and tools like Data Vault, DBT and Snowflake
  • Scheduling pipeline processes with AirFlow
  • Data Modelling via the different layers of an EDWH
  • Experience with open source Big Data tools like Avro and Presto
  • Knowledge of ingesting with streaming utilising Kafka or a similar tool
  • Able to build and deploy code into a secured environment in AWS Cloud tools
  • Working with Git tools for code management
  • Strong communication skills and presentation of their work
  • Fluent in written and spoken English
  • Bachelor’s Degree in Computer Engineering or similar. 

What will be a plus:

  • Experience in setting up and running Spark or EMR clusters in AWS
  • AWS Certifications

What we can offer you:

  • The opportunity to work in innovative and global projects in a fast-paced environment, having a direct impact on the solution/application
  • Informal and friendly culture that rewards innovation and teamwork
  • Permanent work contract and salary according to experience
  • Annual performance bonus
  • FlexOffice - You choose from where to work in Portugal and receive an individual budget to set up your workstation
  • 24 vacation days + 1 day per year of tenure + your birthday
  • Continuous development and training + internal knowledge sharing
  • Pet friendly office in Matosinhos-Sul, a 5-minute walk from the beach with public transportation around
  • 3K referral program - invite a friend to join the team!
  • Co-payment for monthly gym subscription or public transport pass
  • Besides other standard perks (Coverflex ticket meal, health insurance for you and your family…)
  • Flexible schedules and higher wage liquidity using Tickets® “Infância”, “Educação”, “Ensino”.

Interested? Please apply through: https://talent.sage.hr/jobs/0293a88b-7d90-4fac-9762-ec9bbeb96598