Data Engineer Middle Senior
Neo
Games is a leader in the i
Lottery and i
Gaming space offering solutions spanning game studios, game aggregation, lotteries, online casino, sportsbook, bingo, and managed services offered through an industry leading core platform.
The Data & BI team owns the group’s Data & Analytics platforms spanning Data Engineering,
Analytical Engineering and Business Intelligence to lead the group’s
- driven modernisation both internally and for its clients.
The Data Engineer will play a vital role as part of a
- functional team to develop data pipelines to ingest, transform, distribute and expose data from the group’s Core Data Lake for integration, reporting, analytics, and automations.
The chosen candidate needs to be passionate about building scalable data models and architecture for consumption by other teams with the aim of making it easy for BI, Analytics, Product and other data consumers to build
- driven solutions, features and insights.
Responsibilities:
- Create data pipelines – both as batch and in
- time - to ingest data from dissimilar sources - Collaborate with the other teams to address data sourcing and provision requirements
- Design and monitor robust, recoverable data pipelines following best practices with an eye out for performance, reliability, and monitoring
- Innovation drives us - carry out research and development and work on Po
Cs to propose, trial and adopt new processes and technologies - Coordinate with the Product & Technology teams to ensure all platforms collect and provide appropriate data
- Liaise with the other teams to ensure reporting and analytics needs can be addressed by the central data lake
- Support the Data Quality and Security initiatives by building into the architecture the necessary data access, integrity, and accuracy controls
Requirements:
- 3+ years of experience in Data Engineering
- Degree in Computer Science, Software Development or Engineering
- Proficient in Python. Past exposure to Java will be considered an asset
- Understanding of RDMS, Columnar and No
SQL engines & their performance - Experience with cloud architecture and tools: Microsoft Azure, Amazon or GCP
- Experience with orchestration tools such as Apache Air
Flow, dbt - Prior exposure to the Snowflake ecosystem will be considered an asset
- Familiarity with Docker/Kubernetes and containerisation
- Strong background in stream data processing technologies such as Ni
Fi, Kinesis, Kafka - A grasp of Dev
Ops concepts and tools including Terraform and Ansible are an advantage - Understanding of distributed logging platforms - ideally the ELK stack
Skills:
- Fluency in spoken and written English is essential
- Passionate about data and on the lookout for opportunities to optimise
- Passionate about technology and eager to trial and recommend new tools or platforms
We offer:
- High-level compensation and regular performance based salary and career development reviews
- Possibility to work in a big and successful company
- PE accounting and support
- Medical insurance (health), employee assistance program
- Paid vacation, holidays and sick leaves
- Sport compensation
- English classes with native speakers, training, conferences participation
- Referral program
- Team buildings, corporate events
Fii primul, care se va înregistra la oferta de muncă respectivă!
-
De ce să cauți de muncă pe Lucrezi.ro?
În fiecare zi oferte noi de muncă Puteți alege dintr-o gamă largă de locuri de muncă: Scopul nostru este de a oferi o gamă cât mai largă de opțiuni Lasă să-ți fie trimise noile oferte prin e-mail Fii primul care răspunde la noile oferte de muncă Toate ofertele de muncă într-un singur loc (de la angajatori, agenții și alte portaluri) Toate serviciile pentru persoanele aflate în căutarea unui loc de muncă sunt gratuite Vă vom ajuta să găsiți un nou loc de muncă