Data Engineer
Python (advanced) Data Integration (advanced) Spark (advanced) ETL tools (regular) AWS (regular) Hadoop (regular) Kafka (regular) NIFI (regular) Java (junior) Type"":"image/png", "filename":"add value with data (2). png", "filesize":148425, "height":3456, "previewable":false, "url":"https://s3. eu-west-1. amazonaws.com/bucket. justjoin. it/offers/body_images/addepto-big-data-architect-krakow/1695202075. png", "width":6912}"
-
-
- type="image/png"
-
- attributes="{"presentation":"gallery"}" class="attachment
- -content
- -png"> We are Addepto, where you can feel a startup atmosphere! We believe that the only constant in life is change, so we try to keep developing and improving to become better at what we do every day! We act outside the box and create and deliver the best solutions in the area of Big Data, AI, and Business Intelligence. Our team based in Warsaw and remotely is looking for a Data Engineer focusing mainly on designing and constructing data processing architecture. We are open for candidates with different expertise levels who want to develop further their skills and experience in this role. Some of our recent Big Data projects:Data lakes which stores terabyte of data and process machine learning tasks for big telecom company
Streaming applications to server data analytics in
- time for manufacturing companies
Systems that support the
- making process and help to analyze data in a unified format for controlling and operations departments
Support
- time machine learning prediction on massive datasets, which prevents company losses for pharmaceutical companies
And more! What we offer:Work in a
- coordinated team of passionate enthusiasts of Big Data & Artificial Intelligence
Fast career path and opportunity to develop your qualifications thanks to sponsorship for trainings, conferences and many other development possibilities in various areas
Challenging international projects for global clients and innovative
- ups
Friendly atmosphere, outstanding people and great culture – autonomy and supportive work environment are crucial for us
Flexible working hours – you can adjust your schedule to better fit your daily routine
Work-life balance – we respect your private life so you don’t have to work overtime or on weekends
Possibility of both remote and
- based work – modern office space available in Warsaw, Cracow, Wroclaw, Bialystok or coworking space in any place in Poland if needed
Any form of employment – we offer B2B, employment contract or contract of mandate
Paid vacation – 20 fully paid days off if you choose B2B or contract of mandate
Other benefits – e. g. great
- building events, language classes, trainings & workshops, knowledge sharing sessions, medical & sports package, and others Responsibilities:Design and construction of scalable data processing architecture
Design, build and deploy effective data ingestion pipelines/streams in Stream
Sets Data Collector, or Kafka
Making an application that will aggregate, process, and analyze data from various sources
Cooperation with the Data Science department in the field of Machine Learning projects (including text/image analysis and building predictive models)Using Big Data and BI technologies (e. g. Spark, Kafka, Hadoop, SQL)Manage distributed database systems like Click
House, BQ, Teradata, Oracle Exadata, Postgre
SQL + Citus
Modelling, Star and Snowflake schema
Develop and organize data transformations in DBT and Apache Airflow
Translate requirements from the business and translate them into technical code
Ensure the best possible performance and quality in the packages
Manage business user’s expectations Requirements:Higher education in technical and mathematical studies (or the last year of studies)Commercial experience in the implementation, development, or maintenance of Business Intelligence or Big Data systems
Knowledge of Python (or Java/Scala)Experience in SQLHands-on experience with Big Data technologies (Spark, Hadoop, Databricks)Good command of the English language (min. B2+)Experience with cloud services (AWS, Azure or GCP)Independence and responsibility for delivering a solution
Excellent knowledge in Dimensional Data
Good communication and soft skills
Lead discussions, requirement sessions, should be able to comprehend, summarize and finalize the requirements
Familiarity with Ni
Fi, Docker, Kafka, Airflow, Splunk, DBT Are you interested in Addepto and would like to join us? Get in touch! We are looking forward to receiving your application. Would you like to know more about us? Visit our website (career page) and social media (Facebook, Linked
In).
Fii primul, care se va înregistra la oferta de muncă respectivă!
-
De ce să cauți de muncă pe Lucrezi.ro?
În fiecare zi oferte noi de muncă Puteți alege dintr-o gamă largă de locuri de muncă: Scopul nostru este de a oferi o gamă cât mai largă de opțiuni Lasă să-ți fie trimise noile oferte prin e-mail Fii primul care răspunde la noile oferte de muncă Toate ofertele de muncă într-un singur loc (de la angajatori, agenții și alte portaluri) Toate serviciile pentru persoanele aflate în căutarea unui loc de muncă sunt gratuite Vă vom ajuta să găsiți un nou loc de muncă