Regular Data Engineer
As a leading provider of Human Resources consulting services in Transylvania, we deliver regional coverage and specialized expertise across four core areas: recruitment and selection, personnel leasing, assessment centers and HR consultancy. With a strong and consistent presence on the Romanian market, we continue to consolidate our position through a strategic commitment to continuous improvement and alignment with evolving business needs.
Our success is founded on the professionalism of our services, the multidisciplinary capabilities of our consulting team, and the
- standing partnerships we maintain with clients who rely on our support in navigating complex HR challenges.
We collaborate with organizations across a broad range of industries, including IT&C, automotive, outsourcing, pharmaceutical, banking, FMCG and others, building sustainable,
- term relationships that contribute to their organizational growth.
Guided by the principles of client orientation, teamwork, flexibility, excellence, dedication, and responsibility, we remain focused on delivering measurable value and consistently
- quality services to our partners.
The role involves sourcing data from multiple systems, optimizing workflows and collaborating with architects and analysts to deliver clean,
- structured datasets.
Required Skills & Experience:
• 3+ years of experience in data engineering across hybrid environments (on-premise and cloud).
• Proficiency in SQL and Python or Java/Scala.
• Hands-on experience with ETL/ELT tools and frameworks.
• Good understanding of GCP data services: Big
Query, Dataproc, Dataflow, Cloud Storage.
• Familiarity with data modeling, schema design, and metadata management.
• Knowledge of data governance, security, and compliance best practices.
Nice To Have:
• GCP certification (e. g. , Professional Data Engineer) is a major plus.
• Experience with Big Data technologies.
Key Responsibilities:
• Solution Design: Architect data pipelines down to the
- level elements, ensuring clarity and precision in implementation.
• Data Sourcing: Extract data from diverse repositories including relational
databases (Oracle, Postgre
SQL), No
SQL stores, file systems, and other
structured/unstructured sources.
• Data Transformation: Design and implement ETL/ELT workflows to standardize and cleanse data using best practices in data engineering.
• Pipeline Development: Build scalable,
- tolerant data pipelines that support batch and streaming use cases.
• Cloud data processing: Load transformed data into GCP destinations such as Big
Query or Cloud Storage using tools like Dataproc, Dataflow, and other GCPnative services.
• Workflow Orchestration: Design and manage workflows using orchestration tools such as Apache Airflow or Cloud Composer.
• Data Format Expertise: Work with various data formats including JSON, AVRO, Parquet, CSV, and others.
• Optimization & Monitoring: Ensure performance, reliability, and
- efficiency of data pipelines through continuous monitoring and tuning.
• Collaboration: Work closely with data architects, analysts, and business
stakeholders to understand data requirements and deliver
- quality solutions.
- Informații detaliate despre oferta de muncă
Firma: Sales Consulting Localiția: Bucureşti
Bucharest, Bucharest, RomaniaAdăugat: 11. 12. 2025
Postul de muncă activ
Fii primul, care se va înregistra la oferta de muncă respectivă!