- Architect and develop scalable, resilient, high-quality data solutions to support analytics, reporting, and machine learning workloads.
- Design, implement, and maintain reliable ETL/ELT pipelines that integrate, cleanse, and consolidate data from diverse internal and external sources, including APIs and third-party systems.
- Ensure data is accurately ingested, transformed, stored, and made accessible through robust testing frameworks and clearly defined business rules.
- Translate business requirements into scalable data models and transformation logic, working with large, multidimensional datasets to surface insights, trends, and opportunities.
- Build and maintain the BI reporting codebase, ensuring high standards of data integrity, availability, and performance across the organization.
- Create and maintain backend infrastructure that supports analytics platforms, dashboards, and machine learning pipelines.
- Develop and execute unit and integration tests for data pipelines and transformation scripts to ensure reliability and consistency.
- Proactively identify inefficiencies and bottlenecks in data flows and propose scalable, forward-looking solutions.
- Partner closely with product managers, engineers, commercial teams, and business stakeholders to understand strategy and data needs, aligning delivery with the BI roadmap.
- Contribute to team growth through code reviews, design discussions, and knowledge sharing, while staying current with industry best practices.
- Take ownership of the quality, integrity, and consistency of the BI data codebase and associated datasets.
- Ensure ETL efficiency and data availability to meet stakeholder requirements and business SLAs.
- Adhere to engineering best practices, including documentation, version control, and maintainable code standards.
- Support and deliver against team OKRs and KPIs, contributing to overall team and business success.
- 4+ years of back-end and/or data engineering experience, delivering production-grade data solutions.
- Demonstrated experience working in a global or international environment, collaborating across regions and time zones.
- Strong SQL expertise with hands-on Python scripting experience for data extraction, transformation, and analysis.
- Proven experience ingesting and processing data from external APIs and third-party data sources.
- Strong experience working with AWS cloud infrastructure, including EKS and ECS, and with modern data warehousing solutions (ideally BigQuery).
- Solid understanding of BI data architecture, with experience using version control systems such as GitHub or GitLab.
- Demonstrated ability to understand business logic behind datasets, ensuring high standards of data quality, consistency, and trust.
- Highly analytical and detail-oriented, with a focus on building reliable, scalable data solutions.
- Comfortable working with unstructured or imperfect data, transforming it into actionable insights.
- Strong collaboration skills, with experience working effectively in global, multicultural teams.
- Excellent communication skills, with the ability to translate complex technical concepts into clear insights for non-technical stakeholders.
Nice to have
- Experience with BI and visualization tools such as Looker, Tableau, or similar platforms.
- Working knowledge of PHP.
- full remote opportunity
- competitive compensation
- freedom to innovate
- multicultural environment
Ildikó Mező-Mészáros - ildiko.mezo-meszaros@randstad.hu
Boglárka Tóth - boglarka.eva.toth@gmail.com
...