Our Partner provides customized property management service, and have leading role in the market. We are looking for their expanding international team a Lead Data Engineer colleague, with the following tasks:
Pozíció leírása / Job description- Support the Solution and Data Architect(s) to design, build, deliver, and maintain the technical data infrastructure that is the foundation of the BI platform and support global data strategy initiatives
- Provide technical leadership in data engineering good practice, drive the design patterns and practices that will achieve a scalable data integration platform that emphasizes re-usability and accelerates delivery with a focus on automation and continuous improvement
- Assist the project delivery team to scope, plan and deliver new and enhanced data integrations
- Use Azure Databricks for complex data cleansing, preparation, and analysis to meet the functional and non-functional needs of the business, and for re-usable data processing modules
- Develop, test, and maintain Azure Data Factory pipelines and data flows to manage the extract, transform, and load of data from a variety of data sources to the data lake and data warehouse
- Use T-SQL to build, maintain and enhance database objects (tables, views, procedures, etc.) in Azure SQL Database, helping to develop and enhance the Triana data warehouse as part of planned and ad-hoc development activities
- Establish a mindset of innovation, constant improvement for best practices and business standards in data analytics and development, helping to turn raw data into actionable insights
- Generate high-/low-level designs and technical specifications of existing and new data integration processes to the required business and technical standard
- Support data science and other experimental initiatives by using the appropriate tool set including Databricks, ADF, Python and SQL to wrangle and prepare large and/or complex structured and semi-structured data,
- Work with Product Owners, Analysts and Account Teams to assist with data-related technical issues and propose suitable solutions, both short and long term
- Provide leadership and support within the C&W BI Analyst Super User community in order to bring in best practices and innovation from other parts of the C&W organization.
- Promote effective and consistent communication – both within your team and inter-department
- Work closely with a multi-disciplinary team in an Agile / Scrum environment ensuring quality deliverable end of each sprint.
- Extensive experience (3+ years) developing data integrations against different sources including flat file, API and database using one or more industry standard toolsets e.g., Databricks, Azure Data Factory, Azure Synapse, Google Dataflow, Apache Beam, Talend, Spark, Jupyter Notebooks, SSIS, Informatica, etc.
- Recent, hands-on commercial experience (1+ year) building data integration processes using Databricks and Azure Data Factory v2 or Azure Synapse
- Solid coding experience in at least one language commonly used in data cleansing, wrangling and statistical analysis (Python, Scala, R, etc.)
- Above average knowledge of using SQL to query data, troubleshoot common data issues, and summarise or aggregate data to derive new insights, extracting value from raw data
- Some exposure to building and deploying cloud-native data solutions in Azure including Data Lake, Azure Key Vault, Function Apps and Azure SQL Database
- Broad understanding of the principles of data warehouse design and data integration processes to extract, transform, cleanse, and prepare data for BI reporting and advanced analytics
- Routine use of source control (Git, SVN, TFS, etc) as part of regular development activities, branching & merging, pull requests
- Experience gathering, analysing, and documenting requirements, scoping, planning, and managing the delivery of data integration projects
- Proven experience troubleshooting and resolving data, integration, and code issues. Debugging, identifying bugs, and driving them to closure by working closely with the rest of the team
- Demonstrate strong communication skills documenting business processes in the form of process maps and step-by-step user manuals. data analysis, definition of workarounds, etc
- Familiarity with test management (UAT, unit testing, system integration testing and release to live processes), system testing with defined entry and exit criteria
Nice to have
- Experience with Delta Lake technologies including Delta Tables, Unity Catalog, Delta Live, Spark Structured Streaming, Photon, etc.
- Exposure to DevOps/Deployment Automation/Continuous Delivery e.g. Azure DevOps, Octopus Deploy, Jenkins, PowerShell, ARM Templates, Terraform, Ansible, Chef, Puppet, TeamCity, etc.
- Test-driven development using common testing frameworks (e.g. pytest, nUnit, tSQLt, nBI, etc.)
- Exposure to real-time integrations or IoT solutions e.g. Event Hubs, Service Bus, IoT and Event Grid in Azure; IoT Core, PubSub in Google Cloud; etc.
- Knowledge of or exposure to MPP data storage including Azure Synapse, Google BigQuery, Snowflake, etc.
- An understanding of software engineering principles including design patterns; proper use of version control; branching & merging; partially or fully automated release builds, continuous integration/delivery etc.
- Background in delivering data-driven solutions for commercial real estate applications
- Degree in IT, Engineering, natural sciences, business IT degree or in any quantitative discipline
Skills & Personal Qualities:
- Excellent communication in English (verbal and written)
- Curious, analytical mind, driven to solve sometimes complex problems
- Committed, able to work to deadlines, prioritise and juggle between tasks.
- Thoughtful, collaborative, and proactive ready to engage with stakeholders across the globe
- High attention to detail and extremely organized, diligent, and focused
- Able to communicate effectively at all levels, including the ability to negotiate and resolve conflict; build & maintain effective working relationships with internal and external personnel
- Commitment to quality, strong sense of ownership and a thorough approach to work.
- Demonstrate flexibility and ability to deliver, good team player, also able to work independently
- Ability to understand the mechanics and dependencies of complex technical systems and architectures.
- International environment
- Flexible (hibrid/remote) work structure and at the same time opportunity to work at a pleasant atmosphere in the new office of the company
- Attractive compensation package
- Continuously expanding professional team
Mező-Mészáros Ildikó
Polákovits Petra
...