Job Description Data EngineerLocation: Auckland, New Zealand
# of openings: 3
Accountabilities:Design, develop and optimize end-to-end data pipelines using Snowflake, DBT and other relevant technologies. Good to have Databricks.Implement and follow Kimball methodology for data warehousing and dimensional modelling, ensuring the creation of robust and scalable data models.Work within and support an agile team.Follow SDLC principles around code management, version control, and release pipelines via CI/CD process.Key Performance Indicators/Outcomes:Data assets/pipelines/products delivered on time while maintaining data quality controls.High data reliability & pipeline performance.Highly effective optimization of the infrastructure layer (resource utilization/SaaS costs).Compliance with Data & AI governance controls.Exceptional collaboration & communication/documentation.Core Competencies, Knowledge, and Experience:At least 3 – 5 years of relevant work experience.Strong knowledge of data engineering principles within SQL environments and within programming environments.Development experience working with CI/CD pipelines and languages including Python.Experience working with large volumes of information covering relational, dimensional, and non-structured data in complex enterprise environments.Understanding of Kimball methodologies.Experience in an Agile environment.Experience in on-premises and cloud environments, ideally AWS and/or Azure and Snowflake.Security and Privacy Commitment:Knowing how my role contributes to the security and privacy of our customers' information.Following all relevant Customer policies and guidance on security and privacy.Reporting a possible breach or privacy or security to Privacy team as soon as I become aware of it.Taking time to act with caution if I am undertaking a task that could put customer or employee information at risk of breach or misuse.Behaviour:Agile working.Collaboration within and across teams.Good communication skills.
#J-18808-Ljbffr