Search Bar Hero Image

Connecting Specialists to Pioneering Projects

Filters

Country

  • Australia

    72

  • Papua New Guinea

    4

  • New Zealand

    1

  • Show more

Branch

  • Infrastructure

    29

  • Mining

    21

  • Information Technology

    11

  • Show more

Level of Education

  • Academic Bachelor

    42

  • Secondary School

    16

  • Vocational School

    12

  • Show more

Total years of experience

  • 2-4 Years

    21

  • Not applicable

    17

  • 4-6 Years

    15

  • Show more

Showing 77 search results

Sort by most

Senior Data Engineer

Australia, Perth CBD

Academic Bachelor

6-9 Years

The Data Engineer will contribute to the technical development and operational support of our Data, Integration and Analytics (DIA) Platform. This platform will empower operational and corporate data reporting, analytics, and AI-driven insights. The Data Engineer will support project teams, collaborate with cross-functional teams, and contribute to data standards, policies and guidelines to ensure the platform meets the evolving needs of the business. The role is essential to supporting efficient data processing, modern data pipelines, real-time analytics, and enabling data-driven decision-making across the organisation. Deliverables - Technical & Operational Platform Development & Optimisation: Demonstrated experience contributing to the development and optimisation of data platforms and implementing automation for data pipelines, with growing exposure to AI/ML workflows. Familiarity with Databricks is essential and Fivetran or similar technologies. Data Pipelines:Design, implement, and maintain scalable and resilient data pipelines within the DIA Platform across the Databricks and Microsoft Fabric environments. Leverage Microsoft DevOps to establish robust CI/CD workflows for Databricks pipeline deployment, ensuring version control, automated testing, and seamless promotion across development, staging, and production environments. Focus on transforming ingested data into curated, analytics-ready datasets across the Bronze, Silver, and Gold layers of the lakehouse. Automate pipeline orchestration for both batch and streaming Kafka workloads, applying best practices in data quality, lineage, and performance optimisation. Implement Infrastructure as Code principles and collaborative development practices to ensure data pipelines are modular, reusable, and consistently deployable. Data Integration: Develop integration pipelines using modern data integration techniques for API-based, event-driven, and batch ETL/ELT data ingestion. Ensure high level of data integrity for master and reference data to effectively process transactional events across ERP, IoT, and SaaS systems. Design integrations that are secure and resilient, aligned with enterprise architecture principles and platform performance requirements. Work closely with the infrastructure teams to ensure systems performance and maintain a high level of cyber security. Work closely with the data architect and data owners to extend the enterprise data model and ensure timely and accurate data delivery for operational and analytical consumption. Reporting & Analytics Enablement: Collaborate with business stakeholders and data stewards to understand their data and reporting needs and translate those into actionable data models supporting the development of dashboards, and visualisations. Continuously optimise analytics workflows to improve business outcomes. AI/ML Pipeline Development: Exposure to building and maintaining data pipelines that support AI/ML models in production environments, with enthusiasm for enabling data-driven decision-making and predictive analytics.

12 of 77