Seeking a Data Engineer to work within a Data & Analytics Program, aiming to improve the way in which we analyse and use data. Expanding the use of data and analytics, to maximise the efforts of staff through pre-emptive and seamless delivery of information.
The project needs to ingest, analyse, and ultimately match and link information drawn from unstructured data sources.
Role specifics:
• Australian citizenship required.
• Ability to obtain Baseline clearance.
• Melbourne CBD, Box Hill, Brisbane – Wharf St, Sydney CBD, Newcastle, Hobart, Adelaide, Perth or Canberra
• WFH 2 days per week.
Duties:
• Ensure technical build artefacts comply with build standards and patterns, incorporating security standards.
• Lead unit testing activities including management of data and test resolution.
• Optimise automated test and deployment processes.
• Migrate complex EDH pipelines and data processing applications while retaining existing logic and functionality, to a cloud hosted environment using AWS cloud native technologies.
Demonstrated experience:
• Working in large organisations, delivering enterprise big data and cloud solutions including services engineering and integration.
• Data Analytical Programming skills - Have extensive Scala programming experience using Spark.
• Hands on experience with Hadoop and Spark (Java and Scala)
• Elastic Search experience (Schema/Data Ingestion/Kibana/etc)
• AWS experience in Lambda, Glue, CDK, CloudFormation, S3 and step-functions.
• Experienced in CI/CD activities - automating builds, tests and deployments.
For more information or for a confidential discussion, please contact Ebony Henderson on 02 6113 7534 quoting reference number 369205 alternatively please APPLY NOW for consideration of this role.
The project needs to ingest, analyse, and ultimately match and link information drawn from unstructured data sources.
Role specifics:
• Australian citizenship required.
• Ability to obtain Baseline clearance.
• Melbourne CBD, Box Hill, Brisbane – Wharf St, Sydney CBD, Newcastle, Hobart, Adelaide, Perth or Canberra
• WFH 2 days per week.
Duties:
• Ensure technical build artefacts comply with build standards and patterns, incorporating security standards.
• Lead unit testing activities including management of data and test resolution.
• Optimise automated test and deployment processes.
• Migrate complex EDH pipelines and data processing applications while retaining existing logic and functionality, to a cloud hosted environment using AWS cloud native technologies.
Demonstrated experience:
• Working in large organisations, delivering enterprise big data and cloud solutions including services engineering and integration.
• Data Analytical Programming skills - Have extensive Scala programming experience using Spark.
• Hands on experience with Hadoop and Spark (Java and Scala)
• Elastic Search experience (Schema/Data Ingestion/Kibana/etc)
• AWS experience in Lambda, Glue, CDK, CloudFormation, S3 and step-functions.
• Experienced in CI/CD activities - automating builds, tests and deployments.
For more information or for a confidential discussion, please contact Ebony Henderson on 02 6113 7534 quoting reference number 369205 alternatively please APPLY NOW for consideration of this role.