- 3 roles
- Roles located in Canberra
- Applications close on Monday 22/2/2021
- Commence in March – 22/3/2021
- 12 month contract with 2×12 month extension options available
- A Baseline security clearance is required or the ability to obtain one
Several Data Engineers are required to work on developing and applying scalable data science workflows. The roles will involve creating data pipelines and processing techniques that work in both super computer and in cloud computing environments. There will be opportunities to work with internal and external stakeholders, including collaborations on exciting projects around the world.
You will be supported in maintaining, improving and fostering collaboration on open source software including an Open Data Cube project. This is a great chance to work in an inclusive team and to interact with peers from around the world in working on open data and open source software.
This role will involve working with data scientists to deliver real-world impact and insight into water use across Australia’s east coast. In this role you will:
- Work with scientists and software engineers to create and maintain operational data processing pipelines and robust scientific workflows;
- Provide technical advice to software engineers, data scientists and executive stakeholders via Slack, Github, email and formal documentation;
- Work with a diverse team to continue to develop and maintain the Open Data Cube;
- Develop maintainable and well documented code; and
- Work with cloud engineers to help build and run cloud native solutions.
- Experience with Linux system administration and relevant scripting languages.
- Experience with Python and/or similar programming languages.
- Experience in delivering and maintaining operational software.
- Experience with source control management and related processes.
- Ability to communicate effectively with technical and non-technical stakeholders.
- Working knowledge of AWS and infrastructure as code.
- Experience with geospatial information systems and OGC web services.
- Knowledge of Geographic Information Systems.
- Experience working with Postgresql and PostGIS.
- Experience using Docker and Kubernetes.
- Experience with Flask Python framework or equivalent.
- Experience using distributed computing tools like Dask.