Massive GPU Cluster for Earth Observation (MAGEO)
To facilitate application of Deep Learning to Earth Observation data, NEODAAS is in the process of setting up a large Graphical Processing Unit (GPU) cluster comprising five NVIDIA DGX-1 MaxQ nodes connected to 0.5 PB of dedicated storage. NEODAAS are able to offer compute time development assistance with applying the latest Deep Learning techniques to EO data.
The Massive Graphical Processing Unit Cluster for Earth Observation (MAGEO) was funded through a NERC transformational capital bid in 2019 to provide NEODAAS with the capability to apply Deep Learning, and other algorithms benefiting from a large number of GPU cores, to EO data. The cluster will be operated as a service where researchers will be able to make use of the compute power offered by MAGEO and the expertise of NEODAAS staff.
Currently the hardware is being purchased and set up according to the timeline below, this page will be updated as progress is made. Please contact us if you are interested in using MAGEO.
The gantt chart below details the timeline for its implementation:
The cluster will be built around 5 NVIDIA DGX-1 MaxQ nodes, providing a total of 204,800 CUDA cores. The special edition NVIDIA DGX-1 MaxQ nodes are a more energy efficient version using 50 % of the power of the standard DGX-1 while still delivering 80 % of the capacity. Each DGX-1 has 40,960 CUDA cores, 40 Intel Xeon 2.2 GHz cores, 512 GB system memory.
The cluster will have a dedicated 500 TB of storage as well as being able to access satellite data stored on the existing file system at PML which has over 3 PB of storage.
Singularity containers will be used to allow researchers fully customise their environment and use their preferred libraries. For interactive use a Jupyter Lab frontend will be used. More details on software will be added later.
An NVIDIA certified training program on Deep Learning for computer vision is being held 12th March 2020 at PML. A MAGEO ‘hackathon’ is planned for later in the year. Please contact us if you are interested in these or other training opportunities.