Lookup NU author(s): Professor Nick Holliman,
Professor Philip James,
Full text for this publication is not currently held within this repository. Alternative links are provided below where available.
Background: Photo-realistic terapixel visualization is computationally intensive and to date there have been no such visualizations of urban digital twins, the few terapixel visualizations that exist have looked towards space rather than earth. Objective: our aims are: creating a scalable cloud supercomputer software architecture for visualization; a photo-realistic terapixel 3D visualization of urban IoT data supporting daily updates; a rigorous evaluation of cloud supercomputing for our application. Method: we migrated the Blender Cycles path tracer to the public cloud within a new software framework designed to scale to petaFLOP performance. Results: we demonstrate we can compute a terapixel visualization in under one hour, the system scaling at 98% efficiency to use 1024 public cloud GPU nodes delivering 14 petaFLOPS. The resulting terapixel image supports interactive browsing of the city and its data at a wide range of sensing scales. Conclusion: The GPU compute resource available in the cloud is greater than anything available on our national supercomputers providing access to globally competitive resources. The direct financial cost of access, compared to procuring and running these systems, was low. The indirect cost, in overcoming teething issues with cloud software development, should reduce significantly over time.
Author(s): Holliman NS, Antony M, Charlton J, Dowsland S, James P, Turner M
Publication type: Online Publication
Publication status: Published
Series Title: ArXiv
Access Year: 2019
Acceptance date: 11/02/2019
Place Published: Ithaca, New York
Access Date: 18 February