JAEA R&D Navigator
Top > Computational Science and E-Systems Research > Remote VR Visualization Technology for Multi-Site Collaboration

Publication Date: February 12, 2026

Access counts:0

categ6

Remote VR Visualization Technology for Multi-Site Collaboration

Fig. 1 Architecture of the Multi-Site Collaborative VR Visualization System

Fig. 1 Architecture of the Multi-Site Collaborative VR Visualization System

The area left of the gray line denotes the supercomputer; the right side shows three users. User A acts as the host leading the analysis, while Users B and C participate as guests. Particle data generated concurrently with the simulation are written to storage and asynchronously transferred by each user’s communication program to their own PC, enabling identical visualization results to be shared across multiple sites. The host’s analysis cues are materialized as a visualization-parameter file and synchronized to all guests.

In large-scale thermal–hydraulic and fluid simulations in the nuclear engineering field, simulation outputs often reach the petabyte scale, making it difficult to transfer and share raw data. Conventional visualization methods convert the entire computational mesh into polygon data, resulting in visualization datasets as large as the original simulation data. This has made it challenging to achieve the high drawing speed 60 fps (frame per second) for interactive analysis and VR (Virtual Reality)-based visualization, especially in remote collaborative environments.

To address this issue, we have developed IS-PBVR (In-Situ Particle-Based Volume Rendering), a visualization application based on the particle-based visualization technique PBVR (Particle-Based Volume Rendering). PBVR compresses simulation results into visualization particle data of only tens of megabytes, enabling interactive rendering at frame rates exceeding 60 fps.

In this study, we further extend IS-PBVR to develop a multi-site collaborative VR visualization system, allowing researchers at multiple locations to share the same three-dimensional field in VR space. Using head-mounted displays (HMDs), users can automatically synchronize viewpoints, color mappings, and annotations. By streaming particle data incrementally, the system minimizes communication overhead and reduces interaction latency.

This system enables experts at remote sites to observe and discuss exactly the same visualization in real time, facilitating rapid decision-making in applications such as source term estimation in pollutant dispersion analyses. We applied the proposed system to a thermal–hydraulic simulation running on the JAEA supercomputer SGI8600 and demonstrated real-time sharing of three-dimensional visualization results among multiple PCs.

Acknowledgements
This study was supported by JSPS KAKENHI Grant-in-Aid for Scientific Research (C) Grant Number JP20K11844, "Interactive In-Situ Visualization of Large-scale Distributed GPGPU Simulations."
Author (Researcher) Information
Reference
Kawamura, T. et al., Extension of Particle-Based in-Situ Visualization for Multipoint VR Visualization, EPJ Web Conf. vol.302, 11002, 2024, 8p.
Paper URL: https://doi.org/10.1051/epjconf/202430211002

If you have any comments or feedback about this page, please click the button to share your thoughts with us.

Back to Top