Primal Space Systems-3D Graphics, Focusing On Visibility, Prefetching and Navigation (DEAL 611)

This portfolio is generally related to a media streaming system for VR/AR, surgery, gaming, that predicts movement of a user’s viewpoint, estimates a maximal view frustum to provide a smoother user experience with pre-fetched graphics, reduces latency in rendering, and improves responsiveness in dynamic/interactive environments. The system calculates the viewpoint movement based on a motion path or user input and identifies the polygons that will become visible from the new region but are currently hidden. Further disclosed are techniques for visibility event-based navigation by receiving visibility event packets from a server, sensing real-world surfaces, and calculating position by matching the sensed data with the 3D geospatial model. TThe technology may be implemented in VR/AR streaming systems, remote 3D visualization in medical imaging systems, defense simulation training systems, metaverse platforms, autonomous navigation in mapped spaces, etc.