Virtual reality makes real life even better

Innovative technologies improve business operations and enhance workplace safety

By Antoinette Price

Imagine being able to see the issues of large-scale construction projects before building is complete and to collaborate with engineers and architects to keep on top of changes, or observe a city’s infrastructure in real time and improve performance of services.

VR used to train surgeons Doctors can prepare for surgery using VR (Photo: Arch Virtual)

These and many more scenarios are already possible thanks to a combination of virtual reality (VR) and artificial intelligence (AI) technologies.

Seeing things in a new light

According to a report by Zion Research, the global VR market is expected to reach USD 26,89 billion by the end of 2022.

The education, healthcare, tourism, and smart manufacturing industries are among those to embrace this technology, which is changing how we work, learn, train and enjoy entertainment.

For instance, some software platforms enable business meetings to be held in VR, replacing traditional, abstract chart and graph presentations. Participants wear VR headsets and immerse into a 3D duplicate or digital twin representation of their business, which combined with data, allows improved risk and performance monitoring. The virtual presentation offers more accurate ways of looking at business processes and components, for instance in a factory, from an overview level down to the minutest detail. These insights give a much clearer understanding of issues faced and the overall state of operations.

So how does it work?

Virtual reality is a life-like situation in which people experience and interact within a 3D world that isn't real. This is achieved using a VR headset, which blocks out the real world. This complex technology is comprised of software, which drives components such as displays, sensors, images, maps and tracking technology. These in turn link to the hardware (headsets, smart glasses or helmets).

Displays include monitors and handheld devices such as smartphones and tablets. These contain optical sensors, accelerometers, gyroscopes, GPS and cameras for tracking movement.

360-degree VR, an audio-visual simulation of an altered environment, enables to look in all directions, as is done in real life. It includes live, real-time or pre-recorded footage. When combined with tablet and smartphone apps, users can change their perspective by tilting and rotating the device, or by touching the screen, which becomes their eyes.

Interview with Myeong Won Lee, Chair SC 24 for computer graphics, image processing and environmental data representation

e-tech caught up on the latest developments in VR standardization with Myeong Won Lee, who leads the work of the IEC and ISO joint technical committee (ISO/IEC JTC 1) and its Subcommittee (SC 24) on computer graphics, image processing and environmental data representation.

Some of the areas covered include:

  • Intelligence and information systems which use high resolution imagery formats to support a variety of applications, including modelling and simulation environments, displays and 3D printers and scanners. For instance, a 3D printed heart can be used in preparation for complex surgeries.
  • Web and document graphics technologies that utilize 2D and 3D imagery files for presentation and exchange of 3D environments. These incorporate imagery, content concepts and interaction with virtual or synthetic environment applications in modelling and simulation, for example, to reconstruct historic buildings.
  • Visual applications in which data captured from the real world is combined with virtual data to produce mixed and augmented reality. For example, during the recent fire at Notre Dame Catheral in Paris, one news agency enhanced its coverage by using an interactive 3D experience with its app, which displayed the building as it appeared before it caught fire.
  • Visualization technology and architecture for developing applications are used in systems integration areas such as smart cities, for planning purposes, virtual training to prepare first responders for different emergency situations, wearable devices and 3D representation related to healthcare services.

What are some of the key SC 24 standards and where are they used?

The standards can be used in all areas, where 3D visualization is necessary. For example, virtual military training requires the implementation of 3D scene simulation. For sports, we need 3D simulation of a human character. For education purposes, we need a standardized 3D representation model that can be used interchangeably and a standardized 3D data format. We also see many 3D applications in entertainment.

ISO/IEC 19775-1 Extensible 3D (X3D) is a 3D file format that can be used for generating and exchanging 3D scenes in heterogeneous computing environments for VR and AR applications. Because X3D is defined with XML, X3D is suitable for transferring 3D objects and scenes between VR and AR applications. X3D can be used for smart city visualization and navigation.

ISO/IEC 19774-1 Humanoid animation (H-Anim) architecture, defines the hierarchical structure of a human body with joint and segment relationships. It also includes different levels of detail of the human body, such as articulation. This can be applied in many areas of medical (dentistry and orthopaedics modelling) and health information.

ISO/IEC 19774-2 Humanoid animation (H-Anim) motion data animation, defines a method of generating animation using motion capture data. This model can be used in diverse computing environments. 

ISO/IEC 18023 Synthetic Environment Data Representation and Interchange Specification (SEDRIS) can be used to generate 3D environments with semantic information. SEDRIS defines all environmental data necessary in a 3D environment when implementing VR and AR applications.

What are some of the key projects in 2019?

This year we are completing a White Paper with guidelines for developing virtual education and training systems. It defines three basic concepts: information modelling architecture, standards-based functional components, and implementation components. This systematic approach provides a method of developing standards-based systems.

Additionally, our work in the area of systems integration visualization (SIV) is important because it can apply to many areas including education, smart cities, and healthcare. We would like to extend the scope to 3D visualization for systems integration areas. The guidelines we have produced for virtual education and training could be applied in other areas which will need their own. We plan to go forward with VR and AR-based ICT integration systems.

Finally, JTC 1 has established an advisory group on use cases for VR and AR-based ICT integration systems. Detailed use cases will be developed for education and training in some areas such as school, medicine, health, and heavy equipment. We plan to propose several new work items for standards about 3D smart city, 3D virtual training systems, and 3D health information.

Looking ahead

Lee concludes that it is important to work with other JTC 1 subcommittees developing standards related to security, communication, transmission and exchange of information across diverse computing environments and systems, learning, education and training, artificial intelligence, health informatics, and to get them involved in the implementation of integrated systems.

Gallery
Meyong Won Lee, Chair, ISO/IEC JTC 1/SC 24 Myeong Won Lee, Chair, ISO/IEC JTC 1/SC 24
VR used to train surgeons Doctors can prepare for surgery using VR (Photo: Arch Virtual)
VR programme for education VR programmes are used increasingly for education (Photo: https://www.digitalbodies.net/vr-ar-education-focus-strategic-vision-implementation)