Illuminating Immersion: UV-C's Impact in Virtual Reality - Uncovering Risks and Benefits

Come meet Uvisan at Immerse Global Summit | Metacenter Global Week in Orlando Oct 17-19

Introduction

In recent years Virtual Reality (VR) has evolved from a niche technology to a mainstream phenomenon, captivating users with immersive digital experiences that blur the boundaries between the real and the virtual. As the demand for more lifelike and engaging VR content surges, so does the need for addressing the crucial aspect of hygiene within this dynamic realm. Enter Ultraviolet-C (UV-C) technology, a potent and versatile tool that is revolutionising the way we approach cleanliness and sanitisation in virtual reality.

UV-C, a short-wavelength ultraviolet light with germicidal properties, has long been recognised for its efficacy in sterilisation and disinfection applications in various industries. Now, its remarkable potential in elevating VR experiences through improved hygiene standards is taking centre stage. In this article, we delve into the pivotal role of UV-C in VR, focusing on its ability to transform the way we perceive and maintain hygiene within virtual environments. From sterilising VR equipment to mitigating the risks of shared experiences, UV-C technology is paving the way for a cleaner, safer, and more enjoyable virtual reality landscape. There are of course risks when implementing UV-C technology and in the article we explore both the benefits and risks involved when using UV-C as well as offering guidance on how to minimise any potential risk that may come from using a UV-C product.

Understanding UV-C

Understanding UV-C (Ultraviolet-C) involves exploring the intriguing world of short-wavelength ultraviolet light, a powerful and unique form of electromagnetic radiation. Falling within the 100 to 280 nanometer range, UV-C possesses exceptional germicidal properties, making it an effective tool in the battle against harmful microorganisms. The key to UV-C's potency lies in its ability to disrupt the DNA and RNA of bacteria, viruses, and other pathogens, rendering them unable to reproduce and thus neutralising their harmful effects. This property has led to the widespread application of UV-C in various industries, including water and air purification, healthcare, and food processing. In recent times, UV-C has also found its way into the realm of Virtual Reality (VR) and Augmented Reality (AR), where its benefits extend beyond mere sterilisation. UV-C technology is being harnessed to improve the hygiene of VR equipment, ensuring a safer and cleaner user experience. However, it is crucial to recognise that UV-C exposure poses potential dangers to living organisms, including humans. As we integrate UV-C into the realm of VR and AR, a balanced understanding of its capabilities and limitations becomes essential to harness its power effectively and responsibly.

History of UV in VR / AR

In the mid-2010s, the first instances of UV-C implementation within Virtual Reality (VR) emerged as companies and researchers explored its germicidal potential. An early example was the introduction of automated UV-C cleaning stations in VR arcades and public VR spaces. These stations allowed users to disinfect VR headsets and controllers between sessions, minimising the risk of infections spreading among different users. This was of course pre-covid. 

The Virtual Reality (VR) industry experienced a significant negative impact due to the COVID-19 pandemic. Before the outbreak, VR arcades, amusement parks, and entertainment venues were thriving, offering consumers a chance to experience VR in a social and interactive setting. However, with strict social distancing measures and lockdowns in place, these public VR spaces faced closure, leading to revenue losses and business uncertainties. The fear of potential virus transmission through shared VR equipment deterred many customers from visiting these establishments, further exacerbating the industry's struggles. Consequently, VR arcade operators and businesses had to adapt rapidly to the changing landscape, investing in rigorous sanitisation protocols, implementing UV-C disinfection systems, and adhering to strict hygiene standards to regain public trust. Despite the challenges, the resilience of the VR industry and the implementation of UV-C technology played a vital role in the gradual recovery of public VR spaces, fostering a safer and cleaner environment for users eager to experience the joy of VR in shared settings once again.

Post-covid saw the popularity of UV-C technology really skyrocket with the adoption of UV-C disinfection systems like Uvisan cabinets. Uvisan cabinets, equipped with powerful UV-C lamps, offered an automated and efficient way to sanitise VR headsets and importantly, also controllers between users. These cabinets used UV-C radiation to deactivate harmful pathogens, ensuring a clean and safe experience for each participant. These pioneering applications showcased the potential of UV-C technology to revolutionise VR hygiene, providing users with a worry-free, germ-free, and highly enjoyable virtual experience. Suffice to say, whilst Covid may have been the catalyst for UV-C technology becoming so popular within VR / AR, it has now gone far beyond that, with hygiene itself being the focal point more generally, as opposed to Covid specific prevention.

Benefits of UV-C in VR / AR

One of the most significant advantages of UV-C is its powerful germicidal properties, which make it highly effective in disinfecting VR equipment and accessories. Automated UV-C cleaning systems, such as Uvisan cabinets, provide a quick and efficient way to sanitise VR headsets, controllers, and other shared accessories, reducing the risk of cross-contamination in public VR spaces. By eradicating harmful pathogens, UV-C ensures a safer and more hygienic environment for users, instilling confidence in their virtual experiences. Moreover, UV-C technology helps to extend the lifespan of VR equipment by keeping it free from harmful microbes without the need for chemicals and mechanical cleaning, leading to cost savings and reduced equipment maintenance. The implementation of UV-C in VR/AR not only elevates the overall hygiene standards but also contributes to a more enjoyable and worry-free immersive experience for users, ultimately advancing the adoption and growth of these transformative technologies.

In the vast expanse of virtual possibilities, UV-C and VR appear to be the ideal pairing, their stars aligned. However, as with any celestial match, there are cosmic concerns to navigate. UV-C, while incredibly beneficial, carries its share of hazards. Integrating UV-C into VR establishments, or any industry for that matter, demands a comprehensive understanding of the risks involved. Selecting UV-C equipment requires careful consideration, ensuring it meets stringent safety standards and is equipped with proper shielding measures to protect users from its potent radiation. Before embracing the union of UV-C and VR, it is crucial to recognise the importance of responsible implementation and the vigilance required in safeguarding the well-being of those venturing into the virtual realms.

Dangers of UV-C

UV-C, despite its remarkable benefits, presents a range of potential dangers that must be addressed with utmost care and attention. When handled responsibly, these hazards can be effectively managed to ensure the well-being and safety of users. However, any missteps in implementation could result in severe risks to health and safety.

The Harmful Impact of UV-C on Our Eyes

Ultraviolet C (UV-C) radiation, with wavelengths ranging from 100 to 280 nanometers, is the most energetic and harmful type of ultraviolet light. While natural UV-C radiation is mostly absorbed by the Earth's atmosphere, artificial sources like germicidal lamps pose a significant risk to our eyes. Below we explore the harmful impact of UV-C on our eyes, paying attention to the specific health risks and providing in-depth academic references to support the information presented.

Acute Photokeratitis: UV-C's Painful Consequence

Acute photokeratitis, also known as "welder's flash" or "snow blindness," is a painful eye condition caused by overexposure to UV—C radiation. This condition affects the cornea, the transparent outer layer of the eye, and can result in the following symptoms: eye pain, redness, excessive tearing, light sensitivity, and a feeling of grittiness. Prolonged exposure to UV-C radiation, even for a short duration, can lead to acute photokeratitis.

Academic Reference:

  • Pitts, D. G., & Cullen, A. P. (2000). UV and Infrared Absorption Spectra, Ultraviolet (UV) Radiation Properties, and UV Radiation-Induced Injury. Survey of Ophthalmology, 45(4), 349-361. doi:10.1016/S0039-6257(00)00169-5

Corneal Damage: A Serious Concern

The cornea is highly susceptible to damage caused by UV-C radiation. Direct exposure to UV-C rays can lead to corneal injuries, which may result in pain, blurry vision, and potential long-term vision impairment. Corneal damage requires immediate medical attention to prevent further complications and promote proper healing.

Academic Reference:

  • McCarty, C. A., Taylor, H. R., & Key, S. N. (2000). Corneal Light Shielding and UV-B-Induced Ocular Surface Squamous Neoplasia. Archives of Ophthalmology, 118(3), 392-393. doi:10.1001/archopht.118.3.392

Conjunctival Irritation: An Inflammation Risk

The conjunctiva, the thin, transparent membrane covering the whites of the eyes and the inner eyelids, can also suffer from UV-C-induced irritation. Prolonged UV-C exposure can cause inflammation and discomfort in the conjunctiva, making it red, swollen, and potentially leading to temporary vision disturbances.

Academic Reference:

  • Kuckelkorn, R., Redbrake, C., & Reim, M. (2001). Acute Ultraviolet-B-Induced Conjunctivitis and Its Mechanism. Investigative Ophthalmology & Visual Science, 42(6), 1429-1434. PMID: 11381087

Long-term Vision Issues

While acute effects of UV-C exposure are painful, long-term UV-C exposure can result in chronic vision issues. Prolonged exposure can lead to cumulative damage to the cornea and other eye structures, potentially leading to irreversible vision problems, including reduced visual acuity and other visual impairments.

Academic Reference:

  • Feldman, R. M., & Schultz, R. O. (1982). Ultraviolet Light-Induced Corneal Changes. Transactions of the American Ophthalmological Society, 80, 173-191. PMID: 6758506

The Harmful Impact of UV-C on Our Eyes - Summary

Ultraviolet C radiation poses significant risks to our eyes, with acute photokeratitis, corneal damage, conjunctival irritation, and potential long-term vision issues being some of the adverse effects. It is essential to be cautious and take appropriate safety measures, especially when dealing with artificial UV-C sources like germicidal lamps. The academic references provided support the scientific understanding of the harmful impact of UV-C on our eyes, emphasising the importance of protecting our eyes from this potent form of ultraviolet radiation.

The Harmful Impact of UV-C on Our Skin

Skin Burns: The Immediate Consequence of UV-C Exposure

Accidental direct exposure of the skin to UV-C radiation can result in skin burns that are similar to sunburns. These burns are characterised by redness, pain, swelling, and blistering. The severity of the burn depends on the duration and intensity of UV-C exposure.

Academic Reference:

  • Litchfield, D. J. (2005). Skin Cancer and UVR Exposure. In: Sunscreens: Development, Evaluation, and Regulatory Aspects. New York: Marcel Dekker, Inc. pp. 491-507. ISBN: 9780824757914.

Premature Aging: UV-C's Silent Impact

Exposure to UV-C rays from the sun can lead to premature aging of the skin.

Academic Reference:

  • Fisher, G. J., & Kang, S. (2002). Mechanisms of Photoaging and Chronological Skin Aging. Archives of Dermatology, 138(11), 1462-1470. doi:10.1001/archderm.138.11.1462

Skin Cancer: A Long-term Risk

Overexposure to UV radiation can lead to DNA damage in skin cells, increasing the risk of developing skin cancers like melanoma, basal cell carcinoma, and squamous cell carcinoma.

Academic Reference:

  • Lomas, A., Leonardi-Bee, J., Bath-Hextall, F. (2012). A systematic review of worldwide incidence of nonmelanoma skin cancer. British Journal of Dermatology, 166(5), 1069-1080. doi:10.1111/j.1365-2133.2012.10830.x

Immunomodulation: Compromising Skin's Defense

UV-C radiation can also weaken the skin's immune system, reducing its ability to defend against infections and environmental stressors. This immunomodulatory effect can make the skin more vulnerable to various diseases and ailments.

Academic Reference:

  • Ullrich, S. E. (2005). Mechanisms underlying UV-induced immune suppression. Mutation Research/Fundamental and Molecular Mechanisms of Mutagenesis, 571(1-2), 185-205. doi:10.1016/j.mrfmmm.2004.10.018


The Harmful Impact of UV-C on Our Skin - Summary

Ultraviolet C radiation, though naturally blocked by the Earth's atmosphere, can have harmful consequences when exposed directly to our skin through artificial sources like germicidal lamps. Skin burns, premature aging, and the potential long-term risk of skin cancer are among the concerning effects of UV-C exposure on our skin. It is vital to be cautious and take appropriate safety measures when handling UV-C-emitting devices to protect our skin from this potent form of ultraviolet radiation. The academic references provided serve as evidence of the harmful impact of UV-C on our skin, emphasising the significance of skin protection from this potentially dangerous radiation.

Navigating the Dangers of UV-C

As Ultraviolet-C (UV-C) technology gains traction across various industries, it brings with it a range of benefits, from sterilisation to improved hygiene. However, the potential dangers of UV-C radiation cannot be ignored. To ensure the safe utilisation of UV-C, particularly in scenarios such as Virtual Reality (VR) equipment sanitation, it's crucial to adopt precautionary measures. In this article, we delve into the methods and guidelines to effectively protect oneself from the potential hazards of UV-C exposure. Unfortunately there is no governing body and no official guidelines for safety but the below gives a comprehensive overview of what to look for when assessing your UV-C product for safety. 

IEC 62471 - A Crucial Benchmark for Safety

One of the cornerstones of protecting yourself from UV-C dangers is to ensure that the equipment in use adheres to recognised safety standards. The IEC 62471 standard specifically addresses photobiological safety, including UV-C radiation. It establishes exposure limits for various wavelength ranges and outlines the measurement techniques to determine potential risks. Prior to implementing UV-C technology, it's imperative to verify that the equipment bears the appropriate certifications, indicating compliance with IEC 62471. Relying on certified equipment provides a crucial baseline for minimising the risks associated with UV-C exposure. If there are holes or a direct line of sight to the bulbs (glass or any other transparent material will absorb UV-C), it is a strong warning sign that the product is not certified or safe. Uvisan cabinets are in the exempt category indicating that there is zero leakage of UV-C light from the cabinets. 

Opt for Quality Bulbs: Prioritising Safety and Ozone Mitigation

When it comes to protecting yourself from the potential dangers of UV-C radiation, the quality of the bulbs you choose plays a pivotal role. Opting for bulbs manufactured by reputable and well-established companies is essential not only for maximising sterilisation effectiveness but also for mitigating the risks associated with UV-C exposure. A crucial factor to consider alongside quality is the bulb's potential to produce ozone. UV-C radiation can interact with oxygen molecules in the air, resulting in the generation of ozone, which can have adverse effects on respiratory health. High-quality bulbs are designed with measures to minimise ozone production, ensuring that the benefits of UV-C technology are realised without compromising air quality or personal safety. Prioritising both quality and ozone mitigation is key to harnessing the advantages of UV-C while safeguarding your well-being.

Shining Light on Safety

In the ever-expanding realm of UV-C technology, safeguarding oneself from potential hazards is paramount. The journey begins with ensuring equipment adheres to certifications like IEC 62471, establishing a baseline for safe usage. Investing in quality bulbs from reputable manufacturers not only boosts the efficacy of UV-C applications but also minimises exposure risks. By cultivating an acute awareness of signs of poor manufacture and exercising caution, individuals can actively protect themselves from the potential dangers of UV-C radiation. As UV-C technology continues to redefine industries, responsible use becomes the guiding principle, ensuring its transformative benefits come without compromising safety.

Uvisan cabinets rigorously tested and fully certified holding certificates for :

IEC 62471

ISO 9001

ISO 14001

CE Certified

RoHS Certified

Only high grade UV-C bulbs are used in all Uvisan products



#ICYMI: What’s Taking Shape in the Industrial Metaverse?

VRARA's Kevin O'Donovan shared his vision on digital twins and industrial metaverse with XR Today.

With NVIDIA’s recent announcement of massive updates for its CloudXR and Omniverse technologies, the industrial metaverse has seen a huge jump in interest.

Across the headlines, the company boasted a roughly 10 percent spike on Wednesday last week. This has led to renewed confidence in the Santa Clara-based firm’s industrial ambitions and restored faith in the metaverse, although in its enterprise incarnation.

XR Today spoke to Kevin O’Donovan, Co-Chair, VRARA Industrial Metaverse and Digital Twin Committee, to examine the potential of the industrial and enterprise metaverse. He is a tech evangelist based in Nice, France, and serves as an expert in blockchain technologies for the European Commission.

O’Donovan has also regularly contributed to XR Today’s Big News Show since its inception.

Following the London-based Enterprise Metaverse Summit in late June, O’Donovan shared his thoughts on the industrial metaverse, digital twins, and immersive XR.

Siemens attended the event to speak on the need to upscale the industrial and enterprise XR. O’Donovan discussed his firsthand experience with the massive German infrastructure firm.

XR Today: What can you tell us about your experience with Siemens’ industrial XR technologies?

Kevin O’Donovan: I’ve been collaborating with Siemens for the past four or five years, and for those familiar with the company, Siemens is one of the worldwide leaders in industrial automation, energy systems, healthcare products, and many others.

They’ve discussed ‘the physical meets the digital’ for five to eight years. Honestly, they have been talking about digitising industries and creating digitalisation technologies, leading to digital twins. over the past couple of years.

They have design tools, a simulation centre (sim centre), InEx, and many automation software tools. If you’re designing a factory, you can design it in a 3D model, and Siemens has been in this space for years.


They also have Internet of Things (IoT) systems and grid modelling software, but it’s now just over a year ago that Siemens made a big announcement around their Xcelerator strategy and where they were looking to team up in a more structured way with many new and existing partners.

They created interoperability because everyone knows you can’t do it alone if you want to digitalise factories, grids, and industries. With the new system, you can mix and match with various partners. They also announced the Industrial Metaverse, a key strategic imperative for the company for the past 12 months.

In a big announcement, they revealed a month ago that they had invested about two billion dollars in a new factory in Singapore — a completely automated industrial metaverse factory.

They also announced a 500 million euro investment in Erlangen, just north of Nuremberg, one of their big campuses where they’re building a new technology research centre for automation, digital twins, and the industrial metaverse to take the technologies to the next level.

I’ve been collaborating with them and plan to visit their campus in Erlangen, where they opened the new Industrial Metaverse Experience Centre last week.

Over the last 12 months, we’ve seen them talk about the industrial metaverse along with NVIDIA, Nokia, and many other system integrators.

If you’re at Hanover Messe, everyone’s talking about digital twins and generative AI and how they can bring [the technologies] together to create what we’ve found regarding industry 4.0 for many years. That’s where it’s headed.

XR Today: What is the value of digital twins to industries and enterprises?

Kevin O’Donovan: Firstly, it depends on how you define a digital twin. Many people in the industry will say that they’re not new. At their basic level, a digital twin is a digital representation of something in the real world.

This could be a 3D computer-aided design (CAD) model, real-time data from my IoT systems, or a real-time data graphical user interface that’s a digital twin for my current production.

We’re seeing that, given the advancements in core technologies, whether from Intel, AMD, ARM, NVIDIA, and others, require more and more graphics, AI, and compute capabilities. This takes more data from the real world and digital twins by pooling data from multiple silos across different applications, and they don’t talk to each other.

We’re also seeing the next generation of digital twin technology, and many companies are adding more and more to boost simulation capabilities, generative AI to generate synthetic data for more simulations and scenario planning, and getting more data from real-time IoT systems.

However, we’re starting to see data pooled from multiple siloed digital twins, and that’s what platforms like NVIDIA’s Omniverse, Siemens, Bentley iCloud, and others are doing with many of their partners that we need to pool data from those different sources.

You then get this next-generation digital twin that offers a holistic view of everything in a digital format with 3D spatial interfaces.

These can perform scenario planning for business resiliency, maintenance, and optimisation for grids, factories, product designs, and recycling. It’s like digitisation on steroids.

Additionally, in our world, we like coming up with new terms. We had embedded business for many years, and didn’t call it the industrial IoT (IIoT), and now the Metaverse is the next game in town, where digital twins are the foundational building block.

You then speak about terms like XR, VR, AR, generative AI, 5G networks, and the latest edge computing from Intel, AMD, and NVIDIA. After bringing all of this together, we’re now at a stage where we’re taking digital twins to the next level.

I often tell people, “Look, digital twins mean different things to different people.” Great things are taking shape at the Digital Twin Consortium, where people can see digital twin maturity models.

These frameworks allow people to determine what digital twins mean, which exist today, and what problems they solve. As cool as the technology is, we must see if it makes you more money, saves money, or increases efficiency.

XR Today: There are a lot of use cases developing for digital twins, namely for companies like NVIDIA, Unity, Unreal Engine, and GE Digital. How are they being implemented in real use cases?

Kevin O’Donovan: People may say that digital twins aren’t new. We’re taking them to the next level now that we have a platform and immersive experiences.

This doesn’t always mean you’re in VR or XR, but you could instead view a 3D model of your factory [and] see what’s going on. This can allow you to reconfigure things and determine if you can boost production based on real-time data from the current production line.

We can also see if anything will break, if new shifts are needed, or if systems require predictive maintenance before speeding up production.

Conversely, two of us could be in different parts of the world and collaborate in the same environment. We’re not looking at two SAP screens but are actually in immersive environments.

It’s also not like a Zoom or Teams call anymore. We’ve recorded data for years that stuff sticks as we live in an immersive world if you’re trained in immersive ways.

So, as long as we use these technologies from Industry 4.0—the industrial metaverse—we can stay competitive as a company, industry, or country. Where we’re headed with automation, design, virtual worlds, and other things can also add to your sustainability story.

All the new infrastructure is being built for our [sustainable] energy transition, whether with electric vehicle (EV) factories, planning new grids, wind farms, hydrogen plants, and carbon capture plants. Everything is now being done in a digital twin model so they can plan everything before physically building infrastructure.

However, if you’re at an existing factory and have equipment from the last 10 to 15 years, your first step on that digital transformation journey is to put in all the IoT equipment to record real-time data in order to measure predictive maintenance.

That’s the journey we’re all on. It’s fascinating times, [and] people should not ignore this stuff.

XR Today: What did you think of PriceWaterhouseCoopers’ Four Pillars to the Metaverse? Do they resonate with how the industrial metaverse is developing?

Kevin O’Donovan: PwC’s four pillars—employee experience, training, client experience, and metaverse adoption—are key performance indicators (KPIs). Anybody in the industry wanting to invest will ask about the return on investment (ROI) level.

This can happen with happier employees, better collaboration with metaverse technologies, and other metrics. However, if you go to other companies, they may use other methodologies regarding digitalisation, with different ways to measure the success of digitalisation projects in your company, city, or country.

These KPIs allow people to know what success looks like and with the same goals. [However], if it doesn’t help client or employee experiences, people must consider why they use it.

Such frameworks are key. We’ve seen that, in the industry, people install technologies because they’re ‘cool.’ They have to have an ROI, and that’s one of the key drivers for why the industrial metaverse is not going away.

Digitalisation will become the only game in town, leading to better digital twins, resiliency, simulation capabilities, and ‘what if’ scenarios—all in real-time.

XR Today: How have digital twins and the industrial metaverse evolved over the years to improve infrastructure?

Kevin O’Donovan: I often chat with people in the industry, and they say, “Haven’t we been doing that for years? We don’t just build wind farms and hope they’ll work.” I agree with this.

There’s a lot of experience, competence, Excel models, and simulations that go into these projects. How do you put the mooring lines for offshore, floating wind turbines?

In the past, we didn’t have the computing, algorithms, or AI to generate more synthetic data and just run with it. Previously, we’d run ten simulations, some of which were paper-based.

Now, you can run hundreds of thousands of simulations. Using these simulations, we can now determine ‘what if’ scenarios like the tide, temperature, and climate changes—that ‘one in a hundred-year storm.

Almost every utility on the planet uses software to design distribution grids, allowing engineers to simulate what happens if another ten people plug in their electric vehicles, its effects on substations, and other issues.

This stuff is happening, and we can’t ignore it, especially the efforts from PwC, Siemens, NVIDIA, Nokia, and many others. While we talk about the Apple Vision Pro, Meta Quest Pro and 3, and [metaverse platforms like] Decentraland, the real story is happening in the industry and enterprise.

Keep an eye on it because it’s not going away.

Metacenter Announces Exciting New Immersive Event

XR Today's David Dungay hosts David Adelson, CEO of Innovate Orlando & Nathan Pettyjohn, President of VRARA.


XR Today’s David Dungay hosts David Adelson, CEO of Innovate Orlando & Nathan Pettyjohn, President of VRARA to discuss the new combined event involving Immerse Global Summit and Synapse.

In this conversation the panellist discuss the following:

The nature of the relationship and the combined event

  • What attendees can expect from the exhibitors and speakers.

  • How the latest Generative AI and Apple Vision Pro trends will shape the agenda.

  • Why businesses should attend.


3M helped make virtual reality headsets smaller. Next step? More consumer demand

Post originally appearing on Star Tribune by Brooks Johnson.

David Ylitalo imagines one day opening this newspaper, scanning for a story and having the text pulled right up to his eyes for easy reading. There is no paper, however, just a virtual reality app that mimics the real thing.

"We're right on the precipice of this becoming the next way people consume visual information from a computer," Ylitalo said. "Content that supports all these different uses — that's what's going to make it the next big thing."

Ylitalo is vice president of R&D for 3M's Display Materials Division, which has been supporting VR headset makers for a decade.

3M's "pancake optics" help shrink the size of headsets while improving display quality, both key product improvements for VR's quest to get more consumers to buy into the tech.

As Minnesota-based 3M prepares to spin off its health care business and reposition the remaining company for growth, the industrial giant is embedding its materials and technology in a number of next-big-things: electric vehicles, industrial automation, climate tech and virtual and augmented reality.

Sales have slowed for traditional consumer electronics like phones, TVs and computers — a core business segment for 3M that typically generates more than $3 billion in yearly sales. Electronics revenue is down 23% for the first half of the year amid weak consumer demand, especially in China.

Meanwhile, numerous market reports predict a multibillion-dollar spike in VR hardware sales over the coming years.

"Much like our customers, we're waiting for this to really take off, and we're already working on the next generation and the next-next generation of this technology," Ylitalo said.

A Citi report last year said that by 2030 there could be trillions of dollars spent on and in the metaverse, which the bank defines broadly as a highly immersive internet across a wide variety of devices.

"We believe that the metaverse will eventually help us find new enhanced ways to do all of our current activities, including commerce, entertainment and media, education and training, manufacturing and enterprise in general," the report said.

The promise of the metaverse has been touted for years, drawing more attention during the pandemic as workplaces and communities explored new ways to interact online. Lately, though, it's faced setbacks from tech company layoffs and resources shifting to artificial intelligence.

"It goes through its own hype cycles, like a lot of industries do," said Nick Roseth, Minneapolis chapter president of the VR/AR Association trade group. "The two biggest issues are: There aren't enough devices on the market, and content is still expensive."

The release of Apple's Vision Pro this summer was seen as a breakthrough moment — but for $3,500 it will be used mostly by developers to continue pushing the boundaries of what the tech can be used for, Roseth said.

He expects it will be another 18 to 24 months before real progress is made on affordability and accessibility for consumers.

"I have to remind myself that 90% of the population doesn't realize this technology exists," Roseth said. "It's a slow burn."

It took five years for 3M to find ways to improve VR headsets after being approached by companies at the Consumer Electronics Show in 2013.

"They simply asked us if we could make their headsets smaller," Susan Kent, R&D lab director at 3M, said earlier this year. "We shortly realized that we could and make the image quality ... better and look less cartoony."

After 3M combined pancake lenses with its patented reflective polarizer technology, headsets could bring screens closer to a user's face, making them smaller while also enabling crisp text.

3M has also developed optical films for heads-up-displays — like digital data displayed on a car windshield. That type of augmented reality, as opposed to a fully immersive virtual headset, has already seen wide adoption.

"We're already living with augmented reality on our phones," Roseth said, pointing to Pokemon Go, Ikea Place and fashion try-on apps. "That blends information with the real world."

Headsets, heads-up-displays and more were on display last month at the 3M Open in Blaine. As the golf tournament's sponsor, 3M's fan experience tent focused on how its technology is connecting the physical and digital worlds — a hands-on look at all things "phygital."

The golf games — including an augmented-reality putting tool — were especially popular.

"From here, it's about doing this at a large scale," Ylitalo said, "at a volume and cost that allows our customers to put these on not millions of faces but hundreds of millions or billions."

Banuba Increases the Performance of Virtual Backgrounds by 10X by Upgrading Neural Network Architecture

Banuba implemented a series of technical updates to Face AR SDK that drastically increase the maximum frame rate of live videos with background separation effect. Depending on the platform and hardware, this results in up to 10 times higher maximum FPS.


Virtual backgrounds are a must-have feature for any modern video communication software and social media. They help alleviate camera shyness, protect users’ privacy and prevent potentially embarrassing situations like pets walking in during a serious business meeting. After the COVID-19 pandemic caused a massive increase in remote work, the demand for virtual backgrounds skyrocketed, so improving them has been one of Banuba’s top priorities. 


The results were achieved thanks to three main additions:


  • New neural network architecture with improved utilization of CoreML Neural Engine (on Apple devices with Bionic processors);

  • Algorithm optimization on Windows and Web, allowing the neural network to monitor every other frame instead of each of them;

  • Improved anti-jitter algorithms that demand less resources from the device.

  • This effect is available as part of Banuba Face AR SDK (for live streaming and video communication) and Video Editor SDK (for prerecorded videos).


Other updates include:

  • Facial feature editing – a new functionality that allows changing size and shape of any part of the face;

  • Optimized hand tracking and gesture recognition;

  • Better lips segmentation. This is especially noticeable on the corners of the mouth and near the philtrum;

  • Acne removal for photos and an option to change the size of the area affected by the effect.

About Banuba

Banuba is an augmented reality company with over 7 years on the market, pioneering face tracking, virtual try-on, and virtual background technologies. Besides online try-on solutions for makeup, headwear, jewelry, glasses, and more, it offers a Face filters SDK and Video Editor SDK – ready-made modules to apply effects and edit videos.


New whitepaper! VR/AR To Address Staffing Challenges of the Energy Sector (Download)

Our Energy Industry Committee has produced this whitepaper aimed to guide energy organizations in leveraging VR/AR solutions to address challenges in staff acquisition, skill impartation, and talent retention. This document outlines key considerations in identifying use cases, specifications, functionalities, and hardware selection. Additionally, it addresses VR/AR solution deployment and change management. 

These papers (Part 1 and Part 2) serve as reference documentation for both end users and VR /AR solution providers. They facilitate productive engagement by establishing a shared language and understanding of requirements. 

Virtual and Augmented Reality (VR and AR) have the potential to revolutionize learning and training in the energy sector. They offer immersive experiences that enhance understanding of complex concepts, procedures, and equipment in a safe environment. These technologies are engaging and impactful throughout the employment cycle, from recruiting to reskilling and provide access to virtual training environments worldwide, reducing carbon footprints and promoting sustainability. 

In 2021, the VR/AR Association Energy Committee released the first whitepaper in a series, titled “VR/AR in the Energy Sector,” providing insights on VR and AR utilization in the industry. The goal was to offer insights to the VRARA Energy community, representing stakeholder organizations and technology suppliers, on how VR and AR solutions can be used to overcome critical business challenges facing our industry. 

Table of Contents 

Authors & Contributors......................................................................................................................................... 1 

Table of Contents....................................... 2 

1. Introduction................. 3 

2. Principal Considerations Before Getting Started......................................................... 3 

3. Setting the Right Learning Objectives........................................................................................ 5 

4. Type of Content to Develop.................................................................................................. 7 

4.1 Planning the Content................................................................................................ 7 

4.2 Classification of VR AR Use Cases for Training................................................................ 8 

4.3 Types of Simulation Modules.................................................................................. 9 

5. Hardware Equipment........................................................................10 

5.1 Type of Headset systems.........................................................................................10 

5.2 Mobile Devices..............................................................................................................12 

5.3 Desktop deployments.......................................................................................12 

5.4 Room Scale Immersive Systems.............................................................................12 

5.5 Characteristics of VR/AR Hardware Devices......................................................................................12 

5.6 Selection of VR AR Hardware for Application Development......................................13 

6. Content Production and Distribution.............................................................................14 

6.1 Internalize Capabilities................................................................14 

6.2 Third-party partnership......................................................................14 

7. Data and Scoring............................................................................................15 

8. Integrating into the Enterprise................................................................16 

8.1 Integration Support.............................................................................................13 

8.2 Content Licensing and Intellectual Property.............................................................13 

9. Physical Considerations for a Virtual World........................................................................17 

9.1 Training Space..........................................................................................17 

9.2 Audience Preparation................................................................................................17 

10. Conclusion.............................................................................................................................18

Sony will showcase a new product at our IGS during Metacenter Global Week in Orlando, Oct 17-19

Immerse Global Summit at Metacenter Global Week​ in Orlando on Oct 17-19!

In addition to it’s newest Spatial Reality Displays, Sony will also showcase mocopi at our IGS during ​Metacenter Global Week​ in Orlando on Oct 17-19!

Mocopi is a revolutionary phone-based motion 3D capture system for controlling virtual avatars that easily helps you track & record your full body motion; it’s great for your use in the metaverse! Mocopi is fully wireless and only requires a Bluetooth connection to your phone (iOS or Android) , so you can use it anywhere. It consists of six small sensors and a dedicated app that enable full-body motion tracking when combined using Sony’s proprietary technologies. Those in the industry know that traditional motion capture systems require pricey studios and trained operators, while Mocopi simply relies on Sony’s unique algorithm for accurate motion measurements with only small lightweight sensors as well as a smartphone.

Sony’s mocopi system makes motion capture and virtual content creation easy

With the dedicated “mocopi” app, users can create movies with their avatar in motion with their compatible smartphone, using the data obtained from the sensors attached to their body. In addition to pre-installed avatars, users can import custom avatars. Recorded avatar movies can be exported as mp4 files or motion data from the mobile app,” said Sony.

Mocopi is ultimately designed to record a user’s movements and then mirror them in digital environments — hence the mashup of “motion” and “copy.” There are plenty of different use cases for this kind of tech, from allowing animators to rig 3D characters with more realistic motions, to allowing Vtubers to replicate their movements in real time across streams and virtual reality platforms like VR Chat.

Mocopi provides some major benefits for the niche communities that will be willing to cough up the cash to buy it. While there are some affordable VR headsets like the Meta Quest 2 that can be similarly utilized in VR applications, these won’t provide the finesse of a dedicated motion capture tool, especially when it comes to lower body tracking.

Sony’s Mobile Motion Capture

Motion capture is a technique which digitizes the movements of a real person or object and imports them into a computer. This allows you to reproduce more lifelike, humanly movements with a computer-generated character in video production. This technique is also being widely used in movies, animations, and game contents within the Sony Group.

Typical motion capture requires studio facilities to install many cameras as well as a tight full-body suit worn by the actor with many markers attached to the body. In contrast, we have realized a new technology that enables motion capture using only small, lightweight sensors. We call this "Mobile Motion Capture."With this technology, you can readily digitize a person's movements while wearing everyday clothes whether they are indoors, outdoors, or anywhere and apply those movements to a computer-generated character.

If you haven't yet, get ​tickets​ to Metacenter Global Week.

Also, there is still time to become a ​sponsor​ or ​exhibitor​. Apply today to get the best speaking and or expo placement!

How Siemens utilizes Virtual Reality to enhance the employee experience

With more than nearly 300 production and manufacturing facilities and more than 385,000 employees, the global technology powerhouse Siemens is Europe’s largest industrial manufacturer and one of the most famous enterprises worldwide. Setting the highest standards throughout their line of business using innovative concepts and technologies, the company is constantly striving for new ways to improve existing concepts.

EHS & QHSE are crucial for employee & factory security

Ensuring the safety and health of their employees is an important and ever-present matter for companies worldwide. Especially industrial enterprises in the manufacturing business, whose employees work with heavy machinery in factories, face the challenge of continuously sensitizing their staff to issues such as plant security and occupational safety. For this reason, EHS training provides the staff with valuable insights about their workplace, underlying processes and implemented safety measures.

Since these measures are so crucial for the safety of their staff, companies are constantly looking for ways to make EHS training more efficient. Immersive technology like Virtual Reality elevates the learning effect for employees, helping them to grasp the training content faster as well as to apply them more confidently. Siemens has acknowledged exactly this and successfully implements Virtual Reality to train its employees.

What is EHS?

EHS (short for Environment, Health, Safety) is a discipline that focuses on implementing practical aspects of environmental protection, risk reduction and safety at work. When it is combined with Quality Management, it is commonly referred to as QHSE. Other common acronyms are, among others, OHS, SHE, HSSE, QEHS and QHSSE.

Embracing the ”new normal” with Virtual Reality

In collaboration with VRdirect, Siemens created a virtual tour through one of their industrial facilities, digitally depicting the different work places. In this first use case,  the virtual tour was complemented by additional and important information on EHS concepts. Users can explore the immersive and interactive training environment on their own while actively engaging with the necessary information. Through adding new features, the Virtual Reality project was continuously developed into a virtual escape game, facing users with a timed challenge where they have to apply everything they have learned to escape a fire emergency scenario.

This way, not only did the Virtual Reality experience make users engage more actively with the learning materials, but it allowed them to immediately test their knowledge in a fun and entertaining way. Using the VRdirect platform, Siemens can easily publish Virtual Reality projects on various devices, meaning VR headsets, mobile devices and PCs. The Escape Game was presented by Siemens at the Health & Safety Week, an internal event focussing on all topics regarding EHS. Staff members of many different departments were able to experience the project via Virtual Reality headsets but could also try out the web version on a PC.

Wide range of applications of Virtual Reality in EHS & QSHE

The unique way Siemens tackles the challenge of training staff for plant security and occupational safety shows the potential of Virtual Reality for EHS training as well as Quality Assurance. There is a broad spectrum of possibilities opening up when using immersive technology. Virtual Reality allows users to experience virtual surroundings up close and in an interactive way. This makes the technology viable for creating virtual simulations (for training & onboarding purposes, for example) as well as for actual simulations of real circumstances. The latter is especially suited for quality control or workplace inspections that can be done remotely. 

There are many possible use cases for Virtual Reality in EHS and QSHE, for example:

  • Easy onboardings in lifelike workplace surroundings

  • Workplace instructions

  • EHS & QSHE training sessions with integrated quizzes

  • Visitor Center trainings

  • Remote workplace inspections

  • Quality Assurance

Another huge benefit of Virtual Reality solutions is that they are constantly available via a multitude of devices. Once developed, a Virtual Reality application created for EHS training in a specific scenario can be used by employees anytime from any place without further preparation or supervision, greatly reducing the effort required to properly train staff members. Employees who are responsible for occupational safety can gain insights on different workplaces without the need to physically be there – all that is needed are a series of 360° captures and a platform to create and publish an immersive Virtual Reality experience.

Virtual Reality experiences engage with employees for better EHS / QSHE training results

The Escape Game was very well received by the participating Siemens employees at the event. Through the VRdirect platform, solutions like the Escape Game can furthermore be distributed via all common devices to specific users – without restrictions regarding time and place. Especially in times of the COVID-19 pandemic, where personal contact is limited to a minimum and the opportunities to conduct offline training are rare, the constant availability of Virtual Reality experiences is a huge benefit. This way, applications can be offered to employees remotely, regardless of which device the specific target groups can or wants to use.

Virtual Reality allows for immersive experiences even beyond training scenarios

Thanks to its easy-to-use approach, the VRdirect platform allows the use of Virtual Reality not only for EHS and QHSE, but also for a myriad of departments and use cases, for example Sales & Marketing, Human Resources, training in general as well as on- and offline events. With no special development skills needed, Siemens departments can create complete Virtual Reality applications quickly and easily on their own. The broad feature set of the platform allows for the creation of immersive Virtual Reality projects that are not limited to virtual tours only, but allow for countless fields of application. In only a short amount of time, divisions can create Virtual Reality experiences tailored to their own specific needs with nothing more than a clear idea of a story and a couple of 360° images or videos. With the VRdirect platform, projects can also be constantly updated or developed further in real-time.

The potential of one Virtual Reality platform as a Virtual Reality mainstream tool for various departments

Besides the success of the EHS Escape Game, a number of other Siemens departments have already implemented or are currently developing Virtual Reality use cases with a similar approach using the VRdirect platform. Besides a web portal, the Siemens VR app (available in the company’s internal app store soon) serves as a central hub that allows code protected access to the Virtual Reality projects.  Siemens IT APD GLS in Munich has acquired the platform as a potential mainstream tool for internal Virtual Reality projects:

“We needed a solution that allowed a company wide roll-out, meaning quick and easy implementation and distribution of stable Virtual Reality projects. With VRdirect multiple businesses are now starting with Virtual Reality, publishing to the one internal Siemens VR app – and they don’t need expert knowledge or a complex technical set up.”

Daniela Peine

IT APD GLS

Next to VRdirect, Siemens IT APD GLS remains the internal contact for the solution, allowing departments worldwide to realize straightforward use cases in Virtual Reality.

Now Enrolling: Extended Reality (XR) Developer Apprenticeship Program at Cañada College

Background

Cañada College in conjunction with the Bay Area Community College Consortium, California Student Aid Commission, and  State of California-Division of Apprenticeship Standards (DAS) are seeking employers for the Extended Reality (XR) Developer Apprenticeship Program (2023- 2024) cohort. Funded by a California Apprenticeship Initiative (CAI) grant, the Cañada College XR Apprenticeship Program trains, facilitates placement of, and supports eligible candidates in entry level positions with partner employers. Some occupations in XR studios include Junior Developer, Game Designer, and Production Manager.

 

Eligible Apprentices from the Developer Apprenticeship program will have completed either:

  • (a) Approved DAS Pre-apprenticeship Program,

  • (b) A certificate, associate degree and/or bachelor degree in a related field

  • (c) Art related apprentices’ applicants will have also passed a portfolio assessment conducted by developer apprenticeship faculty consultants.

 

Employer Role

Once placed in a position with an employer, the apprentice works to fulfill their On-the-Job Training (OJT) and receive Related Supplemental Instruction (RSI) through Cañada College. Successful completion of the program is expected to take six to eighteen months, depending on studio production cycles and the individual schedules of apprentices. While further employment with a partner employer is not guaranteed, a graduating apprentice is certified by the State of California in their chosen occupation and well positioned in the industry with hours of real-world experience and technical training by industry experts.

 

Funding and Apprentice Support

Cañada College’s Learning-Aligned Employment Program (LAEP) provides funds to offer eligible students opportunities to earn money while gaining career-related experience in their fields of study. For Learning-Aligned employment positions with for-profit employers, the program can provide up to 50 percent of the student’s compensation. In addition, Cañada College will provide mentors, and additional apprentice support free of charge to employers.

 

For more information, visit:

The Fate of Apple's Vision Pro | Part I

Today we’re featuring a guest post from Evan Helda, the Principal Specialist for Spatial Computing at AWS, where he does business development and strategy for all things immersive tech: real-time 3D, AR, and VR. 

Come see Amazon AWS at our IGS at Metacenter Global Week!

Evan has been at the forefront of the immersive technology industry for the last 7 years. His experience spans numerous layers of the AR/VR/3D tech stack; from AR headsets and apps (the OG Meta), to simulation and game engines at Improbable, to cloud and edge computing/5G (AWS). 

Evan also writes a newsletter called Medium Energy, where he explores the impact of exponential technology on the human experience. 

We recently came across Evan's writing and thought you might enjoy his perspective on the Apple Vision Pro. If you do like this piece, we encourage you to check out more of his content over at MediumEnergy.io!


####

Today was the big day.

The fateful day our tired and battle-worn industry has waited for; for a long, long time. So many troughs of disillusionment, so many clunky demos, so many shattered startup dreams...

We all sat bated breath, leaning forward with anticipation.

The backdrop was out of a movie: dozens of industry experts, leaders, investors, and entrepreneurs, sprawled across rows of couches on the beach-side deck of a Malibu mansion.

To our right, waves crashed rhythmically, bringing sea foam right up to our feet. To our left, a sprawling spread of breakfast delicacies and of course, champagne. Copious amounts of champagne. The extent to which it would be popped & consumed? TBD... Directly ahead was a massive flat screen TV unfolding what we all hoped would be our industry's 'big bang'.

And then, the moment finally arrived. Apple CEO, Tim Cook, re-appeared and said those historic words, "But wait... there's just one... more... thing".

Our small crowd erupted with hoots, hollers, and applause. My skin erupted with goose bumps.


As the Apple Vision Pro faded onto the screen, it felt like a dream. And for a split second I did dream, flashing back to another fateful day.... five years prior.

The Office of the Future (Spring 2018)

Today was the big day.

The fateful day our augmented reality startup, Meta (the original Meta…), would finally fulfill the promise our CEO had made to the world; to throw away our computer monitors and replace them with a more natural human-computer interface: an AR headset that would blend the physical world with the digital.

Meta 2 AR Headset

We called it 'spatial computing'.

Our CEO made this promise on the grand stage that is TED (worththe 10 mins to watch here). And in about a month, Bloomberg was set to visit our office. Their tech reporters wanted to see this bold exclamation for themselves and write an article on the outcome.

CEO, Meron Gribetz, on the TED stage

We were being held accountable. The boats were burned. There was nowhere to hide.

Today was the dress rehearsal for that Bloomberg visit. All 100 hundred employees would finally taste the fruits of our labor; three years of blood, sweat, and tears towards building a fully vertical AR stack; our own display system, our own sensor array for positional tracking, our own SLAM algorithms, our own hand tracking algorithms, our own SDK, and most importantly... our own 'spatial operating system'.

This was no ordinary OS. It was meant to be the 'OS of the Mind': one that would conform to how our brains have naturally evolved, guided by what we called 'the principles of spatial design'.

(If you watched the Vision Pro announcement... sound familiar? It's no coincidence. Apple seriously considered buying Meta back in 2017. Our 'spatial design principles' and vision for a SpatialOS were a big reason why. Oh, what could have been…)

We would place virtual monitors all around us at limitless scale. We would free 3D models from the confines of 2D screens, visualizing and interacting with them as they were always intended: spatially.

Gone were the days of the mouse & keyboard. Thanks to computer vision, we would use our hands to more naturally & directly interact with the digital world, just as we do the physical.

This same computer vision tech would turn the world into our desktop background, understanding the environment and anchoring actualized figments of imagination all around us.

Meta, the OG Meta... was going to build the true iteration of Steve Job's 'bicycle of the mind': a computer that grandma or a child could pick up and intuitively know how to use, with zero learning curve.

Oh, how beautiful the vision was...

But oh... how naive we were.

The Revenge of the Meta 2

The office that day of the ‘Bloomberg rehearsal’ was buzzing with anticipation.

For the first time, we each received our own headset. The packaging was a work of art; a beautiful piece of engineering and design, accompanied by a thoughtful developer guide and a document outlining our 'spatial design principles'.

The first step: plugging the tethered device into a computer and then into the wall for power (yes, the sequencing mattered...).

It was a collective stumble right out of the gates. Our computers failed to recognize the device. For the next hour, we twiddled our thumbs as the engineers scrambled to fix a bug and re-distribute the SDK (software development kit)

Hot start.

Once the 'Spatial OS' finally launched, the user was tasked with calibrating the sensors and mapping the world.

A graphical UI instructed you to look up, look down, look left, look right.

The next 5-10 minutes was a comical display of 100+ people in a state of manic indecision; stuck between vigorous yes's and no's; shaking our heads this way and that; waiting, hoping, yearning for the cameras to lock-on to our physical surroundings.

Some devices registered the real world within a few minutes. Other poor souls were left doing neck exercises for the next 5-10 minutes.

If you were lucky enough to create your environment map, then the OS would finally launch. The OS interface looked like a holographic book shelf. Each shelf with floating orbs representing a variety of spatial apps.

But upon launch, exactly where this holographic shelf appeared in space was anyone's guess.

For some, it was down on the floor. For others, it was off in the distant horizon or behind them. The next 10 minutes we collectively embarked on a holographic treasure hunt at our desks; searching up, down, and all around for our 'app launcher'.

My holographic shelf was above me, anchored to the ceiling.

Now the primary way to interact with these holograms was with your hands. You had to reach out and grab them. But doing so was quite the art... it required your hand being in the perfect spot, at just the right proximity to the hologram. When you found that magic zone, a circle would appear.

Then, and only then, could you close your hand and 'grab' the hologram. The camera on the headset needed to see a very distinct gesture: a wide-open hand and then a distinctively closed fist. When the cameras saw this movement, the circle UI would become a dot, confirming the hologram was secured.

This led to yet another comical sight; an entire office of people, waving their hands in the air, trying to materialize that circle. Everyone was flailing about, groping the air and repeatedly trying to turn that circle into a dot. We became a horde of perverts molesting invisible objects of desire.

I stood up and reached longingly into the air for my holographic shelf, only to be immediately yanked back into my chair by the tether.

Screw it. I resorted to using the mouse we so vehemently vowed to replace. It was a fallback form of input, controlling a 'spatial cursor' that allowed me to click on the 3D shelf and pull it closer.

Finally, I could start pulling out little apps & experiences, placing them all around me at my desk. For a split second I was living in the future.

There were virtual monitors showcasing the future of productivity, with PowerPoint, web browsing, and spreadsheets. But I could barely read the text. It was blurry and the eye strain was very real. There was a beating heart for the future of education. There was a virtual jukebox to show case our (attempts at) spatial audio. There was a 3D model of a Tesla, hinting at the future of immersive design or e-commerce.

And my personal favorite... a box with an image of a butterfly. When you touched it, the box exploded into a cloud of 3D butterflies, fluttering vigorously this way and that. When you held out your hand, they would come land and gently rest.

For many, the mind would play tricks. You could feel the tickle of the butterfly's little legs on your hand...

This… this to me is the magic of spatial computing; mind merging with machine, tapping into the mystery of how the brain has naturally evolved to interact with and understand the real world.

Imagine the impact of this for communication, education, collaboration, and creation. We were passionately driven by this potential. We were mission obsessed, and mission bound.

But that moment in the future was short lived. After a few minutes, naseua set in from the motion-to-photon latency (aka: the time between head movement and the display’s output/reaction ). Then, the virtual shelf suddenly started to jitter and float away, carrying with it our collective hopes & dreams.

Alas, my headset lost world tracking entirely and holographic chaos ensued.

Before I knew it, holograms were flying all over the place. The virtual heart shot past my head, the virtual monitor turned upside down and shot through my desk, and the 3D shelf/OS UI zoomed right back up to its original home; the ceiling.

Next to me sat my sales colleague and dear friend, Connor McGill. We looked at each other, let out massive sighs, and just laughed. What else could we do?

We had spent the last 18 months traveling the world; from LA to NYC, Shanghai to London, Amsterdam to Rome giving thousands of demos and convincing the world’s largest companies that spatial computing was the future, and that it was imminent with the Meta 2: Nike and Adidas, Lockheed Martin and Boeing, Exxon and Shell, Ford and Tesla, Disney and Universal, Dell and Lenovo. The list goes on.

This was going to make for some awkward conversations.

Welp... at least we had good packaging.

Dell Technologies President, Jeff Clarke, celebrating the deal to become a Meta 2 reseller.

Meta 2 @ The Pantheon in Rome

Kate Middleton & Prince William

Even Bert wanted in on the action

When the Apple Vision Pro presentation ended, I was in awe. They seemed to have absolutely nailed it.

The display quality— near perfection; making it seem like you’re viewing the world through a pane of glass. The pixel density— mind blowing; making text perfectly legible, at last. The innovation with the R1 chip— a sci-fi feat; processing data from 12 cameras to produce zero perceived latency and making nausea a thing of the past. The world tracking— immediate and flawless, anchoring holograms perfectly and elevating them to first class citizens in the real world. The input/interaction- pure magic, creating the illusion of mind conrol with the perfect tandem of eye and hands tracking.

The list goes on... they seemed to think of every little detail, and thoughtfully addressed the majority of paper cuts that have plagued AR/VR for decades.

When I left the viewing party that day, I half-expected there to be a ‘spatial computing parade’ in the streets.

The tech we’ve all been waiting for was finally here! A cause worth celebrating, for sure. Heck, I was ready to take the day off and paint the town red! (And that is exactly what a few of us did…)

Spatial Squad

But when I integrated back into the real world the next day, my enthusiasm wasn’t quite the norm.

The first friend I talked to about the announcement said “it made me want to put my feet in the grass and hide in the woods”.

Okay, considering some of the concept videos, I get it… (we’ll address those later).

And then there were the mainstream media pundits, spewing all kinds of nonsense: ‘Apple has lost its way’, ‘this product will never sell’, ‘no one needs this’, etc. etc.

As the weeks went by, the criticism kept pouring in…

  • It's too expensive!

  • What is this good for?

  • People won’t wear something on their face

  • It's too isolating

  • The digital eyes are creepy

  • Only a two-hour battery life?!

When I first heard these critiques, my blood boiled. I couldn’t help but think… What the hell is wrong with these people? How can they not see what I see? Do they not get the magnitude of these technical feats and this product’s potential impact?

With my emotions at the helm, I realized I needed to take a step back, think objectively, and question my beliefs… turns out inherent career bias is a helluva a drug.

Why was I so triggered? Am I the crazy one here? Or is everyone else missing it, and my bullishness is indeed warranted?

Over the last month, I’ve done the inner work to remove my XR fanboy cap and think more deeply about Apple’s strategy, along with the importance of this moment.

With my bias officially on the shelf, I remain convinced-- the pundits are wildly wrong, consumers don’t know what they don’t know, and this moment truly does matter, indeed.

And turns out, I’m not alone… while I’ve not tried the device myself, I have listened & talked to those who have. Upon hearing their feedback, I feel (a bit) less crazy. My initial perceptions & instincts seem to hold true.

The best part? Their favorite moment of the Vision Pro demo is a holographic butterfly that lands on your hand. They too could feel the tickle of its legs…

(A coincidence? I think not… Apple hired many of Meta’s top engineers upon failure, including the ones who built that original ‘office rehearsal’ demo)

Now, before exploring the impact of this moment, why Apple’s strategy is the right one, and why you should care… Come with me back in time once more, to a moment chalk full of lessons & predictions of what’s to come.

General Magic

They say history doesn't repeat, but it certainly rhymes.

The Meta journey is a reflection of a similar story, with a similar outcome for an eerily similar company: General Magic.

If you like documentaries, this doc about General Magic is a must watch, even if tech & business is not your thing. It's just a compelling story; of ambition and courage; of how to blaze new trails; and of how to cope with heartbreak and shattered dreams.

If you're unfamiliar: General Magic attempted to create the iPhone back in 1989/1990. The vision and the use cases were exactly the same: a personal computer in your pocket acting as a remote control for your life.

General Magic Design

The team was perfect, the vision was prophetic, and much of the technology existed. But the timing was wrong and the tech couldn’t yet merge to make the whole greater than the sum of its parts.

While the individual pieces were there, they weren’t mature enough to yield a compelling user experience. A lot of technology still needed to be invented, and there were numerous UI/UX rough edges to be smoothed over. Very similar to the Meta 2 headset.

There also wasn’t a fertile ecosystem. The internet wasn't ubiquitous, telco connectivity wasn't mature, and 'mobile developers' didn't really exist. There were very few builders and businesses with properly primed imaginations or business models.

Perhaps most important... consumer behavior wasn't properly evolved. They didn't see the point. The use cases didn't quite click and people weren't quite sure what this thing was good for. The whole thing just seemed… silly.

Sadly, the General Magic dream came to an end in 2002.

Fast to June 29th, 2007 (five years after General Magic shuts down). Apple launches the iPhone and changes the world.

Apple was watching, studying, and learning from General Magic all along. They even hired some of their best/most talented employees (e.g. Tony Fadell). They had blueprints and prototypes all along the way. But it took 17 years to get the technology just right; polishing, testing, debating.

And boy, did they nail it.

In hindsight, it's easy to say the iPhone's future impact was obvious when it launched.

But was it?

Sure, it launched with some killer apps: calls, email/messaging, web browsing, and music. But these things weren't entirely new. It was things we were already doing, just better on multiple vectors. Very few people, if any, saw the app store coming and all the innovation that would follow…

Fast forward to 2023 (also five years after Meta closed its doors), and Apple uses the exact same playbook.

For 10-15 years… They were watching, learning, iterating, polishing, debating, and polishing some more.

Apple then launches the Vision Pro and change… well, depends who you ask. To most, its impact is far from obvious.

And so, the stage is set for perhaps a similar story, albeit with obvious differences. This is a bit more dramatic than going from a flip phone to a touch screen.

We’re now breaking through the screen and entering the machine. Of course, the tradeoffs, roll out plan, and adoption cycle must be radically different.

In Part II, we’ll explore those differences and analyze Apple’s strategy, diving deeper into the tradeoffs, why the skeptics are wrong, and how this adoption curve might unfold over the next 3-5 years.

Until then…

Thanks for taking the time to read Evan's essay. Let us know what you think about this perspective, and if you want to check out some more of Evan’s writing, here are some of our personal favorites:

- How to defend the metaverse

- Finding solace in the age of AI

- The Ultimate Promise of the Metaverse

Matt Thompson Appointed to St. Louis VRARA Chapter

We are excited to welcome Matt Thompson who president of the St. Louis Chapter of the VRARA!

Matt is an experienced product leader in the worlds of SaaS and IP with experience working with companies like Disney, Warner, Universal, Sony, Meta, and YouTube. He is also a rabid gaming and VR/AR enthusiast.

It's incredibly exciting to lead a new Chapter for the VRARA here in St. Louis! I'm looking forward to bringing outside knowledge into our area and taking local knowledge to other areas in the industry. 

-Matt Thompson

Now Hiring: The Department of Interactive Media, University of Miami

VRARA Member The University of Miami is hiring!

The Department of Interactive Media at the University of Miami (https://lnkd.in/gRf3CQ7c) is currently seeking applications for a full-time, nine-month, tenure-track or tenure-eligible faculty position at the Assistant Professor level for faculty conducting research in immersive media. We are dedicated to fostering a diverse and inclusive academic community and enthusiastically encourage applications from individuals who can contribute to this mission. The candidate must possess a Ph.D. in Communication, Human-Computer Interaction, Media and Technology, Media and Society, Media Arts, Emerging Media, Computer Science, or a related discipline by the beginning of the appointment, August 15, 2024.

Learn more and apply at https://lnkd.in/gt5jNdWA.

The True Impact of XR on Education: Beyond the Hype

Written by XPANCEO founder Roman Axelrod.

There is tremendous excitement about how AR/VR technologies can transform education. Just as in movies, one may imagine children taking virtual trips to ancient Rome, studying interactive 3D models of the Solar System, and even observing molecules or frogs coming to life before their eyes. Various research, for example, this one in Nature magazine, proves that AR and VR effectively enhance knowledge acquisition, retention, and skills development. However, merely achieving these benefits is insufficient to revolutionize education. To make a significant difference, new technologies must not only change how individuals interact with materials but also transform the entire sector of the economy.

Still, XR indeed has the potential to reshape the education system. In this article, we will explore how this shift will happen in other fields of education.

Why making education “cooler” is not enough

As in any other industry, educators are often preoccupied with concerns about funding, safety, equality, and curriculum. Therefore, for any new technology to gain widespread acceptance, it must bring about a fundamental transformation, rather than merely making it more user-friendly.

For instance, let's consider school education. As it was highlighted in various studies, AR and VR could potentially enhance students' learning experiences; however, the question arises whether the complexities and costs of implementing such technology in a relatively rigid government-controlled system are justified. Even personal computers, despite their long presence, have struggled to fully integrate into school education. According to the Cambridge International survey, 48% of students do use computers for their education, yet 90% still use pen and paper.

In the case of higher education, we can expect more viable opportunities. For example, giving students in medical or engineering schools better tools through XR applications can significantly improve the effectiveness of their courses. In this realm, funding sources are also clearer, with employers and key vendors showing interest in investing. However, even these advancements fall short. The challenge lies in the fact that such changes only represent a small portion of the overall educational process. While such training aids in acquiring practical skills faster, it cannot replace the complete educational journey.

Where the real change can take place

The most promising ground for XR to provide a complete solution is vocational training, where a global shift in approach is required. Vocational training is education preparing for a specific career, primarily in the blue-collar sector. Studies highlight that the demand for low-skilled workers as well as their salary, is decreasing. In contrast, there is a rising demand for mid-skilled jobs that require specialized training and qualifications to operate and maintain various equipment. The mid-skilled roles encompass various tasks: heavy construction equipment and machines and working with cash registers, inventory scanners, and robotized cleaners. As technology advances, devices become more prevalent, with embedded computers becoming a standard feature. Consequently, re-training becomes essential due to evolving equipment and changes in job requirements.

What will the changes be

With the known benefits of AR and VR applications in education, such as increased engagement, improved focus, and better knowledge retention, we can see their significant impact on vocational training. Going beyond merely accelerating the learning process, XR's potential for fundamental changes is promising:

  1. Redefining training and work. AR applications can truly blur the lines between training, apprenticeship, and on-the-job performance. With the use of them, students or workers can be smoothly transferred from simulations to augmented or real-life work, after the system determines they are ready to move on to the next learning stage. This is especially the case for a smart contact lens which will make education the most realistic without adding any additional devices.

  2. Enhancing workplace safety. XR can make professional certifications fully digital, so people will get to learn how to work with complex machines without any risks. At the same time, XR provides possibilities for instant certification that cannot be faked, so the employer will have effective means to reassure that the person is ready to work with the equipment.

  3. Increasing workforce mobility. With quick and cost-effective training, employers can address workforce shortages and adapt to changing market demands. This fosters investment and growth in manufacturing and other industries, benefiting both employers and employees alike.

Vocational training is being discussed less often, but the opportunities there are much brighter than in more hype areas such as school or university education. Moreover, there already are some companies that offer hardware simulators for heavy equipment, for example, Caterpillar machines or VR training courses. More will follow when XR equipment becomes more affordable, but only with truly wearable gadgets like smart contact lens will the full potential of these tools be unleashed.

 If you want to learn more about the smart contact lens developed by XPANCEO, contact dragon@xpanceo.com.



Call for Speakers: VRARA New York Chapter - Showcase Your Work!

Dear VRARA New York Chapter Members,

We are excited about our upcoming events being organized right now. The New York Chapter of the VRARA is now looking for engaging and passionate speakers who are willing to share their work, insights, and experiences with fellow industry professionals and enthusiasts at our in-person networking socials and demo days.

We are seeking speakers who:

  • Are current members of the VRARA.

  • Have cutting-edge knowledge and expertise in VR/AR.

  • Have developed groundbreaking projects, tools, or methodologies in the field.

  • Can deliver an engaging and informative presentation 

Perks of participating:

  • Showcase your work.

  • Network with fellow industry experts, business leaders, and VR/AR enthusiasts.

  • Receive recognition and build your professional reputation.

  • Obtain valuable feedback and spark new collaborations.


How to Apply:

To submit your interest or proposal, please contact the NY Chapter President, Cindy Mallory, at cindy@thevrara.com.

Spaces are limited, and we encourage you to apply soon to secure your opportunity to contribute in the upcoming VRARA NY chapter events, including a Manhattan event early September. Should you have any questions or need additional information, please don't hesitate to contact Cindy via email or LinkedIn.

Declan Jonson Appointed as Co-President of VRARA Utah Chapter.

We are excited to welcome Declan Johnson who is joining me as co-president of the Utah Chapter of the VRARA.

Declan is a graduate of Brigham Young University. It was there that he developed a passion for XR and devoted time to the Mixed Reality Lab for two years, developing projects for students and departments across campus. Declan took that excitement for the field and started his own consulting company, prototyping XR experiences for clients in Unity.

Declan then joined Continuum XR, a leading team of expert XR developers and 3D artists, and began working exclusively on 8th Wall web AR projects. He has created over 300 web AR experiences, most with a retail focus for nationally esteemed brands, and had a helping hand in growing the company to where it is today. This has led to his current role as Continuum’s Business Development Representative.

"I am absolutely thrilled to take on the role of Co-President for the Utah Chapter of the VRAR Association. I am eager to contribute my passion and expertise in organizing exciting events, fostering growth within the chapter, and spreading the association's influence throughout the vibrant tech landscape of Silicon Slopes."

—-Declain Johnson

Dan McConnell Appointed as President of VRARA DC Chapter.


We are thrilled to have Dan McConnell serve as Chapter President, VR/AR Association Washington DC!

Dan is an action-oriented, visionary technologist with over 20 years of experience as a leader in both government and industry. Currently, Dan is Chief Technologist at Booz Allen Hamilton and leader of the spatial computing capability within the firm’s Bright Labs emerging technology incubator in the office of the CTO. Dan also serves as Co-Chair of the VR/AR Association Industrial Metaverse and Digital Twin Committee. Previously Dan was a strategy consultant at The Cohen Group, and he also spent nearly a decade in uniform in the US Army. Dan holds a bachelor’s degree from the United States Military Academy at West Point and graduate degrees in public policy and technology from Harvard Kennedy School of Government and the University of Virginia McIntire School of Commerce.


Virtualware launches VIROO 2.4, the new version of its virtual reality as a service (VRaaS) platform

Spanish publicly traded company, Virtualware (EPA:MLVIR), one of the leaders in virtual reality, announced today the release of the new version 2.4 of its VRaaS platform, VIROO, which incorporates, among other capabilities, Mixed Reality (MR) and VR CAVEs integration as a standout novelties.

Version 2.4 introduces a new groundbreaking feature by integrating virtual reality (VR) and mixed reality (MR) technologies into its sessions. This combination provides a seamless and collaborative experience, allowing multiple users to connect from different locations and use various devices, establishing genuine platform interoperability.

Among the new features, the most significant are:

• Mixed Reality capabilities: VIROO boasts the capacity to blend VR and MR technologies within its sessions, offering true cross-platform interoperability.
• VR CAVEs integration: VIROO is now compatible with multi-projection systems, such as CAVEs or similar.
• VIROO Studio for Unity: VIROOʼs low-code VR Creation tool for Unity becomes VIROO Studio.
• VIROO Room offline configuration: The new feature allows to deploy immersive multiuser content in VIROO Room without the need of internet connection.
• VIROO Content updates: New scenes have been created and updated for any VIROO 2.4 user to make use of them.
• Latest headsets compatibility: VIROO integrates the full compatibility with the latest enterprise VR headsets.
• Identities management: VIROO adds identity management to enhance security throughout the platform.
• Data visualization and UI/UX improvements: More content information and better usability.

“VIROO 2.4 is the cutting-edge virtual reality technology that offers businesses a significant competitive edge. With its enhanced graphics, seamless interactions, improved performance, and expanded capabilities, VIROO 2.4 empowers businesses to deliver innovative solutions that exceed customer expectations. This is not only opens new revenue possibilities but also attracts customers who are seeking immersive experiences.” said Sergio Barrera, CTO of Virtualware.

Virtualware’s flagship product VIROO is the world’s pioneering VR as a Service (VRaaS) platform, makes Virtual Reality accessible to companies and institutions of all sizes and sectors. It is an all-in-one digital solution that enables the development and deployment of multi-user Virtual Reality applications remotely.

Headquartered in Bilbao, Spain, Virtualware is a global pioneer in developing virtual reality solutions for major industrial, educational, and healthcare conglomerates. Since its founding in 2004, the company has garnered widespread recognition for its accomplishments. In 2021, Virtualware was acknowledged as the world’s most Innovative VR Company.

With a diverse client base that includes GE Hitachi Nuclear Energy, Ontario Power Generation, Petronas, Iberdrola, Alstom, Guardian Glass, Gestamp, Danone, Johnson & Johnson, Biogen, Bayer, ADIF, the Spanish Ministry of Defense, Invest WindsorEssex, McMaster University, University of El Salvador and EAN University, and a network of partners worldwide, Virtualware is poised for further global expansion.

VRARA Member Warp VR helps Meliora VR redefines health & safety training with blended learning

This Customer Success Story originally from VRARA member Warp VR.

Customer intro

In 2017, a number of companies, educational institutions and healthcare organizations in the Netherlands joined forces to experiment with innovative training methods for critical situations in healthcare. The group works together since then to develop applications with VR and 360° videos and evaluate these in practice.

In 2018, Meliora VR (part of Saasen Groep) started from this initiative, providing an innovation platform for cooperating organizations to develop digital products for practicing and testing competencies related to safety, healthcare, and more.

Challenge

Many companies, especially in healthcare and other knowledge-intensive industries, have to deal with a shortage of staff and limited time for training. Also certain environments needed for training aren’t easily accessible (like operating rooms), and certain critical job situations that may be dangerous, impossible, counter productive or too expensive to replicate in real life are hard to train for.

The pandemic amplified this by making it hard to impossible to follow courses in person, so there was a new way needed to reach students.

Solution

Virtual reality is becoming increasingly accessible. Better, cheaper VR glasses and more content providers are making it more interesting as a tool for learning and behavior change. The immersion in an environment (immersion) distinguishes VR from other methods and (digital) tools. 

The learning products Meliora VR develops align with the principles of Miller's Pyramid and consist of a mix of VR, e-learning, animation, and systems for workplace input (such as digital safety tracking systems). The VR products include virtual tours, ‘Real Learning’ (a combination of animation and real objects like CPR dummies), and ‘Right Choice Learning’ (realistic simulations using 360° video).

Meliora VR uses Warp VR to power the Right Choice Learning offerings for its ease-of-use and scalability. Pico VR headsets are managed automatically, so trainees don’t have to hassle with technology but only need to click on their own name to start a scenario. When users aren’t comfortable wearing a VR headset, they are also offered the option to play on a tablet. After each session, students keep access to the scenario for 1 month on their smartphones for additional practice.

Training competencies can take place in pre-conceived scenarios. Again and again, a trainee is then faced with new choices while being able to experience the consequences of each choice they make. For emergency response, 10 scenarios have already been developed. Meliora VR’s system is set up in such a way that customers can easily develop custom scenarios to include other competencies.

Meliora VR has its own production team and actively cooperates with educational institutions like universities of applied sciences Fontys and De Kempel, MBO St. Lucas and Ter Aa College. Students from these organizations help with creating immersive experiences, providing feedback on learning effectiveness, and researching new use cases.

Customers mainly come from healthcare (e.g. Anna Zorggroep) and education (e.g. Fontys University of Applied Sciences). Most use cases focus on health & safety (e.g. what to do with smoke & fire, how to provide  first aid, and working with dangerous substances) and soft skills (e.g. anti-aggression training, and making decisions).

Results

Users are enthusiastic about the realistic, 360° video based simulations and regard them as a great, engaging addition to more traditional learning methods. As trainees can directly experience the consequences of wrong choices, retention of the training material is also significantly improved. 

Meliora VR and B&V partners received a grant from the MKB !dee project ‘VR learning culture in safety and health’ to promote a VR learning culture within Fontys and other educational institutions.

Another example is Anna Ziekenhuis that ordered over 1100 cardboards to scale VR learning within their organization, including onboarding of new employees and safety procedure training.

Quotes from users:

"It felt like I got away from the classroom for a while. What a fun way to learn!"

"It all gets really close. It feels like you're right on top of it."

"I was quite worried about using a VR headset, but fortunately no controllers were needed. It all felt very natural and familiar."

"The biggest strength of the system is its scalability. I can reach everyone with the system very quickly."