What to know about Niantic's new SDK for the most amazing AR experiences

Come see Niantic at our Immerse Global Summit during Metacenter Global Week in Orlando Oct 17-19

Ahead of the general availability of Niantic Lightship 3.0, Justin Sneddon, Group Product Manager on Niantic Lightship, will present how this SDK can help you build the most amazing AR experiences at our Immerse Global Summit during Metacenter Global Week in Orlando.

Here’s a sneak peak!

Beyond the AR Horizon

Lightship ARDK 3.0 takes what ARKit and ARCore offer in Unity via ARFoundation and cranks it up a notch. But that’s just the beginning. Lightship’s tools are designed to fill in the missing gaps and push the boundaries of computer vision technology. Buckle up, because we’re about to take you through some game-changing features like Depth, Meshing, Semantics, Navigation, Shared AR (Multiplayer), Visual Positioning (VPS), Debugging Tools (Playback and Mocking).

Depth - The Foundation of AR Awesomeness

Depth is the secret sauce behind every AR experience. It’s what helps us figure out where to place objects and how they should interact with the real world. Lightship’s depth is something truly special. Why, you ask? Well, it all comes down to our passion for getting people outdoors.

Lightship’s depth is trained on vast outdoor environments, which means it can provide incredibly accurate depth from a single camera. Plus, it’s not limited to a short range like Lidar on iPhone Pros. Lightship’s depth can reach a whopping 40+ meters, and it works on all AR-capable phones—yes, that includes all iPhones and most Androids!

And why does that extended range matter? Imagine summoning a massive dragon into your AR world—this creature has a wingspan that far exceeds the 5-meter limit. With Lightship’s long-range depth, you can place it 10 to 20 meters away from your camera and capture every breathtaking detail.

What else can you do with this supercharged depth? Let me break it down for you:

  • Placement: Convert a screen point to a real-world position and place digital objects there.

  • Measurement: Know the distance to anything on your screen.

  • Occlusion: Use depth information to seamlessly blend digital objects into the real world.

But wait, there’s more! When you combine depth with semantics (stay tuned for that!), the possibilities become endless. Visual effects like pulse effects, depth of field, toon filters, edge detection, and more come to life. I’ll walk you through how to create the first two experiences.

And, there you have it, folks! Niantic Lightship is all about taking your AR game to new heights. If you’re as excited as I am, you can dig deeper into these features with my upcoming blog posts, complete with examples and source code.

Register now!

Roxana Nagy Appointed as Co-Chair for VRAR Association’s Generative AI Committee

We are thrilled to have Roxana Nagy help lead our community for Generative AI.

Roxana is an experienced Creative Technologist and Senior Mobile Engineer with a passion for Immersive Technologies. In her role as Director of Immersive Technologies at Monstarlab, Roxana helps top companies around the world to create strategies and build innovative projects with the use of Metaverse, AR and VR technologies.

With a love for giving back to the community, Roxana is co-leading Women in Tech Dubai, a group focused on creating a safe space and community for women working in the tech field and is a member of the Murdoch University Dubai Industry Advisory panel created to bring critical perspectives about the expectations of industry employers in relation to graduate outcomes.

She is a thought leader on the topic of Augmented Reality, both from a technical and business perspective by writing specialized articles and speaking at high-profile international conferences and podcasts.

 

“I'm excited to step into the role of co-chair for the VRAR Association’s Generative AI committee. This committee stands at the intersection of Generative AI and XR – two domains that, when combined, can dramatically revolutionize our digital interactions, workflows, and experiences.

 By seamlessly integrating advanced natural language processing and generation capabilities, we will witness a new era in which virtual worlds become increasingly engaging, interactive, and personalized.

 I look forward to bringing my expertise to the table, collaborating with industry pioneers, and helping shape the next chapter of XR.”

- Roxana Nagy


Augmented Reality at the stadium - redefining fan engagement with Immersal

Come see Immersal at our Immerse Global Summit during Metacenter Global Week in Orlando Oct 17-19

How to engage your fans at the stadium better? How to get the audience to spend more time and money at the event? Immersal, a visual positioning company shared their know-how of AR stadium experiences from the US and Japan. App for baseball fans at all stars -event and a community AR experience for rockband's fans in Japan, were created based on Immersal VPS. More to learn more, visit Immersal Booth at the IGS, do not miss Immersal's presentation at the conference. In the presentation, the potential of AI and VPS working together - is presented and there will be a simple demo to make this potential more real! 

With Immersal's visual positioning system (VPS) you can create centimeter-accurate, large-scale indoor and outdoor AR experiences. Navigation, entertainment, industrial applications like asset management and maintenance assistant, and information systems in AR are just a few of the many more applications VPS can power.

You can read more about AR stadium experiences powered by Immersal here

If you want to start testing the Immersal SDK and create your own AR experiences - download the free SDK here.

Campfire is coming to Meta Quest 3!

Campfire for Quest uses passthrough to make technical communications for design reviews and training easier than ever. 

 

With full-color passthrough on Meta Quest 3, you’ll feel like you are interacting with content in front of you and team members around you. Whether you are presenting concepts in a design review or explaining complex assembly procedures, this is the next best thing to being there! 

Not only is the Campfire app easy to learn for first-time Quest users, but it also comes with a virtual assistant named Spark. Spark will guide you through an interactive tutorial, combining humor and how-to instruction to make your experience even more enjoyable! 

Campfire for Quest will be available in the Meta Quest Store on Nov 1 for Quest 3, Quest Pro, and Quest 2. The app is also available for PC, Mac, and iPad.  To learn more, schedule a demo here.

 

How Does Orlando Compare to Other National Tech Hubs?

From tourism to lifestyle, Orlando is a destination for millions around the globe to travel each year and experience our world class entertainment to thousands moving here weekly to make this amazing metro area their home.

Next month, a significant event will be held here for the first time, bringing local businesses together with mega global brands, such as Meta, Amazon, and more.

What’s bringing them here? Immersive technology.

Kyle Morrand is the CEO of the gaming technology company, 302 Interactive.

Growing up in Miami, Kyle found his way to Orlando when he chose to attend UCF. After experiencing life in Orlando, he chose to stay.

Kyle and his team at 302 Interactive are getting ready for MetaCenter Global Week, a 3 day event that puts Orlando front and center with immersive technology brands from all over.

We met up with Kyle at Creative Village in downtown to have a conversation about the event, his thoughts on the future of immersive technology and what separates Orlando in comparison to other national tech hubs.

For the full interview click here


source: The Orlando Life

Adam Kornuth Announced as Co-Chair for VRAR Association’s Generative AI Committee

We are delighted to have Adam Kornuth help lead our community for Generative AI. 

Adam brings a wealth of knowledge and experience to his role as Co-Chair of the VRARA Generative AI Committee. He has an extensive track record of collaborating with world-class brands in the retail, media, and technology sectors, including renowned companies such as Coca Cola, AT&T, Toyota, IHG, and TEGNA, along with innovative agencies and emerging technology labs. His multifaceted career has encompassed roles in Strategy, Business Development, Marketing, Account Leadership, and Advisory positions. Kornuth is well-equipped to guide the committee towards innovative solutions that harness the potential of Generative AI for industry leaders, researchers, technology enthusiasts, and organizations worldwide.

“I’m honored to take on the role of Co-Chair for the VRARA Generative AI Committee. Virtual and Augmented reality are on the cusp of a transformative era, and the integration of Generative AI will play a pivotal role in shaping the future of these as well as so many other industries. I look forward to collaborating with experts in the field to drive innovation and foster cross-industry strategic partnerships.”

– Adam Kornuth



Lenovo and F1® Team Up to Virtually Put Fans in the Driver’s Seat with the ThinkReality VRX

Post originally appearing on Lenovo Storyhub by Vishal Shah.

Get Ready Race Fans

Lenovo and Formula 1® are working together to constantly improve the fan experience: from content production to live broadcasting, including the use of cutting-edge technologies like augmented reality (AR) and virtual reality (VR). The Lenovo ThinkReality VRX F1 project is a further step in this direction, allowing F1 fans in the Paddock Club to test themselves on the track.

Using the new ThinkReality VRX all-in-one headset, race fans at the upcoming FORMULA 1 LENOVO JAPANESE GRAND PRIX 2023 in Suzuka and FORMULA 1 LENOVO UNITED STATES GRAND PRIX 2023 in Austin will be able to play an exclusive VR F1 mini-game.

Based on the iconic slot car racing games many of us played as kids, users drive an F1 car around a replica of the Suzuka International Racing Course or the Circuit of the Americas, controlling its speed with the ThinkReality VRX controllers’ buttons. While playing, the user can view the track from different perspectives, as if they were observing a toy car track placed on a table. The experience is just like the traditional electric track cars. Drivers can drift around corners, but if they accelerate too much in a curve, the car can leave the track, and even fall off the table!

The goal is to achieve the best lap time: the scoreboard displays the results and can be projected onto an external screen.

Throughout the production of the ThinkReality F1 experience, generative AI was used to design elements of the racecars and track, as well as for voiceovers and programming assistance. 

The Lenovo ThinkReality VRX F1 game will also be available at Lenovo’s Tech World 2023, taking place on October 23-24 in Austin, Texas.

It’s Not Just Fun and Games

The racing game is an opportunity to showcase the power and value of VR experiences and the ThinkReality VRX, Lenovo’s new all-in-one virtual reality (VR) headset engineered for the enterprise.

From employee training and virtual collaboration to 3D design and engineering, XR technologies are becoming more important than ever for businesses and organizations, enabling people to do more, faster, and with less cost.

The ThinkReality VRX offers the market something truly unique, an end-to-end XR solution for the enterprise. It not only includes cutting-edge hardware, but also the software and services to make enterprise XR deployments easier, as well as quicker in achieving ROI.

The ThinkReality ISV ecosystem is tailored to the core use cases that show real results and ROI at scale. Hard skills training to create muscle memory and support employees to learn by doing, and to fail safely. Soft skills training to help workers communicate better, grow their potential, and learn about their organization’s values. Collaboration tools to enhance team meetings, review digital twins, and to hold special events. Spatial computing applications like virtual monitors and AI-supported workflow applications that help expand workspaces and supercharge productivity. Even wellness platforms to help employees reset, and recenter, both physically and mentally.

The ThinkReality VRX is supported by a broad portfolio of professional services. This includes flexible device management with the ThinkReality cloud software platform, and ThinkReality xR Services from consulting and content creation to deployment support. Similar to many enterprise solutions from Lenovo, the ThinkReality VRX is also supported Lenovo’s Integrated Solution Support (LISS) for around the clock global customer service, as well as Device as a Service (DaaS) financing through Lenovo TruScale.

Lenovo believes smarter technology can revolutionize the way people train, work, and communicate. Once again, our partnership with Formula 1 helps us to showcase the capabilities of Lenovo’s technologies, services and solutions on a global stage.

Orlando Mayor Dyer: Metacenter Global Week to showcase Orlando to the world

Originally appearing on orlandonews.com by Marco Santana.

Orlando Mayor Buddy Dyer has watched the city’s economy undergo multiple transformations since his first election in 2003.

Sometimes, it’s a matter of necessity.

As the coronavirus battered Orlando’s tourism industry the previous three years, he turned his attention to economic diversification.

At the same time, he noticed the tech industry thriving, even as COVID-19 completely hamstrung the city’s 500-plus hotels.

So, among other things, he leaned into the Orlando tech industry.

He famously held his 2022 State of Downtown address in virtual reality in December.

Then, in May, he threw his support in early to announceMetaCenter Global Week during his annual State of the City.

As that weeklong celebration and showcase of Orlando’s tech community approaches – as the major tech event in Orlando next month – Orlando Tech News caught up with him to get his thoughts on the industry and the upcoming event.

What is it about MetaCenter Global Week that has you excited?

It’s a great opportunity to raise the profile of Orlando’s reputation as a tech community. We have a thriving tech ecosystem here with both big companies and small ones but I don’t know that the world is necessarily aware of it. The whole notion of the MetaCenter (Global Week is) having all the people come to Orlando and expose them to what we have to offer.

What could it mean to Orlando’s tech community?

It gives our companies and entities the opportunity to meet with people they might eventually collaborate with, perhaps. It lets the outside world know about the educational and various industries and how well we have parlayed our industry clusters. These include military, simulation and training, Creative Village, Lake Nona and some of these tech-focused businesses at incubators like Starter Studio.

Can you talk about the significance of Innovate Orlando becoming its own thing recently?

The whole notion of having an entity like Innovate Orlando break out of the Orlando Economic Partnership and stand on its own is certainly significant in terms of demonstrating where we stand as a tech community.  We have gained notoriety around the country in terms of what we have to offer here. That’s continuing to get exposure by having a week focused on Orlando’s innovation and tech offerings.

The industry we are known for is actually one of the original tech industries.

If you think about this in terms of our tourism industry, some of the high-tech aspects of the theme parks absolutely go hand-in-hand. These are some of the forebearers of these technology innovations. Modeling, simulation and training and the live experiences offered at theme parks go hand-in-hand in terms of the type of people that would have that expertise and it’s transferable between industries.

Global Week is a combo of offerings. What could the future of the event bring?

Combining Synapse and the Immerse Global Summit into one week was a big deal. What we need to do is look around us. I think this will grow. It might be reminiscent of South by Southwest, which didn’t really know what they would become in the early days. I am hoping in 2043 we can say, ‘Gosh, remember what this was like in 2023?’ 

Can you talk a little more about Innovate Orlando’s presence now?

I think it’s a big deal that it happened. It’s not unlike the fact that Visit Orlando was once a part of the chamber a long time ago and then came out to stand on its own. In some sense, this is a similar move. It’s cool to see. Visit Orlando was there to help a growing economy and has since become a huge part of our economy. I believe Innovate Orlando could serve that same purpose.

How big was it that Orlando had a thriving tech community during the pandemic while COVID hammered tourism?

The growth in our tech community continued during the pandemic as if it weren’t a pandemic. We always talk about diversifying the economy and having a segment that can continue to grow and thrive while other pieces are impacted. Having that is important to the overall health of the community.

PIXO VR Announces Strategic Partnership with Verse Foresight as its Middle East Affiliate

PIXO VR is delighted to announce a strategic partnership with Verse Foresight as its Middle East Affiliate.

This groundbreaking affiliation combines PIXO’s expertise in VR Training with Verse Foresight's profound knowledge of learning needs and clients across the Middle East, and extends the outreach of PIXO VR to the region.

"We are excited to partner with Verse Foresight as our Middle East affiliate," said Sean Hurwitz, CEO of Pixo VR. "The Middle East region's growing appetite for innovation aligns seamlessly with our mission to make work safer and more enriching through VR Training. Together with Verse Foresight, we aim to offer unmatched enterprise-grade immersive learning solutions to businesses in the Middle East.”

Mostafa Nassef, CEO of Verse Foresight, expressed equal enthusiasm: "Our partnership with PIXO VR as their affiliate in the Middle East is a significant step towards enhancing immersive learning experiences in the region. Leveraging PIXO’s exceptional VR content creation expertise and enterprise-grade platform, we are poised to empower businesses in the Middle East to engage their audiences in profoundly impactful ways."

As PIXO VR’s Middle East affiliate, Verse Foresight will focus on providing PIXO’s solutions to a diverse range of industries, including energy, construction, education, healthcare, and more. Clients in the Middle East can anticipate transformative immersive learning experiences that transcend cultural and geographical boundaries.
Stay tuned for updates as PIXO VR and Verse Foresight embark on this transformative journey together, dedicated to bringing the power of VR to the Middle East through their partnership.

Rooom receives significant investment of 17 million euros!

From rooom.com:

We are absolutely thrilled as this exciting development marks an important milestone and will undoubtedly strengthen our ability to continue to provide innovative solutions for our customers.

 

The investment received comes from the Munich-based financial investor Marondo Capital, TGFS Technologiegründerfonds Sachsen, as well as our long-time supporters bm|t beteiligungsmanagement thüringen GmbH and other existing investors.

We are grateful that they have our back to keep pushing our ideas and innovations. This is how we realize our mission to create groundbreaking 3D visualizations and metaverse solutions. Trust us when we say we have big plans and are eager to create even more unique solutions for you and your customers.


SynergyXR Announces the Launch of Version 2.5: A Game-Changer in Immersive Training & Collaboration

Aarhus, Denmark – 18 September 2023 - SynergyXR, a pioneering leader in the augmented and virtual reality sector for business, is thrilled to announce the release of SynergyXR 2.5. This latest version is set to redefine the standards of virtual collaboration and training, making extended realities like VR and AR more accessible and user-centric for businesses. 

What's New in SynergyXR 2.5? 

  • LMS-Integration: A groundbreaking feature that revolutionizes training management. Users can now seamlessly export VR content in the coveted SCORM 1.2 format, ensuring effortless integration with leading Learning Management Systems. 

  • iOS Quick-AR: Designed for on-the-go professionals, this feature eliminates the need for 3D scanning first, allowing users to dive straight into augmented reality. 

  • QR-Code Support: Enhancing user experience, this feature ensures swift access to Spaces. A simple scan is all it takes to dive into a desired virtual space. 

  • Mac Support: In a move towards inclusivity, SynergyXR 2.5 now offers full support for Mac platforms, ensuring a seamless immersive experience for all users. 

 

Sune Wolff, CTO and co-founder of SynergyXR, commented on the release, "SynergyXR 2.5 isn't just another update; it's a leap. By incorporating invaluable feedback and industry insights, we've crafted an XR experience that's not only more powerful and versatile but also centered around the user." 

 

About SynergyXR: 

Based in Aarhus, Denmark, SynergyXR is at the forefront of making augmented and virtual reality tools accessible for modern businesses. With a strong foundation in the manufacturing and energy sectors, SynergyXR understands the challenges that contemporary businesses face. Their commitment to people-first solutions ensures that extended realities like VR and AR become everyday tools for businesses. 

 

For more information about SynergyXR 2.5 and its groundbreaking features, visit SynergyXR 2.5 Release

 

Press Contact: 

Andy Grantham 

Head of Marketing, SynergyXR 

Email: ag@synergyxr.com 

Phone: +45 31239681 

Meet Obsess in Orlando, Oct 17-19 at Metacenter Global Week! Launched over 250 virtual stores for Ralph Lauren, Charlotte Tilbury, J.Crew, Revlon, Maybelline, J&J, NBC Universal, Mattel

Obsess is a Sponsor and Exhibitor at our Immerse Global Summit at Metacenter Global Week on Oct 17-19 in Orlando, the metacenter for the building of the metaverse.


Obsess is the leading experiential e-commerce platform that enables brands and retailers to create immersive, branded, discovery-driven virtual stores on their websites. The mission of the company is to create the next-generation online shopping interface that transforms the traditional e-commerce thumbnail grid into a 3D, interactive, social and highly engaging experience. Obsess has launched over 250 virtual stores and experiences for brands such as Ralph Lauren, Charlotte Tilbury, J.Crew, Revlon, Maybelline, Johnson & Johnson, NBC Universal, Mattel, and more—driving consumer engagement, brand loyalty and conversion.


The Fate of Apple's Vision Pro: Part II

Today, we’re featuring Part II of Evan Helda’s series, ‘The Fate of Apples Vision Pro’. 


Evan Helda is the Principal Specialist for Spatial Computing at AWS, where he does business development and strategy for all things immersive tech: real-time 3D, AR, and VR. Evan also writes a newsletter called Medium Energy, where he explores the impact of exponential technology on the human experience. 

If you've not read Part I, we suggest doing so for complete context & appreciation. Here’s the link. 

In Part I Evan reflects on his time at the original Meta, an AR startup that was building an Apple Vision Pro competitor, albeit 5 years prior. 

Here in Part II, Evan dives into the critiques from Vision Pro skeptics and dissect why they might be wrong.

If you enjoy this piece, we encourage you to check out more of his content over at MediumEnergy.io! You can also follow Evan on X/Twitter: @EvanHelda

----

Have you ever cried in a business setting?

I have.

It was the first time I ever experienced really good AR. And I mean... really, really good. Like, take-your-breath-away, blow-your-mind good.

Needless to say, they were tears of joy.

The sad part? That experience was 5 years ago.

To this day, I still haven't seen anything that comes close. And I've tried it all; just about every headset and every top application

Of course, this experience was just a demo. An absolute Frankenstein of a demo at that. To create something this good, we had to duct tape together the best-in-class components of the AR tech stack. It was kludgy as hell, but it accomplished our goal: to showcase the art-of-the-possible if we could get everything right.

We used the best display system (with the largest field-of-view and highest resolution, aka: the Meta 2); the best hands tracking (a Leap Motion sensor); the best positional tracking (a Vive lighthouse rig, with a Vive controller hot-glued to the top of the headset to track head movement... yeah... like I said, kludgy...); an intuitive interface and custom software via Unity that allowed you to move between 2D creation (using a Dell Canvas to draw a shoe) and 3D consumption (software to 'pull' the 2D shoe out into the world as a 3D object for a design review).

Meta 2 AR headset

Leap Motion hand tracking sensor

Vive Lighthouse Tracking Gear

Dell Canvas for design


But it wasn't just these tech components. We also had some of the world's most talented developers, 3D artists, and UI/UX designers build that demo. And that's the other thing we've been missing as an industry: the world's brightest minds. They just haven't entered this space yet en masse because they know... AR/VR isn’t yet worthy of their talent.

We made this demo for Nike in partnership with Dell (who was reselling our headset). It was a re-creation of the CGI from this two-minute concept video: with the exact same 3D assets, same workflow, and same UI/UX.

Nike designer using Meta 2

Virtual prototype in AR

This Nike video still drives me to this day.

When I put that duct taped contraption on my head the kludginess disappeared. I was captivated by the future. Or rather, my childhood fantasy. Like a wizard at Hogwarts, I was using my voice to summon different design iterations. I was grabbing orbs out of the air, each one representing a different color or texture. These orbs could be dragged with your fingers and dropped on to the 3D model; changing the aesthetic by tossing invisible objects onto other invisible objects.

POV shot of the demo/video

Except they weren't invisible. Not to me. You could see every detail: the stitching, the fabric, the glow of the materials.

I could explain it in more detail, but just watch the video and then... imagine.

Imagine this type of workflow and collaboration, between both humans and AI, allowing us to move from imagination to reality in the blink of an eye.

The video's script poetically says it all:

"It starts with a question, followed by an idea. On how to make things simpler.

Better.

Or more beautiful.

But it’s not just about what it looks like. It’s how it works.

Which means trying... and failing... and trying again.

To be a designer (or creator) means not being bound by the limits of your tools. But instead, being inspired by them; so that you can focus on what only YOU can do; being creative, being curious, and being critical; exploring the union between function and form, until suddenly...

You know.

And when you're ready to share your work, make sure everyone can see... that the world is a little simpler, better, and more beautiful."

I've seen this video over a hundred times. But those words never fail to stir my soul. And with the advent of generative AI, combined with the promise of spatial computing, they’re more poignant than ever.

People sometimes ask me... when will this tech be here? When will we know it’s arrived?

I answer by showing the Nike video. When that experience exists, in the form of a real product, with a real application, in a real production setting... that's when we've arrived.

Now, I've yet to try the Apple Vision Pro (AVP). But that's why I'm so excited, and why I think you, dear reader, should be as well. Because the AVP seems to be the industry's FIRST device that will yield an experience of such magical magnitude.

Something that suspends disbelief and blow your mind. Something that compels the world’s top talent to experiment and re-invent human computer interaction.

Apple Vision Pro

But, as I mentioned in Part I, not everyone has the same level of optimism about AVP.

Immediately after the announcement, the Luddites grabbed their pitchforks and the skeptics had a field day.

Listen, I get it... the notion of being inside the computer is strange and some of Apple's portrayals had Black Mirror vibes (which we'll address later). Such objections are nothing new. Every major tech epoch faced similar doubt in droves.

But when you put the primary objections under a microscope, they just don't hold up. Especially over the fullness of time.

As I also said in Part I, I'm obviously biased. But I've done the work to produce an objective lens. And upon analyzing the major objections, I remain convinced: most haters/pundits are woefully wrong, lacking the right perspective, foresight, and an understanding/appreciation for the nuances of Apple's strategy, timing, and approach.

That's not meant to be a knock. Most people’s/pundit’s perspective is just limited, lacking exposure to the tech, the impactful use cases, and the problems they'll address.

In my opinion, everything Apple is doing makes perfect sense: the form factor, the timing, the positioning, the use cases; all of it has been meticulously ruminated, debated, and patiently executed upon.

So before explaining why, let's do a quick recap of the most common objections/concerns:

  1. It's too expensive! (Price)

  2. I don't want to wear that thing on my face (UI/UX)

  3. What’s the point? What is this good for? (Use Cases)

  4. This is going to ruin humanity! (Societal Impact)

While not an exhaustive list, I view these as the ‘big rocks’ in the proverbial jar of eventual truth (aka: Evan’s optimistic opinions). Let’s dive in.

It's too expensive! (Price)

News flash: the Apple Vision Pro is not going to be a commercial success. Everyone knows this. Including Apple. Regardless, Wall Street is going to be disappointed, the critics will say I told you so, and they will all be missing the forest for the trees.

Making money isn't Apple's goal. Nor is it their metric of success.

Their goal is twofold. First, to attack the most challenging barrier: consumer behavior and imagination. Second, to get into the market, learn, and iterate; all in the wake of consumer inspiration and rising sentiment/demand due to a premium/mind blowing UI/UX.

Apple's strategy can be summarized by a tweet Palmer Luckey wrote in 2016: 

“Before VR can become something that everyone can afford, it must become something that everyone wants”.

Towards that end... Apple had a choice. They could have waited until the price point was perfect, along with the form factor, the battery power, etc. But are these things their biggest challenge?

No. These things: price, battery, weight, size, etc... they're all bound to be solved by the natural progress of technology. You'd rather be deficient on these vectors, as they will naturally take care of themselves.

Where you can't afford to be deficient is usability, utility, and delight. In other words, it's better to go high end and be super compelling, than low end as another 'me too' device competing in a red ocean, doomed to gather dust (just like every other affordable device). That device is just not worth making, and as Luckey alluded, it won't make AR/VR something everyone wants.

Thus, Apple chose to reach 5 years into the future, spare no expense, and pull next-gen technology into the present. Hence, $3,499.

There's also a simpler argument: the price is irrelevant.

Apple is targeting very early adopters: bougie prosumers and power users with very low price sensitivity. These are folks who would pay $5,000 - $10,000 for the AVP. They just want the latest/greatest.

That said, even at this price point, the complaints remain overblown. Especially on a relative basis (both historically and currently).

Case in point is Apple's origins: the Lisa, one of the world’s first personal computers.

Similarly, this was the first time most consumers saw innovations like the modern GUI and the mouse; innovations that would shape the future of computing. Innovations that also begged similar questions… why does the average home need this? At the time, most consumers had no idea.

As a result, the Lisa was an abject commercial failure. But it paved the way for Apple's success with the Apple II and Mac. It also awakened the world to the potential of personal computing.

The Lisa cost $10k in 1983. $29,400 in today's prices. Not to mention, the Macintosh, Apple's most iconic breakthrough, was $2,495 in 1985. That’s $7,000 in today's prices…

From a more local & relative perspective: the Magic Leap 2 AR headset is $3,299; the Microsoft Holo-Lens 2 is $3,500. The Varjo, the most direct comparison as a mixed reality pass through device... it's $7,100!

The AVP is right in the ball park at $3,499, and vastly superior on just about every dimension.

It's also worth considering what the AVP strives to replace: powerful workstations, laptops, and high-end displays. People in their target market spend $2k - $5k on nice workstations/laptops, and up to $2k - $3k on high end displays; all without blinking an eye.

The AVP can replace these products, and then do SO much more...

And so, I rhetorically ask... is the AVP really that expensive?

I don't want to wear something on my face (UI/UX)

My response will seem trite, but I think it will prove true.

This is a classic case of "don't knock it ‘till you try it."

I know, I know. I haven't even tried the AVP myself. But I've spoken with people who have: from grounded analysts to XR skeptics. They’ve all had a similar response, falling somewhere along the lines of...

  • “Holy shit”

  • "I felt like I had super powers"

  • "It was remarkable and exceeded my wildest expectations"

I'll have to circle back on this after I try it, but here’s my bet: the user experience is going to be so compelling that it trumps the awkwardness/friction of wearing something on your face.

At least in the contexts they've optimized for: productivity & visualization.

The input modality is said to be the most mesmerizing part, i.e., the eyes and hand tracking in lieu of a mouse, keyboard, or screen taps.

With the AVP, you just follow your instincts, using your eyes and subtle hand motions to control virtual objects, as if they are actually in the real world. It feels like you have magical powers and it just works. Your intuition is the controller.


Sure, the jury remains out until it ships. But this seems like the first AR/VR product that is just... buttery.

What the hell do I mean by buttery?

It comes from Nick Grossman’s 'butter thesis' (Nick is a partner at Union Square Ventures). The thesis describes product interactions & experiences that just absolutely nail it. What 'it' is exactly is hard to describe… but you know it when you see it. It's just frictionless: intuitive, smooth, and delightful.

AR/VR today is cool and novel. But I don’t think anyone would call it buttery. It’s plagued with all kinds of UI/UX paper cuts that make it very hard to do real work or consume content for hours on end.

As much as I love AR/VR, I'm still painfully aware of the brick on my face, constantly sliding off, noticeably heavy, hot, with non-intuitive controllers, etc.

Now, I'm sure the AVP will have its edges. All V1.0 products do. But fortunately ... this is Apple we're talking about.

Unlike other players in this space, most people will give Apple the benefit of the doubt.

More so than perhaps any other company, Apple knows how to make things desirable. Which is a key pillar of their strategy: social engineering. They’re going to make this thing cool and they have a plan to do so. One example is hyper personalization.

Apple is going to ensure your AVP fits like a glove, while also offering the opportunity for self-expression and style via custom aesthetics.

At first, you’ll have to make appointment at an Apple retail store to buy an AVP. They're carving out entire sections of the stores for headset demos and sizing, allowing associates to select and customize the right accessories for the buyer. This will ensure a snug fitting headband (of which there will be many styles), the perfect light seal (informed by a facial scan), and the right prescription lenses.


Considering the amount of inventory, the number of variations, the in-store logistics, the demos, etc… this will be the most complex retail roll out in Apple's history. A feat few beyond Apple could pull off, and a compelling story line to monitor through 2024.

What’s the point? What is this good for? (Use Cases)

Steve Jobs famously said, "You've got to start with the customer experience and work backwards to the technology. You can't start with technology and then try to figure out where to sell it."

Many people think the AVP flies in the face of this wisdom. They think this is fancy tech looking for a problem. I think the AVP strategy falls somewhere in the middle, largely because it has to; the form factor and UI/UX are just too new for anyone to have all the answers.

In Part I, I said the following: "In hindsight, it's easy to say the iPhone's impact was obvious at launch. But was it? Sure, it launched with what became killer apps: calls, email/messaging, browsing, and music. But, similar to the VisionPro's focus use cases, these things weren't entirely new. It was things we were already doing, just better on multiple vectors."

Similar to the iPhone strategy, Apple is starting with a simple and practical use case: screen replacement, aka: doing things some people are already doing, just better.

The use cases for 'infinite display' will be compelling for a lot of people: remote workers, digital nomads, software programmers, finance traders, data analysts, 3D artists/designers, gamers, movie buffs, the list goes on.

Virtual Displays

The total addressable market for these folks alone is in the tens of millions, if not hundreds of millions. Upon realizing they can become Tom Cruise from Minority Report, these people are going to line up in droves to buy the AVP.

Now, this use case doesn’t come without its haters, garnering comments along the lines of "ugh, but it’s so isolating". But this response feels silly to me. Isolation is the point. Many of these customers work remote from home, alone, and often on the road. What they do requires 'deep work' that is inherently isolating. If anything, the AVP is a device that could help close us off from endless distraction & interruption, allowing us to more easily tap into states of flow & ideal work conditions.

But if 'isolation' is your concern, know that collaboration will be the killer feature of spatial computing’s killer apps.

To be sure, it was odd that Apple barely showed 'multiplayer' use cases in their keynote. Quite odd. Collaboration is where the true magic happens. Particularly in AR, when you both have a completely shared context within both the physical and digital realms, bonded over a shared hallucination.

These 'shared hallucinations' are going to be most impactful within work settings.

Whether Apple likes it or not (because they don’t really care about enterprise), the enterprise will be their biggest/best opportunity short term, i.e., corporations buying the device for use cases like training, design, and sales & marketing.

Across these use cases today, even the world's most advanced companies are stuck in the 90s.

CAD designers create 3D things with 3D design tools, but go right back to 2D pictures in Powerpoint when it comes time to share/present.

Training departments use laughable videos, Power Points, and text filled PDFs with static pictures to explain complex, and sometimes dangerous procedures. And they wonder why they can’t recruit, inspire, and retain digitally native, 'experience craving' millennials/Gen Z…

Sales reps all too often take a similar approach. Their customers are better off just reading the same content online or getting a product analysis from ChatGPT

Across all of these examples, people are fundamentally trying to transfer knowledge by conveying an 'experience' in a woefully non-experiential way. This ‘knowledge transfer’ problem becomes increasingly acute in the face of an aging workforce, worker displacement (as AI eats more jobs), and within an era of customization/personalization in product design & sales.

This is why I prefer to call spatial computing, 'experiential computing'.


Within endless scenarios (be it work, education, or play) the goal is to capture, understand, or convey an 'experience' of some kind: what it's like to wear a shoe, what it's like to navigate a factory floor, what it's like to put your hands behind the wheel of the car.

We can try to use a bevy of words, images, and videos to spark imagination. And maybe imagination will get you 10-50% of the way there.

But what if we could turn imagination into reality? What if we can directly experience the thing itself, in its entirety? What if we could transfer knowledge at 80-100% levels of fidelity, without information loss?

Speaking of ‘direct experience’… Apple’s other focus use case might also be enough to sell out the AVP in year 1, and that’s immersive sports & live entertainment (e.g. any kind of live show/performance, music, plays, comedy acts, etc.)

They’re investing heavily in this area, with their own camera hardware for 360/volumetric capture, their own file format for this media type, and their own streaming platform via the acquisition of NextVR.

NextVR 360 camera


I thought NextVR was by far the most compelling consumer VR app to date. It was also a major driver of Oculus Quest sales, putting user’s court side of NBA games, on the sidelines of NFL games, or front row of a Taylor Swift concert (don’t judge).

These ‘real-world’ tickets cost anywhere from $1,000 to $10,000. Taylor rocks, and so does live sports, but I’m not paying that. Neither are 9/10 of people.

If you told me I could be court side, alongside friends from around the world, week in week out, for $3,499 and a small monthly subscription? I’m all over it. And I think many, many other people will feel the same way.

So as far as use cases are concerned, ‘infinite display’, collaboration + knowledge transfer, and live sports/entertainment alone will be enough to drive demand and establish product market fit. 

But this is the tip of the iceberg. Just like no one predicted the App Store, and the ensuing explosion of new apps, we can’t predict all the innovation that’s brewing amidst the long tail of Apple developers who are already diving into the AVP SDK & developer docs.

Out of the millions of apps in the app store, a healthy chunk is brain storming as we speak about what their apps could look/feel like in a spatial world. And I can’t wait to see the results…

This is going to ruin humanity!

I beg to differ.

To the contrary, spatial/experiential computing just might be a key ingredient to humanity’s salvation, especially with the advent of AI.

There’s a variety of philosophical and practical reasons why. I’ll just hit my two favorites.

Philosophically, consider all the complex and daunting problems we face in the world. Most of them lack answers, and in our search for solutions, it’s hard to say where to start.

But one place that is hard to refute, and that will certainly help us find the right answers/solutions, is better communication & collaboration; between employees, executive, & scientists. Between countries, companies, and local governments. Between political groups, their leaders, and their polarized constituents.

Poor communication & collaboration sits at the heart of all our issues, causing a lack of empathy, understanding, and ultimately, poor decision making, low alignment, and very little progress.

To illustrate the power of spatial computing for communication & collaboration, I fall back to a section from my essay, ‘How to Defend the Metaverse’.

It quotes one of the cyberspace/metaverse OG's: Terrence McKenna.

McKenna says, "Imagine if we could see what people actually meant when they spoke. It would be a form of telepathy. What kind of impact would this have on the world?"

McKenna goes on to describe language in a simple but eye-opening way, reflecting on how primitive language really is.

He says, "Language today is just small mouth noises, moving through space. Mere acoustical signals that require the consulting of a learned dictionary. This is not a very wideband form of communication. But with virtual/augmented realities, we'll have a true mirror of the mind. A form of telepathy that could dissolve boundaries, disagreement, conflict, and a lack of empathy in the world."

This form of ‘telepathy’… i.e. a higher bandwidth, more visiual form of communication, i.e. the ability to more directly see or experience an idea, an action, a potential future… this will not just benefit human to human communication, but also human to machine. Which brings us to my practical response du jour.

Practically, we need to consider how humans evolve and keep up in the age of AI.

We’re briskly moving from the age of information to the age of intelligence. But intelligence for whom?

Machines are inhaling all of human knowledge. As a result, every person and every company will have the ultimate companion; capable of producing all the answers, all the options, and all the insights…

How do we compete and remain relevant? Or perhaps better said… How do we become a valuable companion to AI in return?

Just like machines leveled up via transformers and neural nets, we too need better ways to consume, analyze, and ‘experience’ information. Especially the information AI’s produce, which will come in droves and a myriad of formats.

AI is going to produce answers, insights, and truth for all kinds of things: new ideas, products, stories, moments in time, scenarios, plans, and my personal favorite; all things that remain abstract and unseen by most; space, stars, planets, the deep sea, the deep forest, the inner workings of the human body & mind, the list goes on.

AI is going to reveal things previously mysterious, complex, and otherwise impossible to fully grasp.

As it does so… how can AI best communicate its findings back to humans? And how can we fully grok, parse through, and become fully empowered to act?

More often than not, our answer back to the AI is going to be, “don’t tell me, damnit, show me”.

Spatial computing will be the ultimate tool for helping AI’s ‘show’, and helping humans ‘know’, ushering in an age of ‘experience’ in tandem with the age of ‘intelligence’.

As a result, humans will be empowered to better remain in the loop; as the final decision maker, fully empowered to add the human touch and tweak the final outcome/output, in a way that only humans know how, i.e. through feeling, intuition, and empathy, a la this essay ‘How to find solace in the age of AI: don’t think, feel’

Tech vs. Tech

In closing, there is one more common concern within this realm that I admittedly don’t have the best answer to. At least not yet. And that is… once we’re ‘in the loop’ with AI, and spending more time ‘in the machine’ with spatial computing… how do we retain the best parts of humanity that are obviously negatively impacted by technology?

Things like our attention and mental health, or our physical movement, social skills, and time in nature.

My prediction is that we’re going to get increasingly good at using tech to combat tech.

Meaning… there are apps and tools that we can build to shape our relationship with tech, negate its afflictions, and build better habits & social connections.

Apple is already doing this today, and I thought it was one of the more compelling parts of the WWDC presentation. They showcased apps for journaling to aid with emotional awareness. Meditation for mindfulness. Fitness & outdoor hobbies of all types, with unique ways to measure, gamify, and socialize/connect with others, boosting motivation & consistency along the way. 

I think this trend is going to accelerate over the coming years. It’s already a cottage industry, with startups such as TrippVR for meditation & mental health, and FitXR for VR fitness.

The AVP’s arrival is going to enhance and legitimize these use cases, and over time, shift people’s relationship with technology while reducing the afflictions born of abstracted, ‘flat computing’. Or the afflictions born of boxes tethered walls and TVs (aka: an Xbox or PS5). These current form factors are what keeps kids/people stuck inside, isolated, and socially inept. 

In contrast… AR, in its ultimate form, will free kids from the confines of a screen and a living room with an outlet, thrusting them back into nature, back into face to face contact, and back into a world longed for by prior generations. A world of scratched knees from a treasure hunt in the park, of youthful pride from a fort forged in the woods, or of confidence from winning an argument while playing make believe in the backyard.

Except this time, the treasure becomes real, the forts become labyrinths, and the figments of make belief become not so make belief…

Thanks for taking the time to read Evan's essay. Let us know what you think about this perspective. And if you enjoyed this piece, don’t forget to check out more of his essays and subscribe over at MediumEnergy.io. Here are some of our personal favorites:

- How to defend the metaverse

- Finding solace in the age of AI

- The Ultimate Promise of the Metaverse













SpectreXR Joins the VRARA

The hand-tracking firm joined the VARA Association (VRARA) in a bid to push the boundaries of XR technology and scale the delivery of its hand-tracking solutions.

The firm noted in a social media post that joining the VRARA “aligns perfectly” with the firm’s mission to contribute to the “growth and development of the XR industry,” with a focus on innovating how XR users interact with digital objects and environments.

As the XR space is growing at a rapid pace, it's important for us to be a part of amazing XR communities! We are thrilled to announce that SpectreXR is now a proud member of the VR/AR Association (VRARA)!  At SpectreXR, we're committed to pushing the boundaries of XR technology and delivering innovative solutions. Joining the VRARA aligns perfectly with our mission to contribute to the growth and development of the XR industry putting our focus on how we interact with digital objects and environments. As a member, we look forward to collaborating with fellow industry leaders and experts, sharing insights, and shaping the future of XR. Together, we aim to drive innovation, foster new partnerships, and create exceptional experiences for users across various sectors. Stay tuned for more updates as we embark on this exciting journey as a VR/AR Association member! 

The news follows a partnership between SpectreXR and HTC Vive earlier this year to drive innovation, immersion, and realistic interaction. SpectreXR is now part of HTC VIVE’s Developer Partner Program to promote “synergy” between the firms and improve the usability of HTC VIVE devices.

The move also sees OctoXR benefit from the HTC VIVE partnership, as the deal provides the headset vendor with OctoXR hand-tracking for its software development kit (SDK), further spreading OctoXR’s hand-tracking capabilities to more headsets alongside the recently adopted Pico portfolio.

SpectreXR CEO on the Future of Input

In a recent roundtable with XR Today, Ivan Rajković noted that his firm dedicated over 17000 hours of R&D to hand tracking technology, aiming “to provide the most realistic and intuitive hand interactions inside VR and AR environments.”

Rajković also said that hand-tracking technology can bring value to various industries by providing a “natural and intuitive way of interacting with digital interfaces,” according to the CEO, which can lead to improved productivity, accuracy, and efficiency outcomes for workers.

The CEO also added:

Tracking solutions have become an essential component of modern digital interfaces, providing a more engaging and intuitive experience for end-users. Enterprise adopters can benefit greatly from incorporating tracking solutions into their products, creating a user-friendly and inclusive experience that enhances the overall value proposition of their offerings.

Moreover, the CEO noted that a “big change is happening on the industry side,” with Rajkovic explaining how many companies are “recognizing the importance of hand-tracking technology.”

Rajković noted

With the continued advancements in hardware, hand tracking could become, and we believe that it will become the default way how we are interacting in virtual and augmented reality.

The CEO continued and explained how SpectreXR is “excited” about the future of hand-tracking and how the technology will affect AR/VR/MR applications, “we will continue to invest in research and development to ensure that our technology remains at the forefront of this rapidly evolving field,” Rajkovic remarked. 

Lenovo announces new 3D display for immersive visualization coming February 2024

Post originally appearing on auganix.com by Sam Sprigg.

Lenovo has recently unveiled its new ThinkVision 27 3D Monitor, along with other technology, software and accessories that the company stated are all designed to help boost the capabilities of remote and hybrid workforces, as well as address challenges faced by businesses as they digitize operations across departments.

Aimed at professional content creators who require immersive 3D visualization, the new ThinkVision 27 3D Monitor is a 27-inch, glasses-free, 2D/3D compatible monitor that offers seamless 3D effects and real-time eye-tracking for more immersive creation, connection, and collaboration.

Lenovo noted that as work and personal lives move more into the digital sphere, with it comes a growing need among content creators and professionals for more lifelike remote collaborations and a more streamlined process for creating 3D content, from 3D graphic design and 3D games to 3D videos. As a result, the company is introducing the ThinkVision 27 3D monitor.

“In an increasingly dynamic hybrid and remote work landscape, the notion of ‘one-size-fits-all’ tech solutions is no longer viable,” said Johnson Jia, Senior Vice President of Intelligent Devices Group’s Global Innovation Center at Lenovo. “The new devices and software that we’re unveiling today are integral elements of a unified ecosystem that harnesses potent processing power, immersive visualization, and adaptive software. Together they open up the possibilities of what tech can do for better work and a better team experience.”

The screen merges immersive life-like experiences with ease of use, and with its in-built real-time eye-tracking technology for fluid motion, the device’s switchable lenticular lens offers more natural 3D viewing experiences, thus providing users with a more consistent experience across both 2D and 3D modes, according to Lenovo.

The 3D monitor projects two independent images to the user’s eyes, so that each eye sees the subject from a slightly different angle, which Lenovo stated ultimately delivers stereo vision and depth perception in a natural and efficient stereoscopic visualization. The ThinkVision 27 3D Monitor also comes with built-in speakers and connectivity options, including USB-C docking and modular camera support.

Lenovo highlighted that with its 3D Explorer software, the new 3D monitor forms a comprehensive ecosystem for 3D creation and consumption, and offers a platform for users to access all their 3D apps. The 3D Explorer platform arms creators with useful tools such as a 3D player for viewing videos and files with 3D effects, support for design and productivity software, and an SDK for developers to build 3D applications.

The company added that new features will continue to upgrade over time to enhance and extend the 3D experience, which Lenovo hopes will push the boundaries of 3D digital interaction while “providing a gateway into a vibrant and expanding 3D ecosystem.”

Lenovo ThinkVision 27 3D Monitor pricing and availability

The Lenovo ThinkVision 27 3D Monitor will start at EUR €2,999 (VAT included) and is expected to be available in select markets starting February 2024, according to Lenovo. For more information on the new monitor, as well as the other technology solutions that Lenovo has recently announced, click here.

How Hotelschool The Hague uses VR to improve hospitality education

We are proud to feature VRARA Member Warp VR's latest success story!

Customer intro

Hotelschool The Hague was founded and funded in 1929 by the hospitality industry to create a hub where students can develop, conduct research and share their hospitality knowledge and skills in a realistic setting and hone their leadership talent for a successful career in the industry. Since then, it has become one of the top 10 hospitality management schools worldwide.

Hotelschool The Hague has several programmes on offer, including a four-year bachelor of arts in hospitality management, a fast-track bachelor programme, and masters in international hospitality management and hotel transformation.

Challenge

Higher education faces many financial, political, social, and technological challenges, and increasing competition. To attract and engage students, the education sector has a long history of adopting emerging technologies to supplement traditional pedagogical methods. From smartboards to laptops and even the internet itself, there have been many examples of technologies that have profoundly altered the way educators and students teach and learn.

Virtual reality helps to transform education by delivering meaningful teaching and learning experiences that enhance engagement and retention, and promote inclusivity.

Virtual reality lets students learn through experience. With VR, learning goes beyond textbooks and lectures, and students become active participants in their own education. By providing learners with engaging, memorable and impactful experiences, it helps them learn more effectively, build important skills such as empathy and collaboration, and retain what they learned long after they leave the classroom.

As an educational technology, VR using 360° video is great for placing learners in a different environment outside of the classroom allowing learners to experience a context and visualize concepts and situations in an immersive way. Perhaps most importantly, 360° video allows the learner to experience a situation or environment in the first person, allowing for delivery of emotion and encouraging agency as well as personal, real and active learning.

This medium is ideal for transferring emotion and allowing a learner to feel the emotion in a scenario as well as develop empathy by taking the perspective of someone different to themselves. 360° video allows for the learner to be placed in a new simulated environment that may be inaccessible, unsafe or expensive to experience in real-life, allowing for contextual learning and visualization of concepts and context.

As a result it is a natural fit for safety training as well as soft skills training such as leadership and guest relations to experience difficult situations and build emotional intelligence through exposure. 360° video can also be used effectively for developing customer experience prototypes of new environments as well as visualizing standard operating procedures. Furthermore, 360° video can be used to create immersive case studies that position the learner within a first-person perspective and present them with a problem to be resolved.

In hospitality, VR can enhance event planning, training, customer experience prototyping and marketing. When using 360° video, VR can recreate induction experiences enabling familiarity with the work environment, allowing new employees to practice standard operating procedures prior to their first day. For educators and creators of educational content, 360° video is relatively easy to learn and a nice entry point into VR and immersive technology.

Solution

Hotelschool The Hague works together with other international hospitality universities to prepare students and professionals for a changing profession through immersive real-life learning experiences.

In one of the first projects, as a result of the pandemic, the Future of Work minor experimented with the use of a 360° VR training scenario on dealing with difficult conversations. By making decisions to deal with a challenging team member, one can build leadership skills through trial and error in a safe virtual environment and gain leadership experience in a VR world. This can inform future situations one may find oneself in as a leader within the future of work.

To get up and running quickly, they sourced the scenario from an external content provider and provided Google Cardboard fold-out viewers to students to play it on their own mobile phones. Learners described the experience as being stimulating and impactful, with many feeling like they were actually present in the room.

In a subsequent project, students could create their own scenarios around unconscious bias and diversity and inclusion. In a course on change management, learners get to experience resistance to change in the first person. Within the context of a fictional coffee company, learners are placed in the role of a change consultant who is meeting with a difficult branch manager. The company aims to enhance customer experience through training for frontline staff, which conflicts with the manager’s profit motive resulting in continuous resistance. Learners face the challenge of taking decisions to influence the branch manager to get onboard with the change.

The school selected Warp VR to create and distribute realistic, 360° video based experiences that support story branching and easy to play on both VR headsets and smartphones. After the experience, the process of reflection through experiential learning was facilitated to make the link to change management theory, while building competencies in soft skills such as decision making, critical thinking and perspective taking.

A related project feeding into the VR/AR Project is the Transforming Hospitality Education through Tech Abilities (THETA) Erasmus+ research project. For this project, Hotelschool The Hague is working together with four partner institutions. The THETA project aspires to enable real-life learning through immersive technologies on mobile phones to enhance hospitality education. Furthermore, the project intends to develop guidelines for educators and learners on how to create their own low-tech AR and VR experiences which are easy to use to enrich teaching and learning.

The THETA project is developing four prototypes using immersive technologies, which are refined using student feedback. One of these is Branched Storytelling, which uses 360° video for soft-skills training giving students the chance to experience the emotion of a difficult conversation with a guest while testing decision-making and critical-thinking skills. Another prototype, The Outlets, allows students to feel present in hospitality outlets such as the kitchen and front desk without having to be there physically while providing an introduction to equipment and processes.

“It has been an absolute pleasure working with the Warp VR team over the past two years who are always willing to help and provided support when needed. The platform is easy to use, looks professional, is reliable and works well even when distributing to large classes (over 120 students).” - Che Govender, Lecturer VR/AR in Education at Hotelschool The Hague

Results

Hotelschool The Hague has introduced immersive technologies into eight courses of its curriculum thus far. After the delivery of an immersive experience in the classroom, the didactical process shifts to structured reflection through an experiential learning model to relate and apply classroom theory to solve the problem faced in the VR/AR scenario. The experience is turned into a learning moment through critical in-class reflection and group coaching approaches.

The school uses 10 PICO headsets that can be screencasted to digiboards for live group reflection. Further adoption by other departments within the school is stimulated with informal workshops, dissemination presentations, collaboration with industry partners on the development of immersive learning experiences and the development of manuals to promote both student and educator created content.

Initial research results indicated that students preferred highly interactive experiences with gamification elements. In addition, for AR experiences, rather than viewing 2D video on cutting techniques within an immersive platform, students wanted to be able to view 3D objects (eg. the knife) from different angles in order to maximize the benefits of the medium. Furthermore, the 360° video on dealing with a difficult guest was perceived as being highly engaging.

To learn more about using VR for education, watch our webinars How to use VR in education and How to use VR for healthcare education, or read our blog post VR training in education: a game-changer for learning.

Download customer story

Click here to download this customer story in PDF format (no registration required).

Highlights from our annual Retail Forum: 800 people from 62 countries. See top speakers, countries, sessions.

Our annual ​Retail Forum​ brought together the best minds in VR/AR, shopping, and metaverse commerce. We had about 800 attendees, 20+ speakers from Herman Miller, Snap, Sony, Target, Meta, Google, Microsoft, Bridgestone, Amazon, Marxent, Lenovo, VR Direct, Magic Leap, Walmart, Verizon, Accenture, Morgan Stanley, Obsess, Banuba, WestRock, Niantic, Philips, among others.

Access video recordings here.

Do you want to participate in our upcoming ​Online Meets​? We have weekly online meetings with presentations, live discussions, and networking to help you grow your knowledge, connections, and your business. If you would like to attend, email ​info@thevrara.com​ to be added to the meeting invites. If you have something interesting to present, let us know and we will add you to the schedule!

The Forum’s top sessions included:

Thank you to our Retail Forum sponsor:


Join our ​Immerse Global Summit​ at ​Metacenter Global Week​ in Orlando on Oct 17-19. We expect 3000+ people, 200+ exhibitors, speakers from Amazon, Meta, Obsess, many others!