Augmented Reality gives Real Estate an Edge

Through the incorporation of innovative technologies such as Augmented Reality, potential real estate customers can facilitate the process of searching for properties and have a unique experience when choosing and purchasing.

This is happening as new technologies replace the traditional showroom or face-to-face visits with hyper-realistic immersive experiences that help customers have a quickly envision the property they want to buy, without having to go there. Not only does the client win, but it also saves the broker the time of organizing redundant tours.

Benefits of Augmented Reality: Use of 3D Models


By incorporating 3D models, the client can test out a range of products from a piece of pottery, a floor, or works of art to even larger spaces such as rooms, facilities, or offices. They can even directly tour a house or apartment. The interesting thing is to be able to visualize and try different styles and designs using only the cell phone.

Interceramic, the Mexican company which is a leader in manufacturing floors and tiles, decided to generate its product catalog with Augmented Reality. This way, their customers can scan a QR code to access spaces that show the latest product trends and visualize how they will look in real spaces. They recreated the Santa Monica line for the terrace and kitchen and the Satori line for the living room and kitchen.

Another example is the Uruguayan company Multiconteiner, who also choose to evolve by exhibiting its solutions with this technology. By entering the 360° experience, their clients can find information about the different models of containers, houses and modules they offer, and see a gallery of images for each one. In addition, Augmented Reality lets clients visualize the houses in real scale and in model size, walk through them and look at them in detail, as if they were there.

The advantage of this technology is that it helps to reduce the time and resources that customers devote to searching for products and properties, saving in-person visits for those of greater interest.

It has been demonstrated that these solutions increase the possibility of sale by 30% and shorten the purchase process by 50%. Therefore, these kinds of tools help to generate a competitive edge and mark the difference needed to boost business and the real estate industry.


Contact:

Name: Florencia Moltini

Email Address: florencia@camonapp.com

Website URL: http://camonapp.com/en/augmented-reality-real-estate/

RENK Group Increases Stake in Immersive Digital Twin Provider, Modest Tree

August 24th, 2021 - Globally active RENK Group (“RENK”) has successfully completed an additional share subscription and increased its stake in Modest Tree Media Inc. (“Modest Tree”). Modest Tree, based in Halifax, Canada, provides a range of digital tools and services to manufacturers, including immersive digital twins and the associated digital tools and extended reality solutions, to enable the visualization of products and processes through integration of enterprise datasets.

By increasing its investment in Modest Tree, RENK strengthens its position as a leading adopter of digital solutions, a growing trend in the global manufacturing industry. This allows RENK to provide innovative products and digital-based services to its customers.

“Working with global manufacturers in aerospace, defence, and automotive, we have seen the accelerated focus on digital solutions, unleashing new business models for training, digital twins, and digital in-service support,” states Modest Tree founder and CEO Sam Sannandeji. “Having RENK as a partner helps us to understand the unique and evolving needs of leading digital-focused industrial firms and to provide leading-edge digital solutions to our global industrial clients.

Contact:

Name: Laura Bohnert

Email Address: lbohnert@modesttree.com

Website URL: https://www.modesttree.com/

MIAT presented the first International Immersive Storytelling Masterclass Bootcamp: Training the talents of the future

Award-winning creators and artists, immersive directors, XR creatives and science-techs have transformed Milan into a pool of knowledge for those who will be the talents of the immersive world of the future. 

In July 2021, MIAT (https://www.miat.tech/), the Multiverse Institute Of Art and Technologies, launched the first SOLD-OUT international Immersive Storytelling Masterclass in Milan. 

The Bootcamp, led by international industry professionals, award-winning creators and artists, immersive directors, XR creatives, sound designers and science-techs was structured over eight days, and designed to touch, understand and experience all the facets of the immersive world and the entire immersive production process. 

“During the full immersion week, the participants have ranged from topics such as stereos & cameras, 360 producing and filming, immersive audio, virtual, augmented, and mixed reality, game-engines, and volumetric capture, to learn how to create immersive experiences using new worlds and languages. Furthermore, they went deeper, learning from a scientifical perspective what is behind these immersive technologies. 

Finally, through lectures and sensory and innovative workshops were addressed key topics concerning narrative, interactivity, and immersion, in order to understand how to build the perfect and engaging immersive storytelling capable of transmitting emotions, increasing empathy and connecting with the audience.” Said Elisabetta Rotolo, CEO - Founder of MIAT and tutor for the masterclass. 

At the end of the course, tutors and international experts guided the participant with creative and design thinking techniques through the conceptualisation of their immersive project presented afterwards to a jury of international commissioners including international executive producers, curators, interactive and immersive media experts as Liz Rosenthal, Myriam Achard, Dan Tucker, and Carlo Rodomonti. 

The masterclass, designed to meet the needs of the international XR market, providing the future generation of art-tech with artistic, creative and critical thinking, was the first masterclass launched by MIAT, which will be followed by new courses and opportunities for professionals, students, artists, creatives and corporations. 

More info here

ABOUT MIAT

MIAT is an educational and creative hub for immersive arts and emerging technologies, integrating an XR Academy with hands-on and practitioner-driven industry access training programs, alongside a full-service immersive production centre generating original immersive experiences.

We are an international team of immersive storytellers, filmmakers, producers, XR developers, lead artists, animators, sound engineers, combining high artistic skills with a cutting-edge technological expertise delivering training programs in immersive storytelling, directing, creative producing, filmmaking, VR shooting, XR art, interactive emerging technologies applications. We love to strengthen the talents of the future and create together high-end original immersive experiences for artistic expression and social impact.

For info and contact: 

Elisabetta Rotolo - CEO & Founder

elisabetta.rotolo@miat.tech


Nextech AR Goes Live with Enhanced 3D Google Ad Functionality With Launch of Web XR

NexTech AR Solutions Corp., an emerging leader in augmented reality for eCommerce, AR learning applications, AR-enhanced video conferencing and virtual events, today announced the launch of its next generation Ad technology.

With this announcement NexTech’s customers can now tap into Google’s immense network, delivering engaging and streamlined 3D ads that extend to AR at scale that do not require application download. The upgrade leverages WebXR and also enables a robust AR experience with each Ad. NexTech Ads also provide rich and robust analytics giving customers data driven insights so they can better capture impressions, clicks, interactions and gain overall AR engagement data.

Click to view 3D AR ad sample: view sample.

NexTech CEO Evan Gappelberg commented on this product enhancement. “Digital consumers are looking for engaging immersive experiences. The combination of Nextech AR’s 3D model creation at scale with Nextech’s Ad Network now on Google Ads creates an incredibly valuable offering that accelerates our customer’s reach with higher engagement levels. The Google Display Network reaches 90 percent of internet users worldwide, across millions of websites, news pages, blogs and Google sites like Gmail and YouTube”, said Gappelberg.

Analytics from NexTech’s Vacuum Cleaner Market (VCM) indicate click through rates (CTR) of approximately 5 percent for 3D Ads created with NexTech 3D tools, compared to average Google CTR of 1.55 percent according to Smart Insights. NexTech’s 3D technology enabled on Google’s expansive Ad network offers a compelling advertising tool. Google estimates that as of 2021, that their search engine processes approximately 63,000 search queries every second, translating to 5.6 billion searches per day and approximately 2 trillion global searches per year. According to eMarketer, Digital Ad spend will be 455.3 billion in 2021 with 55.2% of that going to display advertising.

NexTech 3D advertising customers such as Motif, a leading fashion knowledge and educational hub are recognizing NexTech’s 3D advertising suite of tools as a game changer. “Partnering with Nextech AR Ad Network gives us extended reach to new B2B communities in consumer product goods, fashion & apparel and branding agencies. Its 3D advertising capabilities were a key draw for us and enables us to illustrate to prospective students the type of 3D models they can create through our 3D digital fashion course with Roz McNulty”, said Elisabeth Souquet, Marketing Director at Motif. Now, with NexTech’s rollout of 3D Google Ads, customers such as Motif can extend 3D advertising built with NexTech tools to reach a significantly broader audience.

More info

BOHEMIA INTERACTIVE SIMULATIONS JOINS TEAM CESI TO DELIVER U.S. ARMY’S NEXT GENERATION OF VIRTUAL COMBAT TRAINING

Aug 12, 2021

ORLANDO, Fla. (USA) - Bohemia Interactive Simulations (BISim), a global developer of advanced military simulation and training software, has been subcontracted by Cole Engineering Services Inc. (CESI) to deliver significant components of the U.S. Army’s next generation of collective training technology. CESI recently announced it had been awarded the Training Simulation Software/Training Management Tools (TSS/TMT) contract and plans to utilize all three of BISim’s primary products (VBS4, VBS Blue IG and VBS World Server) as part of the overall TSS/TMT solution.

We are excited to bring BISim’s products and technical expertise in support of prime contractor CESI to jointly deliver the U.S. Army’s next generation of virtual combat training technology in 2025.

Arthur Alexion, BISim CEO

The U.S. Army’s Synthetic Training Environment (STE), of which TSS/TMT supplies the central software capabilities, will allow units and leaders to conduct realistic, multi-echelon and multi-domain combined arms maneuver and mission command, live, collective training anywhere in the world. TSS/TMT will also converge live, virtual and constructive capabilities in a single, easy-to-use interface. Military leaders and instructors will be able to set up complex virtual battles, coordinating with thousands of AI-powered allies, and fighting against artificially intelligent or instructor-controlled adversaries, with realistic AI behaviors and at theater-wide scale. Soldiers will be able to repeat these training missions many times over, facing new challenges that will help them to better prepare for live training and enhance their readiness for operations.

Terrain data will draw from STE’s One World Terrain data and soldiers will interact either through PC-based soldier stations or through STE’s Reconfigurable Virtual Collective Trainers. The STE will facilitate quick set up, comprehensive after action review of exercises and an intelligent tutor capability, and will lead to more realistic, more targeted and ultimately more impactful training and mission rehearsal. The delivered system will be the most advanced and most comprehensive training capability in the military world, encapsulating the ambitious vision of the U.S. Army’s STE Cross Functional Team and combining the talents and software of a number of cutting-edge military-focused companies including CESI and BISim.

BISim’s easy-to-use, whole-earth virtual and constructive VBS simulation capabilities will be tightly integrated with TSS/TMT. BISim’s World Server technology will be utilized in the Army’s TSS/TMT to ingest OWT format into the TSS/TMT environment, providing soldiers the ability to train and rehearse on realistic virtual terrains anywhere in the world. VBS Blue IG, BISim’s high-performance, 3D whole-earth image generator (IG), is already providing high-fidelity visuals for RVCTs and usage of BISim’s IG will be expanded to STE through TSS/TMT.

BISim first began working with the U.S. Army on the STE-related programs in 2016 and, as prime contractor working with Cole Engineering Services (CESI) and 4C, developed a prototype for a cloud-enabled, virtual world training capability under the OTA1 contract in 2018. BISim’s VBS Blue IG product has been used to support the STE’s RVCT Air and Ground efforts and BISim’s world server technology is the core of the cloud-enabled STE World Server (STEWS) middleware used in the Army’s One World Terrain (OWT) project. We are humbled and proud that BISim’s involvement with STE has been substantial and ongoing, with the TSS/TMT subcontract being BISim’s 9th STE-related contract award.

“BISim has invested tens of millions of dollars in upgrading and modularizing our technology in anticipation of STE-type requirements, and VBS4 already supports all of the Army’s Games for Training capabilities, representing over one hundred U.S. Army-specific training uses,” said BISim CEO Arthur Alexion. “We are excited to bring BISim’s products and technical expertise in support of prime contractor CESI to jointly deliver the U.S. Army’s next generation of virtual combat training technology in 2025.”

BOHEMIA INTERACTIVE SIMULATIONS

Founded in 2001, Bohemia Interactive Simulations (BISim) is a global software company at the forefront of simulation and training solutions for defense and civilian organizations. BISim utilizes the latest game-based technology and a 200-strong, in-house team of engineers to develop high-fidelity, cost-effective training and simulation software products and components for defense applications.

Globally, more than 500,000 military personnel are trained every year using VBS software products. More than 60 NATO and NATO-friendly countries and over 250 integrators/prime contractors use VBS technology, many making significant funding commitments to extend VBS product capabilities. Customers include the U.S. Army, U.S. Marine Corps, Australian Defence Force, Swedish Armed Forces, French MoD and UK MoD and most major integrators. VBS products have become, by far, the world’s most widely used COTS product range in the military-simulation sector, supporting hundreds of military use cases and vastly greater military exploitation than any comparable products. 

Contact

David Dadurka

david.dadurka@bisimulations.com

Brought to life by MediaCombo, Tracing Paint: The Pollock-Krasner Studio in VR has launched on HTC Viveport

NEW YORK CITY, NEW YORK, August, 2021 — Launching today on HTC Viveport, MediaCombo’s Tracing Paint gives art-lovers a virtual time machine to experience the times and place where Jackson Pollock and Lee Krasner, two of the most brilliant abstract expressionist artists, created their masterpieces. This fascinating VR experience--the winner of a 2021 Telly Award for Immersive & Mixed Reality--transports you into the artists’ studio, where you examine their work and listen to the artists talk about their inspiration and processes.

“MediaCombo is passionate about bringing new life and interactive possibilities to art and cultural experiences,” explains Robin White Owen, Principal at MediaCombo. “For this project, we wanted to make it feel like you were really in the studio talking with the artists. Until now, the experience has been available only at the Pollock-Krasner House and Study Center, their former home and studio on eastern Long Island. Hosting Tracing Paint in HTC Viveport means that even more people will be able to interact with and learn from the artistic works of Jackson Pollock and Lee Krasner.”

The Pollock-Krasner House and Study Center, a National Historic Landmark, was opened in 1988 to interpret the artists’ living and working environment. Their works are all in museums or private collections around the world, so Tracing Paint brings the paintings back to the studio, giving visitors a deeper appreciation of what was created in this historic place. MediaCombo used photogrammetry to scan and create a 1:1 model of the building’s interior. This VR experience also enables people who can’t travel and those with mobility issues to visit the site virtually.

Tracing Paint is available for sale for $4.99. Two versions are now available on Viveport — the PC VR version for headsets like the Vive Pro 2 and the WAVE version for self-contained headsets like the Vive Focus 3.

Contact

Name: Michael Owen

Email Address: michael@mediacombo.net

Website URL: http://www.mediacombo.net/pollock-krasner-studiovr/

Nextech AR to Acquire AR Cloud-3D Mapping Company ‘ARway’ Transforming Into A Metaverse Company

Nextech AR Solutions Corp. (“Nextech”) is pleased to announce that it has signed a definitive agreement under which Nextech will acquire U.K. based spatial computing company ARWAY Ltd. (“ARway”) in an all-stock transaction and hire the key founders Baran Korkmaz and Nikhil Sawlani.


This acquisition provides Nextech with a spatial mapping platform critical to building the Metaverse. Through Unity, optimized for Google and Apple, and by using AI to recognize surroundings for hyper-accurate location mapping, ARway provides users an Augmented Reality Software Kit (SDK) to frame the digital world in a few minutes.


Facebook, Epic Games, Microsoft and others have all identified the Metaverse as the future of the internet and computing itself. This acquisition positions Nextech as a first mover in what it’s calling a ‘mini-metaverse’. Evan Gappelberg, CEO of Nextech AR Solutions comments, “We think that the mini-metaverse business use case is here to stay, and the implications for future growth are significant. Nextech's mini-metaverse offering will be available to brands and companies that want to create mini-metaverses based on a geolocation like museums, corporate headquarters, theme parks, sports stadiums, University Campuses and more. We can scan these spaces with ARway’s technology and drop in AR experiences that are triggered based on geolocation, making for a fully immersive Metaverse experience. The mini-metaverse is the first step toward universal mapping, a concept that while not a reality today, is a future inevitability. Nextech's mini-metaverse offering will enable people to experience the multiverse as it increasingly becomes a normal part of everyday
life.”


The ARway SDK combines robust mapping technologies for location persistent AR experiences across ARkit, ARcore and Microsoft Azure Spatial Anchors, delivered on cross-platforms Unity today, Unreal, Android Studio or Xcode/Swift soon, to unlock true spatial computing within a single toolkit for iOS, Android and Hololens. ARway has been creating persistence location experiences such as indoor navigation, guided tours, treasure hunts and many more with its No-code platform and has developed an ecosystem with over 1000+ developers, having created 3D maps in over 60 countries, with notable customers such as the HCG Hospital, British Telecom, Bosch, AirAsia, The City of London and the GuildHall School of Music and Drama.


ARway AR cloud and 3D mapping technologies combined with Nextech's scalable solutions for AR e-commerce, AR advertising, HoloX Human Holograms and AR Portals put it in a leadership position in the race to the Metaverse.


Evan Gappelberg, CEO of Nextech AR Solutions comments, “Creating the metaverse is the most ambitious thing we can accomplish as an augmented reality company, and now with the ARway platform plus Baran and Nikhil joining the Nextech team, we are positioned to do just that. The potential for Nextech to be first to market with mini-metaverses, spatial maps as NFT’s in the metaverse and leveraging our creator platform HoloX to populate the metaverse with content at scale is super exciting to me.” He continues, “with our global sales and marketing machine, our combined AI teams and our existing AR tech and resources as a public company, I'm convinced that we will quickly take a leadership position in the AR metaverse!”


Baran Korkmaz, CEO Co-founder of ARway comments:
“I believe this will be a historic moment in the development of the Metaverse, a vision that started over four years ago with ARWAY. Unifying human-machine understanding by connecting the digital and physical world to empower people to connect and share in deeper, more meaningful ways. And this vision is now becoming a reality with our new family at Nextech. As the future of augmented reality is inevitable, in this new age of Web 3.0, Mixed Reality and wearable cameras, it will be a large effort to map the physical world. And just like today’s web, there will be various use cases, proprietary data, walled gardens, and permission layers. So I'm super excited to be joining Nextech, where we will be at the next tech frontier”.

More info

Contact

Name: Judith Planella

Email Address: judith.planella@nextechar.com

Website URL

Contemporary art at the epicenter of digital technology

Contemporary art has found itself at the epicenter of blockchain technology. At the same time, blockchain technologies were at the epicenter of contemporary art: New York artist Jessica Angel presented the Voxel Bridge project at the Vancouver Biennale. The main content of the installation is AR-blockchain visualization. The artist has turned the Vancouver Cambie Bridge into a public art object, within the boundaries of which digital worlds come to life, similar to both space and the inner structure of a computer. The project is based on the Kusama Network’s blockchain and does use AR-tools created by the Spheroid Universe for visualise it. “Art is a mobilizing force capable of connecting seemingly dissimilar worlds, Voxel Bridge demonstrates this ability. The installation expands the sensory experience of art into a unifying experimental effort that allows blockchain technology, AR and public art to discover new ways to interact with each other. ”(Jessica Angel, artist, author of the Voxel Bridge project).


Voxel Bridge is a landmark event in the world of contemporary art. The project does not just present graffiti or a piece of booming crypto art. Voxel Bridge exists simultaneously in three worlds: in the real space of Vancouver's Cambie Bridge, in augmented reality visible through the application, and in the digital space of the blockchain. The connection of the real and digital worlds creates a new unexpected experience for the viewer: the installation includes twenty different interactive AR-animations with the help of which viewers “live” the history of the creation of the Kusama Network and “penetrate” its digital spaces and processes. Jessica Angel uses the information fabric of the blockchain online as a creative tool and directly projects this information into the visually perceived elements of the artwork. “It is believed that the Vancouver Biennale is always one step ahead and pushes the traditional boundaries of art in public space. Now that the very definition of “public” has been reimagined to mean “virtual,” we are determined to bring the world an art installation of immense artistic and technical value. The Vancouver Biennale is all about using art as a catalyst for learning, so if you really have no idea about the latest technological buzzwords like blockchain, cryptocurrency, augmented and virtual reality, this art installation provides a truly accessible and visually mesmerizing opportunity to experience these concepts ” (Barry Mowatt, President and Artistic Director of the Vancouver Biennale).


In addition, the event combines two significant elements - technological and social - the experience of visualizing complex digital technologies and the experience of creating large-scale AR objects for urban spaces. Blockchain technology is designed to change the way we interact with the world and creates unlimited opportunities for all types of business and social practices, including identity, healthcare, banking, logistics, decision making, arts and education. Voxel Bridge is the first art project to literally show the world this burgeoning revolutionary technology.


The development of AR solutions for Voxel Bridge was carried out by Spheroid Universe, which is the world's first platform that creates a unique opportunity to see the work of blockchain and other digital technologies through art, using augmented reality tools, advanced developments in computer vision and artificial intelligence. The visualization of the Kusama Network blockchain as an AR installation is based on dynamic geolocation, where the key technological problem is the precise positioning of a large-scale interactive AR object. The experience of solving this problem will soon be applied in the Polkadot / Kusama Network ecosystem to visualize AR objects of NFT art. In this regard, Voxel Bridge is not only a high-tech task in the implementation of contemporary art in augmented reality space, but also an experimental experience in visualizing NFT art objects that have appeared and circulate in the Kusama Network. The project is the first test of the Web 3.0 Spheroid.Earth algorithm for positioning objects over large areas.


The social advantages of emerging technologies can be mastered by people only through tireless mass education and information: what are the advantages of new technologies, why they are necessary, how they reach people. Jessica Angel's installation Voxel Bridge at the Vancouver Biennale is a big step in the right direction. “Creative and innovative visualization of information is critical in a global society that increasingly relies on information technology and digital systems. The differences that information accessibility provides are both wide and profound, and can lead to tremendous changes in attitudes, and in general we would like to hope that these changes will be in the direction of knowledge and understanding, not ignorance. ”(Gavin Wood, Founder of Polkadot, ex-CTO and co-founder of Ethereum).

Contact

Name: Alexander Sysoenko

Email Address: sys@spheroid.eu

Website URL: http://www.spheroiduniverse.io/

Agisoft Metashape Pro and 3D PluraView monitors enable 3D-stereo photogrammetry at its best

3D stereo photogrammetry at the highest level: certified symbiosis of software and hardware with Agisoft Metashape Pro and 3D PluraView stereo monitor enables best viewing comfort in excellent 3D quality.

3D stereo photogrammetry at the highest level: certified symbiosis of software and hardware with Agisoft Metashape Pro and 3D PluraView stereo monitor enables best viewing comfort in excellent 3D quality.

Where classic surveying reaches its productivity limits today, the 3D photogrammetry application Agisoft Metashape Pro is being used, often in a combination with so-called 'drones' or 'Unmanned Aerial Systems' (UAS). As part of the analysis and processing of geospatial data, Metashape Pro creates detailed, fully textured 3D models from digital recordings with the highest-possible level of accuracy and detailing. Metashape Pro is not only the ideal tool for working with UAS images, but also for big, large-format aerial cameras and terrestrial close-range pictures. Tens of thousands users worldwide already benefit from the level of detail and achievable precision of this application. Together with the 3D PluraView monitors from Schneider Digital, such cutting-edge software technology extends to excellent quality 3D-stereo visualization. Thanks to their excellent stereoscopic visualization quality, the 3D PluraView monitors from Schneider Digital open up the useof innovative 3D editing and measuring methods with the 3D Stealth Mouse S4-Z in Metashape for the first time! In combination, both high-end technologies together result in a perfect 3D-STEREO workplace solution for geospatial data acquisition, processing and product generation at the highest level. That is why the compatibility of Agisoft Matashape Pro with the entire 3D PluraView monitor product family has now been officially certified by the manufacturer Schneider Digital.

As a specialist in stereoscopic 3D hardware and visualization solutions with a global reach, Schneider Digital has been in close contact with the software company Agisoft for many years. Agisoft is considered a pioneer in the development of UAS and terrestrial photogrammetry solutions. From 2019 onwards, the company accelerated their functionality development for stereoscopic display and editing functions, due to rapidly growing customer demand. The 3D-stereo viewing and editing capability is especially important for UAS service providers for feature collection and in the creation and processing of dense 3D point clouds and textured building models. 

Since the June 2020 release (1.6.3), Agisoft has been integrating a number of important 3D-stereo functionality in Metashape Pro to convert digital images into detailed, completely textured 3D models. Agisoft supports the stereoscopic editing functions of the Stealth 3D Mouse S4-Z in Metashape Pro, with the help of which it is now possible to work particularly precisely and efficiently in (holographic) space. While the standard version of Metashape is sufficient for basic, non-geodetic requirements and smaller projects, the Pro version is very well suited to create extensive, georeferenced 3D GIS datasets, enabling also interactive stereoscopic digitization. With the Pro version, up to 50,000 images of a physical object or geographical area can be integrated and processed in a coherent image block. The rigid 3D bundle block adjustment enables tens of thousands of users of Metashape Pro to solve any kind 3D reconstruction, visualization, surveying, and mapping task efficiently and with maximum precision.

Due to its achievable accuracy, the software is used for UAS and aerial image triangulation, as well as for the processing of satellite images. Metashape Pro is also handling the creation of elevation models and processes terrestrial laser-scanning data, where objects are directly mapped and reconstructed with dense and precise 3D point clouds. At the heart of Metashape Pro are computationally intensive and precise reconstruction algorithms for the three-dimensional calculation and display of georeferenced objects, originally captured as densely matched surfaces by image sensors and as point clouds, originating from professional LiDAR equipment.

The Future of Geospatial Data is 3D-stereo visualization

Metashape Pro is definitely one of the most comprehensive photogrammetric software applications existing today. However, only in combination with high-performance hardware components, the real-time measurement, visualization and analysis of 3D datasets can be performed efficiently and true to reality. Key functionality elements are the stereoscopic visualization, 3D feature capture and connectivity to very large GIS databases. With the dual-screen 3D PluraView displays, Schneider Digital provides the optimal desktop solution for stereoscopic viewing and data editing. With display sizes up to 28” and 4K resolution per screen, the passive beam-splitter technology provides a daylight-usable, bright and otherwise unequaled, contrast-rich display of high-resolution GIS and BIM datasets.

Agisoft Metashape Pro allows for the reliable and precise processing of very large datasets and has become an indispensable part of workflow processes for many geospatial experts. UAS systems today may capture image series with several thousand images and resolutions reaching 1cm and less. In terrestrial close-range imaging even less than 1mm pixel size is achievable. Terabyte-sized datasets can be calculated with Metashape Pro and the right processing hardware without any problems. The capability to process large datasets with ultra-high resolution is routinely used in the digital reconstruction of buildings and terrain models. A notable application being archeology, when recording and reconstructing cultural sites and assets.

With Metashape Pro, 3D models can also be recorded multiple times and displayed as a 4D time series. Due to its ability to realistically display geospatial background data together with highly detailed building models, Metashape Pro is also the ideal software for surveyors and architects. The main functionalities of this software solution are:

  • To load and process diverse geospatial datasets, especially from aerial imaging and LiDAR sensors

  • To positionally adjust, rigidly triangulate thousands of aerial images in photo blocks

  • To geocode digital images, topographic maps, point clouds, CAx data integration

  • To provide a comprehensive tool-set for stereoscopic feature collection and GIS-based attribution

  • To auto-correlate digital terrain surfaces (DSM) and provide tools for editing, creating DTM output

  • To adjust image data radiometrically and generate seamless, orthorectified mosaics

  • To convert geospatial 3D data into various projections and height reference models

  • To provide processing workflows for multispectral (thermal) UAS and aerial images

  • To process LiDAR point cloud data and co-register with positionally accurate objects

  • To integrate panchromatic and multispectral satellite images for geocoding and feature collection

  • To generate textured and geocoded 3D models of objects and buildings

Best viewing comfort with excellent 3D quality

For the past 16 years, the passive beamsplitter monitor technology of the 3D PluraView monitors, especially in the geographic information industry, has been the reference with the highest user satisfaction and user acceptance among all stereoscopic display systems.

As a further developed successor to the beamsplitter monitors from PLANAR, very large amounts of data can be spatially visualized in the highest resolution with up to 4K / UHD per stereo channel. Especially on the 4K displays, the image quality is so extraordinary that even finely textured objects are being displayed razor-sharp and rich in contrast, down to the smallest detail. At the same time, the passive 3D PluraView monitors are particularly durable and reliable. They deliver the substantial benefit of being able to work fatigue-free in all 3D stereo applications, even in normal office daylight conditions. Due to the passive 90° linear polarization filter glasses, which require a screen each for the left and right stereo image, the 3D PluraView monitors are absolutely flicker-free and completely independent of the room’s light source.

Maximum Metashape performance due to complete workplace solutions

All of the 3D PluraView models are ‘plug & play’ compatible with the software application Metashape Pro. These 3D displays support professional software applications with high refresh and smooth image motion, combined with high color depth and contrast. The reliable 3D PluraView technology from Schneider Digital is the basis for precise, stereoscopic display capabilities of the highest quality. As a stereo-capable software solution, Agisoft Metashape always utilizes the full performance potential of each workstation: All components, i.e., multi-core CPUs, graphic cards and RAM memory are used optimally, in order to fully exploit and scale-up the performance of Metashape. Schneider Digital is also a specialized manufacturer of powerful high-end PULSARON workstations, based on AMD and the CENTURON series, based on Intel CPUs, both equipped with professional graphic cards from NVIDIA or AMD. These custom-built workstations are built to meet the individual requirements of the expert user and are optimally supporting Metashape in all processing tasks. Supplemented by special 3D hand-held controllers, such as the Stealth S4-Z or the 3Dconnexion SpaceMouse, complete processing environments are created by Schneider Digital with matching hardware components for computation, visualization, analysis and interaction with demanding, graphics-intensive 3D projects. Agisoft users benefit from a complete workplace solution by Schneider Digital, which guarantees highest workstation performance and provides an optimal stereoscopic image visualization and editing environment. This ideal combination and the compatibility of the Agisoft Metashape Pro software with the 3D PuraView monitors has now been officially certified by Schneider Digital.

Certified matching of software and hardware

How the interaction of Metashape Pro and the 3D PluraView monitor turns out in practice was successfully tested by Schneider Digital and Agisoft last year. As expected, both hardware and software met the high requirements of international standards in terms of quality and performance. The solutions have consequently been certified by both partners. For Josef Schneider, CEO of Schneider Digital, an important step forward: “We are delighted to be merging geospatial software and hardware into a high-performance combination through our collaboration with Agisoft. With the 3D PluraView monitor technology, we offer Metashape Pro users an ideal, truly spatial 3D visualization environment that enables them to experience Metashape 3D datasets individually or as a team in 3D-stereo. And this without having to wear restricting VR headsets."

Find out more at www.pluraview.com 

Schneider Digital Contact:

Schneider Digital

Josef J. Schneider e.K.

Maxlrainer Straße 10

D-83714 Miesbach

Tel.: +49 (8025) 99 300

Mail: info@schneider-digital.com   

Schneider Digital – The company:

Schneider Digital is a global full-service solution provider for professional 3D-stereo, 4K/8K and VR/AR hardware. Based on its 25 years of industry and product experience as well as its excellent relationships with leading manufacturers, Schneider Digital offers innovative, sophisticated professional hardware products and customized complete solutions for professional use. Qualified advice and committed after-sales service are the company's own standards.

The Schneider Digital product portfolio includes the right professional hardware solution for the respective requirements in these areas: High resolution 4K/8K to multi-display walls. Schneider Digital is the manufacturer of its own powerwall solution smartVR-Wall and the passive stereo monitor 3D PluraView. Performance workstations and professional graphics cards from AMD and NVIDIA as well as innovative hardware peripherals (tracking, input devices, etc.) round off the product range. Many articles are in stock. This guarantees fast delivery and project realization.

Schneider Digital is an authorised service distributor of AMD FirePRO/Radeon Pro, PNY/NVIDIA Quadro, 3Dconnexion, Stealth int., Planar and EIZO. Schneider Digital products are used primarily in graphics-intensive computer applications such as CAD/CAM/CAE, FEM, CFD, simulation, GIS, architecture, medicine and research, film, TV, animation and digital imaging.

Further information is available at www.schneider-digital.com and www.3d-pluraview.com.

Schneider Digital press contact:

LEAD Industrie-Marketing GmbH

André Geßner Tel.: +49 80 22 - 91 53 188

Hauptstr.46 E-Mail: agessner@lead-industrie-marketing.de

D-83684 TegernseeInternet:www.lead-industrie-marketing.de

Training in Augmented Reality

By Angelica Jasper, Claire Hughes, Kay Stanney, Jennifer Riley, & Cali Fidopiastis of Design Interactive, Inc.

Virtual and Augmented Reality (VR/AR) are experiencing massive growth as  platforms to support educational efforts, with an estimated $700M being invested in immersive education applications by 2025. Interestingly, while an estimated 80% of teachers already have ready access to VR/AR head worn displays, few use them regularly, but most want to uncover how best to incorporate them into the educational process. The desire to adopt VR/AR technology likely stems from its high efficacy, as on average, immersive training solutions have been found to be more effective and efficient than traditional training of cognitive, technical, and socio-emotional skills. The learning gains (pre- to post- test) from immersive training average 2.5% more per 1⁄4 hour of training for cognitive skills versus traditional training and 2.95% per hour gain in technical skills versus traditional training. Further, immersive training realizes ~30% more performance efficiency and equivalent fewer errors versus traditional training, with increases of ~30% in confidence and self-efficacy, though standard deviations tend to be high. These gains are believed to be derived from the immersive nature of VR/AR that can bolster knowledge acquisition, information retention, engagement, and presence. Users can proactively interact with 3D content in the confines of reality, facilitating enriching learning approaches. Within the military medical training domain, AR has successfully supported a variety of healthcare knowledge and skills including surgical training, anatomy lessons, and heart disease education. AR may be particularly suited to military medical training) because it provides innovative image-guided approaches, often with interactive, 3D images that provide a variety of real-time feedback (e.g., haptic feedback) through hands-on learning. Realistic, real-time feedback is crucial within the military medical domain due to fast-paced nature of the battlefield. 

Despite the documented benefits of AR for learning and training, the AR industry is still working to overcome several developmental obstacles that may negatively impact information retention and performance. For instance, there is some evidence for attentional tunneling and skill transfer problems that inhibit learning in AR. Further, the connection between individual psychological characteristics and learning performance in AR is limited, leaving the question remaining: What makes someone well suited and responsive to learning in AR? 

Individual Differences in Learning

Individuals differ in their ability to learn and retain information. Successful adult learning has been linked to psychological abilities, including processing speed, memory, and general intelligence. These propensities can be further impacted by both an individual’s level of internal motivation and their ability to believe that they have control over their lives, also known as internal locus of control. Performance is often enhanced by an internal locus of control in adult learners, which has been significantly connected to self-efficacy. 

Self-efficacy refers to an individual’s personal judgement of how well they can perform at something given the skills they have. Someone with strong self-efficacy is more inclined to view challenges as a motivating factor to overcome and is driven to learning and growing their skillset. This individual psychological characteristic has been repeatedly linked to more successful learning performance, including higher engagement with the content and can even predict future learning successes. Differences in self-efficacy may be reflective of learning performance in many contexts, including a variety of medical trainings. Thus, the increased self-efficacy associated with VR/AR training is beneficial to learning.

Real-World Application

Recent work from Design Interactive seeks to establish what makes an individual receptive to learning in AR, specifically in the context of tactical casualty combat care (TCCC), a standardized military medical training curriculum. Design Interactive has developed AUGMED™, an AR-based TCCC trainer designed to teach and assess TCCC skills in an interactive, simulated environment.  

In a recent study, AUGMED™ was used to assess the psychological effects of AR in order to better characterize “AR Psychological Suitability”, which characterizes the extent to which a context is receptive to and a system is capable of producing learning and desirable psychological and human performance outcomes in AR environments. Participants experienced a series of training modules to guide them through treatment of massive hemorrhage and battlefield respiratory injuries. Using the AR lessons as a guide, participants were then required to perform procedures and complete skills tests to assess overall learning performance. 

Researchers parsed participants into low, medium, and high-performance groups to evaluate performer differences, including the individual psychological characteristic of self-efficacy. Group difference analyses revealed that high performers consistently had greater levels of self-efficacy associated with AR training, whereas low performers had lower levels of self-efficacy.  In other words, those who believed they could perform the medical tasks given their AR lessons (in combination with any previously held knowledge) were the most successful at retaining the learned information (as measured by a skills test) and applying it in the experimental scenario. These findings are consistent with and reflective of prior self-efficacy and learning research, indicating that the AUGMED™ training suite may produce intended learning outcomes. 

Suitability for AR Learning 

Results from this study indicate that those individuals with higher self-efficacy exhibited better learning outcomes in an AR environment than those with low self-efficacy. These individuals with higher self-efficacy were more responsive to the AR learning modality whereas those with low self-efficacy may need more support (e.g., additional hints, explanations, guidance) when training in immersive environments. As these types of innovative technologies emerge and become mainstream, it will become even more valuable to identify the role of individual psychological factors that affect AR learning performance.

Contact:

Angelica Jasper

Angelica.Jasper@designinteractive.net

Immersitech launches patented immersive spatial (3D) audio SDKs to compliment online communication applications

Immersitech, a leading developer of patented sound processing technologies, announced the commercial launch of its Immersitech Engage™ with easy-to-integrate software development kits (SDKs). Immersitech’s technology addresses sound challenges experienced by collaboration, gaming, and multi-user, virtual event platforms by reducing background noise, improving voice clarity, and introducing immersive spatial audio to provide the most in-person-like experience available.

Immersitech’s new product offering provides flexibility for service provider development teams to easily use a sound improvement technology within the current platform seamlessly and effortlessly. “Many service providers are overwhelmed trying to address foundational quality issues while also adding new features. We augment those teams with our expertise in audio processing and allow them to deliver higher quality services faster,” notes CEO & Co-founder, Jim Poore.

Immersitech’s technology is now available globally by contacting Bill Sweeney, Director – Business Development & Sales at info@immersitech.io

ERIC PRYDZ ANNOUNCES EXCLUSIVE VR CONCERT SERIES WITH SENSORIUM GALAXY

Award-winning producer Eric Prydz is joining the Sensorium Galaxy Metaverse to host a series of exclusive performances in PRISM — the world’s most anticipated virtual destination for extraordinary music events.

Commenting on the new collaboration, Eric Prydz said: “I believe metaverses are taking digital events to the next stage, which is one of the main elements that has drawn me to become part of this project. As live streams fail to deliver the emotions and interactions of real-life concerts, the flexibility of what Sensorium is building offers me a whole new and unique way to connect with my audience and discover an entirely new dimension of music.”

Prydz has been at the forefront of electronic music for more than a decade, thanks to his visceral productions and exhilarating live shows. His bold and ambitious concerts are known for pushing the limits of music and technology to create experiences that get the entire dance world talking, something that aligns perfectly with Sensorium Galaxy’s vision for music experiences.

This hugely influential artist is the first DJ to join Sensorium Galaxy in 2021. Last year, a series of world-acclaimed DJs confirmed their performances on PRISM, including the legendary David Guetta, Armin van Buuren, Carl Cox, Black Coffee, Dimitri Vegas & Like Mike.

Aside from his upcoming virtual shows in PRISM, Eric Prydz is also taking part in Sensorium Galaxy’s international brand reveal campaign ‘The Chosen Ones’. He is the first artist to be featured on a series of short videos conceptualized by Academy-Award winning creative production studio The Mill, and renowned creative agencies The Night League and High Scream. These agencies have a record for delivering outstanding productions, visual effects, and virtual experiences for clients such as Ford, Disney, Nike, Coca-Cola, Burberry, and many other top-grade brands.

This campaign is directed by Rogier Schalken, an award-winning Dutch film director responsible for bringing to life global advertising projects for KLM Music, Samsung DiscoVR, and Adidas’ Leo Messi Road to FIFA. The Chosen Ones’ introduces people to the music-dedicated virtual world by featuring the confirmed artists and incorporating PRISM’s brutalist and futurist vibes and signature design elements.

Sasha Tityanko, Deputy CEO for Sensorium Galaxy: “PRISM is a unique destination for artists. Our virtual setup is constantly mutating based on music. And that’s exactly why this collaboration with Eric is so great. His distinctive progressive style matches perfectly with our goal to create out-of-this-world immersive experiences for users.”

Sensorium Galaxy incorporates advanced social features to enhance user experience in virtual setups. With their eye-catching, AI-powered avatars, users can watch performances from multiple angles, heights, and even see through the DJ’s eyes. Multi-channel communication also enables them to connect and communicate with other users and artists in a surprisingly natural manner.

Yann Pissenem: “I’ve been working together with Eric for many years. All of his shows have delivered high-tech experiences and helped to raise the bar in the entire industry. I am excited to see how he can leverage the endless possibilities provided by virtual reality to deliver fully immersive DJ performances.”

Sensorium Galaxy is expected for public launch in H2 2021. Global coverage and cross-platform compatibility ensure that all fans of Prydz have the opportunity to enjoy the progressive vibes of his best shows regardless of their physical location.

Contact:

Name: ELENA RUDOVSKAYA

Email Address: elena.rudovskaya@sensoriumxr.com

Website URL: https://sensoriumxr.com/

Nextech to Launch Augmented Reality NFT Hologram Creator Platform

VANCOUVER, B.C., Canada – July 26th, 2021 – Nextech AR Solutions Corp. (“Nextech'' or the “Company”) (OTCQB: NEXCF) (NEO: NTAR) (CSE: NTAR) (FSE: N29), an emerging leader in augmented reality for eCommerce, AR learning applications, AR-enhanced video conferencing and virtual events, today revealed plans to launch its NFT hologram creator platform. The creator platform will leverage the Company's human hologram creator platform HoloX, which is expected to launch in the third quarter. Once launched, customers will have the ability to seamlessly experience its digital collectibles in augmented reality.

According to Reuters, “The market for non-fungible tokens (NFTs) surged to new highs in the second quarter, with $2.5 billion in sales so far this year, up from just $13.7 million in the first half of 2020, marketplace data showed. An NFT is a crypto asset, representing an intangible digital item such as an image, video, or in-game item. Owners of NFTs are recorded on blockchain, allowing an NFT to be traded as a stand-in for the digital asset it represents”.

The Company has a two staged rollout plan where initially AR human holograms are purchased through a third-party NFT marketplace, then viewed and experienced outside a digital wallet using Nextech’s HoloX application. The second stage of the rollout includes minting the NFT on Nextech’s platform and being able to buy and sell human holograms on Nextech’s platform.

The Company is also in talks with existing marketplaces to leverage its newly acquired Threedy.AI 3D content creation technology to turn existing NFT artwork into AR NFTs at scale using Nextech’s Threedy AI. Through a simple JavaScript tag integration, NFT product photos are automatically onboarded, 3D models are created for each NFT through the power of AI and hosted on the Threedy’s cloud, and 3D visualizations are served to client properties using web AR/3D, all within a single integrated platform. This platform will support the production of thousands of 3D models per week, further advancing the reach of the NFT market.

Dawsyn Borland, VP of AR Innovation Labs and Content comments:

“This venture is an exciting step forward for Nextech and a big move for creators and consumers alike. AR experiences are extremely well positioned as NFTs, as they not only display digital content but allow buyers to interact with them.” She continues, “Our current technology stack is perfectly in line with the global adoption of NFTs and we are thrilled to meet this growing demand.”

Evan Gappelberg, CEO of Nextech comments:

“This is an exciting new market opportunity for our Company and when the NFT enabled platform is completed it will squarely put us into the world of blockchain by merging our AR tech with NFT technology.” He continues, “Our AR creation technology leverages AI allowing us to scale up the production of high-quality and personalized content as a hologram NFT, be it your favorite athletes, artist or artwork. We are continuing to build leading edge AR solutions for fast growing industries by coming up with innovative ways of leveraging our tech stack. Our goal has always been to create long term shareholder value and with this new AR NFT innovation I believe we are succeeding at doing just that.”

Human holograms are a force that is driving the digital economy. According to Gartner**, by 2035, the digital human economy will become a $125-billion market. Digital human technologies are growing exponentially across many of today’s industries and use cases, with an eye toward more use cases tomorrow.

**“Maverick Research: Digital Humans Will Drive Digital Transformation”; Gartner Inc., March 31, 2021

Further details about Nextech’s NFT will be announced next month.

See more info here

Contact:

Name: Judith Planella

Email Address: judith.planella@nextechar.com

LBX Immersive Provides Virtual Reality Headset Solutions to the Stanford University School of Medicine

Stanford Press Release Image (1).png

Mare Island, California - July 28, 2021 - In April 2021, LBX Immersive leased virtual reality headsets to the Stanford University School of Medicine to be used in its first-ever Operative Neuroanatomy Course in Virtual Reality. Providing affordable and accessible virtual reality solutions to institutions of higher education is one of the core missions of LBX Immersive. 

The team at LBX Immersive believes that integrating virtual reality into more traditional modes of learning helps students gain a deeper understanding of course material. In addition to this, we believe that it is important to introduce students to the concept of virtual reality as it starts to become an important component in the workflow of many professions. Professional fields currently using virtual reality include surgery, pharmaceuticals, medical research, design, and simulation training. 

Through our strategic partnerships with highly experienced VR/XR content and platform creators, we can offer medical schools and programs the most productive learning experiences. It is possible to step inside a 3D model of the human brain through the ENGAGE platform “360 Room” or perform a cost-effective cadaveric dissection through VictoryXR Cadaver Lab. These immersive experiences and spaces enable students to successfully conceptualize complex lessons and ideas. 

LBX Immersive works directly with department heads, faculty, and students to determine the right solution for their VR/XR goals. Clients tell us what VR/XR education experience they’re trying to create and we provide a complete solution that includes hardware, software, and/or custom content, device management, and training. If clients know they want to adopt VR/XR into their course offerings but don’t know how to, we provide the expertise to consult and guide the process. 

To learn more about our VR/XR solutions, contact us: info@lbximmersive.com


About LBX Immersive:

LBX Immersive is an experienced team providing E3  (Education, Enterprise, Entertainment) virtual reality and mixed reality solutions. We sell complete solution packages that include consulting, hardware, content, custom content, device management, and training. Our goal is to support you from start to finish on your VR/XR journey. To find out more visit https://lbximmersive.com/, follow us on Linkedin, Facebook, Twitter, and Instagram


Pico chosen as first VR headset provider to support new Audi-backed holoride SDK

holoride announced the launch of its Elastic Software Development Kit (SDK) on the newly launched holoride Creator Space. Created to run on the Unity game engine, the Elastic SDK enables developers to access a powerful toolset to create immersive, in-car game experiences with ease.

The Elastic SDK is a creative reinterpretation of real-world maps and geographic information that is projected into the Unity Scene, which is where creators work with content assets within the Unity platform. Creating content does not require any additional coding or developing skills for an initial build and, therefore, lowers the barrier for content creators to get started right away.

Pico Interactive is the first virtual reality headset provider to support the holoride experience through the Creator Space. Developers who use Pico’s VR headsets will be able to take their creations beyond their computers and experience them in combination with holoride’s Development Kit.

holoride Launches Elastic Software Development Kit on Unity’s Real-Time 3D Platform for Developers to Create In-Vehicle XR Content


holoride
, the Audi-backed company advancing the future of in-vehicle entertainment, today released its Elastic Software Development Kit (SDK) on the newly launched holoride Creator Space. Created to run on the Unity (NYSE: U) game engine, the world’s leading platform for creating and operating real-time 3D (RT3D) content, the Elastic SDK enables developers to access a powerful toolset to create immersive, in-car game experiences with ease.

holoride is building the world’s first immersive in-vehicle media platform by enabling processing motion and location-based data in real-time using the Unity game engine. holoride’s content adjusts to the motion of the vehicle and route and perfectly syncs with the passenger’s journey. Through this, holoride creates a new media category made for moving vehicles called Elastic Content, which allows for a novel approach in content creation.

“The release of our Elastic SDK marks an important milestone for holoride in preparation for our market launch in 2022,” says Nils Wollny, CEO & co-founder of holoride. “We’re opening a world full of possibilities for content creators to start building contextual, immersive experiences for passengers. By eliminating tech and business complexity, we encourage content creators to unleash their full creative potential and embark on this ride to build great content.”

The Elastic SDK is at the core of this creative development process and the powerful software that drives the holoride experience. It is a creative reinterpretation of real-world maps and geographic information that is projected into the Unity Scene, which is where creators work with content assets within the Unity platform. Creating content does not require any additional coding or developing skills for an initial build and, therefore, lowers the barrier for content creators to get started right away. By adding prefabricated assets like renderers, animations, rigid bodies and audio, content creators can easily populate the worlds they create.

“From next-generation infotainment systems to augmented and virtual reality-powered experiences, Unity is changing how in-vehicle content is created and consumed,” says Timoni West, vice president, augmented & virtual reality at Unity. “holoride is revolutionizing the way that we can interact with moving objects, both in the digital and physical world and is representative of how Unity’s platform is the underlying SDK for the metaverse.”

As people start to make better use of their travel time, in-vehicle Elastic Content becomes an attractive option for passengers – whether for entertainment value or to reduce motion-sickness. Giving creators access to more experiential reality (XR) and VR technology opens up more avenues for content creation as vehicles become the next big thing in platform content.

To further the content creation process, holoride has built the Creator Space. The Creator Space includes everything users need for their development journey, including the Elastic SDK, documentations and tutorials curated by holoride and continuously enriched by the developer community. Beyond these tools, it is a platform for developers, artists and enthusiasts alike to collaborate in creating richer content.

To encourage content creators to get started with the Elastic SDK, holoride has built the Development Kit, a prototype that provides all essential data so creators can conduct real-world tests in a car. The Development Kit will allow content creators to demo the content they have built using the Elastic SDK and provide them with valuable insights aside from simulating it in the game engine.

With support from Audi and Unity, holoride will offer developers to experience Elastic Content first-hand with original content created by Pittsburgh-based AAA studio Schell Games in a roadshow from Los Angeles to San Francisco. With holoride experts onsite, developers will have the chance to discuss the Elastic SDK and its wide range of capabilities.

Pico Interactive will be the first virtual reality headset provider to support the holoride experience through the Creator Space. Developers who use Pico’s VR headsets will be able to take their creations beyond their computers and experience them in combination with holoride’s Development Kit.

Any developer or gaming studio is eligible to apply for one of the limited development kits. To apply, visit holoride’s Creator Space at http://developer.holoride.com/

About holoride
German startup holoride creates an entirely new media category for passengers by connecting Extended Reality (XR) content with data points from the vehicle in real time. These data points include physical feedback, like acceleration and steering, traffic data, as well as travel route and time. holoride technology provides a new type of immersion into any kind of VR content, creating a breathtaking, immersive experience, and significantly reducing motion sickness. The tech startup was founded at the end of 2018 in Munich, Germany by Nils Wollny, Marcus Kuehne, Daniel Profendiner, and Audi, who holds a minority stake in the startup. It was hailed “Best of CES” four times (Las Vegas, January 2019), recognized as one of the 100 Best Inventions of 2019 by TIME Magazine and is part of the global innovation platform “STARTUP AUTOBAHN powered by Plug and Play.” In 2021, holoride went on to win the prestigious SXSW Pitch and was also named Best in Show.

For more information, please visit http://www.holoride.com/

About Unity
Unity (NYSE: U) is the world’s leading platform for creating and operating real-time 3D (RT3D) content. Creators, ranging from game developers to artists, architects, automotive designers, filmmakers, and others, use Unity to make their imaginations come to life. Unity’s platform provides a comprehensive set of software solutions to create, run and monetize interactive, real-time 2D and 3D content for mobile phones, tablets, PCs, consoles, and augmented and virtual reality devices. The company’s 1,800+ person research and development team keeps Unity at the forefront of development by working alongside partners to ensure optimized support for the latest releases and platforms. Apps developed by Unity creators have been downloaded more than five billion times per month in 2020.

For more information, please visit http://www.unity.com.

About Pico Interactive
Pico Interactive focuses on innovative VR and AR solutions which enable businesses to create and experience the best in VR and Interactive Computer-Generated Imagery (CGI). With operations in the United States, Europe, China and Japan, Pico Interactive focuses on creating amazing VR platforms for any application and is built around the principle of “user first design.”

To learn more, visit http://www.pico-interactive.com.

 

Lenovo and Varjo Help Power Aston Martin’s Use of Mixed Reality

Legendary automaker, Aston Martin, is known for its commitment to luxury design and the ultimate customer experience. Using Lenovo’s ThinkStation P620 workstation, along with Varjo’s XR-3 headset, the company is pushing the boundaries of what’s possible to create a fully immersive experience of their first, high-performance SUV, the Aston Martin DBX. Mixed reality technology is allowing Aston Martin to not only iterate on designs faster, but also enhance dealer training and customer engagement with an ultra-realistic model of the Aston Martin DBX.

You can learn more about Aston Martin’s use of immersive technology in a customer video here. In the video, VP and Chief Marketing Officer at Aston Martin, Pete Freedman, delves into the various XR applications they’re exploring across the business, while spokespeople from both Lenovo and Varjo reveal how their respective hardware is bringing the Aston Martin DBX experience to life, all in human eye resolution.

LBX Immersive Becomes Reseller of VictoryXR World-Leading Virtual Reality Educational Content

LBX ImmersiveVictoryXR Logo.png

Mare Island, California - July 21, 2021 - The team at LBX Immersive is excited to announce that we have signed a reseller partnership agreement with VictoryXR, a world leader in VR and AR content creation for schools and education. This partnership furthers the mission of LBX Immersive to provide quality content and complete VR solutions to our education clients. Through the reseller agreement, LBX Immersive clients can purchase licenses to VictoryXR Academy (the most advanced learning campus for synchronous, multi-student learning with a live teacher or pre-recorded lessons) and/or VXR.Direct (asynchronous VR learning platform with lessons on science, history, language, and more). These licenses can be bundled with LBX Immersive headset leasing solutions. 

At LBX Immersive, we work to provide affordable and accessible virtual reality solutions to High Schools and Universities. Our complete solutions include consulting, hardware, content, custom content, device management, and training. We strongly believe in the need for high-quality virtual reality experiences suited for the next generation of learning modalities. 

We are excited to be connected to such a well-respected name within the field of virtual reality and augmented reality educational product development. The combination of affordable VR headset solutions and high-end education content makes this a successful partnership. 

"VictoryXR is very selective about who we choose as partners. LBX Immersive is the type of premiere partner that we are happy to have on our team because they represent our product well."  - Steve Grubbs, CEO of VictoryXR

To learn more about our VictoryXR bundles, contact us: info@lbximmersive.com


About LBX Immersive:

LBX Immersive is an experienced team providing E3  (Education, Enterprise, Entertainment) virtual reality and mixed reality solutions. We sell complete solution packages that include consulting, hardware, content, custom content, device management, and training. Our goal is to help you from start to finish on your VR/XR journey. To find out more visit https://lbximmersive.com/, follow us on Linkedin, Facebook, Twitter, and Instagram


Desktop AR Educational Program Launched for 20,000 students in UK by Perception

HoloSDK for Education.png

Perception, a deep-tech company, is today releasing their Desktop Augmented Reality (AR) in the UK, using their HOLO-SDK technology, alongside a free educational programme to over 20,000 students.

The project includes Holo-SDK licenses for educators at zero cost - this will give students access to innovative technology that transports virtual items into the real world through AR.

Perception is offering free Holo-SDK licences to educators as part of their ‘Educational program’. As part of their efforts to make it easier for teachers to teach with VR/AR in an immersive and accessible way. The program will be distributed to 20,000 students across the UK.

Dr Sirisilp Kongsilp, Founder and CEO of Perception, says, “Desktop AR is an incredible technological achievement in making Augmented Reality accessible and simple to use, without the need for expensive hardware or high-tech AR glasses. Virtual and Augmented reality have started to dominate the tech sphere, but too often they are incredibly expensive, meaning the use of the products are limited to those in a particular financial category.”

Please find more details in the press release below.

PERCEPTION LAUNCHES A FREE EDUCATIONAL PROGRAM ALONGSIDE THE RELEASE OF HOLO-SDK TECHNOLOGY

Deep-tech company Perception are launching their Desktop Augmented Reality (AR) product onto the UK market. The HOLO-SDK technology is free to use, and makes augmented reality now accessible for most. Where real-life interactions have been stifled, this timely technological innovation can be used to make the arts more accessible in an educational environment. Perception is offering free Holo-SDK licences to educators as part of their ‘Educational program’. As part of their efforts to make it easier for teachers to teach with VR/AR in an immersive and accessible way.

Dr Sirisilp Kongsilp, Founder of Perception, says, “On account of Perception, I am extremely proud to introduce HOLO-SDK technology to the UK. This is an incredible technological achievement, in that we are making Augmented Reality accessible and simple to use, without the need for expensive hardware or high-tech AR glasses”.

Perception, a deep-tech company, has today announced the launch of their Holo-SDK software in the UK. Holo-SDK is a unity plugin which allows users to view holographic Augmented Reality images through their own desktop screen. This system is known as Desktop AR, which turns an ordinary 2D monitor into a volumetric display. It brings virtual objects into the real-world using a webcam and anaglyph glasses.


The release of HOLO-SDK arrives alongside an educational program in which Holo-SDK licenses for educators are being given for free as part of their efforts to ensure easy access to their software as well as the educational opportunities provided by the software itself. The package includes several workshop materials such as a free 1 year educational license, a workshop source code and workshop manual.


HOLO-SDK has a range of unique technological intricacies with features that enhance the Augmented Reality it caters to and without users having to spend a fortune. It has the ability to track users’ head positions by utilising the webcam, which then in turn adjusts the virtual cameras; this is all made possible via Holo-SDK software. The system itself also renders holographic images according to a user's viewing position, allowing them to perceive a 360 degree experience of the object.


Perception is making this form of Augmented Reality readily accessible to many users, by limiting the requirements for this software; having a desktop webcam and compatible glasses are the only criteria. At home Desktop AR is entirely free, and users can make DIY red & blue glasses or purchase them online at little cost. The accessibility of this product means it has the capability as an integrated system that can function as an alternative to reality for many.


This form of Augmented Reality in this context can help re-connect students with arts and history; bringinging augmented reality cultural experience to life. It also can enable a certain intimacy with the arts and history, by enabling users to view the pieces up close, with unlimited time to assess and appreciate the content before them. It is a fantastic educational tool as well as for voyeuristic purposes for all types of users.


Dr Sirisilp Kongsilp, Founder and CEO of Perception, says, “I am extremely excited to share this HOLO-SDK technology with the UK. Desktop AR is an incredible technological achievement in making Augmented Reality accessible and simple to use, without the need for expensive hardware or high-tech AR glasses. Virtual and Augmented reality have started to dominate the tech sphere, but too often they are incredibly expensive, meaning the use of the products are limited to those in a particular financial category. Perception is set to disrupt this.”


Link


Media Contact

Lucy Johnston / lucy.johnston@mediazoo.tv / 07522 287745


About Perception

Perception Holo-SDK is a deep-tech company which is revolutionising the Augmented Reality sector by making it widely available across the globe, bringing objects to life through Desktop AR technology. Founded in Thailand in 2019, Perception is expanding into the UK with their immersive technology, to produce a creative desktop experience for individuals. Perception is set to reshape the arts and culture sector through partnerships with museums, galleries and artists which aim to preserve human history. Their marketplace The Morpheus Project, the first ever Holo-NFT art community, allows artists and galleries to display their holographic collectables to sell them to supporters. Their purpose-built Desktop AR platform allows any user to view 360 degree holographic images from the comfort of their own homes. Perception is increasingly making its way into the UK technology sector by partnering with app developers to get them on board with this state of the art AI tech.