Luxexcel launches platform for manufacturing of prescription lenses for smartglasses

VisionPlatform™ 7 integrates a ready-to-use suite of technology solutions to 3D print prescription smart eyewear

Eindhoven, The Netherlands – July 14th, 2021 – Today, Luxexcel announced its next-generation manufacturing platform, Luxexcel VisionPlatform™ 7, which makes it possible for businesses to integrate prescription lenses into the production of smartglasses in their manufacturing facilities. VisionPlatform™ 7 includes new features specifically geared towards manufacturing prescription smart lenses that are lightweight, thin, and can be used in commercial frames similar to traditional eyeglasses. The platform has been developed based on market demand and customer-specific needs for flexible yet high-performing manufacturing systems.

VisionPlatform™ 7 is technology-agnostic and integrates objects such as waveguides, holographic optical elements, and liquid crystal foils during the 3D printing process. The platform includes high-tech hardware, proprietary materials, advanced software, and processes – everything required to create prescription smart eyewear on-demand. With the capabilities to manufacture prescription smart eyewear, technology companies can accelerate their Augmented Reality (AR) eyewear projects and give designers the flexibility to create the smartglasses they want.

Fabio Esposito, Chief Executive Officer at Luxexcel says, “The launch of VisionPlatform™ 7 begins a new era in the manufacturing of prescription smart lenses. To compete in the race to launch consumer-ready smartglasses, eyewear manufacturers need to address prescription in their smartglasses devices. More than 75% of the adult world population today requires prescription lenses. Luxexcel provides a disruptive solution where smart technology is seamlessly combined with a prescription lens, rather than adding prescription power to the smart device as an afterthought. Luxexcel is accelerating the market introduction of true consumer smart glasses by allowing technology companies to manufacture prescription smart eyewear in the comfort of their own manufacturing home.”

This launch is off the back of Luxexcel’s successful product demo earlier this year, which proved that the three vital elements of AR smartglasses – prescription lens, a waveguide, and a projector in a fashionable device – can be combined into a 3D printed prescription lens. Luxexcel’s comprehensive material knowledge and family of custom materials ensures that the integrated objects bond well with the prescription material. VisionPlatform™ 7 also prints critical features, difficult or impossible to manufacture otherwise, such as the air gap required for a waveguide, and printed hardcoating, all without birefringence.

Guido Groet, Chief Strategy Officer at Luxexcel says, “To manufacture a device that combines prescription and smart functions requires a combination of skills in technology and optics. It can be difficult to find this combination of skill sets in one team, so we offer this experience to our partners. We provide our customers with a complete solution to manufacture prescription smart lenses so that they can focus on developing the technologies and content for the device. With VisionPlatform™ 7, more innovative features can be added to smartglasses and the only limitation is the imagination of the designer.”

VisionPlatform™ 7 features include:

  • Integration of smart technologies: VisionPlatform™ 7 is technology agnostic and seamlessly integrates a variety of objects during the 3D printing process to create prescription smart eyewear. Integrated objects include smart technologies such as waveguides and holographic optical elements, as well as active or passive foils such as liquid crystals.

  • Optimized printing processes: VisionPlatform™ 7 prints a range of powers from high to low on diameters from 65mm to below 1 mm. The platform also prints features difficult to produce with traditional lens making, such as air gaps required for waveguides, and micro lenses.

  • Strong adhesion properties: Different adhesion promotors for various objects such as waveguides and holographic optical elements ensure that integrated objects bond well with the prescription material.

  • New materials: Materials for high temperatures ensure that a range of anti-reflective and hydrophobic coatings can be applied to lenses as part of existing ophthalmic post-processing steps.

  • Printed hardcoating: A printable hardcoat is integrated as part of the 3D printing process. Luxexcel delivers an even surface at a low temperature to protect smart technologies inside the lens, the technology can even coat selected areas.

  • Lens design: Custom lens designs can be printed to address users’ most common refractive errors, such as single vision lenses and multifocal progressive designs, free form lenses as well as prisms.

About Luxexcel

Luxexcel is the world’s only technology provider for 3D printed commercial lenses. Our mission is to accelerate the adoption of consumer smart eyewear by meeting the needs of more than 75% of the adult world population requiring prescription lenses. We enable this with our patented 3D printing manufacturing platform. Original equipment and device manufacturers use our platform to design and produce custom, lightweight, smart prescription eyewear on-demand and in a fashionable form factor suitable for the mass market. Luxexcel’s head office is based in The Netherlands, with branch offices in Belgium and the United States.



Contact:

Name: Eva Flipse

Email Address: eva.flipse@luxexcel.com

Nextech AR Solutions Acquires 3D AI Modeling for E-commerce Company Threedy.ai Inc

Nextech AR acquires scalable 3D model creation platform powered by artificial intelligence Self-serve web AR enabled platform to drive mass adoption

VANCOUVER, B.C., Canada – June 22, 2021 – Nextech AR Solutions Corp. (“Nextech” or the “Company”) (OTCQB: NEXCF) (NEO: NTAR) (CSE: NTAR) (FSE: N29), a diversified leading provider of augmented reality (“AR”) experience technologies and services, is pleased to announce that it has acquired previously announced Threedy.ai, Inc. (“Threedy”), an artificial intelligence (“AI”) company based in Silicon Valley, California, in an all-stock transaction valued at US$9,500,000. A definitive agreement has been signed by all parties with closing expected on or about June 25, 2021, upon satisfaction of customary closing conditions. Nextech's acquisition strategy is focused on creating net new revenue opportunities that scale with the global adoption of AR.


“By combining Threedy’s disruptive AI technology and our leading AR platform, we have just changed the game,” commented Evan Gappelberg, Founder and CEO of Nextech. “With our integrated platform powered by AI, users can quickly transform 2D objects into AR-enabled 3D, while removing the friction that currently exists within the customer value chain. Simply – one will be able to take a photo, convert it to a high-resolution 3D image within minutes, and have that item displayed on a phone or device in augmented reality in webAR. This a true game-changer for e-commerce businesses and product manufacturers alike, and for the more than 100 million and growing consumers shopping with AR online and in stores today who are driving the mass adoption of augmented reality in everyday life.”


“Retailers including Kohl's, Pier1, and K-Mart Australia are already using Threedy’s technology to offer AR shopping experiences at scale and now together with Nextech we will create a unified, scalable 3D content creation engine for all our AR solutions,” commented Nima Sarshar, CEO of Threedy. “Threedy has created AI-powered tools that take 3D content creation for AR and other applications from a craft-production process to mass-production. It takes several hours for a typical 3D artist to create a 3D replica of a physical product. Content creation has long been the bottleneck for bringing AR to large retailers, many of whom have thousands of SKUs live at any given time. Using our virtual assembly line technology, thousands of models can be created per week, with minimal human intervention, in many categories.”


About Threedy.ai
Using Threedy’s proprietary AI and computer vision innovations, the production of 3D models can be scaled to 1,000s of 3D models per week. Threedy has built a truly disruptive end-to-end solution around its model creation technology for the AR industry. Through a simple JavaScript tag integration, product photos are automatically onboarded, 3D models are created for each product through the power of AI and hosted on the Threedy’s cloud, and 3D visualizations are served to client properties using web AR/3D, all within a single integrated platform.
For further information, please contact:
Judith Planella at Judith.planella@nextechar.com
To learn more, please follow us on Twitter, YouTube, Instagram, LinkedIn, and Facebook, or visit our website: https://www.Nextechar.com

About Nextech AR
Nextech develops and operates augmented reality (“AR”) platforms that transports three-dimensional (“3D”) product visualizations, human holograms and 360° portals to its audiences altering e-commerce, digital advertising, hybrid virtual events (events held in a digital format blended with in-person attendance) and learning and training experiences.
Nextech focuses on developing AR solutions however most of the Company’s revenues are derived from three e-Commerce platforms: vacuumcleanermarket.com (“VCM”), infinitepetlife.com (“IPL”) and Trulyfesupplements.com (“TruLyfe”). VCM and product sales of residential vacuums, supplies and parts, and small home appliances sold on Amazon.

Your Website URL: https://www.nextechar.com/press-releases-and-media/nextech-ar-solutions-acquires-3d-ai-modeling-for-e-commerce-company-threedyai-inc

URL for your video (YouTube, Vimeo): https://youtu.be/itRqTRVi9CQ

DARK SLOPE STUDIOS ANNOUNCES THE LAUNCH OF A VIRTUAL PRODUCTION FELLOWSHIP FOR FACTUAL ENTERTAINMENT - Deadline is July 19!

Dark Slope Studios, the Toronto-based virtual production and metaverse gaming company, today announces the launch of their virtual production fellowship. Announced by CEO Raja Khanna at the Banff World Media Festival, the studio will be open to accepting format pitches for the next technology-driven prime time reality show obsession. Applications will be accepted starting Monday July 19th 2021.

Applicants are invited to submit a brief proposal with a series idea that balances innovative production techniques with compelling storytelling; visionary concepts that blur the line between the real world and the digital world. Dark Slope Studios will be looking for concepts that introduce virtual reality experiences into conventional linear television shows, make use of deep fake facial tracking elements, or combine the use of LED screens and virtual production workflows into a unique, highly original new format of factual entertainment.

The company will review all the applications and select one to three finalists. All proposals will be considered and each finalist will be awarded $5000 CAD and a chance to negotiate a development deal with Dark Slope Studios. Development deals will include mentorship sessions, travel to the RealScreen conference, and investments into concept development, concept art, prototypes and more.

The prizes awarded are for the best original ideas as judged by Dark Slope and their evaluators. A minimum of one prize will be awarded. The finalist will retain all rights. Dark Slope Studios will secure an exclusive right of negotiation with the finalist for 6 months, the finalists are under no obligation to work with Dark Slope. Interested applicants are encouraged to sign up here and submit their full application once submissions open on July 19th.

“We are looking for ideas that appeal to a global audience. We’re a nimble, ever evolving company with strong industry roots and are looking to support the next generation of talent by providing seasoned guidance in an environment that encourages experimentation.” Dan Fill, President, Dark Slope Studios.

Dark Slope Studios balances technical innovation with world-class storytelling in factual entertainment. The company is on the vanguard of innovative production techniques with its proprietary Flex KS production pipeline that integrates full body and facial motion tracking with the use of LED screens, green screens and real-time rendering to create animation, FX, real time integrated live action, and gaming content.



ABOUT DARK SLOPE STUDIOS

Founded in 2018, Dark Slope Studios pushes the boundaries of multi-sensory entertainment center spectacles, virtual reality experiences, immersive games, and motion capture-driven virtual production for television. From it’s 8000 square foot studio, Dark Slope Studios leverages its warehouse-scale virtual production stage with proprietary pipeline, tools, workflows and game-engine technology in order to bring world class creative properties to life. Based in Toronto, and with a global workforce, Dark Slope Studios combines advanced technologies with compelling storytelling to reimagine the future of entertainment. Visit https://darkslopestudios.com/ to learn more.

Esri ArcGIS Pro certified for 3D stereo visualization with 3D PluraView monitors

The Software ArcGIS Pro from Esri meets all requirements for the efficient display and processing of GIS, BIM, LiDAR and photogrammetric geodata in a fully integrated 3D-stereo work environment with an outstanding range of functions. The compatibility of ArcGIS Pro with the entire 3D PluraView monitor product family has now been officially certified by the manufacturer Schneider Digital.

The Software ArcGIS Pro from Esri meets all requirements for the efficient display and processing of GIS, BIM, LiDAR and photogrammetric geodata in a fully integrated 3D-stereo work environment with an outstanding range of functions. The compatibility of ArcGIS Pro with the entire 3D PluraView monitor product family has now been officially certified by the manufacturer Schneider Digital.

Today, ArcGIS Pro from Esri is the most modern and advanced desktop GIS product and is used worldwide for the capture, processing and analysis of high-resolution 3D data. The constant further development of ArcGIS Pro is increasingly blurring the boundaries between Geographical Information Systems (GIS) and classic digital photogrammetry. Important functionalities from photogrammetry, such as importing aerial images, aero-triangulation, and extraction of terrain models, orthorectification and stereoscopic data capture, are now also integrated in ArcGIS Pro. This ensures a continuous GIS workflow, including photogrammetric data capture, directly in the Esri environment, resulting in substantial efficiency gains. The central element here is the complete integration of all existing and newly captured three-dimensional geometries with ArcGIS Pro, and the integrated ability to represent them with full three-dimensionality on stereoscopic output devices. The Software ArcGIS Pro from Esri meets all requirements for the efficient display and processing of GIS, BIM, LiDAR and photogrammetric geodata in a fully integrated 3D-stereo work environment with an outstanding range of functions. The compatibility of ArcGIS Pro with the entire 3D PluraView monitor product family has now been officially certified by the manufacturer Schneider Digital. 

Be it professional thematic maps, complex GIS analyses, geodata management or GIS data capture - with the 3D-capable software application ArcGIS Pro from Esri, professional users enjoy the benefits of the world’s leading GIS environment. For optimal geodata querying and processing, the capture and analysis of spatially connected and topologically correct features, GIS professionals have a whole range of interactive and automated tools at their fingertips with the Pro version. Locally saved geo-information in 2D and 3D format can for example be compared to or synchronized with Cloud servers and online data services. Topographic terrain features and building models at different levels of detail (LOD) can likewise be extracted and saved to file geodatabases (GDB). ArcGIS Pro also offers a range of powerful tools for the management, editing and analysis of large amounts of data, e.g. large-format aerial images, UAS and satellite images, and LiDAR point-cloud data. Real façade textures can be mapped to 3D building geometries from oblique aerial images and vehicle-based mobile camera systems. Likewise, hybrid 3D terrain models can be generated from orthoimages, combined with synthetic 3D models for vegetation and building objects.

Large data volumes - super-high resolution in 3D-stereo on the 3D PluraView monitors

The prerequisite for the stereoscopic display of 3D geodata is a monitor system with high-resolution, bright and contrast-rich 3D stereo visualization, to best support ArcGIS Pro users. This perfect visualization solution is provided by the 3D PluraView family of monitors by Schneider Digital, which are latency-free and work ‘plug-and-play’ with professional NVIDIA and AMD graphics cards. As there is no requirement for proprietary graphic drivers, the 3D PluraView displays are a future-proof solution to display geospatial data also with future versions of LINUX and Microsoft operating systems. The extremely large data volumes, which ArcGIS Pro can process, require not only a professional graphics card and a professional workstation, but also a monitor, which can display topographical information and GIS content in top 3-D stereo quality. Only a bright and high-contrast stereoscopic representation of 3D content can ensure a professional analysis and reliable interpretation of data. For this, the United States software supplier Esri relies on the outstanding performance of the 3D PluraView - the passive, 3D-stereo beam-splitter monitors from Schneider Digital. These dual-screen systems are the de-facto industry standard for all stereoscopic software applications for the past 16 years and are certified for all common 3D-stereo capable photogrammetry and GIS applications. The fold-up beam-splitter mirror allows the flexible use of the 3D PluraView monitors also as a standard monitor for monoscopic tasks. With screen diagonals up to 28”, 3D PluraView monitors provide completely flicker-free stereoscopic 3D visualizations. Thanks to one screen per stereo channel, they offer the full stereo resolution up to 4K (UHD) and 10-bit color depth with brilliant brightness. Their passive polarization filter technology provides a fatigue-free ergonomic working environment, even in office daylight conditions, viewing stereo together with several colleagues.

Visualize, capture and edit GIS and BIM models in stereo

ArcGIS Pro users value the accurate, pixel-precise, stereoscopic image display on Schneider Digital’s 3D-stereo monitors. The high level of viewing comfort on the flicker-free displays and the intuitive operation facilitate the handling of complex data visualizations, thereby improving overall productivity and accelerating working procedures. In combination with ArcGIS Pro, the technology of 3D PluraView monitors offers the possibility of uploading aerial image stereo pairs to capture 3D points, lines or polygons directly as topologically defined GIS elements. 3D mesh-based geometries, smart BIM models and even 3D CAD models can be edited intuitively in a fully integrated 3D-stereo environment.

ArcGIS Pro users benefit from compatibility with a wide variety of data formats when importing and integrating smart 3D models into existing GIS data sets. For subsequent visualization, the 3DConnexion SpaceMouse is a well-proven device, allowing simple and intuitive 3D navigation within ArcGIS Pro. The additional use of a Stealth 3D mouse is recommended where the focus is on the precise measurement and capture of objects, elevations and distances. One of the biggest challenges for GIS applications is the loading of large amounts of data (e.g. detailed urban models) as digital twins, and then displaying them seamlessly in stereo. This requires reliable photogrammetry and GIS solutions, which are innovative, ergonomic, fast and reliable. The stereo-capable software application ArcGIS Pro and the 3D PluraView monitors by Schneider Digital provide this reliability through the synergy of two high-end products. In combination, they ensure a perfect 3D-stereo experience with excellent display quality. The Software ArcGIS Pro from Esri meets all requirements for efficient stereoscopic working with geospatial 3D datasets and has now been certified officially for compatibility with the entire 3D PluraView monitor product family by the manufacturer Schneider Digital. This ‘duo’ turns into the perfect workplace solution with the right 3D input tools and a high-performance Schneider Digital workstation, customized for use with ArcGIS Pro.

Find out more at www.pluraview.com 

Schneider Digital – The company:
Schneider Digital is a global full-service solution provider for professional 3D-stereo, 4K/8K and VR/AR hardware. Based on its 25 years of industry and product experience as well as its excellent relationships with leading manufacturers, Schneider Digital offers innovative, sophisticated professional hardware products and customized complete solutions for professional use. Qualified advice and committed after-sales service are the company's own standards.

The Schneider Digital product portfolio includes the right professional hardware solution for the respective requirements in these areas: High resolution 4K/8K to multi-display walls. Schneider Digital is the manufacturer of its own powerwall solution smartVR-Wall and the passive stereo monitor 3D PluraView. Performance workstations and professional graphics cards from AMD and NVIDIA as well as innovative hardware peripherals (tracking, input devices, etc.) round off the product range. Many articles are in stock. This guarantees fast delivery and project realization.


Schneider Digital is an authorised service distributor of AMD FirePRO/Radeon Pro, PNY/NVIDIA Quadro, 3Dconnexion, Stealth int., Planar and EIZO. Schneider Digital products are used primarily in graphics-intensive computer applications such as CAD/CAM/CAE, FEM, CFD, simulation, GIS, architecture, medicine and research, film, TV, animation and digital imaging.


Further information is available at www.schneider-digital.com and www.3d-pluraview.com.



Schneider Digital press contact:

LEAD Industrie-Marketing GmbH

André Geßner Tel.: +49 80 22 - 91 53 188

Hauptstr.46 E-Mail: agessner@lead-industrie-marketing.de

D-83684 TegernseeInternet:www.lead-industrie-marketing.de

Holographic Exhibition: a partnership between Perception with Imperial War Museum and the Science Museum Group

PERCEPTION PARTNERS WITH IMPERIAL WAR MUSEUMS AND SCIENCE MUSEUM GROUP TO BRING HOLOGRAPHIC AUGMENTED REALITY EXPERIENCES TO 20,000 STUDENTS WORLDWIDE

Perception, a deep-tech Augmented Reality company, has signed a Memorandum of Agreement with both Imperial War Museums and the Science Museum Group to bring holographic AR cultural experiences to life.

The partnerships will bring highlights of the museums’ collections into the homes and classrooms of audiences across the globe using cutting-edge 3D desktop AR hologram technology.

Dr. Sirisilp Kongsilp, CEO and Founder of Perception, says “These agreements not only display the rate at which Perception is expanding, but also mark an exciting opportunity to work with the culture sector to explore the power of Holographic Desktop Augmented Reality software in sharing collections in inspiring new ways. For the first time ever, this technology can bring Museum objects to audiences in holographic 3D anywhere in the world – reimagining the concept of a digital exhibition.”

Perception, a deep-tech company supported by the UK Government’s Department for International Trade’s Global Entrepreneur Programme (GEP), has today announced the signing of a Memorandum of Agreement with Imperial War Museums and the Science Museum Group to explore new opportunities for Holographic AR across the museum, arts and culture sector.

These agreements outline plans to create holographic exhibitions which bring museum artefacts to more than 20,000 students in the UK and Thailand. The agreements are set to foster a strong relationship between the tech and culture spheres, with the partnerships combining Perception’s state-of-the-art Desktop AR facilities with prized artefacts from both museums. Desktop AR is an augmented reality system which allows any 2D monitor to emit holographic images. This is done through tracking user head positions and rendering 3D images accordingly.

These agreements come at a pivotal time for both the Augmented Reality and Arts and Culture sectors. Augmented Reality is taking over the technology sphere at pace, and has vast potential to mirror this growth in the museum, arts and culture sector. Following a tough year for museums due to Covid-19 halting footfall, this technology presents an opportunity to share some of the UK’s most valuable and historical artefacts with a much wider audience.

Both Imperial War Museums and the Science Museum Group are preparing to give Perception access to parts of their internationally-significant collections, with the aim of sharing an impactful selection of British history. This will provide both museums with the opportunity to reach an even larger global audience, which will have an impact long after the pandemic.

Dr. Sirisilp Kongsilp, CEO and Founder of Perception, says “These agreements not only display the rate at which Perception is expanding, but also mark an exciting opportunity to work with the culture sector to explore the power of Holographic Desktop Augmented Reality software in sharing collections in inspiring new ways. For the first time ever, this technology can bring Museum objects to audiences in holographic 3D anywhere in the world – opening up new possibilities to how knowledge can be shared digitally, globally.”

Gill Webber, Executive Director Content & Programmes at Imperial War Museums, says “There is a real desire within the museums sector to reach new audiences and explore innovative ways of sharing their collections. This technology is an exciting way to explore this and we are thrilled to be working with the Perception team on the project.”

Jonathan Newby, Acting Director, from the Science Museum Group, says “Digital exploration is part of our DNA so we’re really excited to be part of this project exploring the emerging holographic AR field which will really help grow our understanding of both the creative potential and how audiences respond. We look forward to developing this relationship and seeing how our audiences can benefit from this exciting new technology.”

Media Contacts:

Lucy Johnston / lucy.johnston@mediazoo.tv / 07522 287745

About Perception

Perception Holo-SDK is a deep-tech company which is revolutionising the Augmented Reality sector by making it widely available across the globe, bringing objects to life through Desktop AR technology. Founded in Thailand in 2019, Perception is expanding into the UK with their immersive technology, to produce a creative desktop experience for individuals. Perception is set to reshape the arts and culture sector through partnerships with museums, galleries and artists which aim to preserve human history. Their marketplace The Morpheus Project, the first ever Holo-NFT art community, allows artists and galleries to display their holographic collectables to sell them to supporters. Their purpose-built Desktop AR platform allows any user to view 360 degree holographic images from the comfort of their own homes. Perception is increasingly making its way into the UK technology sector by partnering with app developers to get them on board with this state of the art AI tech. More info https://www.holo-sdk.com/

About Science Museum Group

The Science Museum Group is the world’s leading group of science museums, welcoming over five million visitors each year to five sites: the Science Museum in London; the National Railway Museum in York; the Museum of Science and Industry in Manchester; the National Science and Media Museum in Bradford; and Locomotion in Shildon. We share the stories of innovations and people that shaped our world and are transforming the future, constantly reinterpreting our astonishingly diverse collection of 7.3 million items spanning science, technology, engineering, mathematics and medicine. Standout objects include the record-breaking locomotive Flying Scotsman, Richard Arkwright’s textile machinery, Alan Turing’s Pilot ACE computer, Dorothy Hodgkin’s model of penicillin and the earliest surviving recording of British television. Our mission is to inspire futures - igniting curiosity among people of all ages and backgrounds. Each year, our museums attract more than 600,000 visits by education groups, while our touring exhibition programme brings our creativity and scholarship to audiences across the globe. More information can be found at sciencemuseumgroup.org.uk.

About Imperial War Museums

IWM (Imperial War Museums) tells the story of people who have lived, fought and died in conflicts involving Britain and the Commonwealth since the First World War. Our unique collections, made up of the everyday and the exceptional, reveal stories of people, places, ideas and events. Using these, we tell vivid personal stories and create powerful physical experiences across our five museums that reflect the realities of war as both a destructive and creative force. We challenge people to look at conflict from different perspectives, enriching their understanding of the causes, course and consequences of war and its impact on people’s lives.

IWM’s five branches which attract over 2.5 million visitors each year are IWM London, IWM’s flagship branch that recently transformed with new, permanent and free First World War Galleries alongside new displays across the iconic Atrium to mark the Centenary of the First World War; IWM North, housed in an iconic award-winning building designed by Daniel Libeskind; IWM Duxford, a world renowned aviation museum and Britain's best preserved wartime airfield; Churchill War Rooms, housed in Churchill’s secret headquarters below Whitehall; and the Second World War cruiser HMS Belfast.

The Global Entrepreneur Programme (GEP)

The UK Global Entrepreneur Programme (GEP) aims to attract some of the world’s most sustainable, early stage innovative companies and entrepreneurs to the UK to set up a British based global headquarters to sell and internationalise. Since 2004, hundreds of GEP companies from all over the world have raised more than £1 billion of venture capital through this programme. These companies seek to take advantage of the UK’s strong and open business environment, under the mentorship of successful serial entrepreneurs.

In February 2020, the UK introduced the Global Talent Visa, which enables the brightest and best tech talent from around the world, including those in ASEAN to come and join the UK’s digital technology sector. And earlier in June, the UK Government launched an £8m Digital Trade Network in Asia Pacific to support new international partnerships for the digital economy with the region. The three-year pilot will enhance the UK’s digital tech capability in Asia Pacific, bringing together digital trade policy, digital tech trade and investment promotion, and a new tech entrepreneur network with Tech Nation to support scaling tech businesses as they seek to internationalise.

About the Department for International Trade (DIT):

The UK's Department for International Trade (DIT) has overall responsibility for promoting UK trade across the world and attracting foreign investment to our economy. We are a specialised government body with responsibility for negotiating international trade policy, supporting business, as well as delivering an outward-looking trade diplomacy strategy.

How Walmart infuses kindness into de-escalation training

Walmart VR.png

Is it possible to teach kindness? What role does kindness play in de-escalating a frustrated customer? As Walmart discovered, not only can we teach frontline workers to be more kind, but associates are eager for training on the topic because it builds both their skills and their confidence.

In this article, you’ll learn about one of the most successful customer service trainings deployed for the world’s #1 retailer, Walmart. A combination of empathy-building, de-escalation, and handling difficult conversations, the Virtual Reality-based experiences have exceeded nearly every expectation set by the organization in terms of adoption, feedback, and performance. And it's all about infusing more kindness into customer service. Maybe we all should take this training!

Read on to learn from Strivr, the company that’s deployed over 50 VR training experiences about how VR not only improves skills, but also influences how employees feel: valued, confident, and engaged.

Read more here http://www.strivr.com/blog/be-kind/

Pico Chosen as BMW’s Hardware-of-Choice for New VR Experience

 
 

Pico Interactive’s headsets have been chosen by BMW in conjunction with We are Jerry (a sports consulting agency) and AR/VR experts GOVAR, for a new VR workshop & live experience for BMW i Motorsport’s Formula E engagement.

The virtual experience runs on ENGAGE, which incorporates all CAD and 360° data, videos, live streams and presentations from BMW i Motorsport in addition to content from sponsors and partners.

With COVID restrictions on live events, We are Jerry and GOVAR created the “BMW i Virtual Garage Experience” for BMW i Motorsport, which provides more options than in-person events. For example, CAD and 360° data of the BMW E race cars, videos, live streams, presentations and other content can be integrated into the experience. Up to 1,000 guests can move around the virtual live experience at the same time if required.

With Pico's VR headsets, select customers, sponsors and partners of BMW i Motorsport will get access to the world's first fully immersive Formula E experience

The virtual workshop & live experience platform developed by We are Jerry and GOVAR opens up new ways and opportunities for us at BMW i Motorsport to work with sponsors, partners and customers”— Lutz-Philipp Kugler, Brand Cooperations BMW Group,

BARCELONA, SPAIN, July 1, 2021 -- Pico Interactive, one of the world's leading developers of innovative VR (virtual reality) solutions for B2B use, will showcase for the first time at the Mobile World Congress (MWC) in Barcelona a groundbreaking virtual reality (VR) workshop & live experience developed by Munich-based sports consulting agency We are Jerry and AR/VR experts GOVAR for BMW i Motorsport's Formula E engagement. The virtual Experience works on the ENGAGE VR platform, developed by VR Education, which makes it possible to incorporate not only all CAD and 360° data, videos, live streams, and presentations from BMW i Motorsport, but also content from sponsors and partners. In order to overcome COVID restrictions on live events, such as VIP hospitality events, We are Jerry and GOVAR created a virtual experience platform - the "BMW i Virtual Garage Experience" - for BMW i Motorsport, which offers even more possibilities than the live onsite experience.

Exceptional motorsport experience
The central question was: "How do we compensate for the COVID-related limitations for the partners and guests of BMW i Motorsport and, at the same time, create completely new target group-compliant customer experiences in the digital world, all as efficiently and synergistically as possible?" explains Karsten Streng, Founder and Managing Partner of We are Jerry.

It quickly became clear to all those responsible: The high standards and requirements of BMW i Motosport could only be implemented with extremely powerful VR headsets. The guests should be presented with a seamless immersive experience that lives up to the premium standards of the BMW brand as an innovation leader. The choice quickly fell on standalone VR headsets from Pico Interactive: "Pico has by far the most powerful VR glasses, which also offer 6 degrees of freedom (6DoF) for spatial movement in the experience," says Stefan Göppel, Managing Director of GOVAR. "Spatial distance does not mean sacrificing social interaction. The virtual workshop & live experience for BMW i Motorsport is therefore an encounter, thanks to the immersion it creates the feeling of a live meeting.”

From the Garage Tour to The Roof Top Bar: Talk&Meet with the drivers
The newly created VR event platform offers significantly more options than "just" virtually mapping the possibilities of the previous hospitality events as part of Formula E. For example, CAD and 360° data of the BMW E race cars, videos, live streams, presentations and other content can be integrated into the Experience, which can be accessed individually by guests. In addition, BMW i Motorsport has the option to provide sponsors and partners of the Formula E team such as Julius Baer, Einhell, Fortinet and others with their own virtual experience areas, which they can use for their own customer loyalty measures. Up to 70 guests can move around the virtual live experience at the same time or if required, even up to 1,000 guests.

"The virtual workshop & live experience platform developed by We are Jerry and GOVAR opens up new ways and opportunities for us at BMW i Motorsport to work with sponsors, partners and customers," says Lutz-Philipp Kugler, Brand Cooperations BMW Group, summarizing the unique project. "Instead of setting up hospitality events over and over again for each Formula E race, we only had to develop the virtual platform once and can now flexibly change and adapt it. Our sponsors and partners have responded very positively to this new form of customer engagement and interaction - not only as an alternative during the Corona pandemic, but they welcome being able to create their own virtual experience spaces for their own content and experiences. This experience clearly demonstrates the future potential of virtual events."

The first BMW i Virtual Garage Experience will take place on July 1, 2021. Selected marketing decision-makers from BMW i Motorsport partners around the world will receive a special BMW i Formula E Experience box containing the Pico headsets as well as exclusive team merchandise. Using the Pico VR headset, guests will be able to enjoy the pre-installed virtual experience and visit unique locations.

During the garage tour, a BMW engineer waits for the guests and introduces the technical concept of the electric racer. Guests can look through the BMW Formula E racing car as if they were using an X-ray machine and view all the technical details at their leisure. Guests will get first-hand insights from the two Team BMW i Andretti Motorsport drivers Maximilian Günther and Jake Dennis, who will be on hand to answer questions. Those who aren't afraid of fast-paced action can climb into the BMW i8 Safety Car as a co-driver and take a lap of the Monaco Grand Prix circuit in 360°. In addition, various content offers await the guests, which they can consume at their leisure. Formula E Hospitality can then be rounded off perfectly with a virtual drink in the Roof Top Bar. Here, guests have the opportunity to communicate with the other guests while also switching to a private mode if others are not to take part in the conversation. Another advantage of the ENGAGE platform used is that the BMW Formula E experience can also be experienced in 2D desktop/laptop mode.

In addition to the tour through the garage of the BMW i Andretti Motorsport team, guests receive an X-ray view of the inner workings of the BMW Formula E race car, receive background information from BMW engineers about the drivetrain, for example, and can experience the cockpit of the bolide to get the race car drivers’ perspective. They can also take a virtual 360° lap of the racetrack as a co-driver in the BMW i8 Safety Car and then meet other guests in the Roof Top Bar for an informal exchange of ideas.Guests at the BMW i Virtual Garage Experience will receive a high-quality hospitality box in advance of each event, which will include exclusive team merchandise as well as standalone VR headsets from Pico Interactive: Simply put on the headset and dive in, the Formula E Experience is pre-installed.

Samet Simsek
Havana Orange GmbH
+4989921315188 ext.
pico@havanaorange.de
Visit on social media:
LinkedIn

3D scanning specialist botspot converted into a stock corporation

Company expands and spins off its scanning services segment

Berlin, 30 June 2021: The Berlin-based enterprise botspot, one of the world's leading developers and manufacturers of multi-sensor 3D scanning systems, was converted into a German Aktiengesellschaft (AG) following a successful financing round (valuation 20 million euro).

This was announced today in Berlin by Sascha Rybarczyk (Board of Management, botspot AG). Due to its strong growth in serial production and through several major research and development projects with leading car manufacturers, the company will move into new, significantly larger production and administrative headquarters in Berlin before the end of August this year. For its likewise growing business area of scanning services, a legally independent subsidiary is planned as a spin-off for the second half of the year, as well as the foundation of a distribution company at the same location.

Sascha Rybarczyk, Board of Management botspot AG: The enormous demand from various industries for 3D technology, particularly for botspot 3D scanners, is creating a great atmosphere of change. With the restructuring, botspot is setting the course to continue shaping the 3D market with unique innovative power, the best know-how and great passion.

With the change of the legal form into a stock corporation, Sascha Rybarczyk (previously already general manager of the company) and Bernd Timmermann (new member with many years of experience in industrial production) were appointed to the Board of Management.

Co-founder and lawyer Markus Frank, software entrepreneur and lawyer Dr Peter Becker (Jurasoft AG) and marketing specialist and multiple founder Vian Feldhusen joined the Supervisory Board. botspot AG is one of the world's leading developers and manufacturers of multi-sensor 3D scanning systems (e.g. photogrammetry & infrared). Since its founding in 2013, the company has become an innovation leader and developed a wide range of scanners as serial products or custom scan solutions that are being used in many different industries (the export quota is 94 percent). In addition to various 3D scanning systems for the digitization of people and objects, botspot offers customized scan solutions for numerous applications in the fields of E-Commerce, Health Care, Industrial Applications and VR/AR/MR.

Press contact:

Mr. Evgeny Stroh, Marketing & PR, botspot AG

E-mail: press@botspot.de

Phone: +49 152 53409690

Internet: www.botspot.de

VR for groups | Hyro Immersive Experience Rooms by Purple!

Purple Creative Innovators is about creating attention grabbing moments. Powered by visual storytelling and technology that amazes your senses.

As a business, you want to share your stories in an inspiring way. A way that sets you aside from the mass. Amaze your clients and employees by giving them a memorizing brand experience. Conference rooms are not considered to be inspiring or innovative. You need to capture people’s attention in a way that keeps them focused and involved from start to finish.

The Hyro Immersive Experience room is a permanent set up installed at your office, store or other preferred space. It immerses visitors in a shared, spectacular video and audio experience. A concept created by Purple. The immersive room has proven its value for multiple purposes, ranging from VIP presentations, sales pitches, product training, onboarding employees, to strategic decision-making processes and creative sessions. It is truly inspiring.


How does it work? 

Inside a closed room, visitors are astonished by high-end audio and video technology while surrounded by custom-made spheric 360 degrees videos, imagery, animations and special effects. The system is controlled remotely, via a tablet or a large multi-touch interactive table. The sessions are easily moderated via Purple’s Hyro Storytelling Software. Besides pre-loading content in the control system, moderators are also able to upload their own material and present it in multi-layer projection mode on each location in the room. 


By taking your audience on a customized visual journey, they’ll remember more than just your brand: you will impress them with an unforgettable and memorable experience. An experience that creates a long-lasting positive connection with your brand. An immersive room helps to uplift the power of your brand, by shortening the sales cycles, increasing deal sizes, improving customer relations and streamline important decision making processes.

Interested to hear more about Purple’s innovation? Send a message to susan@purple.nl

AVATAR DIMENSION BRINGS HOLOGRAMS TO WASHINGTON, D.C.

Avatar Dimension volumetric DC stage.png

Company Officially Opens Doors as the Only Certified Microsoft Mixed Reality Capture Studio on East Coast



WASHINGTON, DC, June 14, 2021 – What do the chart-topping artists, top defense leaders, influential museums and luxury fashion houses all have in common? They’re all turning to Avatar Dimension, the high-tech studio driving the creation of photorealistic human holograms placed into immersive environments to deliver stunningly realistic virtual experiences. Starting today, Avatar Dimension’s Mixed Reality, volumetric capture studio has opened its doors in the Washington D.C. area as the only certified Microsoft Mixed Reality Capture Studio on the East Coast, and just one of seven in the world.

Take a Virtual Tour of Avatar Dimension’s New Washington D.C. Studio
tour.avatar-dimension.com

Even in its pre-operational phase, Avatar Dimension clients already consist of enterprise, military and government agencies to marketers, games, fashion and entertainment. Projects ranged from the 2020 Air Force Global Futures Report, where a chief U.S. Air Force futurist was turned into a holographic presenter for the first-ever report in virtual reality; to the COCA (Center of Creative Arts, St. Louis MO) Virtual Web Tour, the first-ever online experience that marries 360° images with photo realistic holograms to bring viewers inside the newly renovated campus guided by 3D docents; to the Microsoft Ignite 2021 Keynote, where 3D holograms of world renowned marine biologists Edie Widder and Vincent Pieribone were captured as part of the Microsoft Mesh virtual presentation in AltSpace VR; to Balenciaga’s Afterworld The Age of Tomorrow, where hologram models were created for its 3D virtual catwalk experience; and an exclusive 3D NFT project, in conjunction with Emmersive Entertainment, featuring internationally renowned music artist Flo Rida.

Immersive 3D experiences are growing quickly as a far more effective way to train, entertain, and educate because virtual content can be shared globally, at scale with greater impact, and more quickly with far less expense. Examples include training doctors on surgeries from any location; celebrities making what looks like a live event appearance when actually it’s a hologram; simultaneous 21st century training for military and first responders from wherever they’re stationed; and walking the Smithsonian galleries from the comfort of your home.

“From armed forces training programs taught by lifelike virtual instructors to legendary talent dropping into your living room for an exclusive holographic performance, we are on the cusp of an entirely new technology era,” explains Dennis Bracy, CEO, Avatar Dimension. “With our state-of-the-art volumetric capture studio, we aim to enable more businesses and government agencies to benefit from the blending of the digital and physical worlds.”

“As we’ve seen through the pandemic, the internet immediately met the challenges presented by providing virtual tools for society to help replicate the real day-to-day life experience,” said David Sabey, Chairman of Sabey Corporation, a lead investor in Avatar Dimension. “We are extremely pleased to have our Intergate.Ashburn data center host Avatar Dimension and expand the limitless benefits of a new digital frontier.”

“Dimension is thrilled to partner with Dennis and the Avatar Studio team,” according to Simon Winchester, co-managing partner of London’s Dimension Studio. “Avatar Dimension brings together our world-class creative and technical talents to help organizations embrace valuable new opportunities in Mixed Reality and the Metaverse. Over the past 12-months, we've seen rapidly growing adoption of volumetric content so it's an exciting time to launch our state-of-the-art studio on the East Coast.

“Content creators are increasingly engaging volumetric video as a new medium to bring viewers more immersive, connected human experiences” says Steve Sullivan, General Manager of Microsoft Mixed Reality Capture Studios. “Our mission is to deliver high-quality, accessible, affordable human holograms to fuel the growth of Mixed Reality, and we’re thrilled that Avatar Dimension will be our studio partner in Washington D.C., bringing this technology to the East Coast.”

About the Studio
Purposely located in the Sabey Company’s Intergate Ashburn complex near Washington, DC, Avatar Dimension is perfectly situated to work with the region’s many enterprises, government agencies, military centers, health care companies, and entertainment businesses, as well as customers globally. Key details include:

- Avatar Dimension is a partnership of Avatar Studios, of St. Louis, Dimension Studio of the UK and Sabey Data Centers of Seattle.
- 4,000 square foot studio includes a stage, green room, client area, and offices, with backup power, worldwide fiber connections, all in a highly secure facility.
- Content captures use Microsoft’s best-in-class capture and rendering pipeline along with 70 12megapixel Volucam cameras from IO Industries for the best captures and most realistic fidelity. The cameras simultaneously capture the action and then video files are merged to create 3 dimensional holograms.
- Located in Sabey Data Center’s Intergate.Ashburn complex in Ashburn, VA., the data center capital of the world.
- Stage operations are also mobile so that volumetric capture sessions can take place from nearly any location or event in the US.
- During COVID-19, Avatar Dimension safely built and operated a temporary stage using remote tools with Microsoft’s Hololens 2. Safe captures will continue per local guidelines.

Press kit available at: https://www.zebrapartners.net/avatar-dimension-press-kit/

About Avatar Dimension
Avatar Dimension is driving the future of immersive video by offering the most advanced volumetric capture stage and studio in the world. Its mission is to imagine and create stunningly realistic virtual experiences for enterprise customers and their creative agencies, with a focus on building enterprise applications for training, government programs, marketing and entertainment. Avatar Dimension is the only certified Microsoft Mixed Reality Capture Studio on the East Coast, and consists of a collaboration between Avatar Studios of St. Louis, Dimension Studio of the UK and Sabey Data Centers of Seattle, with its premier studio based in Ashburn, VA. For more information, visit www.avatar-dimension.com.

Contact:

Name: Andrea Sausedo Piotraszewski

Email Address: andrea@zebrapartners.net

Website URL: https://www.avatar-dimension.com

HTX Labs Awarded SBIR Phase II Contract with the US Navy to Support Chief of Naval Air Training (CNATRA) Primary Flight Training Program

Houston, TX (June 22, 2021) - HTX Labs, developer of the EMPACT® Immersive Training Platform, announced today it has been awarded a new Small Business Innovation Research (SBIR) Phase II contract with the US Navy. HTX Labs worked jointly with Air Force and Navy contracting officers to successfully transfer its AFWERX Phase II contract across military branches, leveraging the results from previous Air Force efforts to deliver value to the Navy.

This new Navy SBIR award is targeted at further advancing the EMPACT platform to enable the development and delivery of immersive training content for both current and incoming student pilots with the Chief of Naval Air Training (CNATRA) Primary Flight Training Program.

In particular, the project will include enhancements to HTX’s T-6B virtual aircraft and EMPACT platform necessary to create and distribute portable, multi-modal immersive training to teach aircraft system academics, provide normal and emergency procedural training with consequential learning, and implement new analytics capture and visualization capabilities required to address the evolving training needs of Navy pilots.

“Development of virtual reality part-task training is important to the Navy. Chief of Naval Air Training is constantly evaluating new and innovative ways to improve pilot and flight officer training”, said LCDR Kerry “Rooster” Bistline TRAWING FOUR AVENGER BATCELL (Innovation) Officer In Charge, “These types of systems offer the potential to further enhance student preparation, understanding, and application of procedures throughout the ground and flight phases of training.”

To provide Navy airmen with critical pilot training content, the HTX team will leverage and enhance their EMPACT Studio immersive content authoring product to satisfy NAVAIR A1-T6BAA-NFM-100 (NATOPS) flight manual requirements, allowing NAVAIR to create expert-led and student-centered interactive pilot training material. EMPACT enables the rapid creation and deployment of immersive training content, enabling Navy airmen to experience training on Virtual Reality (VR) head-mounted displays (HMDs), laptops, and tablets, allowing students to effectively train anytime, anywhere, on any device to more broadly scale and sustain immersive training across the Navy.

“We are excited to have successfully transferred our Air Force SBIR Phase II to the US Navy to continue the important work of leveraging immersive technology to train pilots more effectively, to shorten the training cycle, and to play a role in addressing the pilot shortage across the US military”, said Scott Schneider, co-Founder, and CEO of HTX Labs. “We look forward to expanding our capabilities to deliver immersive training for the T-6B as well as other airframes to support the Navy’s mission of developing a highly prepared and proficient airman“.

Varjo Unveils its Varjo Reality Cloud Platform For a True-to-Life Metaverse

Today the company announced a pioneering new reality with Varjo Reality Cloud. The new platform will enable virtual teleportation for the first time by allowing anybody to 3D scan their surroundings using a Varjo XR-3 headset and transport another person to that same exact physical reality, completely bridging the real and the virtual in true-to-life visual fidelity. This real-time reality sharing will usher in a new era in universal collaboration and pave the way for a metaverse of the future, transforming the way people work, interact, and play.

New Varjo Reality Cloud platform will make teleportation and real-time reality sharing possible for the first time, paving the way for a new form of human interaction and universal collaboration


Helsinki, Finland – June 24, 2021 – Varjo™, the leader in industrial-grade VR/XR hardware and software, today announced a pioneering new reality with Varjo Reality Cloud. The new platform will enable virtual teleportation for the first time by allowing anybody to 3D scan their surroundings using a Varjo XR-3 headset and transport another person to that same exact physical reality, completely bridging the real and the virtual in true-to-life visual fidelity. This real-time reality sharing will usher in a new era in universal collaboration and pave the way for a metaverse of the future, transforming the way people work, interact, and play.

 

“We believe that Varjo’s vision for the metaverse will elevate humanity during the next decade more than any other technology in the world,” said Timo Toikkanen, CEO of Varjo. “What we’re building with our vision for the Varjo Reality Cloud will release our physical reality from the laws of physics. The programmable world that once existed only behind our screens can now merge with our surrounding reality – forever changing the choreography of everyday life.”

 

For the past five years, Varjo has been building and perfecting the foundational technologies needed to bring its Varjo Reality Cloud platform to market such as human-eye resolution, low-latency video pass-through, integrated eye tracking and the LiDAR ability of the company’s mixed reality headset. As the only company having already delivered these building block technologies in market-ready products, Varjo is uniquely positioned to combine them with Varjo Reality Cloud to empower users to enjoy the scale and flexibility of virtual computing in the cloud without compromising performance or quality. 

 

Using Varjo’s proprietary foveated transport algorithm, users will be able to stream the real-time human-eye resolution, wide-field-of-view 3D video feed in single megabytes per second to any device. This ability to share, collaborate in and edit one’s environment with other people makes human connection more real and efficient than ever before, eliminating the restrictions of time and place completely.

 

To further accelerate bringing the vision for Varjo Reality Cloud to life, Varjo today also announced the acquisition of Dimension10, a Norwegian software company that pioneers industrial 3D collaboration. Their virtual meeting suite is designed for architecture, engineering and construction teams and will become a critical component to making virtual collaboration possible within Varjo Reality Cloud. 



Additionally, Varjo welcomed Lincoln Wallen to the company’s board of directors. Wallen currently serves as the CTO at Improbable, is a recognized scholar in computing and AI, and brings to Varjo his extensive knowledge of large scale cloud computing, and moving digital content production into the cloud. Previously, Wallen has worked as CTO of Dreamworks where he transitioned global movie production to the cloud, including the development of a cloud-native toolset for asset management, rendering, lighting, and animation.



Varjo Reality Cloud will first be available to existing customers and partners in Alpha Access starting later this year. For more information about Varjo’s new cloud platform and its vision for the metaverse, tune into a live, virtual event today, June 24, 2021, at 12:00pm ET via varjo.com.



About Varjo:  

Varjo (pronounced var-yo) is based in Helsinki and is creating the world’s most advanced VR/XR hardware and software for industrial use, merging the real and virtual worlds seamlessly together in human-eye resolution. www.varjo.com 

###

 

Media Contact:



press@varjo.com

Brittany Edwards

Carve Communications for Varjo

210-382-2165



Engineering & Computer Simulations Unveils First Haptics Technology Lab in Orlando

This unique lab will be used to enhance the effectiveness of simulation training for military and first responders.  


June 14, 2021 - (Orlando, FL) – Waymon Armstrong, CEO/president of Engineering & Computer Simulations (ECS), announced that ECS has opened one of the first haptics labs in Central Florida that is specifically focused on the integration of haptics products within various extended reality (XR) simulation and training programs. Located at the corporate headquarters, the technology team at ECS will perform haptics research and development as well as evaluation, assessment, integration and product development focused on haptics technology. 

As one of its first projects, the team, led by Shane Taber, Vice President of Operations – Orlando, will evaluate and assess specific haptics vendor products for a research and development program that is being funded through 2024 by the Army’s Simulation and Training Technology Center (STTC), a division of the Combat Capabilities Development Command (CCDC) Army Research Laboratory’s (ARL) Human Research and Engineering (HRED) Directorate. The vendor gloves being studied include the VRgluv, which were just delivered last week, the HaptX Gloves DK2, which were the first set delivered in Florida; and gloves from BeBop 

The lab space will allow the ECS team to perform additional critical studies and programs such as front-end analysis and test and assessment. The initial focus of the studies will be for medical training, but plans are already underway to include aviation maintenance and marksmanship simulation programs as well.  

Armstrong says: “We are excited to establish this in-house technology lab that allows us to further integrate and refine haptics technology in virtual, augmented, mixed, and extended realities. The ability to develop and test within this space ensures we continue to enhance the innovative global training products and services for our warfighters, first responders, and enterprise clients.” 

Paul Sohl, Rear Adm. USN (ret.) and CEO of the Florida High Tech Corridor Council adds: “In engaging with great leaders like Waymon, I've come to realize that tech is really about the positive impact it can have to benefit the people, neighborhoods, and communities we serve. Tech to serve others, such as our military service members and our first responders … now that’s cool in my mind.” 

Derived from the Greek word for touch, “haptic” is defined by Oxford Languages as “the use of technology that stimulates the senses of touch and motion, especially to reproduce in remote operation or computer simulation the sensations that would be felt by a user interacting directly with physical objects.” Haptic technology simulates the sensation of touch and feedback for the user and this capability will enhance the realism of our training simulations.  

By wearing specialized haptics gloves in tandem with a virtual reality (VR) headset, ECS delivers training programs that offers a realistic sense of touch and natural interactions within an immersive training environment. For example, when applied in a medical environment, the haptics integration within the application would provide combat medics, and healthcare professionals the tools that they need to improve their quality of training and retention to potentially save more lives. 

Sheena Fowler, Vice President of Innovation for the Orlando Economic Partnership, states: “Our Orlando Tech Council aligns our region’s tech and innovation ecosystem resources to accelerate progress towards an innovation-based economy. We believe this unique haptics lab, established by our friends at ECS, will enhance Central Florida’s technology and innovation community and help to strengthen our region as one of the leading communities for innovation.” 

 

George Cheros, President & CEO at National Center For Simulation adds: “ECS has always been an amazing innovator and partner in our community and industry. It is no surprise that they are one of the first technology and training firms to dedicate lab space to research and advance haptic technology for implementation in our simulation and training programs.”  

 

About ECS 

ECS is an award-winning global training and technology solutions company, headquartered in Orlando, Florida with operations in Huntsville, Alabama and San Antonio, Texas. An industry innovator with a vast portfolio of training programs, ECS is a trusted provider of training, maintaining, and sustaining military service members and first responders around the world. http://www.ecsorl.com/  

Contact:      Lynne Garrow  

                    (407)  595-1978- cell 

                    lynne@mycapitalcommunications.com 

Uvisan UV-C Disinfection Cabinets are used in LBE venues, by the National Health Service, and in filming Shaun the Sheep!

Uvisan has a UV-C disinfection cabinet that disinfects 30 headsets in 5 minutes!

VR/AR Headsets present and enormous risk of spreading infection as they are placed on the users head/face and are often shared among multiple users. Popular opinion dictates that the facial region is the danger area of the headset, although interestingly, the vast majority of viral and bacterial transmission actually happens in the areas the hands touch (thats why it is so important we all wash our hands often). Uvisan cabinets provide 360 surface coverage and kill 99.99 % of all viruses and bacteria in 5 minutes. Other UV-C disinfection products designed for VR Headsets do not cover the outside where the hands touch and are not suitable products to reduce viral transmission.

The cabinets also contain USB Fast charge ports for up to 30 headsets to give a convenient place to charge and store your headsets, with robust lockable doors for additional safety.

  • 360 degree coverage - Disinfect that areas that matter.

  • Disinfect up to 30 headsets in 5 minutes - Quick and easy at scale

  • Uvisan cabinets kill 99.99% of all viruses - Protect those using your headsets.

  • Significantly cheaper than other alternatives

  • Lock, Store, Charge and Disinfect all in one cabinet.

Uvisan cabinets are in used in Multiple LBE venues, the NHS (National Health Service) in the UK for the VR training suites, Aardman Studios in filming the latest installment of Shaun the Sheep, and are recommended by HP


VRARA Members save 15% off Uvisan Cabinets! Email
info@thevrara.com for your code!

More info https://www.uvisan.com

uvisan is Trusted by .png

Global partnership announced between Schneider Digital and DAT/EM Systems International

The dual-screen, passive stereo 3D PluraView systems are 100% compatible to all DAT/EM products, including the flagship Summit Evolution Professional digital stereoplotter.

The dual-screen, passive stereo 3D PluraView systems are 100% compatible to all DAT/EM products, including the flagship Summit Evolution Professional digital stereoplotter.

Anchorage/Miesbach in May 2021 – The software producer DAT/EM Systems International of Anchorage (USA) and the 3D hardware specialist Schneider Digital from Miesbach (Germany), have announced a global partnership agreement for the distribution of the latest 3D stereoscopic vision technology from Schneider Digital to all DAT/EM customers around the globe. Under this cooperation from now on, DAT/EM partners worldwide can directly distribute to their clients the 3D PluraView stereo monitors and Schneider Digital Workstations as complete photogrammetric workplace solutions, for example in combination with the Summit Evolution Professional digital stereoplotter.

“The combination of the state-of-art 3D PluraView displays with the Summit Evolution software provides the clearest visualization of stereo imagery for photogrammetrists worldwide”, said Jeffrey Yates, General Manager of DAT/EM Systems International. “The increased visual acuity allows map compilation to be more productive with greater accuracy”. 

The dual-screen, passive stereo 3D PluraView systems are 100% compatible to all DAT/EM products, including the flagship Summit Evolution Professional digital stereoplotter. DAT/EM software and Schneider Digital photogrammetry hardware products combine into the most precise, productive and user-friendly stereoscopic data capture environment with unparalleled display quality and fastest response times to work even with terabyte-sized imagery.

“We have been working very closely with DAT/EM International and DAT/EM Europe for many years and have built a trusted relationship which benefits our joint user community in the area of GEO-IT applications”, said Josef Schneider, the CEO and founder of Schneider Digital. “Together with DAT/EM on the software side, our objective always was and always will be to provide the best functionality and quality for the photogrammetry workstations and 3D PluraView monitors that we manufacture”. For the user, this means a completely homogeneous and integrated 3D stereo workstation solution with perfectly coordinated hardware and software components.

Further informations at https://www.3d-pluraview.com/en/ 

About DAT/EM Systems International: 

DAT/EM Systems International, located in Anchorage, Alaska, USA, has been developing photogrammetric software since 1987. DAT/EM specializes in 3D stereo viewing and feature data collection software. Its products include the Summit Evolution digital stereoplotter and proprietary capture interfaces that allow Summit Evolution to digitize directly into MicroStation®, AutoCAD®, ArcGIS® or Global Mapper®. For more information, please visit DAT/EM’s website at https://www.datem.com/ or email sales@datem.com.



Schneider Digital – The company:

Schneider Digital is a global full-service solution provider for professional 3D-stereo, 4K/8K and VR/AR hardware. Based on its 25 years of industry and product experience as well as its excellent relationships with leading manufacturers, Schneider Digital offers innovative, sophisticated professional hardware products and customized complete solutions for professional use. Qualified advice and committed after-sales service are the company's own standards.

The Schneider Digital product portfolio includes the right professional hardware solution for the respective requirements in these areas: High resolution 4K/8K to multi-display walls. Schneider Digital is the manufacturer of its own powerwall solution smartVR-Wall and the passive stereo monitor 3D PluraView. Performance workstations and professional graphics cards from AMD and NVIDIA as well as innovative hardware peripherals (tracking, input devices, etc.) round off the product range. Many articles are in stock. This guarantees fast delivery and project realization.

Schneider Digital is an authorised service distributor of AMD FirePRO/Radeon Pro, PNY/NVIDIA Quadro, 3Dconnexion, Stealth int., Planar and EIZO. Schneider Digital products are used primarily in graphics-intensive computer applications such as CAD/CAM/CAE, FEM, CFD, simulation, GIS, architecture, medicine and research, film, TV, animation and digital imaging.

Further information is available at www.schneider-digital.com and www.3d-pluraview.com.


Schneider Digital press contact:

LEAD Industrie-Marketing GmbH

André Geßner Tel.: +49 80 22 - 91 53 188

Hauptstr.46 E-Mail: agessner@lead-industrie-marketing.de

D-83684 TegernseeInternet:www.lead-industrie-marketing.de


ThirdEye X2 MR Smart Glasses Now Compatible with AT&T FirstNet

Today, ThirdEye announced it is available for use on AT&T FirstNet, the nationwide, high-speed wireless broadband communications platform dedicated to America’s first responders. The ThirdEye X2 MR Glasses and RespondEye, a software platform suite that connects to EMS backend systems and enables first responders to access data on demand, now support access to the FirstNet network. This enables First Priority capabilities on FirstNet such as always on priority and preemption for first responders. Additionally, ThirdEye is HIPAA certified for both its X2 MR Glasses and RespondEye platform.

ThirdEye Now Available on FirstNet® Delivering Reliable Communications Capabilities for First Responders

Recently launched AR RespondEye Software Approved for use on FirstNet

 

PRINCETON, N.J., June 8, 2021 –  ThirdEye, a leader in augmented and mixed reality enterprise solutions, today announces it is available for use on FirstNet, the nationwide, high-speed wireless broadband communications platform dedicated to America’s first responders. This collaboration provides ThirdEye’s X2 MR Glasses and RespondEye software access with FirstNet. First responders can now securely and quickly obtain vital patient information through ThirdEye’s new telehealth solution. 

 

FirstNet, built with AT&T, is a public-private partnership with the First Responder Network Authority (FirstNet Authority) – an independent agency within the federal government. It’s designed with and for first responders and the public safety agencies and extended community that could be called on to support them. 

 

The ThirdEye X2 MR Glasses and RespondEye software support access to the physically separate and dedicated FirstNet network core, which enables First Priority® capabilities on FirstNet – always-on priority and, for first responders, preemption – and the FirstNet Band 14 spectrum. FirstNet eligible customers can feel confident the X2 MR Glasses combined with FirstNet services will provide the necessary critical connectivity in a reliable, highly secure and cost-effective manner. 

 

As part of the agreement through the FirstNet Embedded IoT Program - a program that allows for industry leading FirstNet eligible devices to be combined with FirstNet to create a bundled, end-to-end solution for a single monthly fee for first responders, public safety agencies and extended community – the ThirdEye solution will include FirstNet connectivity.

 

This collaboration comes after ThirdEye successfully launched its pilot programs with the Marcus Hook and Upper Merion Fire Departments in Pennsylvania. ThirdEye’s X2 MR Glasses with the RespondEye software help prevent first responders from coming in contact with potential COVID-19 patients using the attachable FLIR (forward-looking infrared) thermal sensor to detect elevated body temperatures – one of the symptoms of the disease. By using the smart glasses, medics can be hands-free and receive digital information displayed live in their field of view. 

 

“Our telehealth software assists first responders with their everyday tasks while also equipping them with the technology to help protect them from COVID-19,” said Nick Cherukuri, CEO and founder of ThirdEye. “FirstNet provides first responders access to the high-speed and reliable connectivity when facing emergencies. We continue to update the RespondEye’s features upon requests from the EMS teams as we work toward making the implementation of AR in healthcare the new norm.”

 

By wearing the smart glasses, a paramedic in the field can directly contact a doctor for assistance. The doctor can then stream live video from the scene to be displayed on a computer, tablet or smartphone at the hospital via the RespondEye platform. This allows the doctor to assess a patient and make treatment recommendations without having face-to-face contact. The RespondEye software is cross-compatible and runs on ThirdEye X2 MR Glasses as well as iOS/Android phones, tablets and web browsers. The software is currently free to first responders with tiered pricing for data usage being released soon.

 

“We’re pleased to welcome ThirdEye as a member of our FirstNet IoT Embedded Program,” said Scott Agnew, assistant vice president, product marketing, FirstNet Program at AT&T. “This allows us to further our mission to deploy, operate, maintain, and enhance the only nationwide wireless platform dedicated to public safety and the extended community that support public safety response.” 

 

At just 300 grams, ThirdEye’s X2 MR Glasses are the lightest mixed reality glasses on the market and are suited for comfortable, extensive wear in all conditions. The glasses run on Android 9.0 and are powered by a Snapdragon xR1 Qualcomm processor chip. ThirdEye also developed custom augmented reality first responder software via its RespondEye software platform suite, which connects to the EMS backend systems. This software combined with ThirdEye’s HIPAA certification enables first responders to access all of their data on demand and safely via the X2 MR Glasses.

 

ThirdEye achieved HIPAA certification for its X2 MR Glasses and RespondEye Platform, HIPAA compliance is a requirement for vendors that handle any patient medical data to ensure the security of the patient’s information as well as confidentiality. 

 

For more information on ThirdEye, visit www.thirdeyegen.com. For more information on FirstNet, check out FirstNet.com 

 

FirstNet and the FirstNet logo are registered trademarks of the First Responder Network Authority.  All other marks are the property of their respective owners

###

 

About ThirdEye 

ThirdEye is a leader in smart glasses and AR/MR software development. While many companies today use just smart glasses or only software, ThirdEye provides a full end-to-end package for its customers and employees. It has hundreds of software developers creating apps ranging from games to entertainment to enterprise applications and its products retail around the world. From everyday consumers to Fortune 500 companies, ThirdEye is bringing the power of mixed reality globally. Mixed reality has the potential to change the way the world operates, and ThirdEye's vision is to help generate the future.   

 

Andrea Mazzola | Account Executive | Uproar PR


856-873-4444

Andrea Mazzola <amazzola@uproarpr.com>

RECAP: 2021 VR/AR Global Summit - Metaverse, Convergence, and Adoption (by Jon Jaehnig, ARPost)

By Jon Jaehnig

What is the metaverse? Does it require blockchain? What’s the future of XR?

The VR/AR Association held their annual Global Summit for North America from June 2 through 4. For the second year in a row, the conference was held online but that didn’t stop it from delivering the kind of insights and announcements that we’ve come to expect from the annual event. Highlights included new platforms, metaverse definitions, tech convergence, and more.

“We will have an audience of thousands of people in attendance,” VR/AR Association Global Director, Kris Kolo, said during opening remarks. “We are thrilled to have you here with our hundreds of speakers and presenters.”

What the Smartest People in the Room Are Talking About

As our customary warning, even devoting three days to the conference we weren’t able to catch everything. Here are some of the big trends that we were able to glean from the hours of talks that we were able to take in during the event.

XR Adoption During COVID-19

The world (and the XR metaverse) is in a strange place as we look forward to the lifting of coronavirus restrictions while remaining in what is hopefully the end of the quarantine period that, as unwelcome as it was, offered huge benefits to the XR industry.

“Virtual became the new normal. That bodes well for the technology space in general,” 8th Wall Vice President of Product, Tom Emrich, said in a “Burning Questions” panel discussion on the first day of the summit.

This period in our shared history didn’t just increase the use of XR, it changed how people use and perceive XR technology and its use cases.

“VR is social. It’s not just about gaming anymore,” Cube XR LLC partner Larry Rosenthal said in the panel. “What COVID has done is it has made XR more like the telephone than the television.”

While speakers at different panels and presentations didn’t have a universal stance on what the future will look like, they did agree on one thing, as summed up by industry expert Deborah Worrell in an “Everything VR/AR” podcast recorded live at the summit: “We’re not going back to the way it was. There is no new normal.”

Why Isn’t the Future Here Yet?

Despite Rosenthal’s analogy, XR is still not as ubiquitous as the television, and there was no shortage of discussion about why not.

“The lens of meme theory tells us a very damning story of why AR failed to take up,” Auki Labs CEO, Nils Pihl, said in a talk on the impact of persistent AR. “When AR is hard to share, its [appeal] is limited.”

On the enterprise side, XR has been difficult to adopt and scale in part because of legacy management methods.

“There is a fundamental difference between managing smartphones and tablets and managing AR/VR devices,” ArborXR COO and co-founder, Jordan Williams, said in a talk on “Why Oculus for Business and Other Solutions Aren’t Working for XR Deployments.” “AR/VR devices are regularly shared between multiple users, and smartphones and tablets are usually not shared.”

One of the recurring themes at XR conferences that resurfaced at this event is that XR advocates tend to be most vocal during XR conferences.

“A lot of our conversations over the last ten years have been quite insular,” UgoVirtual CEO, Michael Cohen, said in a talk on hybrid events. “They’ve been about how we launch these amazing technologies,” rather than on the value of the experiences delivered.

The Metaverse and “Post-Reality”

Cohen wasn’t the only one who pointed out the shortfalls of XR’s branding. There was also a lot of contention over our understanding of one concept that was discussed almost continuously: The Metaverse.

“The term “metaverse” and what it implies are still unknown to the public,” Outpost Capital Founding Partner, Ryan Wang, said in his talk “One more step into Metaverse: XR meets Blockchain.” “One of the key reasons is that it’s difficult to understand the term and what it means.”

Wang suggested that the term should apply to any platform that promotes a “digital lifestyle” including companies like Peloton. One benefit of this is that it answers the question of whether there will ever be a single dominant metaverse. Instead, there will be a constellation of purpose-built metaverse platforms and providers creating a “multiverse.”

Others argued for an even more user-friendly definition, though one that potentially reduces discussion on the matter.

“We already have a metaverse. It’s called the internet,” MetaVRse co-founder Alan Smithson said in the “Burning Questions” Panel. “What we’re doing now is creating more visual ways of interacting with it.”

No matter how you define the metaverse or the role that you think it plays, Google AR Partnership’s Global Head of Creative, Matthieu Lorrain, said that it was contributing to a period of “post-reality.”

“The radical transformation of our perceived reality will be the most significant cultural change in our lifetime,” Lorrain said in his talk “Welcome to Post-Reality.”

The “Convergence” of Emerging Technologies

The VR/AR Global Summit’s more robust conversations on the metaverse also tended to involve “the convergence” of XR with other emerging technologies, usually blockchain.

“With blockchain, you can claim the ownership of not just digital items, but digital land,” said Wang, who also commented on the ability of blockchain and edge computing to allow decentralized management of vast amounts of spatial data. “With the popularity of the creator economy, there are more and more players being incentivized to become creators.”

The idea of creators creating the metaverse themselves through blockchain-enabled peer-to-peer transactions inspired many speakers, though some admitted that we haven’t pinned this down yet.

“It’s kind of the Wild West right now, but what we’ll see is the development of a new parallel economy,” industry expert Amy Peck said during the “Burning Questions” panel. “It’s really a peer-to-peer economy and we’re seeing it already with digital goods and NFTs.”

In the same panel, futurist Anne Ahola Ward commented on NFTs potentially being a way for people to establish and secure entire virtual identities.

Announcements and Introductions

One of the big draws to the Global Summit are the industry announcements and introductions, of which there were plenty this year.

Involve XR from Lumeto

Raja Khanna introduced Involve XR from Lumeto, a multi-user synchronous immersive learning platform for medical training and crisis response training. The virtual studio is populated by interactive virtual equipment and a responsive AI-powered patient.

“Our goal as a company is to get out of the way of the trainers. We don’t want to hardcode the session into the experience,” Khanna, the company’s CEO, said in a talk introducing the platform. “It’s a big, ambitious goal. We’re not there yet but we’re getting there very, very quickly.”

The platform is piloting with the American College of Chest Physicians, as well as select universities in the United States and Canada, with plans to deploy more widely later this year.

HoloMedX

HoloMedX is another medical training platform, but it takes a different approach. The platform uses ray casting to construct a patient-specific 3D model from 2D scans used with spatial models of surgical implants. The model is then displayed on a Looking Glass holographic display manipulated with an air controller designed after medical equipment.

This results in a hyper-realistic XR display without the use of a headset. The company’s goal is to use the technology, which is rolling out in the coming weeks, to train practitioners as well as to give patients a better pre-op understanding of a surgical procedure.

Announcements from Altered Ventures

The last two big announcements came from venture capital fund Altered Ventures. These were the launch of a VR art gallery for NFTs and partnership and investment in Victoria VR, a blockchain-based VR MMO/RPG. Within Victoria, all assets are purchasable and saleable for users.

“We see the future of Victoria VR as a universal platform for VR experiences,” said Victoria VR co-founder and COO Adam Bém. “To create not just a game, but a virtual world where people are able to live and work.”

See Around in the Metaverse

The VR/AR Global Summit - Summer is behind us once again. However, that doesn’t mean that the conversations need to stop. The next global summit is in the Fall in Europe, Sept 29-Oct 1 Also,  the VR/AR Association has regional chapter events throughout the year, not to mention their more focused industry-specific forums throughout the year. Maybe by next year, we’ll agree on what the metaverse really is.

source

Saab (a global defense and security company) is using Varjo technology for its new Gripen E/F simulators

Here’s the latest training and simulation use case from Varjo. Today, Saab announced an expanded collaboration with Varjo in which the company’s technology will be integrated in Saab’s new Gripen E/F simulators. More specifically, Varjo’s new XR-3 mixed reality headset will be used in all of Saab’s flight simulators to give pilots a truly immersive training experience with photorealistic visual fidelity so they can feel like they’re flying in the real world.

Saab and Varjo bring virtual reality to flight simulators

Saab and Varjo, which develops virtual and mixed reality technologies for professional use and was founded in 2016, have advanced in their collaboration of many years to the point in which Varjo’s technology will be integrated in Saab’s new Gripen E/F simulators. The collaboration has its roots in Varjo’s innovation, which was critical to the successful development of state-of-the-art simulations.

“When we did our first experiments with commercial devices, we received feedback from the pilots that they were unable to read text in virtual reality because the resolution was not sufficiently high. This was not a problem with Varjo’s technology,” says Stefan Furenbäck, Saab’s Head of Tactical Environment Simulation and Visualisation.

“Our collaboration has been smooth and straightforward. We understand each other’s needs and how we can solve our shared technological problems. Two companies with similar cultures, with innovation high on their agenda and ready to discuss everything,” Seppo Aaltonen, Varjo’s Chief Commercial Officer describes the collaboration.

Pilot training requires trainees to be able to read text and see even the smallest details. Pilots need to feel like they are flying in the real world while using the simulator. Up until now, it has been necessary to use cave or dome-shaped simulators to create a virtual reality experience that is as immersive as possible for the pilot. They are very large and expensive, and building their display systems requires a lot of work. They are also difficult to move.

Conventional dome simulators have a flat-screen and the 3D view can only be achieved using special glasses. Varjo’s virtual reality headsets have separate screens for each eye, so they come with a built-in 3D feature.

Varjo’s Bionic Display is based on the idea of how the human eye works; you see everything in the middle of your field of vision in ultra-high resolution and anything in the peripheral vision in lower resolution, enabling a fully natural and smoothly performing virtual experience.

This is done by projecting patterns on the surface of the eye with infrared LEDs that are monitored by small cameras and the resolution adapts to the movements of the eye. Human-eye-level resolution can be achieved without supercomputers if it is known what point the human eye is looking at at any given moment. The maximum computing power is always focused on the current point.

Varjo’s technology uses video cameras to constantly capture the environment for the purposes of mixed reality. This means that the images from the real world work on the same principle – the high-resolution image comes from the focal point. This allows powerful gaming computers to be used to operate the technology instead of supercomputers.

The collaboration between Saab and Varjo has worked very well for several years now, and it has expanded to Varjo technologies being integrated into the Gripen E/F fighter simulator.

“We’re finalising the basic functionalities in our own simulator so that we can use Varjo XR-3 headsets in all our flight simulators. We’ve previously carried out smaller, independent prototype-like projects but now we’re integrating them into our actual flight simulators,” says Furenbäck.