Categories
News The project story

Presented 360ViSi project at international conference

Online Educa Berlin is an annual global, cross-sector conference and exhibition on digital learning and training. The conference gathers participants from around the world to learn about developments in learning technologies.

University of Stavanger was invited to talk about 360° video simulation and used the 360ViSi project as case subject. Video producer Mari Linn Larsen and Media Engineer Kåre Spanne presented the project in front of an audience of 60 people.

“I am very pleased that so many wanted to hear about our experiences from the 360ViSi Erasmus+ Knowledge Alliance and were interested in learning how to use 360° simulation in their own education,” says Mari Linn Larsen.

The feedback from the participants was positive, and many of them had questions.

“The presentation sparked an interesting discussion in the room, where we could elaborate further on the kind of situations where 360° simulation can enhance education and increase the access to training,” Kåre Spanne explains.

Categories
News The project story

Inspiring project meeting in Nottingham

Partners from all four countries participated in the project meeting. The aim was to bring all partners up to speed on each other’s cases and progress, as well as preparing for the final stretch of the project.

UoN’s Immersive Suite

University of Nottingham (UoN) has its own Immersive Suite on campus, which the project partners were allowed to visit. Michael Taylor, Kathrine Wittingham and lecturer Gill Langmack demonstrated how the technology works and how UoN used it in the training of nurses.

At the immersive suite, images or video is projected onto three walls, and allows students to interact with the content through hotspots, quizzes.

Michael Taylor demonstrates how the Immersive Suite can be used.

Michael Taylor has tested the 360ViSi Community Nursing Case in the Immersive Suite, and the project partners were allowed to try the technology with a quiz. See some examples in the video below.

Video footage of the Immersive Suite in use.

360° images in Xerte

Fay Cross from University of Nottingham presented how the Xerte technology is used to create Reusable Learning Objects, and how 360 images can be applied to the system.

The Xerte Project aims to provide high quality free software to educators all over the world, and to build a global community of users and developers. The Xerte technology was developed at the University of Nottingham. Check it out here!

360 image with hotspots in Xerte.

Other topics covered on the first day were the implications of the EU web content accessibility guidelines (WCAG) for the design of Reusable Learning Objects (RLO) and 360⁰ media cases specifically, and Design Thinking as a framework for RLO development.

Project meeting – Day 2

The second day of the transnational meeting started with focus on evaluation. Professor Heather Wharrad (UoN) introduced the evaluation requirements and toolkit for the project.

Each partner university gave an update on the development of and experiences from their respective cases, and presented their methods for collecting data – and preliminary student feedback from each case.

Project parter Screen Story from Norway presented a business case with a new client, which is a direct result of their participation in the 360ViSi project.

Quasar Dynamics, project partner from Spain, has developed a digital tool which will enable educators to produce Reusable Learning Objects without help from technical experts.

Screenshot from Quasar Dynamics new 360 production tool, which is still under development.

The tool is developed based on identified needs from educators in the 360ViSi project. Quasar presented the Beta version of the tool, and invited everyone to try it and provide feedback.

Read more about the development of the tool.

CoViRR insights

At the end of Day 2, Dr Matthew Pears from University of Nottingham shared best practices for co-creating Virtual Reality and 360 educational resources based on insights from the CoViRR project, another Erasmus+ project University of Nottingham has been a part of.

Final day

The last day of the meeting, started with input about chatbots in education from the CEPEH Project by Stathis Konstantinidis, James Henderson and Matt Pears.

CEPEH is an Erasmus+ strategic partnership that aims to co-design and implement new pedagogical approaches and, in particular, chatbots for European Medical and Nursing schools.

From left: James Henderson, Matt Pears and Stathis Konstantinids shared insights from the CEPEH project.

There is a growing evidence that chatbots have the potential to change the way students learn and search for information. Chatbots can quiz existing knowledge, enable higher student engagement with a learning task or support higher-order cognitive activities. 

The 360ViSi project members were given a mini workshop showing how easy it can be to create a chatbot – about the 360ViSi project.

Way forward

The transnational project meeting was concluded after the partners planned the final stretch of the project. This involves completing all tools and solutions, collecting data, evaluation, disseminatin and final reporting.

Based on input from other projects, like CEPEH, and due to the successful cooperation in the 360ViSi project, the partners are already considering applying for new Erasmus+ projects, with the aim of combining chatbots and 360 media in education.

Categories
News The project story

Turning 360° technology into business opportunities

Through this project we have confirmed that 360° pictures and videos can indeed be used for educational purposes. But can it also open business opportunities for companies? The answer is yes.

Making art more accessible

By combining 360° photos and videos, Screen Story in collaboration with Stavanger Art Museum, made three virtual tours with the purpose of making art more available for the masses.

Experiencing art can be an enticing experience, but not everyone has the opportunity to visit an art museum for different reasons. Be it physically challenges, geographical impracticalities or an ongoing pandemic. By making a digital representation with the use of 360-video/photo technology, anybody can visit the Stavanger Art Museum and the exhibitions Kitty Kielland, Frida Hansen and The Hafsten collection.

“We have some pretty unique works of art here that’s in the forefront of Norwegian art history,” says Hanne Beate Ueland, director of Stavanger Art Museum.

From shooting the Hafsten collection. Elin Lillebråten from Stavanger Art Museum guides the visitors through the virtual tour.

When the pandemic hit, they closed the museum and sent everybody home. They had to find new ways for the visitors to enjoy the art.

“To experience an art museum is very much about walking into a space and being surrounded by the art. We talked to people who we respect within technology, and they suggested that we contact Screen Story, and we did. They had this project going on with 360-video. We immediately found that quite interesting and inspiring, and it turned out to be project where we could collaborate quite easily,” Ueland continues.

Turning ideas to business

Screen Story took on the task and saw an opportunity to gain new know-how in the field of 360- technology. Project leader at Screen Story, Øyvind Torjusen, says it was a perfect fit.

“This proved to be a great case for the 360ViSi-project. Screen Story’s main task in this project is to look at the transferability to business, and create a 360-product which in turn can be commercialised. Having an actual need from an institution such as the Stavanger Art Museum is a much better starting point than a made-up scenario,” he says.

And action!

The recording process took a total of six days, with over 170 images and 48 videos shot, all in 360 degrees. A script consisting of about 18 pages written by the three mediators, each presenting a different exhibition. A whopping 3.25 TB of data was needed for all the materials.

Shooting in a museum can be a challenge. Lighting conditions are often set in such a way to preserve the art, and not optimal for shooting with a 360-camera which requires a lot of light. And because of the nature of shooting in 360 degrees, there’s nowhere to hide a light rig which complicates shooting even more.

Stian Skjerping prepares to shoot the very first 360 project Screen Story did for Stavanger Art Museum, the exhibition “In the clouds”.

“We had to bring up the brightness of the fixed lights to the maximum while shooting, but even then it was a bit too dark in some of the rooms,” says Stian Skjerping, videographer at Screen Story.

This meant some extra hours in post-production to brighten the footage.

“But when you brighten the footage you add more noise to the image as well, which meant we had to use some noise removal tools to clean it up,” he continues.

Making a great user experience

When all the footage was colour corrected and cleaned up, the assembling of the tour began. Every image and video must be linked together so that the viewer can move seamlessly through the exhibitions. Along the way, new ideas and solutions were discussed to make the user experience the best possible.

The Hafsten collection is one of the 360 exhibitions Screen Story has made for Stavanger Art Museum.

“Stavanger Art Museum wanted a simpler way to reach more content, so we came up with a top bar with a menu where you quickly can navigate to all the videos, as well as get help navigating,” says Pål Berg Mortensen, editor at Screen Story.

Moving around a 360 virtual tour can be a challenge, because it’s a relative new product, so there are no set conventions yet. It was therefore important for both Screen Story and the museum to make the tour as intuitive as possible.

“We made an intro where you get some instructions on how to navigate, hopefully making it less intimidating,” says Mortensen.

The museum also wanted to incorporate more information and content about single works of art.

“Early on there were discussions about including more information on individual art pieces, and how that would work. The solution was to create a pop up menu that appears when the the art piece is clicked on. The user can then choose to watch a video about the piece, or see it in high quality and read more about it,” Mortensen explains.

More benefit than traditional video

The Stavanger Art Museum were fascinated by the experience of being able to actually walk through the exhibitions, and see several use cases for the 360 virtual tour.

“It’s something we are able to use, not only towards the general audience but also in our dialogue with other artist we are working with for the coming exhibitions,” Hanne Beate Ueland says.

She sees a clear benefit over “traditional” video:

“One of the most interesting things for us, that separates 360-videos and images from a normal video of a exhibition, is to give people an actual experience of being there and walking around, looking at the artworks, but also reading, and finding more information. That adds a complexity and quality to the project that we liked.”

“It’s great to see that the customer was satisfied with the tour and appreciates all the hard work behind it,” Skjerping says.

The potential for a “360 virtual tour product” is definitively there, and Screen Story has learned a lot in this process moving forward.

“We wanted to create a truly immersive experience where you can interact with the pieces of art, and absorb the atmosphere in the exhibitions. And I think we managed to do that. This is only the beginning of what is possible with the 360-technology,” Skjerping concludes.

You may also want to read these stories about 360 technology and business

EuroLeague Metaverse: Success case

The business side of 360 video simulation

Categories
News The project story

Recorded four 360º cases

The video and pictures below were captured when 360 ViSi project member The Catholic University of Valencia “San Vicente Mártir”, UCV, worked on four new cases.

The cases were recorded by UCV’s team of academics at the university’s Virtual Hospital.

Behind the scenes

This video shows the stages and topics to consider when recording 360º video for education.

Creating a teaching methodology which is applicable to 360º video is the goal of the 360ViSi project, to help nursing students enhance their clinical, communication and team-working skills.

“Four 360º cases were recorded after creating a case script for each and carrying out a briefing with the technicians. Innovating in new teaching methodologies is something that we love doing for our nursing students!” says Esther Navarro, Dr of Nursing at UCV.

Image gallery from shooting of the case

Categories
News The project story

Creating new editing tool for teachers

If a teacher realises that her students would benefit from training in an interactive 360° environment on a particular topic – wouldn’t it be great if she could easily create it herself, without help from others? One of the goals for the 360ViSi project is to make that possible.

It’s the teachers who know the curriculum, the material and the learning outcomes. They also know when visual representation and repetition would help students understand concepts. Being able to create 360° interactive educational content themselves would shorten the time and effort from idea to finished product. At least that’s our experience from the 360ViSi project.

Project partner Quasar Dynamics is now developing an editing tool for 360° videos specifically for teaching purposes which should be easy to use for teachers, based on the needs that we have experienced during the 360ViSi journey.

Usability testing across borders

Lead developer at Quasar Dynamics, Jaíme Diaz González, and his colleague Manel Almenar, created two possible prototypes. We saw this as a great opportunity to share skills between countries and organisations.

Gloria Orri, UX designer at University of Stavanger, ran a usability test to evaluate the two prototypes, with participation from health professors from both University of Stavanger and The Catholic University of Valencia “San Vicente Mártir”.

Usability testing prototypes of the new editing tool.

Adjusting according to test results

The usability test gave insight about what direction the design should take to ensure that the editor will be intuitive to use for teachers who are not familiar with video editing tools. Another goal is to ensure that it will cover all their needs to create 360° content for education.

Qualitiative interview after the usability test.

With the feedback from the usability test, Quasar Dynamics will build an editor for teachers that probably will be presented during autumn 2022.

And the best thing of all? When it is finished, the first edition of the tool will be available to use for everyone – free of charge!

Another 360ViSi editor and player

Project partner Turku University of Applied Sciences has also developed an editor with the working title 360° Editor. It is also suitable for larger and more complex scenarios. The target group for the tool is also teachers. The code for this tool will be published on the developer platform GitHub.

In addition, project partner ADE Ltd has developed 360°Player, which is an output tool for the simulation made in the 360° Editor.

Read more about the editor and player here.

Also relevant

Other stories about the 360 editor and player:

Testing the 360ViSi editor.

Developing the 360 player – a status

Categories
News The project story

The business side of 360 video simulation

With the ambition to highlight business opportunities for private companies using 360 video, the day started with a presentation from Stavanger Art Museum and Screen Story. They talked about their successful collaboration with 360 visualisation of art exhibitions. Read more about it here.

ADE shared their experiences about commercial and educational 360 and VR solutions that they have developed. A Learning Management System (LMS) developed was presented, which includes practical trainings in health care and the technical sector.

Quasar Dynamics presented their other projects with immersive technologies, for instance using VR goggles for people with different disabilities, and a horror Playstation game they are soon to launch.

Guest input

Guests VID Specialized University and the company Mediafarm have collaborated to create student active learning activities.

They explained their journey from the beginning of working with 360 video, what they have learnt and shared advice on everything from actors vs real patients, locations, scripts, how to create emotional impact, placement of camera, lighting, user interactions, playback solutions, subtitles, graphic interface needs, technical considerations and more.

The 360ViSi project team was invited to try out one of their education solutions with VR goggles.

Elizabeth Armstrong, UiS (in front), Javier Ortizá, Quasar Dynamics, and Mari Linn Larsen, UiS, deeply concentrated in a 360 environment.
Mike Taylor from the University of Nottingham tested one of VID’s 360 video education tools.
Arne Morten Rosnes (VID), Severin Ommundsen and Morten Orre (Mediafarm) shared their experiences and best practices from creating interactive 360 video for education.

Excursion to companies

The project team visited two different companies located in Stavanger, who both use simulation tools for training and education purposes – in very different ways.

Aker Solutions delivers integrated solutions, products and services to the global energy industry, and uses simulation extensively. For instance, the company simulates complex offshore operations in order to prepare staff, to reduce risk of accidents and injuries, and to enhance efficiency.

Aker Solution’s simulation dome, where crane operators and other personnel can train and plan how to execute heavy lifts. Weather and waves may be changed to affect the situation. Clear communication with personnel on the platform deck is crucial, and also part of the simulation.
Aker Solutions has an advanced simulation centre used to train personnel before complicated operations in the offshore oil and gas industry.

The team also visited Lærdal Medical who creates realistic simulation-based learning for healthcare education, for instance the world-known Resusci Anne manikin.

Bodil Bø from the Faculty of Health Sciences at University of Stavanger looks at the premature manikin from Lærdal Medical, which is used to train health personnel.
Lærdal Medical’s most advanced simulation manikin is called SimMan, which here being programmed at the factory.
Categories
News The project story

Transnational project meeting and workshop in Norway


The purpose of the meeting and workshop was to share information and learning outcomes from the project so far.

On the first day, all the project partners presented their case studies and analysis of the features that worked well, experiences on what did not work, and shared information about the different tools they had tried out in their respective case studies.

Workshop well underway. Mike Taylor presents case study from the University of Nottingham, and tells the rest of the project team about the process, findings and results.
Esther Navarro from the UCV Catholic University Valencia sharing experiences and best practices from their own case studies.


Student involvement

Two student nurses from the University of Stavanger participated and gave feedback about how they experienced using one of the tools produced by the project, and also gave advice on elements that would make students interested and invested in using these kinds of tools.

Students David Kristoffer Nymoen Eltervaag and Tov Mosvær Sandland gave their honest feedback about a 360ViSi project from University of Stavanger which they had tested.

Trip to the famous Lysefjord

After a long day with interesting presentations and dicussions, the project team spent the evening with teambuilding and dinner.

A boat took the team to the beautiful Lysefjorden to look at the famous Pulpit Rock.

Socialising and getting to know eachother.
Lysefjorden
Categories
News The project story

360ViSi project presented at Science Week in Valencia

Science Week at the Catholic University of Valencia, Spain, is an annual event hosted by the Vice-Rectorate for Research aimed at all students and academics interested in science, innovation and new technologies in teaching. 

This time, Dr. David Fernández and Dr. David Sancho, members of the 360ViSi project, presented some of their projects within teaching innovation. One of them was the 360ViSi project.

You may see a summary of the topics in this video.

Categories
News The project story

Technology and tools – an overview



1. The characteristics of 360° video

360° video or still images immerse the user in a virtually generated environment through photography or a video obtained with 180° / 360° cameras, that is to say, cameras that capture images or footage in all directions and generate a panoramic sphere that the users can turn and explore.

Technically, the 360° cameras capture overlapping fields of multiple lenses pointing in different directions, or from various cameras mounted together in a sphere. Special software is used to compress and join the different images, usually in the shape of a rectangular frame called an equi-rectangular projection.

The result is a still image, or a linear running video, that allows for a view in all directions and multiple modes of visualisation on various devices:

  • Through a Virtual Reality device, in which the screen updates the scene being displayed according to the user’s head motion.
  • On a smartphone, by shifting your device in the direction you want the video to display.
  • On a computer, by scrolling through the window to focus on the different parts of the scene. This is also possible for smartphones and tablets.
  • On a conventional screen, in which case the user is not in control of the viewing angles.

The different devices and modes for projecting 360° imagery also allow for different pedagogical applications. Understanding these technical differences will be important in order to maximize the benefits of 360° video used in education.

The focus of the following in this article is 360° video. However, many of the features and technologies mentioned may also apply to 360° still images. And not least for use in education, a combination of these two formats may also be relevant.

2. A comparison: 360° video to VR 

360° video can easily be misunderstood as the same as virtual reality (VR). Both are immersive media formats but differ greatly in how complex they are to produce and by their properties for user interaction.

With the irruption and common availability of 360° recording cameras (Samsung, 360Fly, Ricoh, Kodak, Nikon, etc) the already familiar term “Virtual Reality” was used in marketing to imply what could be enjoyed with the new recording devices, as well as the images that they produce. The fact that their images also could be reproduced on Head-Mounted Display (HMD) devices, or “goggles”, that are a characteristic of VR technology, gave support to the use of the term. Still, no professionally accepted understanding of the term “Virtual Reality” apply to 360° video, which should instead be understood on the basis of its own, unique characteristics.

A prominence of VR is that it is made up of simulated environments, synthetically recreated by 2D or 3D imagery, which is exceptional when the need is to portray inaccessible or dangerous scenarios and environments. The prominent feature of 360° video, on the other hand, is its actual ability to photo realistically play back environments and actions as they were recorded in real-time, which is exceptional when the goal is documentation or observation of real scenarios, events and environments.

Understanding the strengths of the two media also provides insight into their limitations. Their visual characteristics are a fundamental aspect. Differences in navigation, interaction, production, playback and sharing are others, which must be considered when deciding how these media can best be utilized as teaching aids.

2.1    User interaction and navigation

The virtual motion characteristics is another of the differences that is evident between the two media of 360° video and VR. Since a 360° video is previously recorded, the user is limited to looking around and following the directed timeline, with no option of movement neither in the virtual space the video portraits nor beyond the time it constitutes. Any movement must be directed at the time of recording. It should be noted that such recorded movements in a 360° video unfortunately may result in an unpleasant, dizzying experience for the viewer who passively watches the video.

Still, from a pedagogical point of view, the fixed movements and timeline of a 360° video may also be understood as a favourable property. In contrast to the open environment of VR, the directed environment of a 360° video constitutes intentional story lines that will aid the viewer to an intended learning outcome. Shorter story lines that are interlinked may very well be used as a strategy to expand the user’s room for manoeuvre, who at certain points in a 360° video can decide which story line to follow.

In VR, the user controls the experience within the created, synthetic environment and the properties it has. Its key feature is that users virtually have the ability to move through the environments and interact with them at their own will.

Between the two media technologies of VR and 360°video, the latter is basically the less flexible in terms of interactivity. However, systems that enable interactivity can be applied to a 360°video by a pointer that can be housed normally in the centre of the viewer, combined also for complementary functions with a physical button housed in the playback device.

Using the 360° video content on a PC-screen will enable an even wider range of opportunities when it comes to hotspots and interactivities. However, other methods of interactivity, such as by the classic keyboard, mouse or joystick, are more difficult to adapt. It is by such devices all virtual, or extended reality media technologies (XR) in general, unfold their advantages — including virtual reality (VR), augmented reality (AR) and mixed reality (MR). Including also motion sensors, the potential of user interaction, e.g. by hand movements, indeed make XR the preferred media if a high degree of user interaction is the goal. However, application of these interactive features in XR normally will entail the need for expert developers, as well as the need for custom playback devices. In contrast, 360° video generally is easier to develop, and will normally require only off-the-shelf consumer tools for recording and playback.

3. Capture, author and playback

3.1    Recording 360° video

360° videos can be captured with consumer grade 360° video cameras like Insta360 ONE R. There are of course more professional versions that can capture stereoscopic video or more resolution. There are few good practices for recording, but the basic idea is to consider the camera as an observer: Positioning on eye level and aiming lens towards main action.

With device manufacturers software it is possible to convert raw 360° videos in usable format pretty easily without any video editing skills. Afterwards you can use any video editing software to edit your video footage. If you need special effects related to 360° video, you will need special tools. Some video editors like Adobe Premier or DaVinci Resolve, have these options as plugins. When requirements of postprocessing grows, so does the required skill set.

For more information on hardware, programmes and first steps in 360° video producing, the 360ViSi project has published a User Guide on Basic Production of 360-Degree Video. 

3.2    Authoring an interactive experience

There are multiple ways to build 360° video content into an interactive experience. The easiest way is to use an existing tool, like 3DVista, to create a 360° video tour and also include other embedded content.

These tour platforms usually allow the creator to include multiple action points between which the user can teleport. From these different action points the user sees a different 360° video or image that also can include overlays of other interactive content. Such content can usually be additional contextual images, videos or text information. The 360° videos can also be embedded into a VR application (3DVista, 2021).

Interactive VR experiences are built using 3D game engines, like Unity or Unreal Engine with a specialized software development kit (SDK) added on top. Game engines are suitable for VR development because they already deal with real-time 3D rendering and user interactions. A 360° video or image can be used as the background of the application, allowing the addition of interactive 3D models overlay on the video. The output of VR development is usually a desktop application, but there are options for different platforms (Valve, 2021), as referred to below.

Other XR experiences, like interactive augmented reality (AR) content will in turn require other tools. Augmented reality (AR) is an XR technology that overlays and combines virtual data and synthetic graphics with real images and data from the physical world. Some of the current development tools are limited to the platform the content is built for. Common SDKs for AR development include AR Core (Android) and AR Kit (iOS). These SDKs can be utilized in different development environments, like Android Studio and Xcode, and most common SDKs can also be used inside game engines, like Unity or Unreal Engine. There are also more specialized SDKs for different Head-Mounted Displays (HMD), like the Microsoft HoloLens (Chen et al., 2019).

3.3    Consuming 360° and XR media

Both XR and 360° media can be played on several different types of devices. When choosing a playback device, however, there are several factors to consider, such as the capacity for interactivity, the source from which they collect the information and reproduce it, as well as the availability, ease of use and comfort for the user, and not at least, the price.

For 360° video, there are basically two different ways of consuming. Most easily accessible are flat screens with some touching controller, like monitor with mouse or mobile phone with touchscreen or gyroscope. The other way is by Head-Mounted Displays (HMD), or goggles, as discussed further below. HMDs are more immersive, but also custom devices that often needs to be attached to a computer.

With flatscreen devices the only way to interact with a 360° video is to select by clicking and rotating the view. There is also the possibility to use stereoscopic 3D view on smart phones, with a Google Cardboard like headset that divides the sight separately for the left and right eye.

With VR headsets there is the possibility to interact with the environment through controllers and also move in a 3D synthetic space. These more complex interactions demand more from the 3D environment and are not useful features with basic 360° videos.

Consuming AR content varies between use cases. For more entertainment-oriented or just consumer targeted applications, a smart phone is the go-to device to consume AR content. This is due to AR supported smart phones being common. Still, the custom AR Head-Mounted Displays, like the Microsoft HoloLens, are the most suitable and will give the best immersive experience. AR headsets are however a lot more expensive than VR headsets and not targeted towards consumers.

3.3.1      Head-Mounted Display (HMD) devices

VR goggles for PC: The VR goggles that revolutionised virtual reality was the Oculus Rift, followed by others, like the HTC Vive. This type of goggles provides the best user experience, but they require a powerful PC with a good graphics processor capable of emitting satisfactory images.

This type of device is the closest thing to literally being inside a virtual environment. However, such powerful sensation may make the user feel dizzy or unwell. Additionally, the need for cables limits the movements and requires that a distance between two and five metres between the user and the PC should be kept. The installation of the device also needs to be thorough.

Autonomous VR goggles: There are also cordless virtual reality goggles. These autonomous virtual reality goggles have built-in processors, as well as a screen and sensor, all in one.  They operate by a WIFI or Bluetooth connection.

To start using this type of device the user simply needs to put them on, and the experience will start. They have buttons that enable navigation and can be configured to choose what you want to see. Most models are compatible with Samsung and the applications in their catalogue.

Recently, the Oculus Go has been commercialised with LCD screen and an integrated sound system that achieves absolute immersion in the VR experience. Oculus also has a more advanced standalone headset line-up with the Oculus Quest and the Oculus Quest 2 (Oculus, 2020). HTC is also preparing its Vive Standalone, a highly anticipated model, with its own platform and a Qualcomm Snapdragon 835 processor.

VR goggles for smartphones: (Google Cardboard and VR Gear): These devices are the cheapest option although the experience is obviously not the same as with the custom goggles. The way to use them is simple. A smartphone is installed in the goggles case, and thanks to the lenses of the goggles that make the 3D effect possible, the user will get a feeling of immersion in virtual reality. Their appeal is the simplicity, ease of access and low cost.

With a common smartphone, virtually everyone has the opportunity to enter an immersive environment. Only a cheap cardboard case to house the smartphone is necessary either to play a 360° video from a video platform or an application designed for audio-visual immersion (Google, 2021).

3.3.2      Notes on the future technology of XR media displays

One feature steadily improving in 360° videos and VR headsets is the resolution of the cameras and screens, which alone is a feature to notice that will make a learning experience more immersive. 

Replacing the old Oculus standalone VR headset Quest 1, the Oculus Quest 2, released in October 2020, will improve the entry point for VR headsets. The new model will have higher resolution, refresh rate and improved processing power. 

For PC supported VR, the HP G2 Reverb headset was released the fall 2020. The HP G2 Reverb has a much greater resolution than any consumer grade VR headset so far. It also follows Oculus Rift in the regard of removing the need for additional sensors. 

With the new PS5 coming out soon, the possibility for a PSVR 2 is likely. This may expand the market for a consumer-friendly alternative to VR.

Manufacturers like Nvidia and AMD are always pushing the limits for what a graphics processing unit (GPU) can do. With the recent release of Nvidia’s RTX 3080/3090 it opens for better graphical fidelity in VR, which enhances the immersive experience of VR. 

4. Collaboration in 360° and XR media

Real-time communication and multiuser interaction in synthetic virtual environments is common, typically in real time online games. This feature of XR-technology is indeed also useful in XR-applications for education – opening both for instructor led activities as well as collaborative training and simulated activities, where intelligence in the application also may be applied to influence the experience.

A particular benefit is the independence of space, which makes it possible to interact independent of the location of the participants in the virtual activity.

Another characteristic is that the participants will have to be represented in the virtual environment by an avatar. For some activities this may be an advantage, particularly in simulations. For collaborative activities it is likely to be the opposite, as well as a complex and expensive solution for an activity that better may be solved by consumer, online services for live video-based collaboration.

Recently, several services have been developed for the transmission of real-time 360° video. This solution is still not very much explored, but may indeed prove to open for new ways of both remote collaboration and educational activities ‒ where all participants individually can control the camera viewpoint remotely as they like.

During the lockdown of the Corona pandemic this technology as an example, has been reported to be very useful in remote site inspections and surveys. The platform of Avatour is one such promising online service of real-time 360° video conferencing launched in 2020 during the first spring of the Corona lockdown. The service is presented by Avtour as “…the world’s first multi-party, immersive remote presence platform” (Avatour, 2021).

Also in 2020, 3DVista, a leading provider of 360° video services and software, launched another approach to remote collaboration in a 360° video environment called “Live Guided Tours”. Every participant can navigate the tour as they like or follow the lead of the one controlling the tour. Building on the features of 3DVista’s 360° video editor, the experience may also include multimedia interactive hotspots, enriching the experience (3dvista, 2020), which is particularly relevant for educational purposes.

The latest addition to interaction in an XR immersive environment is Metaverse. In late 2021, Facebook launched Metaverse as their ultimate vision of a social media platform, which is supposed also to include Meta Immersive Learning explained to provide a “…pioneering immersive learning ecosystem”. By the addition of immersive technologies like AR and VR, the claim is that learning will “…become a seamless experience that lets you weave between virtual and physical spaces” (Meta, 2021).

It remains to be seen weather Metaverse will live up to its promise, or at all gain popularity among the public. Still, the vision of Metaverse pinpoints the potential of the XR media technologies, including 360° video, and these media as the next step in virtual social interaction and indeed learning.

Have a look at 360ViSi’s review of Meta Immersive Learning. 

5. References and sources

3dvista. (2020, May 28). Live-Guided Tours.
https://www.3dvista.com/en/blog/live-guided-tours/

3DVista.  (2021). Virtual Tours, 360º video and VR software.
https://www.3dvista.com/en/products/virtualtour

Avatour. (2021).
https://avatour.co/

Chen, Y., Wang, Q., Chen, H., Song, X., Tang, H., Tian, M. (2019). An overview of augmented reality technology, J. Phys.: Conf. Ser. 1237 022082

Google VR. (2021). Google Cardboard. Google.
https://arvr.google.com/cardboard/

Hassan Montero, Y. (2002). Introducción a la Usabilidad. No Solo Usabilidad, nº 1, 2002. ISSN 1886-8592. Accessed October 20.
http://www.nosolousabilidad.com/articulos/introduccion_usabilidad.htm

Meta. (2021). Meta Immersive Learning.
https://about.facebook.com/immersive-learning/

Oculus. (2020, September 16). Introducing Oculus Qeust 2, the next generation of all-in-one VR.
https://www.oculus.com/blog/introducing-oculus-quest-2-the-next-generation-of-all-in-one-vr-gaming/

Rustici Software. (2021). cmi5 solved and explained.
https://xapi.com/cmi5/

Sánchez, W. (2011). La usabilidad en Ingeniería de Software: definición y características. Ing-novación. Revista de Ingeniería e Innovación de la Facultad de Ingeniería. Universidad Don Bosco. Año 1, No. 2. pp. 7-21. ISSN 2221-1136.

Valve. (2021), SteamVR Unity Plugin.
https://valvesoftware.github.io/steamvr_unity_plugin/    

Categories
News The project story

Metaverse potential for education

Image: Facsimile from Meta’s Oculus workrooms.

Analysis by Jaime Diaz, Quasar Dynamics

Metaverse is a concept defining the next generation of the internet, describing an immersive, multi-sensory experience based on the use of multiple devices and technological developments. A 3D universe that combines multiple virtual spaces.

How can the metaverse be used?

We understand the metaverse as the combination of technology and software that moves the user’s mind to an environment that is different in sensory perception from where they physically are. This would include the use of VR headsets, controllers, or haptic suits (see image).

A haptic suit (also known as tactile suit, gaming suit or haptic vest) is a wearable device that provides haptic feedback to the body. Photo: Teslasuit.

Immersive Education is designed to immerse and engage students in the same way that today’s best video games grab and keep the attention of players. It supports self-directed learning as well as collaborative group-based learning environments that can be delivered over the Internet.

Some of the possible uses that metaverses could have in educational experiences are described below.

Laboratory experiments

The real laboratory equipment, tools and machinery are expensive. Imagine a chemistry class where a teacher wants to observe the formation of four acids that occur in acid rain. Such an illustrative presentation would require multiple pH sensors, test tubes, protection glasses, burets…

In addition to the economic realities, we should not forget the risks that these kinds of experiments entail.

Quasar Dynamics, one of the 360 ViSi partners, created a complete 3D laboratory to emulate a very common practice in engineering degrees where the students had to use a power supply, a function generator, oscilloscope and a polimeter.

To interact with the environment, students used VR glasses and controllers. As the students “broke” the tools frequently by connecting the wrong poles, this was the best way to save costs but yet learn by using a practical method.

Museums and tours

Imagine if students could visit the most important museums in the world every week.  Metaverse enables this mental transition to the 3D museums.

Quasar Dynamics also developed a 3D art gallery where artists can upload their artworks, such as 3D sculptures, configure the gallery with furniture or add videos and audio recordings. You can visit the gallery here.

Immersive conferences

Meta also released interesting software for real-time 3D conferences. Even with its business focus, it can be used in education, especially for students that have to interact remotely.

In the workrooms, attendees can wear VR headsets or just use flat-screen computers. They can start drawing on a virtual board, add images or different content for others to see. When using VR glasses, an avatar of the user is displayed showing the user’s own tracked hands. This working method could be the future for meetings or work presentations for online education.

Advantages and disadvantages

The metaverse has the advantage of being a completely 3D experience in real-time, allowing the developer to build an environment without the barriers that reality imposes. For example, it is possible to recreate the interior of the International Space Station without having to be there.

However, the metaverse is a young technology that still needs development. Some environments are still laggy if you do not have a powerful, expensive device that many companies are not currently supporting.

In addition, there is some fear that the metaverse will create a generation of people with poor social skills due to a lack of interaction with the real world.

In summary, the metaverse is the future. Not only for educational purposes but also for a lifestyle. We must ensure that students benefit from its technological advancements while still remaining competitive in real life.