Categories
The project story

Software review: Adobe Captivate

An easy-to-use software

By Esther Navarro, Nursing programme leader at UCV

Adobe Captivate is a software program that is used for creating e-learning content, such as interactive presentations, online courses, and simulations. It is highly regarded in the e-learning industry because it is easy to use and has a wide range of features that allow users to create professional-quality content. Some of the features that make Adobe Captivate a great program include:

  • Drag-and-drop interface: Adobe Captivate has a user-friendly interface that allows users to easily create and edit e-learning content by dragging and dropping various elements onto the canvas.
  • Responsive design: Adobe Captivate automatically adjusts the layout of the content to fit different screen sizes and resolutions, ensuring that it looks great on desktop computers, tablets, and smartphones.
  • Multimedia support: Adobe Captivate allows users to incorporate various types of media, such as videos, audio, and images, into their e-learning content, making it more engaging and interactive.
  • Advanced interaction options: Adobe Captivate provides a wide range of options for adding interactive elements to e-learning content, such as quizzes, surveys, and games, which can help to keep learners engaged.
  • Accessibility: Adobe Captivate includes features that make it easier for users to create accessible e-learning content that can be used by learners with disabilities.

Overall, Adobe Captivate is a powerful and user-friendly programme that is well-suited for creating professional-quality e-learning content.

Screenshot of Adobe Captivate dashboard in use.
Screenshot from UCV Valencia’s e-learning project produced with Adobe Captivate.

360ViSi editors

The 360ViSi project is also developing two brand new editors based on the needs discovered by educators in the project.

Read more about them here:

Recent posts

Categories
The project story

Findings on 360 video in other projects

CoViRR

Similar to the 360ViSi project, the Erasmus+ project titled “Co-creation of Virtual Reality Reusable e-Resources for European Healthcare Education (CoViRR)” aims to consider the approaches to enabling and producing 360° interactive video which can be embedded within Reusable Learning Objects (RLOs).

CoViRR has held a number of meetings, training events and international presentations on the strategic aims, development progress and findings to date. The work packages, as deliverables for this project, include the co-creation and technical implementation of VR activities (as scenarios) together with an analysis of feasibility and acceptability of reusable VR e-resources in the form of documented best practices and recommendations from a co-creational design aspect.

The dissemination of results will occur in the following ways:

  1. Virtual channels at local, national and international levels to inform about the created VR e-resources; attracting more learners from other universities for analyzing the feasibility and the acceptability.
  2. Outcomes delivered through a multiplier event, journal publications, conferences, social media, and websites, press and internal and external partner networks.
    More information is available from the CoViRR website.

The 360ViSi project has been able to benefit particularly from technical input from partners also involved in the CoViRR project. With both projects running concurrently, it has been a good opportunity for learning technologists, software developers and researchers alike to be able to exchange useful knowledge and common interests between all involved.

ARsim2Care

Following the same line as 360ViSi, there is another Erasmus+ project, titled “Application of Augmented Reality in Clinical Simulation (ARsim2care)”, which aims to enhance the use of manikins with augmented reality in order to carry out certain nursing techniques.

The main goal of this project is to develop an Augmented Reality (AR) software that, combined with clinical simulation anatomical models, allows students to learn how to perform invasive clinical procedures, helping them visualize internal anatomical structures. The project is addressed at a European level as the training of technical skills is a common need for European nurses.

The procedures compiled in a manual included in the ARsim2Care app, are endotracheal intubation, arterial blood sampling, intramuscular injection, nasogastric tube insertion and suctioning via a tracheostomy tube.

This manual includes the detailed description of each procedure, and it illustrates the key internal anatomical structures to be visualized by the student during the procedure.

The ARsim2Care project was initiated before 360ViSi, and it is in its final stage. It has allowed the 360ViSi team to see how new technologies, particularly technologies related to VR and AR, can help students gain better practical/technical competencies. Researchers from both teams (ARsim2Care and 360ViSi) have been in contact to discuss results and exchange experiences.

Categories
News The project story

Technology and tools – an overview



1. The characteristics of 360° video

360° video or still images immerse the user in a virtually generated environment through photography or a video obtained with 180° / 360° cameras, that is to say, cameras that capture images or footage in all directions and generate a panoramic sphere that the users can turn and explore.

Technically, the 360° cameras capture overlapping fields of multiple lenses pointing in different directions, or from various cameras mounted together in a sphere. Special software is used to compress and join the different images, usually in the shape of a rectangular frame called an equi-rectangular projection.

The result is a still image, or a linear running video, that allows for a view in all directions and multiple modes of visualisation on various devices:

  • Through a Virtual Reality device, in which the screen updates the scene being displayed according to the user’s head motion.
  • On a smartphone, by shifting your device in the direction you want the video to display.
  • On a computer, by scrolling through the window to focus on the different parts of the scene. This is also possible for smartphones and tablets.
  • On a conventional screen, in which case the user is not in control of the viewing angles.

The different devices and modes for projecting 360° imagery also allow for different pedagogical applications. Understanding these technical differences will be important in order to maximize the benefits of 360° video used in education.

The focus of the following in this article is 360° video. However, many of the features and technologies mentioned may also apply to 360° still images. And not least for use in education, a combination of these two formats may also be relevant.

2. A comparison: 360° video to VR 

360° video can easily be misunderstood as the same as virtual reality (VR). Both are immersive media formats but differ greatly in how complex they are to produce and by their properties for user interaction.

With the irruption and common availability of 360° recording cameras (Samsung, 360Fly, Ricoh, Kodak, Nikon, etc) the already familiar term “Virtual Reality” was used in marketing to imply what could be enjoyed with the new recording devices, as well as the images that they produce. The fact that their images also could be reproduced on Head-Mounted Display (HMD) devices, or “goggles”, that are a characteristic of VR technology, gave support to the use of the term. Still, no professionally accepted understanding of the term “Virtual Reality” apply to 360° video, which should instead be understood on the basis of its own, unique characteristics.

A prominence of VR is that it is made up of simulated environments, synthetically recreated by 2D or 3D imagery, which is exceptional when the need is to portray inaccessible or dangerous scenarios and environments. The prominent feature of 360° video, on the other hand, is its actual ability to photo realistically play back environments and actions as they were recorded in real-time, which is exceptional when the goal is documentation or observation of real scenarios, events and environments.

Understanding the strengths of the two media also provides insight into their limitations. Their visual characteristics are a fundamental aspect. Differences in navigation, interaction, production, playback and sharing are others, which must be considered when deciding how these media can best be utilized as teaching aids.

2.1    User interaction and navigation

The virtual motion characteristics is another of the differences that is evident between the two media of 360° video and VR. Since a 360° video is previously recorded, the user is limited to looking around and following the directed timeline, with no option of movement neither in the virtual space the video portraits nor beyond the time it constitutes. Any movement must be directed at the time of recording. It should be noted that such recorded movements in a 360° video unfortunately may result in an unpleasant, dizzying experience for the viewer who passively watches the video.

Still, from a pedagogical point of view, the fixed movements and timeline of a 360° video may also be understood as a favourable property. In contrast to the open environment of VR, the directed environment of a 360° video constitutes intentional story lines that will aid the viewer to an intended learning outcome. Shorter story lines that are interlinked may very well be used as a strategy to expand the user’s room for manoeuvre, who at certain points in a 360° video can decide which story line to follow.

In VR, the user controls the experience within the created, synthetic environment and the properties it has. Its key feature is that users virtually have the ability to move through the environments and interact with them at their own will.

Between the two media technologies of VR and 360°video, the latter is basically the less flexible in terms of interactivity. However, systems that enable interactivity can be applied to a 360°video by a pointer that can be housed normally in the centre of the viewer, combined also for complementary functions with a physical button housed in the playback device.

Using the 360° video content on a PC-screen will enable an even wider range of opportunities when it comes to hotspots and interactivities. However, other methods of interactivity, such as by the classic keyboard, mouse or joystick, are more difficult to adapt. It is by such devices all virtual, or extended reality media technologies (XR) in general, unfold their advantages — including virtual reality (VR), augmented reality (AR) and mixed reality (MR). Including also motion sensors, the potential of user interaction, e.g. by hand movements, indeed make XR the preferred media if a high degree of user interaction is the goal. However, application of these interactive features in XR normally will entail the need for expert developers, as well as the need for custom playback devices. In contrast, 360° video generally is easier to develop, and will normally require only off-the-shelf consumer tools for recording and playback.

3. Capture, author and playback

3.1    Recording 360° video

360° videos can be captured with consumer grade 360° video cameras like Insta360 ONE R. There are of course more professional versions that can capture stereoscopic video or more resolution. There are few good practices for recording, but the basic idea is to consider the camera as an observer: Positioning on eye level and aiming lens towards main action.

With device manufacturers software it is possible to convert raw 360° videos in usable format pretty easily without any video editing skills. Afterwards you can use any video editing software to edit your video footage. If you need special effects related to 360° video, you will need special tools. Some video editors like Adobe Premier or DaVinci Resolve, have these options as plugins. When requirements of postprocessing grows, so does the required skill set.

For more information on hardware, programmes and first steps in 360° video producing, the 360ViSi project has published a User Guide on Basic Production of 360-Degree Video. 

3.2    Authoring an interactive experience

There are multiple ways to build 360° video content into an interactive experience. The easiest way is to use an existing tool, like 3DVista, to create a 360° video tour and also include other embedded content.

These tour platforms usually allow the creator to include multiple action points between which the user can teleport. From these different action points the user sees a different 360° video or image that also can include overlays of other interactive content. Such content can usually be additional contextual images, videos or text information. The 360° videos can also be embedded into a VR application (3DVista, 2021).

Interactive VR experiences are built using 3D game engines, like Unity or Unreal Engine with a specialized software development kit (SDK) added on top. Game engines are suitable for VR development because they already deal with real-time 3D rendering and user interactions. A 360° video or image can be used as the background of the application, allowing the addition of interactive 3D models overlay on the video. The output of VR development is usually a desktop application, but there are options for different platforms (Valve, 2021), as referred to below.

Other XR experiences, like interactive augmented reality (AR) content will in turn require other tools. Augmented reality (AR) is an XR technology that overlays and combines virtual data and synthetic graphics with real images and data from the physical world. Some of the current development tools are limited to the platform the content is built for. Common SDKs for AR development include AR Core (Android) and AR Kit (iOS). These SDKs can be utilized in different development environments, like Android Studio and Xcode, and most common SDKs can also be used inside game engines, like Unity or Unreal Engine. There are also more specialized SDKs for different Head-Mounted Displays (HMD), like the Microsoft HoloLens (Chen et al., 2019).

3.3    Consuming 360° and XR media

Both XR and 360° media can be played on several different types of devices. When choosing a playback device, however, there are several factors to consider, such as the capacity for interactivity, the source from which they collect the information and reproduce it, as well as the availability, ease of use and comfort for the user, and not at least, the price.

For 360° video, there are basically two different ways of consuming. Most easily accessible are flat screens with some touching controller, like monitor with mouse or mobile phone with touchscreen or gyroscope. The other way is by Head-Mounted Displays (HMD), or goggles, as discussed further below. HMDs are more immersive, but also custom devices that often needs to be attached to a computer.

With flatscreen devices the only way to interact with a 360° video is to select by clicking and rotating the view. There is also the possibility to use stereoscopic 3D view on smart phones, with a Google Cardboard like headset that divides the sight separately for the left and right eye.

With VR headsets there is the possibility to interact with the environment through controllers and also move in a 3D synthetic space. These more complex interactions demand more from the 3D environment and are not useful features with basic 360° videos.

Consuming AR content varies between use cases. For more entertainment-oriented or just consumer targeted applications, a smart phone is the go-to device to consume AR content. This is due to AR supported smart phones being common. Still, the custom AR Head-Mounted Displays, like the Microsoft HoloLens, are the most suitable and will give the best immersive experience. AR headsets are however a lot more expensive than VR headsets and not targeted towards consumers.

3.3.1      Head-Mounted Display (HMD) devices

VR goggles for PC: The VR goggles that revolutionised virtual reality was the Oculus Rift, followed by others, like the HTC Vive. This type of goggles provides the best user experience, but they require a powerful PC with a good graphics processor capable of emitting satisfactory images.

This type of device is the closest thing to literally being inside a virtual environment. However, such powerful sensation may make the user feel dizzy or unwell. Additionally, the need for cables limits the movements and requires that a distance between two and five metres between the user and the PC should be kept. The installation of the device also needs to be thorough.

Autonomous VR goggles: There are also cordless virtual reality goggles. These autonomous virtual reality goggles have built-in processors, as well as a screen and sensor, all in one.  They operate by a WIFI or Bluetooth connection.

To start using this type of device the user simply needs to put them on, and the experience will start. They have buttons that enable navigation and can be configured to choose what you want to see. Most models are compatible with Samsung and the applications in their catalogue.

Recently, the Oculus Go has been commercialised with LCD screen and an integrated sound system that achieves absolute immersion in the VR experience. Oculus also has a more advanced standalone headset line-up with the Oculus Quest and the Oculus Quest 2 (Oculus, 2020). HTC is also preparing its Vive Standalone, a highly anticipated model, with its own platform and a Qualcomm Snapdragon 835 processor.

VR goggles for smartphones: (Google Cardboard and VR Gear): These devices are the cheapest option although the experience is obviously not the same as with the custom goggles. The way to use them is simple. A smartphone is installed in the goggles case, and thanks to the lenses of the goggles that make the 3D effect possible, the user will get a feeling of immersion in virtual reality. Their appeal is the simplicity, ease of access and low cost.

With a common smartphone, virtually everyone has the opportunity to enter an immersive environment. Only a cheap cardboard case to house the smartphone is necessary either to play a 360° video from a video platform or an application designed for audio-visual immersion (Google, 2021).

3.3.2      Notes on the future technology of XR media displays

One feature steadily improving in 360° videos and VR headsets is the resolution of the cameras and screens, which alone is a feature to notice that will make a learning experience more immersive. 

Replacing the old Oculus standalone VR headset Quest 1, the Oculus Quest 2, released in October 2020, will improve the entry point for VR headsets. The new model will have higher resolution, refresh rate and improved processing power. 

For PC supported VR, the HP G2 Reverb headset was released the fall 2020. The HP G2 Reverb has a much greater resolution than any consumer grade VR headset so far. It also follows Oculus Rift in the regard of removing the need for additional sensors. 

With the new PS5 coming out soon, the possibility for a PSVR 2 is likely. This may expand the market for a consumer-friendly alternative to VR.

Manufacturers like Nvidia and AMD are always pushing the limits for what a graphics processing unit (GPU) can do. With the recent release of Nvidia’s RTX 3080/3090 it opens for better graphical fidelity in VR, which enhances the immersive experience of VR. 

4. Collaboration in 360° and XR media

Real-time communication and multiuser interaction in synthetic virtual environments is common, typically in real time online games. This feature of XR-technology is indeed also useful in XR-applications for education – opening both for instructor led activities as well as collaborative training and simulated activities, where intelligence in the application also may be applied to influence the experience.

A particular benefit is the independence of space, which makes it possible to interact independent of the location of the participants in the virtual activity.

Another characteristic is that the participants will have to be represented in the virtual environment by an avatar. For some activities this may be an advantage, particularly in simulations. For collaborative activities it is likely to be the opposite, as well as a complex and expensive solution for an activity that better may be solved by consumer, online services for live video-based collaboration.

Recently, several services have been developed for the transmission of real-time 360° video. This solution is still not very much explored, but may indeed prove to open for new ways of both remote collaboration and educational activities ‒ where all participants individually can control the camera viewpoint remotely as they like.

During the lockdown of the Corona pandemic this technology as an example, has been reported to be very useful in remote site inspections and surveys. The platform of Avatour is one such promising online service of real-time 360° video conferencing launched in 2020 during the first spring of the Corona lockdown. The service is presented by Avtour as “…the world’s first multi-party, immersive remote presence platform” (Avatour, 2021).

Also in 2020, 3DVista, a leading provider of 360° video services and software, launched another approach to remote collaboration in a 360° video environment called “Live Guided Tours”. Every participant can navigate the tour as they like or follow the lead of the one controlling the tour. Building on the features of 3DVista’s 360° video editor, the experience may also include multimedia interactive hotspots, enriching the experience (3dvista, 2020), which is particularly relevant for educational purposes.

The latest addition to interaction in an XR immersive environment is Metaverse. In late 2021, Facebook launched Metaverse as their ultimate vision of a social media platform, which is supposed also to include Meta Immersive Learning explained to provide a “…pioneering immersive learning ecosystem”. By the addition of immersive technologies like AR and VR, the claim is that learning will “…become a seamless experience that lets you weave between virtual and physical spaces” (Meta, 2021).

It remains to be seen weather Metaverse will live up to its promise, or at all gain popularity among the public. Still, the vision of Metaverse pinpoints the potential of the XR media technologies, including 360° video, and these media as the next step in virtual social interaction and indeed learning.

Have a look at 360ViSi’s review of Meta Immersive Learning. 

5. References and sources

3dvista. (2020, May 28). Live-Guided Tours.
https://www.3dvista.com/en/blog/live-guided-tours/

3DVista.  (2021). Virtual Tours, 360º video and VR software.
https://www.3dvista.com/en/products/virtualtour

Avatour. (2021).
https://avatour.co/

Chen, Y., Wang, Q., Chen, H., Song, X., Tang, H., Tian, M. (2019). An overview of augmented reality technology, J. Phys.: Conf. Ser. 1237 022082

Google VR. (2021). Google Cardboard. Google.
https://arvr.google.com/cardboard/

Hassan Montero, Y. (2002). Introducción a la Usabilidad. No Solo Usabilidad, nº 1, 2002. ISSN 1886-8592. Accessed October 20.
http://www.nosolousabilidad.com/articulos/introduccion_usabilidad.htm

Meta. (2021). Meta Immersive Learning.
https://about.facebook.com/immersive-learning/

Oculus. (2020, September 16). Introducing Oculus Qeust 2, the next generation of all-in-one VR.
https://www.oculus.com/blog/introducing-oculus-quest-2-the-next-generation-of-all-in-one-vr-gaming/

Rustici Software. (2021). cmi5 solved and explained.
https://xapi.com/cmi5/

Sánchez, W. (2011). La usabilidad en Ingeniería de Software: definición y características. Ing-novación. Revista de Ingeniería e Innovación de la Facultad de Ingeniería. Universidad Don Bosco. Año 1, No. 2. pp. 7-21. ISSN 2221-1136.

Valve. (2021), SteamVR Unity Plugin.
https://valvesoftware.github.io/steamvr_unity_plugin/    

Categories
News The project story

Metaverse potential for education

Image: Facsimile from Meta’s Oculus workrooms.

Analysis by Jaime Diaz, Quasar Dynamics

Metaverse is a concept defining the next generation of the internet, describing an immersive, multi-sensory experience based on the use of multiple devices and technological developments. A 3D universe that combines multiple virtual spaces.

How can the metaverse be used?

We understand the metaverse as the combination of technology and software that moves the user’s mind to an environment that is different in sensory perception from where they physically are. This would include the use of VR headsets, controllers, or haptic suits (see image).

A haptic suit (also known as tactile suit, gaming suit or haptic vest) is a wearable device that provides haptic feedback to the body. Photo: Teslasuit.

Immersive Education is designed to immerse and engage students in the same way that today’s best video games grab and keep the attention of players. It supports self-directed learning as well as collaborative group-based learning environments that can be delivered over the Internet.

Some of the possible uses that metaverses could have in educational experiences are described below.

Laboratory experiments

The real laboratory equipment, tools and machinery are expensive. Imagine a chemistry class where a teacher wants to observe the formation of four acids that occur in acid rain. Such an illustrative presentation would require multiple pH sensors, test tubes, protection glasses, burets…

In addition to the economic realities, we should not forget the risks that these kinds of experiments entail.

Quasar Dynamics, one of the 360 ViSi partners, created a complete 3D laboratory to emulate a very common practice in engineering degrees where the students had to use a power supply, a function generator, oscilloscope and a polimeter.

To interact with the environment, students used VR glasses and controllers. As the students “broke” the tools frequently by connecting the wrong poles, this was the best way to save costs but yet learn by using a practical method.

Museums and tours

Imagine if students could visit the most important museums in the world every week.  Metaverse enables this mental transition to the 3D museums.

Quasar Dynamics also developed a 3D art gallery where artists can upload their artworks, such as 3D sculptures, configure the gallery with furniture or add videos and audio recordings. You can visit the gallery here.

Immersive conferences

Meta also released interesting software for real-time 3D conferences. Even with its business focus, it can be used in education, especially for students that have to interact remotely.

In the workrooms, attendees can wear VR headsets or just use flat-screen computers. They can start drawing on a virtual board, add images or different content for others to see. When using VR glasses, an avatar of the user is displayed showing the user’s own tracked hands. This working method could be the future for meetings or work presentations for online education.

Advantages and disadvantages

The metaverse has the advantage of being a completely 3D experience in real-time, allowing the developer to build an environment without the barriers that reality imposes. For example, it is possible to recreate the interior of the International Space Station without having to be there.

However, the metaverse is a young technology that still needs development. Some environments are still laggy if you do not have a powerful, expensive device that many companies are not currently supporting.

In addition, there is some fear that the metaverse will create a generation of people with poor social skills due to a lack of interaction with the real world.

In summary, the metaverse is the future. Not only for educational purposes but also for a lifestyle. We must ensure that students benefit from its technological advancements while still remaining competitive in real life.

Categories
News The project story

Video meeting in 360 environment 


By setting up a video meeting inside a virtual environment, the participants can navigate, discuss and present on the same platform. 

Screenshot from testing this feature. The case is from University of Stavanger’s 360Visi case on ward rounds.

Read more about the ward round case here!

About the 3D Vista feature

According to 3D Vista, it is possible to take guests “by the hand” on a shared virtual tour with personal guidance. Synchronizing where you look, you can take turns on controlling the virtual tour.  

Users are able to explore on their own, and the host can pick them back up anytime to follow the host’s route.

Live Guided Tours are accessible on both desktop and mobile devices, so individuals attending the live video meeting can dial in from their mobile phones as well. 

More than screen sharing

So why not just use Zoom with a normal virtual tour then? According to 3DVista, both quality and speed are better. Sharing a multimedia-intense virtual tour via a screenshare often results in jerky images, latency and overall an undesirable experience.

With 3DVista Live Guided Tours every participant is inside their own actual virtual tour and sees their own media with the participant video bubbles overlaid on top. This preserves the quality of the virtual tour images and the smoothness when moving through the rooms – even if they’re controlled by somebody else, according to 3DVista’s webpage.

Another advantage argued by 3DVista, is that you are preserving interactivity, the essence of a virtual tour. Screen sharing a virtual tour would not allow participants to take control of the tour and explore themselves. With 3DVista Live Guided Tours (and the permission of the host), any guest can deviate from the route and navigate on their own, open any hotspots they’d be interested in and feel in full control of the virtual visit.

By Kåre Spanne, Media Engineer at the University of Stavanger

Sources  

https://www.3dvista.com/en/products/lgt/

Categories
News The project story

Safe learning with a patient surgeon

Imagine standing in an operation theatre and handing instruments to a surgeon. To nursing students, the impression can be nerve-wracking and may cause them to choose another area of specialization.

To guarantee that the future is not lacking nurses with competence to work in this demanding field, Turku University of Applied Sciences has created a virtual 360° game that supports the students’ skills in identifying surgical instruments.

Nursing students testing the VR instrument game.

The game gives the students a boost in their competence and showcases the reality of working in an operating theatre.

“When turning my head, I saw the very realistic environment. I believe this game lowers the threshold to work in an operating theatre,” says Jasmine Pitkänen.

Pitkänen, a soon-to-be-graduate nursing student. Along with her classmates, she gave an estimation of the learning experience. With a bit more practice on the game, Jasmine would be ready to jump on the deep end.

Learning by doing was a rewarding method for the students. Instead of reading books or watching videos and learning passively, the students appreciated active practicing and experiencing the lifelike situation. Or like Mikko Kinnunen put it:

The experience was concrete, authentic and very realistic. I improved my score after rehearsing just once. When you practice, you learn.”

The students appreciated that, unlike books and videos, the game indicated incorrect answers. This feature along with the possibility to retry were seen as big bonuses from the learning point of view.

“If I handed a wrong instrument, it was made clear, and I got a chance to try again,” said Riikka Mörsky.

Instead of having the instruments explained as a list in a book, the game showed all the instruments at one glance. When reading about the topic, it did not occur to her, how the situation would look like in a real setting.

“If I had played this game at the beginning of my studies, I might have chosen to practice in an operating theatre.”

Valtteri Hannila appreciated the safe learning experience. Despite handing a wrong instrument several times, the surgeon stayed calm, and the rehearsal continued. In the real-life, a surgeon might not be as understanding. Playing in front of others added, however, more stress to the situation.

During their nursing studies, the students had played a learning game before, but this was the first VR learning game experience. Based on their practicing on the Instrument game, both the students’ experiences and attitudes towards the learning method were positive.

See video and interviews of the students testing the instrument game.

You may also find these stories interesting

The police student case – using 360° video in education

Specific needs for simulation tools in Health Education

User guide for creating interactive 360° video

Process for creating interactive 360° video for education

Categories
News

Company cooperation across borders – because of the 360ViSi project

That led to a demo with the Norwegian University of Stavanger and based on their recommendation, the Finnish project team from Turku University of Applied Sciences was keen to hear more about the company and its products.

ONsim offers its customers VR headsets and easy-to-use 360° training solutions mainly to the medical sector whereas Turku UAS is, as a part of the 360ViSi project, both creating new methods for nursing education and developing the 360ViSI Editor enabling practically anyone to create simulations from 360° videos.

With a mutual passion for VR and simulation, the meeting between ONsim and Turku UAS was a sparkling start to the cooperation. Both ONsim and Turku UAS teams are enthusiastic about developing not only their products but also each other’s ideas. With cross-fertilization in mind, both the project and the company are looking forward to future cooperation.

Categories
News The project story

Zero surgical failures



Petteri Joenpolvi, CEO of company ADESANTE, a start-up established out of the 360ViSi partner ADE, presented extended reality (XR) which is a new technology covering virtual reality (VR), augmented reality (AR) and mixed reality (MR).

“Every year 310 million patients are going through surgical procedures, and 50 million of them are experiencing some kind of complications. In the US, for example, 4000 people are injured every year due to surgical failures, of which 33 % experience permanent injury and 7 % wrongful death,” he explains.

ADESANTE has developed XR solutions for viewing medical images, planning surgical procedures, training surgeons and medical students and support surgery.

The XR solution is used by University hospitals and General hospitals. It’s easy to operate and gives you a precise overview of the human anatomy. It’s perfect for planning a surgical procedure and give the surgeons a better understanding of how to avoid surgical failures.

“Through ADE, the 360ViSi project will benefit from the expertise also from ADESANTE”, says the project manager Atle Løkken.

 Petteri Joenpolvi’s presentation, is available on YouTube.

https://www.surgeryvision.com/

Categories
News The project story

The instrument game

During nursing studies, there is rarely enough time to practice enough the needed skills to make students feel competent and confident. When entering an operating room, students often feel particularly nervous. As a working environment, it is strict and disciplined, that adds anxiety especially when the practicing time in busy teaching premises is limited due to, for example, spatial resources.

Learning by selecting

As a part of 360ViSi project, Turku University of Applied Sciences’ Turku Game Lab together with Nursing studies, have started to solve the problem by creating an education game placed in an operating room.

When playing, the student first sees a 360° demo video of surgery. In front of the student, one of the nurses is handing instruments to the surgeon according to his requests.

When starting the actual game, the student sees the surgeon and the selection of instruments. The task is to, according to the instructions, select and hand over the correct one. The key to the learning is the immediate feedback to the choices the student makes. In the end, the player sees the scores, the time used per instrument and the number of correct and wrong answers are shown.

Unlike during simulations or internships, the game gives the student a possibility to rehearse countless times. This leads to strengthened self-confidence and supporting his or her learning. Once the student gets to practice, he or she can concentrate on the aspects going beyond the basics.

Categories
The project story

Using VR and 3D to explain the COVID-19 effect

As a nurse and university senior lecturer, the Head of the E-learning and New Technologies department (SENT) at The Catholic University of Valencia “San Vicente Mártir” (UCV), Dr. David Fernández, has explained to his students, through virtual reality (VR), aspects concerning COVID-19.

Dr. Fernández has answered all the questions received by participants on his twitter account @enferdocente. He has visually explained the structure of the virus, how it spreads, and how it reproduces within our organism until it reaches the alveoli in the respiratory system.

“The problem with coronavirus is that it has a high affinity for the ACE-2 receptors that are found in the membranes of lung cells,” he says.

According to Dr. Fernández, VR offers specific benefits:   

“It allows the use of three-dimensional objects so students can see, experience and understand in a graphic way what is being explained. Moreover, it facilitates independent learning and it ingrates different teaching methodologies, like escape rooms, through which students can play and learn at the same time.”

UCV is a partner university of the 360ViSi Erasmus+ project team and cooperates in this particular case with a Spanish immersive tech expert company, Innoarea, to increase the understanding of the COVID-19 and how the virus affects lung cells. Innoarea have organised a virtual environment, Innorooms, a collaborative VR tool, whereDr. Fernández has visually explained the structure of the virus, how it spreads, and how it reproduces within our organism until it reaches the alveoli in the respiratory system.

The collaborative VR-tool allows several people to meet in the same virtual space, and which is particularly oriented to teaching and education. It offers the possibility to share experiences and knowledge with others, regardless of where they are, that way, students and lecturers are able to communicate through avatars if it is not feasible to do it physically. The meeting was followed on streaming on YouTube.