Categories
Results The project story

Principles for designing, using and sharing reusable learning objects

This is particularly relevant as the creation of reusable learning objects (RLO) and immersive simulation scenarios is often undertaken by teachers themselves. 

According to official data from Google, about 70% of searches on YouTube refer to the terms “How to do”, which indicates the great demand for audiovisual resources that the population uses to learn to do things, that is, not to learn theory but to learn practicalities. (Google, 2015). 

Throughout the 360ViSi project, it is important to note, by emphasis, that everything considered is based upon a clear and sound pedagogical point of view.  

1. Pedagogical perspectives on 360° interactive video and related immersive XR-technologies

According to stakeholders like BlendMedia, interactive 360° ​​video technology is beginning to take on the same importance as the revolution of the arrival of video in education, with the difference that this technology also offers students an immersion and a sense of “presence” that enhance empathy and a deeper understanding of the content. 

According to various authors, including the well-known Benjamin Bloom (American educational psychologist renowned for the” taxonomy of educational objectives”), the fact that the student interprets, focuses, creates, interacts, and evaluates real situations, will make their learning much more meaningful and long-lasting. 

Grossman (Grossman et al., 2009) distinguished between three types of pedagogical practice teachers would use to take students to a supposed practical reality. On one hand, there is representation, through which the student is provided with analysis resources such as video, interviews, etc. On the other hand, decomposition, in which the teacher dissects and analyzes learning through discussion forums, debates etc. And finally, the pedagogical approach, which allows students to approach their future professional reality without necessarily being in it. (Ferdig and Kosko, 2020) 

That is why simulation plays an essential role in the learning of health science students (Kim et al., 2016) and the creation of 360° videos through which the student feels immersed in a real environment, not only will it reduce simulation costs but will help the student to abstract from the distracting elements and to focus on the learning objectives. 

Immersive environments, such as 360° videos, are extremely relevant in the technological explosion that we are experiencing, given that their potential to improve learning through training from multiple perspectives, situations and experiences has already been proven. 

Hallberg, Hirsto and Kaasinen (2020), stated that 360° technology can be used to facilitate learning related to the representation of spatial knowledge, participation, and context, both experimental and collaborative, which make this technology an effective tool for learning practical aspects digitally. The main difference between this technology and others is that 360° technology allows us to create recorded virtual environments so that students can immerse themselves in a situation or action that seems 100% real. 

To achieve optimal learning, it is essential that this technology meets the following characteristics: 

  •  That the user experience is satisfactory. That is to say, that it does not cause dizziness, that the recording quality is sufficient for the student to feel that they are “inside” the scene and that the interaction is as natural as possible. 
  •  That the virtual elements do not distort. Many times, overlapping texts or 2D words are configured on the spherical surface, which may cause distortions that distract the student. 

Balzareti (Balzaretti et al., 2019) states that promoting reflection among students through decision-making enhances learning effectiveness. 

According to Ferretti et al. there are 4 keys to the use of audio-visual media as a reflective method in learning: 

  1. Discover and describe 
  2.  Look for cause-effect links 
  3.  Exercise analytical thinking 
  4.  Identify possible alternatives 

It is, thus, important to bear in mind that what students value the most when it comes to keeping their attention on 360° videos, is not so much visual excellence or technological novelty, but rather that they value “enjoying” and “doing an interesting activity”, in fact, when compared to 2D video activities, these are the elements that stand out the most. (Snelson & Hsu, 2019). 

For this, it is important that the audiovisual signals (hotspots) are well designed to direct the attention of the users in virtual environments, since they are key for the effective immersion of the student and the feeling of spatial presence. The lack of such cues can lead to boredom, while overuse of such cues can stop students from focusing their attention by switching from one item to another for fear of missing something important, which would end up leading to frustration and stress. 

Regarding the pedagogical benefits of the use of this technology, although there are different studies on it, the one carried out in 2020 by Hyttinen and Hatakka stands out, since they wanted to take the use of 360° video in teaching to the next level, using it to perform live teaching sessions. The results, very positive on the part of the students, reflected some clear benefits, such as not needing physical materials (simulators, objects, actors etc.), which are not necessary for the students, the greater sense of presence in the student’s environment, the use of devices which are familiar to the student (smartphone), a fun learning for the student and a greater concentration on the task, since there are not distracting elements. 

In order to achieve these benefits, it is important to bear in mind that there will be a certain percentage of students who would need to bridge the digital divide in order to feel comfortable with the use of this technology. This can be achieved through training, not only for students, but also for teachers (Tan et al., 2020). 

As per the duration of the 360° content, although the scientific literature regarding the use of this technology is still not very abundant, a systematic review published in early 2021 by Hamilton, McKechnie, Edgerton and Wilson in the Journal of Computers in Education, shows how there are two types of lengths. On one hand, a duration that is set by the content creators, on the other hand, making the content last as long as the participant takes to complete the task, although the vast majority of studies opted for a short duration of the video (with an average duration of 13 min). 

2. A framework for the design of reusable learning objects, including learning objects of 360° media

The ASPIRE framework has been successfully applied over many years by the 360ViSi project partner HELM (Health E-Learning and Media Team) at the University of Nottingham to help scaffold the design and development of learning resources, including Reusable Learning Objects (RLOs) and Open Educational Resource (OER) of any medium. 

There are many OER learning design frameworks available but the ASPIRE framework is particularly suited for the 360ViSi project and the design of 360° video based RLOs because it is flexible enough to involve a community of practice style of development. It has been employed on many health-related learning resource projects involving health care professionals, patients, carers, charities and other related health organisations. (Wharrad, 2021) 

Such stakeholder involvement helps to identify and align the requirements of focused learner groups, this in turn helps to shape resource content and the way it is represented. This ability to share stakeholder knowledge and expertise helps to assist the development of quality reusable learning objects, that can be shared initially by all project partners and eventually with the wider OER community. 

ASPIRE is an acronym for the all the steps of the framework involved in RLO creation:

  • Aims – helps to focus on getting the correct learning goal    
  • Storyboarding – allows the sharing of initial ideas and enables a community-based approach to resource design 
  • Population – combines storyboard ideas into a fully formed specification 
  • Implementation – a specification peer review is applied, and any concerns addressed before starting resource development  
  • Release – a second technical peer review is applied and approved before resource release  
  • Evaluation – for research and development purposes to assess the resources impact 

(Windle, 2014). 

Read more about ASPIRE here.

A number of bespoke tools have been developed to help support the implementation of the framework within any given project. One of them is a specification tool – where the content author/s can create a specification to support the ASPIRE frameworks population step and share with the rest of the project team. This allows for the greater inclusion during this important stage for all team members, who are able to review and discuss and suggest multimedia, as graphics and video content.

Once written, the specification can then be accessed as part of a content peer review stage by a subject expert not previously involved in the project. This allows for a review by a ‘fresh pair of eyes’ ensuring that all the necessary content is included, and that the specification is robust enough for development. (Taylor, Wharrad and Konstantinidis, 2022) The inclusivity that this tool provides supports and encourages a community of practice approach to any given project’s development. 

3. Sharing reusable learning objects 

An online repository is the best way for sharing RLOs across universities and other interested parties and is a digital library or archive which is accessible via the internet. An online repository should have conditions of deposit and access attached.  

There are some good examples of online repositories currently available where academics and learners can access, share and reuse content openly released under a creative commons licence.

They include: 

  • https://www.nottingham.ac.uk/helmopen/, which contains over 300 free to use healthcare resources  
  • https://acord.my/, allowing open access to resources developed by university staff and healthcare and biomedical science students in Malaysia which have been released as part of the ACoRD project.  
  • https://www.merlot.org/merlot/, which holds a collection of over 90,000 Open Education Resources and around 70,000 OER libraries. (Merlot, 2022). 

For sharing RLOs between different Learning Management Systems (LMS) an eLearning Standard should be used such as xAPI and cmi5. Cmi5 is a contemporary e-learning specification intended to take advantage of the Experience API as a communications protocol and data model while providing definition for necessary components for system interoperability such as packaging, launch, credential handshake, and consistent information model. (Rustici Software, 2021) 

References and sources  

Balzaretti, N., Ciani, A., Cutting, C., O’Keeffe, L., & White, B. (2019). Unpacking the Potential of 360degree Video to Support Pre-Service Teacher Development. Research on Education and Media, 11(1), pp. 63–69. https://doi.org/10.2478/rem-2019-0009  

BlendMedia (2020). https://blend.media/  

Ferdig, R. E., Kosko, K. W., Maryam Zolfaghari (2020). Preservice Teachers’ Professional Noticing When Viewing Standard and 360 Video. Journal of Teacher Education, 72(3). 

Ferretti, F., Michael-Chrysanthou, P. & Vannini, I. (Eds.) (2018). Formative assessment for mathematics teaching and learning: Teacher professional development research by videoanalysis methodologies. Milano: FrancoAngeli. ISBN 9788891774637. http://library.oapen.org/handle/20.500.12657/25362  

Google (2015). https://about.google/stories/year-in-search/ 

Grossman, P., Compton, C., Igra, D., Ronfeldt, M., Shahan, E., & Williamson, P. W. (2009). Teaching Practice: A Cross-Professional Perspective. Teachers College Record: The Voice of Scholarship in Education, 111(9), 2055–2100. https://doi.org/10.1177/016146810911100905  

Hallberg, S., Hirsto, L., & Kaasinen, J. (2020), Experiences and outcomes of craft skill learning with a 360° virtual learning environment and a head-mounted display. Heliyon. https://doi.org/10.1016/j.heliyon.2020.e04705  

Hamilton, D., McKechnie, J., Edgerton, E. & Wilson, C. (2021). Immersive virtual reality as a pedagogical tool in education: a systematic literature review of quantitative learning outcomes and experimental design. Journal of Computers in Education. https://link.springer.com/article/10.1007/s40692-020-00169-2   

Hyttinen, M. & Hatakka, O. (2020). The challenges and opportunities of using 360-degree video technology in online lecturing: A case study in higher education business studies. Seminar.net. https://doi.org/10.7577/seminar.2870  

Kim, J. , Park, J. H., & Shin, S. (2016). Effectiveness of simulation-based nursing education depending on fidelity: a meta-analysis. BMC Medical Education 16(152). https://doi.org/10.1186%2Fs12909-016-0672-7  

Merlot 2022, merlot introduction Merlot Organization, viewed 14 June 2022.  https://www.merlot.org/merlot/

Rustici Software. (2021). cmi5 and the Experience API. https://xapi.com/overview/; https://xapi.com/cmi5/ 

Snelson, C., & Hsu, Y. C. (2019). Educational 360-Degree Videos in Virtual Reality: a Scoping Review of the Emerging Research. TechTrends 2019 64:3, 64(3), 404–412. https://doi.org/10.1007/S11528-019-00474-3 

Tan, S., Wiebrands, M., O’Halloran, K. & Wignell, P. (2020). Analysing student engagement with 360-degree videos through multimodal data analytics and user annotations. Technology, Pedagogy and Education 29(5). https://doi.org/10.1080/1475939X.2020.1835708  

Taylor, M., Wharrad, H., Konstantinidis, S. (2022). Immerse Yourself in ASPIRE – Adding Persuasive Technology Methodology to the ASPIRE Framework. In: Auer, M.E., Hortsch, H., Michler, O., Köhler, T. (eds). Mobility for Smart Cities and Regional Development – Challenges for Higher Education. ICL 2021. Lecture Notes in Networks and Systems, vol 390. Springer, Cham. https://doi.org/10.1007/978-3-030-93907-6_116 

Wharrad, H., Windle, R. & Taylor, M. (2021) Chapter Three – Designing digital education and training for health, Editor(s): Stathis Th. Konstantinidis, Panagiotis D. Bamidis, Nabil Zary. Digital Innovations in Healthcare Education and Training, Academic Press, 2021, pp. 31-45. ISBN 9780128131442, https://doi.org/10.1016/B978-0-12-813144-2.00003-9. (https://www.sciencedirect.com/science/article/pii/B9780128131442000039

Windle R. (2014). Episode 2.3: The ASPIRE framework [Mooc lecture]. In Wharrad H., Windle R., Designing e-learning for Health. Future-learn. https://www.futurelearn.com/courses/e-learning-health 

Categories
News The project story

Creating new editing tool for teachers

If a teacher realises that her students would benefit from training in an interactive 360° environment on a particular topic – wouldn’t it be great if she could easily create it herself, without help from others? One of the goals for the 360ViSi project is to make that possible.

It’s the teachers who know the curriculum, the material and the learning outcomes. They also know when visual representation and repetition would help students understand concepts. Being able to create 360° interactive educational content themselves would shorten the time and effort from idea to finished product. At least that’s our experience from the 360ViSi project.

Project partner Quasar Dynamics is now developing an editing tool for 360° videos specifically for teaching purposes which should be easy to use for teachers, based on the needs that we have experienced during the 360ViSi journey.

Usability testing across borders

Lead developer at Quasar Dynamics, Jaíme Diaz González, and his colleague Manel Almenar, created two possible prototypes. We saw this as a great opportunity to share skills between countries and organisations.

Gloria Orri, UX designer at University of Stavanger, ran a usability test to evaluate the two prototypes, with participation from health professors from both University of Stavanger and The Catholic University of Valencia “San Vicente Mártir”.

Usability testing prototypes of the new editing tool.

Adjusting according to test results

The usability test gave insight about what direction the design should take to ensure that the editor will be intuitive to use for teachers who are not familiar with video editing tools. Another goal is to ensure that it will cover all their needs to create 360° content for education.

Qualitiative interview after the usability test.

With the feedback from the usability test, Quasar Dynamics will build an editor for teachers that probably will be presented during autumn 2022.

And the best thing of all? When it is finished, the first edition of the tool will be available to use for everyone – free of charge!

Another 360ViSi editor and player

Project partner Turku University of Applied Sciences has also developed an editor with the working title 360° Editor. It is also suitable for larger and more complex scenarios. The target group for the tool is also teachers. The code for this tool will be published on the developer platform GitHub.

In addition, project partner ADE Ltd has developed 360°Player, which is an output tool for the simulation made in the 360° Editor.

Read more about the editor and player here.

Also relevant

Other stories about the 360 editor and player:

Testing the 360ViSi editor.

Developing the 360 player – a status

Categories
The project story

Findings on 360 video in other projects

CoViRR

Similar to the 360ViSi project, the Erasmus+ project titled “Co-creation of Virtual Reality Reusable e-Resources for European Healthcare Education (CoViRR)” aims to consider the approaches to enabling and producing 360° interactive video which can be embedded within Reusable Learning Objects (RLOs).

CoViRR has held a number of meetings, training events and international presentations on the strategic aims, development progress and findings to date. The work packages, as deliverables for this project, include the co-creation and technical implementation of VR activities (as scenarios) together with an analysis of feasibility and acceptability of reusable VR e-resources in the form of documented best practices and recommendations from a co-creational design aspect.

The dissemination of results will occur in the following ways:

  1. Virtual channels at local, national and international levels to inform about the created VR e-resources; attracting more learners from other universities for analyzing the feasibility and the acceptability.
  2. Outcomes delivered through a multiplier event, journal publications, conferences, social media, and websites, press and internal and external partner networks.
    More information is available from the CoViRR website.

The 360ViSi project has been able to benefit particularly from technical input from partners also involved in the CoViRR project. With both projects running concurrently, it has been a good opportunity for learning technologists, software developers and researchers alike to be able to exchange useful knowledge and common interests between all involved.

ARsim2Care

Following the same line as 360ViSi, there is another Erasmus+ project, titled “Application of Augmented Reality in Clinical Simulation (ARsim2care)”, which aims to enhance the use of manikins with augmented reality in order to carry out certain nursing techniques.

The main goal of this project is to develop an Augmented Reality (AR) software that, combined with clinical simulation anatomical models, allows students to learn how to perform invasive clinical procedures, helping them visualize internal anatomical structures. The project is addressed at a European level as the training of technical skills is a common need for European nurses.

The procedures compiled in a manual included in the ARsim2Care app, are endotracheal intubation, arterial blood sampling, intramuscular injection, nasogastric tube insertion and suctioning via a tracheostomy tube.

This manual includes the detailed description of each procedure, and it illustrates the key internal anatomical structures to be visualized by the student during the procedure.

The ARsim2Care project was initiated before 360ViSi, and it is in its final stage. It has allowed the 360ViSi team to see how new technologies, particularly technologies related to VR and AR, can help students gain better practical/technical competencies. Researchers from both teams (ARsim2Care and 360ViSi) have been in contact to discuss results and exchange experiences.

Categories
The project story

Adding 360 video to the ASPIRE framework

By Michael Taylor, Learning Technologist and Project Developer at University of Nottingham.

Any successful project requires good planning, progress tracking and review to ensure project costs, scope and deadlines are met. To this end, the Health E-Learning and Media team (HELM) at the University of Nottingham have devised the ASPIRE development framework which also importantly supports pedagogical design methodology and has been applied by the team over many years in their development of Reusable Learning Objects (RLOs).

The ASPIRE framework for development of projects.

The challenge now is to see if ASPIRE can be used successfully as a framework for 360 video development or would experiences learned when producing 360 video content highlight the need for major revisions to the ASPIRE framework, or even lead to the adoption of a totally new design methodology to cater for 360 video and other immersive technology development.

Why use ASPIRE?

One reason why ASPIRE was developed by Helm and used instead of other available development frameworks, is because we believe it fully supports a community participatory design approach to development.

When developing resources, we understand that including as many project stakeholders as possible is key. Such stakeholder involvement helps to identify and align the requirements of all the focused learner groups, this in turn helps to shape resource content and the way it is represented.

The ability to share stakeholder knowledge and expertise helps ASPIRE to assist the development of quality resources, which are fit for purpose and can be shared not only directly by all project stakeholders but also ultimately with the wider Open Educational Resource (OER) community. For example, the Community Nursing Scenario developed by the Nottingham team as part of the 360ViSi project may benefit from ASPIRE’s two step peer review process, which would allow experienced UK community nurses to review content for accuracy before starting development.

A second technical review could be carried out after the development and prior to release. Such quality assurance steps would help identify potential resource bugs before release.

What is ASPIRE?

ASPIRE is an acronym of all the development framework steps involved during resource development and after release.

  • Aims – helps project teams to focus on getting the right learning goals and objectives
  • Storyboarding – allows the sharing of initial ideas for resource content and enables a community-based approach to resource design
  • Population – combines storyboard ideas and allows the content author/s to write a fully formed specification, which includes a narrative and multimedia elements
  • Implementation – a specification/content peer review is provided by a suitable subject expert not yet involved in the project, allowing for suggestions and concerns to be raised and addressed before starting/implementing resource development.
  • Release– a second technical peer review is applied, and any issues are addressed and approved before resource release
  • Evaluation– using data tracking and feedback forms to collect data for research and development purposes to assess the resources impact.

An Online toolkit to support ASPIRE framework

A number of bespoke tools have been developed and implemented to help support the procedures and requirements of the ASPIRE framework within any given project. They co-exist alongside off-the shelf commercial applications to deliver a full online toolkit supporting all ASPIRE development steps.

The toolkit proved invaluable during the recent pandemic when meeting in person was not possible and allowed online group participation and development through tool use. The toolkit includes an off the shelf storyboarding application which allows numerous participants to simultaneously access, review, discuss and record project ideas.

There is a bespoke specification tool that allows content authors to build on ideas raised at the storyboard stage and create, populate and implement an online development specification, which can be accessed and shared by the whole development team. The specification tool further helps with versioning issues which can arise when working with large stakeholder groups.

Example of the bespoke specification tool which can be shared with all project stakeholders.

Can 360 video methodology and other immersive technologies use ASPIRE?

It was fully anticipated from the onset of this project that the ASPIRE framework could successfully be used for 360 video and future immersive technology development. However, there was an expectation by those experienced in applying the ASPIRE methodology that some adaption would be required.

Recent Helm collaborations with international development teams on sister Erasmus+ immersive 360 VR projects such as CEPEH and CoViRR had identified certain limitations when using ASPIRE framework and associated tools. For example, our bespoke Specification tool required manual modifications from each of the content authors to cater for 360 video and VR content development.

Employing Need Analysis methodology

To help understand how the development needs of 360 video content differed from that of more traditional multimedia content a need analysis was carried out by the 360 visi partners. This ensured that all development requirements were recognised and recorded at the project start to help identify any potential adaptations required for the ASPIRE methodology and associated tools.

A team of 12 developers and academics from all partners met regularly at the early stages of the 360 Visi project to set some overall project guidelines and create working practices. During one meeting the Nottingham team showcased their developed online ASPIRE specification tool, which had the capability of being accessed and shared.

The need analysis involved the review of four 360, VR specifications belonging to each of the partners for suitable use in the project. The results of these reviews and specific requirements and recommendations from each team are presented below. (See Table 1).

Results

It immediately became apparent that each partner specification asked for slightly different data input. For example, the form used by the University of Stavanger had text fields to enter such information as the duration of each scene and camera placement. This differed from the Universidad Católica de Valencia who wanted to record further information about each individual medical case.

Nottingham’s specification document asked for detailed prop information and the TUAS form was focused more on completing a simulation scenario as part of their bespoke tool development.

A decision was made by all partners to design a standard specification form that would include features that each partner deemed necessary for the development of immersive content. Although each partner recognised the ASPIRE specification tools potential, it was primarily designed to be used with Nottingham’s more traditional RLO format and it was agreed that it required amendments to fit immersive project needs. The results are entered below.

Project partnerCritical review and suggestions
University of Stavanger·   Pre-production section – common case understanding for all team members
·   Technical preparation – required equipment and how it should be used in project
·   Communication – documentation – consent forms
·   Community of practice – inform all relevant parties about project.
·   Post production – brief information on tools to develop content after the film shoot
·   Area to view test footage
·   Room description
·   Observation view – Where to place camera
·   Scenario description
·   Audio, hotspot text and narration if required
·   Rationale for producing in 360 instead of 2D video
The Catholic University of Valencia “San Vicente Mártir”·   Specification learning outcomes
·   Case description (background information provided to student prior to accessing 360 video)
·   Hotspots – with or without static photos
The University of Nottingham·   Required props
·   Required filming/lighting equipment
·   Cognitive learning outcomes
·   Scene description – signs, external sound etc.
·  Filming approach – camera settings – formats
Turku University of Applied Sciences·   Required props
·   Simulator setup/manikin preparation 
·   Observer tasks
·   Scenario lifesavers – case briefings
·   Pre-material required
·   Major problems encountered
·   Non-technical co-operation – staffing

A visual representation of intended additions to the specification tool can be viewed in the illustration below. The sections in green represent the current functionality available in the tool including the adding of:

  • text in the form of a narrative
  • multimedia assets
  • author ‘further comments’ pop-up box

The subsequent blue sections represent the additions recommended as part of a basic need’s analysis through a series of meetings/consultations with 360 Visi project stakeholders.

Detailed explanation

The following section explains in more detail the suggested tool functionality that each segment represents in the illustration.

Filming Equipment

Description of cameras, microphones, lights etc. required for each shoot. This is an essential checklist for all the production teams.

Prop Listings

Detailed list of everything required in each scene e.g., from uniforms to nursing equipment through to everyday items such as bicycles, cars etc. Everything required in shot to make the scene as realistic as possible.

Scene Area / Description

What is to be highlighted in each video clip, e.g., streetlamp to highlight lighting security at night, or key safe to support access issues.

Observation View (Camera placement)

Not always easy to predict, especially if the production team are unfamiliar with the area that is to be filmed. However, it does help focus production team on task at hand and gives a valuable guide on camera placement whilst on location.

Audio Description

Used for narrative description of each scene. It helps guide user to where the action is located when using 360 clips.

Scene Duration

Estimated length of scene, very useful whilst out on location and during post-production.

Scene Learning Objectives

What leaning objectives will be met in each scene, they are broken down into Skills, Knowledge and General competence.

Hotspot Learning Element Placement

Used mainly in post-production to show editor where each hotspot is required, this also very useful during filming sessions, as a potential hotspot could become lost in 360 camera’s stich line. This will lead when considering the camera placement for every scene.

Hotspot Text

This information-based text is often overlooked by content authors and is left to the developer to write and include. This section would specify if an image, 2d video or audio clip is to be added as part of the hotspot for the learner.

Discussion

There is an overall necessity highlighted through our work with immersive technology for the adaptation of some stages of the ASPIRE framework. For example, a recent project undertaken by HELM on the repurposing of case-based learning helped introduce a modification to the Storyboarding stage of the ASPIRE process. The aim of modifying the ASPIRE process was met, as the Storyboard stage was changed from previous collaborative visual idea generation to decision-making tree, and content driven discussion. [2].

The content creators need also to realise that a VR, XR or interactive 360 video resource is more immersive and therefore training around both understanding the differences on the design with more traditional web based educational resources, but also how to describe the content is needed. [3].

Early adopters of the ASPIRE framework on 360 videos for clinical skills [4] identified the need of adapting the ASPIRE framework and for more concrete tool functionalities. This is in line with the recommendations that are made in this discussion.

In conclusion, the ASPIRE process is a recognised development methodology for creating high quality digital resources. [1] The needs analysis carried out during this project provided an opportunity to adapt our existing bespoke specification tool currently used in the ASPIRE process. 360 and VR technologies are the next generation of media and will be increasingly adopted for use within learning resources. Creating robust specifications prior to creating the media for these resources is crucial to ensure future high quality and efficiency in development.

Future developments

Adaptation of existing paper-based tools is currently under review and our peer review forms will shortly be joining the other bespoke tools available in supporting the online implementation of the ASPIRE process.

Acknowledgments

This work was supported by the ERASMUS+ Knowledge Alliance “Increase access to training in European health education through 360o video simulation technology, 360ViSi,” and the ERASMUS+ Strategic Partnership in Higher Education “Co-creation of Virtual Reality reusable e-resources for European Healthcare Education CoViRR” (2018-1-UK01-KA203-048215) projects of the European Union.

References

1. Wharrad, H., Windle, R., Taylor, M.: Designing digital education and training for health. 10.1016/B978-0-12-813144-2.00003-9, (2021).

2. Pears, M., Henderson, J., and Konstantinidis, S.: Repurposing Case-Based Learning to a Conversational Agent for Healthcare Cybersecurity. 10.3233/SHTI210348, (2021).

3. Konstantinidis, S., Wharrad, H., Nikolaidou, M, M., Antoniou, P., Neokleous, K., Schiza, E., Matsangidou, M., Frangoudis, F., Hatziaros, M., Avraamides, M., Bamidis, P, D., and Pattichis, C, S.: Training the Trainers Curriculum on Co-Creation of Virtual Reality Reusable EResources: 12th International Conference on Education and New Learning Technologies, (Edulearn20). 5752-5761. (2020).

4. Schiza, E.C., Hadjiaros, M., Matsangidou, M., Frangoudes, F., Neocleous, K., Gkougkoudi, E., Konstantinidis, S. and Pattichis, C.S.: Co-creation of Virtual Reality Re-Usable Learning Objectives of 360° video scenarios for a Clinical Skills course. IEEE 20th Mediterranean Electrotechnical Conference (MELECON) (pp. 364-367). IEEE. (2020).

Categories
News The project story

The business side of 360 video simulation

With the ambition to highlight business opportunities for private companies using 360 video, the day started with a presentation from Stavanger Art Museum and Screen Story. They talked about their successful collaboration with 360 visualisation of art exhibitions. Read more about it here.

ADE shared their experiences about commercial and educational 360 and VR solutions that they have developed. A Learning Management System (LMS) developed was presented, which includes practical trainings in health care and the technical sector.

Quasar Dynamics presented their other projects with immersive technologies, for instance using VR goggles for people with different disabilities, and a horror Playstation game they are soon to launch.

Guest input

Guests VID Specialized University and the company Mediafarm have collaborated to create student active learning activities.

They explained their journey from the beginning of working with 360 video, what they have learnt and shared advice on everything from actors vs real patients, locations, scripts, how to create emotional impact, placement of camera, lighting, user interactions, playback solutions, subtitles, graphic interface needs, technical considerations and more.

The 360ViSi project team was invited to try out one of their education solutions with VR goggles.

Elizabeth Armstrong, UiS (in front), Javier Ortizá, Quasar Dynamics, and Mari Linn Larsen, UiS, deeply concentrated in a 360 environment.
Mike Taylor from the University of Nottingham tested one of VID’s 360 video education tools.
Arne Morten Rosnes (VID), Severin Ommundsen and Morten Orre (Mediafarm) shared their experiences and best practices from creating interactive 360 video for education.

Excursion to companies

The project team visited two different companies located in Stavanger, who both use simulation tools for training and education purposes – in very different ways.

Aker Solutions delivers integrated solutions, products and services to the global energy industry, and uses simulation extensively. For instance, the company simulates complex offshore operations in order to prepare staff, to reduce risk of accidents and injuries, and to enhance efficiency.

Aker Solution’s simulation dome, where crane operators and other personnel can train and plan how to execute heavy lifts. Weather and waves may be changed to affect the situation. Clear communication with personnel on the platform deck is crucial, and also part of the simulation.
Aker Solutions has an advanced simulation centre used to train personnel before complicated operations in the offshore oil and gas industry.

The team also visited Lærdal Medical who creates realistic simulation-based learning for healthcare education, for instance the world-known Resusci Anne manikin.

Bodil Bø from the Faculty of Health Sciences at University of Stavanger looks at the premature manikin from Lærdal Medical, which is used to train health personnel.
Lærdal Medical’s most advanced simulation manikin is called SimMan, which here being programmed at the factory.
Categories
News The project story

Transnational project meeting and workshop in Norway


The purpose of the meeting and workshop was to share information and learning outcomes from the project so far.

On the first day, all the project partners presented their case studies and analysis of the features that worked well, experiences on what did not work, and shared information about the different tools they had tried out in their respective case studies.

Workshop well underway. Mike Taylor presents case study from the University of Nottingham, and tells the rest of the project team about the process, findings and results.
Esther Navarro from the UCV Catholic University Valencia sharing experiences and best practices from their own case studies.


Student involvement

Two student nurses from the University of Stavanger participated and gave feedback about how they experienced using one of the tools produced by the project, and also gave advice on elements that would make students interested and invested in using these kinds of tools.

Students David Kristoffer Nymoen Eltervaag and Tov Mosvær Sandland gave their honest feedback about a 360ViSi project from University of Stavanger which they had tested.

Trip to the famous Lysefjord

After a long day with interesting presentations and dicussions, the project team spent the evening with teambuilding and dinner.

A boat took the team to the beautiful Lysefjorden to look at the famous Pulpit Rock.

Socialising and getting to know eachother.
Lysefjorden
Categories
News The project story

360ViSi project presented at Science Week in Valencia

Science Week at the Catholic University of Valencia, Spain, is an annual event hosted by the Vice-Rectorate for Research aimed at all students and academics interested in science, innovation and new technologies in teaching. 

This time, Dr. David Fernández and Dr. David Sancho, members of the 360ViSi project, presented some of their projects within teaching innovation. One of them was the 360ViSi project.

You may see a summary of the topics in this video.

Categories
News The project story

Technology and tools – an overview



1. The characteristics of 360° video

360° video or still images immerse the user in a virtually generated environment through photography or a video obtained with 180° / 360° cameras, that is to say, cameras that capture images or footage in all directions and generate a panoramic sphere that the users can turn and explore.

Technically, the 360° cameras capture overlapping fields of multiple lenses pointing in different directions, or from various cameras mounted together in a sphere. Special software is used to compress and join the different images, usually in the shape of a rectangular frame called an equi-rectangular projection.

The result is a still image, or a linear running video, that allows for a view in all directions and multiple modes of visualisation on various devices:

  • Through a Virtual Reality device, in which the screen updates the scene being displayed according to the user’s head motion.
  • On a smartphone, by shifting your device in the direction you want the video to display.
  • On a computer, by scrolling through the window to focus on the different parts of the scene. This is also possible for smartphones and tablets.
  • On a conventional screen, in which case the user is not in control of the viewing angles.

The different devices and modes for projecting 360° imagery also allow for different pedagogical applications. Understanding these technical differences will be important in order to maximize the benefits of 360° video used in education.

The focus of the following in this article is 360° video. However, many of the features and technologies mentioned may also apply to 360° still images. And not least for use in education, a combination of these two formats may also be relevant.

2. A comparison: 360° video to VR 

360° video can easily be misunderstood as the same as virtual reality (VR). Both are immersive media formats but differ greatly in how complex they are to produce and by their properties for user interaction.

With the irruption and common availability of 360° recording cameras (Samsung, 360Fly, Ricoh, Kodak, Nikon, etc) the already familiar term “Virtual Reality” was used in marketing to imply what could be enjoyed with the new recording devices, as well as the images that they produce. The fact that their images also could be reproduced on Head-Mounted Display (HMD) devices, or “goggles”, that are a characteristic of VR technology, gave support to the use of the term. Still, no professionally accepted understanding of the term “Virtual Reality” apply to 360° video, which should instead be understood on the basis of its own, unique characteristics.

A prominence of VR is that it is made up of simulated environments, synthetically recreated by 2D or 3D imagery, which is exceptional when the need is to portray inaccessible or dangerous scenarios and environments. The prominent feature of 360° video, on the other hand, is its actual ability to photo realistically play back environments and actions as they were recorded in real-time, which is exceptional when the goal is documentation or observation of real scenarios, events and environments.

Understanding the strengths of the two media also provides insight into their limitations. Their visual characteristics are a fundamental aspect. Differences in navigation, interaction, production, playback and sharing are others, which must be considered when deciding how these media can best be utilized as teaching aids.

2.1    User interaction and navigation

The virtual motion characteristics is another of the differences that is evident between the two media of 360° video and VR. Since a 360° video is previously recorded, the user is limited to looking around and following the directed timeline, with no option of movement neither in the virtual space the video portraits nor beyond the time it constitutes. Any movement must be directed at the time of recording. It should be noted that such recorded movements in a 360° video unfortunately may result in an unpleasant, dizzying experience for the viewer who passively watches the video.

Still, from a pedagogical point of view, the fixed movements and timeline of a 360° video may also be understood as a favourable property. In contrast to the open environment of VR, the directed environment of a 360° video constitutes intentional story lines that will aid the viewer to an intended learning outcome. Shorter story lines that are interlinked may very well be used as a strategy to expand the user’s room for manoeuvre, who at certain points in a 360° video can decide which story line to follow.

In VR, the user controls the experience within the created, synthetic environment and the properties it has. Its key feature is that users virtually have the ability to move through the environments and interact with them at their own will.

Between the two media technologies of VR and 360°video, the latter is basically the less flexible in terms of interactivity. However, systems that enable interactivity can be applied to a 360°video by a pointer that can be housed normally in the centre of the viewer, combined also for complementary functions with a physical button housed in the playback device.

Using the 360° video content on a PC-screen will enable an even wider range of opportunities when it comes to hotspots and interactivities. However, other methods of interactivity, such as by the classic keyboard, mouse or joystick, are more difficult to adapt. It is by such devices all virtual, or extended reality media technologies (XR) in general, unfold their advantages — including virtual reality (VR), augmented reality (AR) and mixed reality (MR). Including also motion sensors, the potential of user interaction, e.g. by hand movements, indeed make XR the preferred media if a high degree of user interaction is the goal. However, application of these interactive features in XR normally will entail the need for expert developers, as well as the need for custom playback devices. In contrast, 360° video generally is easier to develop, and will normally require only off-the-shelf consumer tools for recording and playback.

3. Capture, author and playback

3.1    Recording 360° video

360° videos can be captured with consumer grade 360° video cameras like Insta360 ONE R. There are of course more professional versions that can capture stereoscopic video or more resolution. There are few good practices for recording, but the basic idea is to consider the camera as an observer: Positioning on eye level and aiming lens towards main action.

With device manufacturers software it is possible to convert raw 360° videos in usable format pretty easily without any video editing skills. Afterwards you can use any video editing software to edit your video footage. If you need special effects related to 360° video, you will need special tools. Some video editors like Adobe Premier or DaVinci Resolve, have these options as plugins. When requirements of postprocessing grows, so does the required skill set.

For more information on hardware, programmes and first steps in 360° video producing, the 360ViSi project has published a User Guide on Basic Production of 360-Degree Video. 

3.2    Authoring an interactive experience

There are multiple ways to build 360° video content into an interactive experience. The easiest way is to use an existing tool, like 3DVista, to create a 360° video tour and also include other embedded content.

These tour platforms usually allow the creator to include multiple action points between which the user can teleport. From these different action points the user sees a different 360° video or image that also can include overlays of other interactive content. Such content can usually be additional contextual images, videos or text information. The 360° videos can also be embedded into a VR application (3DVista, 2021).

Interactive VR experiences are built using 3D game engines, like Unity or Unreal Engine with a specialized software development kit (SDK) added on top. Game engines are suitable for VR development because they already deal with real-time 3D rendering and user interactions. A 360° video or image can be used as the background of the application, allowing the addition of interactive 3D models overlay on the video. The output of VR development is usually a desktop application, but there are options for different platforms (Valve, 2021), as referred to below.

Other XR experiences, like interactive augmented reality (AR) content will in turn require other tools. Augmented reality (AR) is an XR technology that overlays and combines virtual data and synthetic graphics with real images and data from the physical world. Some of the current development tools are limited to the platform the content is built for. Common SDKs for AR development include AR Core (Android) and AR Kit (iOS). These SDKs can be utilized in different development environments, like Android Studio and Xcode, and most common SDKs can also be used inside game engines, like Unity or Unreal Engine. There are also more specialized SDKs for different Head-Mounted Displays (HMD), like the Microsoft HoloLens (Chen et al., 2019).

3.3    Consuming 360° and XR media

Both XR and 360° media can be played on several different types of devices. When choosing a playback device, however, there are several factors to consider, such as the capacity for interactivity, the source from which they collect the information and reproduce it, as well as the availability, ease of use and comfort for the user, and not at least, the price.

For 360° video, there are basically two different ways of consuming. Most easily accessible are flat screens with some touching controller, like monitor with mouse or mobile phone with touchscreen or gyroscope. The other way is by Head-Mounted Displays (HMD), or goggles, as discussed further below. HMDs are more immersive, but also custom devices that often needs to be attached to a computer.

With flatscreen devices the only way to interact with a 360° video is to select by clicking and rotating the view. There is also the possibility to use stereoscopic 3D view on smart phones, with a Google Cardboard like headset that divides the sight separately for the left and right eye.

With VR headsets there is the possibility to interact with the environment through controllers and also move in a 3D synthetic space. These more complex interactions demand more from the 3D environment and are not useful features with basic 360° videos.

Consuming AR content varies between use cases. For more entertainment-oriented or just consumer targeted applications, a smart phone is the go-to device to consume AR content. This is due to AR supported smart phones being common. Still, the custom AR Head-Mounted Displays, like the Microsoft HoloLens, are the most suitable and will give the best immersive experience. AR headsets are however a lot more expensive than VR headsets and not targeted towards consumers.

3.3.1      Head-Mounted Display (HMD) devices

VR goggles for PC: The VR goggles that revolutionised virtual reality was the Oculus Rift, followed by others, like the HTC Vive. This type of goggles provides the best user experience, but they require a powerful PC with a good graphics processor capable of emitting satisfactory images.

This type of device is the closest thing to literally being inside a virtual environment. However, such powerful sensation may make the user feel dizzy or unwell. Additionally, the need for cables limits the movements and requires that a distance between two and five metres between the user and the PC should be kept. The installation of the device also needs to be thorough.

Autonomous VR goggles: There are also cordless virtual reality goggles. These autonomous virtual reality goggles have built-in processors, as well as a screen and sensor, all in one.  They operate by a WIFI or Bluetooth connection.

To start using this type of device the user simply needs to put them on, and the experience will start. They have buttons that enable navigation and can be configured to choose what you want to see. Most models are compatible with Samsung and the applications in their catalogue.

Recently, the Oculus Go has been commercialised with LCD screen and an integrated sound system that achieves absolute immersion in the VR experience. Oculus also has a more advanced standalone headset line-up with the Oculus Quest and the Oculus Quest 2 (Oculus, 2020). HTC is also preparing its Vive Standalone, a highly anticipated model, with its own platform and a Qualcomm Snapdragon 835 processor.

VR goggles for smartphones: (Google Cardboard and VR Gear): These devices are the cheapest option although the experience is obviously not the same as with the custom goggles. The way to use them is simple. A smartphone is installed in the goggles case, and thanks to the lenses of the goggles that make the 3D effect possible, the user will get a feeling of immersion in virtual reality. Their appeal is the simplicity, ease of access and low cost.

With a common smartphone, virtually everyone has the opportunity to enter an immersive environment. Only a cheap cardboard case to house the smartphone is necessary either to play a 360° video from a video platform or an application designed for audio-visual immersion (Google, 2021).

3.3.2      Notes on the future technology of XR media displays

One feature steadily improving in 360° videos and VR headsets is the resolution of the cameras and screens, which alone is a feature to notice that will make a learning experience more immersive. 

Replacing the old Oculus standalone VR headset Quest 1, the Oculus Quest 2, released in October 2020, will improve the entry point for VR headsets. The new model will have higher resolution, refresh rate and improved processing power. 

For PC supported VR, the HP G2 Reverb headset was released the fall 2020. The HP G2 Reverb has a much greater resolution than any consumer grade VR headset so far. It also follows Oculus Rift in the regard of removing the need for additional sensors. 

With the new PS5 coming out soon, the possibility for a PSVR 2 is likely. This may expand the market for a consumer-friendly alternative to VR.

Manufacturers like Nvidia and AMD are always pushing the limits for what a graphics processing unit (GPU) can do. With the recent release of Nvidia’s RTX 3080/3090 it opens for better graphical fidelity in VR, which enhances the immersive experience of VR. 

4. Collaboration in 360° and XR media

Real-time communication and multiuser interaction in synthetic virtual environments is common, typically in real time online games. This feature of XR-technology is indeed also useful in XR-applications for education – opening both for instructor led activities as well as collaborative training and simulated activities, where intelligence in the application also may be applied to influence the experience.

A particular benefit is the independence of space, which makes it possible to interact independent of the location of the participants in the virtual activity.

Another characteristic is that the participants will have to be represented in the virtual environment by an avatar. For some activities this may be an advantage, particularly in simulations. For collaborative activities it is likely to be the opposite, as well as a complex and expensive solution for an activity that better may be solved by consumer, online services for live video-based collaboration.

Recently, several services have been developed for the transmission of real-time 360° video. This solution is still not very much explored, but may indeed prove to open for new ways of both remote collaboration and educational activities ‒ where all participants individually can control the camera viewpoint remotely as they like.

During the lockdown of the Corona pandemic this technology as an example, has been reported to be very useful in remote site inspections and surveys. The platform of Avatour is one such promising online service of real-time 360° video conferencing launched in 2020 during the first spring of the Corona lockdown. The service is presented by Avtour as “…the world’s first multi-party, immersive remote presence platform” (Avatour, 2021).

Also in 2020, 3DVista, a leading provider of 360° video services and software, launched another approach to remote collaboration in a 360° video environment called “Live Guided Tours”. Every participant can navigate the tour as they like or follow the lead of the one controlling the tour. Building on the features of 3DVista’s 360° video editor, the experience may also include multimedia interactive hotspots, enriching the experience (3dvista, 2020), which is particularly relevant for educational purposes.

The latest addition to interaction in an XR immersive environment is Metaverse. In late 2021, Facebook launched Metaverse as their ultimate vision of a social media platform, which is supposed also to include Meta Immersive Learning explained to provide a “…pioneering immersive learning ecosystem”. By the addition of immersive technologies like AR and VR, the claim is that learning will “…become a seamless experience that lets you weave between virtual and physical spaces” (Meta, 2021).

It remains to be seen weather Metaverse will live up to its promise, or at all gain popularity among the public. Still, the vision of Metaverse pinpoints the potential of the XR media technologies, including 360° video, and these media as the next step in virtual social interaction and indeed learning.

Have a look at 360ViSi’s review of Meta Immersive Learning. 

5. References and sources

3dvista. (2020, May 28). Live-Guided Tours.
https://www.3dvista.com/en/blog/live-guided-tours/

3DVista.  (2021). Virtual Tours, 360º video and VR software.
https://www.3dvista.com/en/products/virtualtour

Avatour. (2021).
https://avatour.co/

Chen, Y., Wang, Q., Chen, H., Song, X., Tang, H., Tian, M. (2019). An overview of augmented reality technology, J. Phys.: Conf. Ser. 1237 022082

Google VR. (2021). Google Cardboard. Google.
https://arvr.google.com/cardboard/

Hassan Montero, Y. (2002). Introducción a la Usabilidad. No Solo Usabilidad, nº 1, 2002. ISSN 1886-8592. Accessed October 20.
http://www.nosolousabilidad.com/articulos/introduccion_usabilidad.htm

Meta. (2021). Meta Immersive Learning.
https://about.facebook.com/immersive-learning/

Oculus. (2020, September 16). Introducing Oculus Qeust 2, the next generation of all-in-one VR.
https://www.oculus.com/blog/introducing-oculus-quest-2-the-next-generation-of-all-in-one-vr-gaming/

Rustici Software. (2021). cmi5 solved and explained.
https://xapi.com/cmi5/

Sánchez, W. (2011). La usabilidad en Ingeniería de Software: definición y características. Ing-novación. Revista de Ingeniería e Innovación de la Facultad de Ingeniería. Universidad Don Bosco. Año 1, No. 2. pp. 7-21. ISSN 2221-1136.

Valve. (2021), SteamVR Unity Plugin.
https://valvesoftware.github.io/steamvr_unity_plugin/    

Categories
News The project story

Metaverse potential for education

Image: Facsimile from Meta’s Oculus workrooms.

Analysis by Jaime Diaz, Quasar Dynamics

Metaverse is a concept defining the next generation of the internet, describing an immersive, multi-sensory experience based on the use of multiple devices and technological developments. A 3D universe that combines multiple virtual spaces.

How can the metaverse be used?

We understand the metaverse as the combination of technology and software that moves the user’s mind to an environment that is different in sensory perception from where they physically are. This would include the use of VR headsets, controllers, or haptic suits (see image).

A haptic suit (also known as tactile suit, gaming suit or haptic vest) is a wearable device that provides haptic feedback to the body. Photo: Teslasuit.

Immersive Education is designed to immerse and engage students in the same way that today’s best video games grab and keep the attention of players. It supports self-directed learning as well as collaborative group-based learning environments that can be delivered over the Internet.

Some of the possible uses that metaverses could have in educational experiences are described below.

Laboratory experiments

The real laboratory equipment, tools and machinery are expensive. Imagine a chemistry class where a teacher wants to observe the formation of four acids that occur in acid rain. Such an illustrative presentation would require multiple pH sensors, test tubes, protection glasses, burets…

In addition to the economic realities, we should not forget the risks that these kinds of experiments entail.

Quasar Dynamics, one of the 360 ViSi partners, created a complete 3D laboratory to emulate a very common practice in engineering degrees where the students had to use a power supply, a function generator, oscilloscope and a polimeter.

To interact with the environment, students used VR glasses and controllers. As the students “broke” the tools frequently by connecting the wrong poles, this was the best way to save costs but yet learn by using a practical method.

Museums and tours

Imagine if students could visit the most important museums in the world every week.  Metaverse enables this mental transition to the 3D museums.

Quasar Dynamics also developed a 3D art gallery where artists can upload their artworks, such as 3D sculptures, configure the gallery with furniture or add videos and audio recordings. You can visit the gallery here.

Immersive conferences

Meta also released interesting software for real-time 3D conferences. Even with its business focus, it can be used in education, especially for students that have to interact remotely.

In the workrooms, attendees can wear VR headsets or just use flat-screen computers. They can start drawing on a virtual board, add images or different content for others to see. When using VR glasses, an avatar of the user is displayed showing the user’s own tracked hands. This working method could be the future for meetings or work presentations for online education.

Advantages and disadvantages

The metaverse has the advantage of being a completely 3D experience in real-time, allowing the developer to build an environment without the barriers that reality imposes. For example, it is possible to recreate the interior of the International Space Station without having to be there.

However, the metaverse is a young technology that still needs development. Some environments are still laggy if you do not have a powerful, expensive device that many companies are not currently supporting.

In addition, there is some fear that the metaverse will create a generation of people with poor social skills due to a lack of interaction with the real world.

In summary, the metaverse is the future. Not only for educational purposes but also for a lifestyle. We must ensure that students benefit from its technological advancements while still remaining competitive in real life.

Categories
News The project story

Video meeting in 360 environment 


By setting up a video meeting inside a virtual environment, the participants can navigate, discuss and present on the same platform. 

Screenshot from testing this feature. The case is from University of Stavanger’s 360Visi case on ward rounds.

Read more about the ward round case here!

About the 3D Vista feature

According to 3D Vista, it is possible to take guests “by the hand” on a shared virtual tour with personal guidance. Synchronizing where you look, you can take turns on controlling the virtual tour.  

Users are able to explore on their own, and the host can pick them back up anytime to follow the host’s route.

Live Guided Tours are accessible on both desktop and mobile devices, so individuals attending the live video meeting can dial in from their mobile phones as well. 

More than screen sharing

So why not just use Zoom with a normal virtual tour then? According to 3DVista, both quality and speed are better. Sharing a multimedia-intense virtual tour via a screenshare often results in jerky images, latency and overall an undesirable experience.

With 3DVista Live Guided Tours every participant is inside their own actual virtual tour and sees their own media with the participant video bubbles overlaid on top. This preserves the quality of the virtual tour images and the smoothness when moving through the rooms – even if they’re controlled by somebody else, according to 3DVista’s webpage.

Another advantage argued by 3DVista, is that you are preserving interactivity, the essence of a virtual tour. Screen sharing a virtual tour would not allow participants to take control of the tour and explore themselves. With 3DVista Live Guided Tours (and the permission of the host), any guest can deviate from the route and navigate on their own, open any hotspots they’d be interested in and feel in full control of the virtual visit.

By Kåre Spanne, Media Engineer at the University of Stavanger

Sources  

https://www.3dvista.com/en/products/lgt/