A small insight into a few VR/ AR/ MR headsets, glasses and other things

We still don’t know 100% what effects, specific to VR for example, can have on our health or on our brain in the long term. An MRI of the brain while using it cannot be done with VR because the head has to be kept still and that turns out to be a bit more difficult in this case. Using VR has many good aspects such as overcoming trauma, can bring out a realistic level of empathy, reduce pain or even cure phobias. However, stimulus overload can also, in the worst case scenario, lead to creating a new trauma. The warnings, requests for breaks, age limits, or the need to sign documents before using VR cannot be ignored or accepted as careless under any circumstances. This is precisely why it is important to consider in advance what technology to use for what purpose and whether a positive goal can be achieved with it.

In the process of my research, I was mainly interested in headsets, glasses or other things that are relatively easy to put on, as safe as possible to use and perhaps don’t completely exclude your own real environment or at least are easy to take off. Which technology, headset or glasses will end up being the best fit for my project will become clear over time. So here is a small selection of technologies that are interesting for me at the moment:

The Mixed-Reality-Headset Microsoft Hololens 2

  • This headset is put on and tightened with the help of a knob and a headband
  • Eye tracking
  • The headset does not need to be taken off because the visor can be folded upwards
  • Display does not have to be precisely aligned with the eyes to work due to the technology used (laser, mirror, waveguides) in the glasses
  • Not yet immersive enough for the normal consumer
MR Headset Microsoft Hololens 2

The first standalone mixed reality glasses Lynx-R1

  • Does not require external tracking sensors or other devices
  • Optical hand tracking
  • Digital elements can be added to a real-time filmed environment by two cameras on the display
  • VR and AR at the same time
  • Multiple cameras are used to film the environment
MR glasses Lynx-R1

Small VR glasses from Panasonic

  • Ordinary, commercially available VR glasses are much bigger and bulkier
  • Stereo speakers are integrated
  • Is put on like a normal pair of glasses 
  • Spatial tracking
  • Positional tracking through inside-out tracking cameras by tracking the position of the head mounted display and that of the controller
 Panasonics “VR Glasses”

AR glasses Rokid Vision 2

  • Must be connected with a cable to smartphone, laptop or tablet
  • Is put on like a normal pair of glasses 
  • Has Speakers for stereo sound
  • The glasses will be operated by voice control
  • There are specially developed scenarios, such as a Fantasy World. This is an immersive space in which the user can interact with the world through head-, gesture- or voice control
  • The user can move freely in the virtual space through room tracking
  • No word yet on when it will hit the market
AR glasses Rokid Vision 2

VR Arcade Game from The VOID or Sandbox VR

The VOID and Sandbox VR are actually both on the verge of going out of business. Due to the Corona crisis, all arcades had to close, and at the VOID, Disney withdrew several important licenses, such as Star Wars, because the company could not pay for them due to expensive equipment and associated debts. Still, the concept behind it is very exciting. Here are a few key points from The VOID:

  • Through a headset, motion capture cameras, 3D precision body tracking, haptic suits, props like flashlights or blasters meant to represent a weapon, one can explore the game in the physical environment and interact with a virtual world simultaneously 
  • Fully immersive through VR and at the same time physical in the game by making virtual objects resemble physical objects
  • So you are immersed in the game as the main character, and depending on the virtual world and story, you have to complete certain tasks as a team
The VOID Trailer

Sources

  1. Microsoft’s Hololens 2: A $3,500 Mixed Reality Headset for the factory, not the living room, Dieter Bohn (24.2.2019), https://www.theverge.com/2019/2/24/18235460/microsoft-hololens-2-price-specs-mixed-reality-ar-vr-business-work-features-mwc-2019
  2. Lynx-R1: Erste autarke Mixed-Reality-Brille vorgestellt, Tomislav Bezmalinovic (4.2.2020), https://mixed.de/lynx-r1-erste-autarke-mixed-reality-brille-vorgestellt/
  3. CES 2021: Panasonic zeigt extra-schlanke VR-Brille, Tomislav Bezmalinovic (12.1.2021), https://mixed.de/ces-2021-panasonic-zeigt-extra-schlanke-vr-brille/
  4. Rokid Vision 2: AR-Brille kommt in neuem Design, Tomislav Bezmalinovic (15.1.2021), https://mixed.de/rokid-vision-2-ar-brille-kommt-in-neuem-design/
  5. The VOID (2021), http://www.thevoid.com/what-is-the-void/
  6. The Void: Highend-VR-Arcade steht vor dem Aus, Tomislav Bezmalinovic (17.11.2020), https://mixed.de/the-void-highend-vr-arcade-steht-vor-dem-aus/

Augmented Reality Contact Lens

In the process of my research, I came across a company that has presented a prototype of an augmented reality contact lens. The company, Mojo Vision, was founded in California in 2015 and has since been trying to bring to market a contact lens that sits so close to the pupil that the eye can’t see the lens, allowing virtual elements to become visible. The Mojo Lens is currently still a prototype and must first pass some testing to be certified and considered safe and effective to use. It will take probably a few more years before the first medical tests will start. 

Mojo Lens

What is fascinating about the prototype is that technical components such as chips, displays or sensors can be made so small that they can be combined in a contact lens. 

Visualization of the display of the lens through VR

The contact lens should only display information or items when they are wanted. The lens can therefore understand when it is needed and when it is more of a nuisance. The state of the current prototype (shown in 2020) – or more precisely – of the Display of the Mojo Lens can currently be tried out with a VR headset. In this world, you can see, for example, the current weather or weather forecasts, the amount of traffic or a calendar when you focus your eyes on a certain spot in the image.

The mission and goal of Mojo Lens, however, is to use this technology to help people who have severely impaired vision. The lens should be able to highlight important people or things or zoom in on objects to be of help. Only then would the lens be accessible to the public.

Plan of the past, present and future of the Mojo Lens

Sources

  1. Mojo Vision’s Augmented Reality Contact Lenses Kick off a Race to AR on Your Eye, Jason Dornier (17.1.2020), https://singularityhub.com/2020/01/17/mojo-visions-augmented-reality-contact-lenses-kick-off-a-race-to-ar-in-your-eye/
  2. Mojo Lens The World’s First True Smart Contact Lens, Mojo Lens (2021), https://www.mojo.vision/mojo-lens
  3. The making of Mojo, AR contact lenses that give your eyes superpowers, Mark Sullivan (16.1.2020), https://www.fastcompany.com/90441928/the-making-of-mojo-ar-contact-lenses-that-give-your-eyes-superpowers

UX, UI in VR, MR

Framework for VR

In terms of user experience, the rules for 3D are different from those for 2D. Therefore, there must also be other or new design processes or models that incorporate this very option of interaction.
The important thing in VR or MR is to find the perfect balance between interaction in the virtual world and using the most suitable tools.  Since VR or MR only works if users can interact with the virtual world, it is important to consider to what extent the user has influence. For example, a very poor user experience is when intermediate scenes last too long and the user switches from being an active participant to a passive one. So if both the visualization of the world, the sound or the acoustics and all haptic stimuli are right, the user can move seamlessly through the world. A big help can be when a spacious and familiar environment is used, as well as changing, natural sound or acoustics, and that the user gets feedback when they do certain things or when the world changes, for example. Since the virtual world usually seems very large and the user has too much freedom it is useful to include directional cues as orientation. These guide the user in a certain direction, tell that user what to do or show him/her certain destinations. This helps to give the user a greater immersion and the purpose of this experience is more easily understood.

The company Punchut developed a VR Experience Framework in which any interactive world can be placed. The framework consists of the 3 axes actual to rendered/simulated reality, fixed point to free movement and the interaction in passive to active participant. Another important key point in VR or MR plays the time, because the environment and the experiences are constantly changing. So this must also be taken into account.

A user can only trust the environment, empathize with the world, or change if the physics of the world are developed far enough. By changing, it is meant that the user can learn or develop certain skills in, for example, medical applications or the rehearsal of difficult situations in VR. Since medical applications rather rarely contain narrative elements, it is nevertheless necessary to follow the basic rules when designing for VR. Emotions can also be generated by very simply designed rooms.
It has been shown that a customizable avatar with different skin color, gender or body type can be freeing for some users, making them more likely to drop inhibitions and feel braver or more adventurous. Most importantly, however, are the hands, as these are used to interact with the virtual world. The most suitable interaction option is hand tracking, which means no attached UI elements like controllers for example. The reason for this is that it allows the user to completely put themselves in the avatar’s shoes. Especially when the feeling of presence is very strong, it is important to let the user reacclimatize after the VR experience to be able to get used to real life again. This can take a few seconds to even minutes. It can be helpful to accompany the user slowly from the virtual to the real world by taking off the headset, slowly bringing in the sound of the real world or the awareness of one’s own orientation. 
It can happen that people hesitate before using VR, because they are either put off by the headset, the controllers or the fact that they dive with their whole body into another world. To convince potential users otherwise, you can try to introduce them to VR slowly and let them explore the world and its features first. In the best case, virtual reality can even seem like magic.
Furthermore, motion or simulation sickness is a big issue. People can suffer from nausea or vomiting, paleness, dizziness, and headaches if their physical perception of themselves differs from what they experience in VR. For example, the eyes think the body is moving even though it is not. To prevent this disease, all consequences and impacts must be prevented or recognized in advance.

UX principles

Bill West came up with 8 best practice principles to ensure a good user experience in VR. This is a summary:

  1. The virtual world should resemble a real world as much as possible. Components of this are, for example, lighting mood, shadows or backgrounds 
  2. Users must have a clear role and know exactly what the goal and task of the VR application is
  3. The user is only truly integrated when he forgets that he is in a virtual environment. Users must therefore be involved through interactive elements 
  4. The more senses are engaged, the better the VR experience is
  5. Users can quickly become overwhelmed, which is why it is so important to focus the attention on the important things. If something does not serve a purpose, it should be left out
  6. To avoid confusing users, interactions should be consistent and cues should be introduced in both visual and audio forms
  7. The correct placement of objects is important in order not to cause discomfort to the user. The best distance between the user and the object is between 1 and 20 meters
  8. Safety and comfort are important issues. There should be enough space for the user to move around safely. Motion sickness can be avoided by eliminating all conflicts between different sensory inputs. In addition, it should always be possible to pause the VR experience

Ergonomics in XR

Iron Man

The subject of VR has been addressed in movies like Iron Man for quite some time. However, a helmet like the one Tony Stark wears would be very exhausting after 10 minutes at the latest, because the user interface is too much and would overwhelm the user. The following two diagrams should help to make the VR experience as ergonomic and pleasant as possible:

Sources:

  1. An Experience Framework for Virtual Reality, Jared Benson, Ken Olewiler, Joy Wong Daniels, Vicky Knoop, Reggie Wirjadi (02.06.2016), https://medium.com/punchcut/an-experience-framework-for-virtual-reality-f8b3e16856f7
  2. Design Insights for Virtual Reality UX, Jared Benson, Ken Olewiler, Joy Wong Daniels, Vicky Knoop, Reggie Wirjadi (08.06.2016), https://medium.com/punchcut/design-insights-for-virtual-reality-ux-7ae41a0c5a1a
  3. 8 Ways to Create a Better UX in Virtual Reality, Bill West (20.06.2019), https://learningsolutionsmag.com/articles/8-ways-to-create-a-better-ux-in-virtual-reality
  4. Designing User Experience for Virtual Reality (VR) applications, Sourabh Purwar (04.03.2019), https://uxplanet.org/designing-user-experience-for-virtual-reality-vr-applications-fc8e4faadd96
  5. Picture Iron Man: https://www.pearlriverflow.com/the-mire/2019/5/6/the-iron-man-hud-is-terrible

Impact of VR on our emotions

As I want to use XR (extended reality includes AR, VR & MR) for my master thesis and since the current topic is prosopagnosia and I want to influence their lives for the better, I thought it would be useful to research if and how much VR or MR can influence our emotions. Actually there exist a few various studies on the subject by now and I looked into a few of them. But first it is important to clarify the terms mood and emotion because the two are not the same. Mood does not have to have a specific cause, but is long-term and has low intensity where emotions are triggered by specific things, are short-lived and much more intense.
Emotions can be differently measured or recognized because each person responds differently based on their lifestyle and culture. However, the main characteristics to be able to measure emotions are in most cases subjective evaluation and perception, actions, facial or vocal expressions, heart rate, skin reactions and a few others. This is where user experience comes into play, especially in VR. In order to ensure a good UX some areas need to be included. This includes the feeling of presence within the environment or immersion, the system’s ability to interact with user input, and the user’s involvement in the virtual environment. It can be both passive and active. However, the user must always be a part of the environment or able to do something with it.

But back to the studies.

The Virtual Counselling Environment

The first study deals with the extent to which virtual people can have an influence on the emotions of the user.

To investigate this, a computer-generated counselling environment was created. Furnishings were oriented on a normal therapy room, which means sofas, paintings, books, a table and a chair on which the user can sit down. The user should feel as if he is in this virtual world and sitting on a real chair. The patient is a virtual human called ‘Justin’. The therapist is the user wearing the headset. So a training environment has been set up where virtual patients can be treated by novice clinicians and those therapists can practice challenging situations. To make it more realistic, it is important that the complete process is included in the virtual environment. This includes coming in, reactions, statements and leaving the room. The virtual human has basic movements, human facial features and a human voice. The whole system works only if speech recognition is built in, because only then a positive response is possible and the user is not frustrated.
Surprisingly, the virtual human could simulate human emotions so well that the user could recognize them.

Virtual Counselling Environment [1]

Study results

  • VR has a strong influence on the emotions of the user
  • A natural human computer interaction has an impact on the user
  • Speech recognition is important, because only then it seems more natural and the emotional level is much stronger
  • The use by controllers has less influence on the emotional response

Virtual Park Scenarios

The next study is about whether a scenario of a fictional VR park can evoke joy, sadness, boredom, anger and anxiety. 
The Velten Mood Induction Procedure method was chosen, which means that under controlled conditions images, film clips or music are shown which evoke temporary emotional states. This method tests effects of emotional states, memory or change in information processing.
In addition, electrodermal activity (EDA) was applied for this study. 

„Electrodermal activity (EDA) refers to the variation of the electrical properties of the skin in response to sweat secretion. By applying a low constant voltage, the change in skin conductance (SC) can be measured non-invasively (Fowles et al., 1981).“[3]

To be able to evoke emotions, the feeling of presence must be considered above all. Presence is often defined as the sense of being there in a Virtual Environment and because of this feeling, an emotion can occur. So, a greater presence would create a much greater reaction and the user would behave, feel or think as if he or she were in a similar situation, because it does not feel as if what he or she is experiencing is only from technology. Frame Rate actually affects the feeling of being present as well. 
For the study, 120 students were selected to first of all relax for 5 minutes and then explore the park in the VE environment in first-person view for 5 minutes.

Scenarios [2]
  1. Joy: sunny, calm, quit, daytime scenario with chirping birds, non-playing characters
  2. Anger: constant, unnerving sound of heavy construction work
  3. Boredom: dull and boring scenario with sound of distant traffic
  4. Anxiety: gloomy night-time, dim light, owl sound, non-playing characters silhouettes
  5. Sadness: grey, rainy day, rain dripping sound, non-playing characters carrying umbrellas and walking hastily 

Study results

  • Joy, anger, anxiety and boredom have triggered the planned emotional states
  • The Park vor sadness failed because boredom and sadness are often accompanied by one another
  • Most negative emotions are accompanied to a small degree by other negative emotions

Feelings of fear and anger in VR

The purpose of this study was to find out to what extent the user’s emotions depend on the medium used, in this case VR and computer screen. To be even more specific, fear and anger were selected and compared with each other. Because especially from these two negative-valance emotions it is possible to see clear differences. The willingness to take risks increases with anger, while it decreases with fear. Both emotions have in common that they are negative and that we are strongly aroused by both. The differences are not only the willingness to take risks but also the control over the body and the awareness of the emotion.
To trigger anger, a short film clip from the movie ‘My Bodyguard’ was chosen, in which a group of teenagers bully weaker people. For the emotion fear, the subjects had to play the VR horror game ‘Play with me’. Both emotions were shown to the subjects on a computer screen as well as in VR.

Trailer My Bodyguard (1980)
Trailer Play with me (Game)

Study results

  • Decisions can change depending on the emotional state in which the person is at the time
  • People take more risk when they are angry
  • VR has a much higher impact on decision-making behavior compared to the use of a computer

Emotional reactions are intensified much more by VR

To get to the bottom of this, the GAPED picture database was used. GAPED stands for Geneva Affective Picture Database. In this database 730 pictures were collected, which are supposed to trigger different states of mind. These included negatively affected pictures such as snakes or pictures about mistreatment (injured animals or butchering), positive ones such as puppies, nature scenes or laughing faces, and neutral pictures such as non-living objects (bicycle spokes).
Participants had to answer three questions and rate their emotional state on a scale after viewing 24 selected images for 4 seconds.

Study results

  • For all images, the arousal was much higher with VR than with a computer screen
  • Photos with snakes or spiders where associated with phobias so the arousal was even higher
  • By diving into the world and blocking out the reality, the much higher arousal of the user could be explained
  • The attention span could be much higher when virtual reality is used

Sources:

  1. The effect of two different types of human-computer interactions on user’s emotion in virtual counseling environment, Tu Ziqi, Weng Dongdong, Cheng Dewen, Shen Ruiying, Fang,Hui, Bao Yihua (10.2019), https://booksc.org/book/80763010/fcb00e
  2. Is virtual reality emotionally arousing? Investigating five emotion inducing virtual park scenarios, Anna Felnhofera, Oswald D.Kothgassnera, Mareike Schmidt, Anna-Katharina Heinzle, Leon Beutl, Helmut HlavacsbIls, Kryspin-Exner (10.2020), https://www.sciencedirect.com/science/article/abs/pii/S1071581915000981
  3. A continuous measure of phasic electrodermal activity, Mathias Benedek, Christian Kaernbach (30.06.2010), https://www.ncbi.nlm.nih.gov/pmc/articles/PMC2892750/
  4. The Feeling is Real: Emotion Elicitation in Virtual Reality, Sahinya Susindar, Mahnoosh Sadeghi, Lea Huntington, Andrew Singer, Thomas K. Ferris (11.2019), https://journals.sagepub.com/doi/pdf/10.1177/1071181319631509
  5. Can Virtual Reality Increase Emotional Responses (Arousal and Valence)? A Pilot Study, Sergio Estupiñán, Francisco Rebelo, Paulo Noriega, Carlos Ferreira, Emília Duarte (2014), https://link.springer.com/chapter/10.1007/978-3-319-07626-3_51
  6. Physiological Measures of Presence in Stressful Virtual Environments, Frederick Phillips Brooks Jr., Michael Meehan, Brent Insko, Mary C. Whitton (08.2002), https://www.researchgate.net/publication/2529722_Physiological_Measures_of_Presence_in_Stressful_Virtual_Environments

Impact of VR on what we think and do

VR, AR or MR is already changing the way we think. The experiences, thoughts and emotions we have while using VR for example, keep us engaged long after we take off the headset and return to the real world. Because these experiences in the interactive, virtual world affect us just as much as face-to-face interactions, we feel the same way. VR can trigger emotions and feelings as if you were really physically there. This experiences are such powerful tools for building empathy because it makes little difference in our brains whether we perform an action ourselves or someone else does; the mirror neurons do the same. Our brains can’t tell the difference between real and virtual, which is why both bad and good experiences can have real, emotional effects. It may happen that our brain thinks that we have a different body. This can happen when sensory, visual and perceptual feedback matches. To give an example, it has been discovered that people who had an avatar with lighter or darker skin color while using VR had less racist prejudice after leaving the virtual world. Other advantages of it are, for example, trainings for very dangerous situations, because these virtual experiences trigger real emotional and physical reactions. However, it may be necessary to include real-world elements or constraints to remind the user of the virtual nature of their world or to ground the user over and over again. Should no limitations or assistances be built in and the brain becomes too accustomed to the virtual body, it can cause the user to lose control and make decisions that they would never actually make in the real environment. To prevent the brain from no longer recognizing boundaries, it can help for the user to put on a kind of vest and slip into the avatar’s body or walk across some kind of bridge. The goal, should VR be used for therapies, is for the user to benefit and learn from the scenario and fully embrace the experience as as his or her own.

Because VR can do both good and bad, Michael Madary and Thomas Metzinger drafted two lists and recommendations for the research ethics of VR and for the use of VR by the general public. These lists are at least first steps because the extent to which VR impacts us is very complex to understand. And because it is so complex, there is a need to think about ethics.

Super short summary of both lists into one.

  1. No permanent or serious harm should be caused to the subject or user.
  2. Every participant needs to be informed about lasting and serious behavioural impacts resulting from VR and that not every consequence is known.
  3. Media and Researches should be transparent especially when virtual reality is being discussed as medical treatment and they should avoid overstating the benefits of VR.
  4. There must be awareness of dual use, which is when the technology is used for something other than its original intent. „Torture in a virtual environment is still torture.“ (Madary & Metzinger, 2016, p. 19).
  5. Responsible handling of sensitive data that can be recorded through VR (eye movement, emotions, body movements) must be ensured and the trust of users must not be abused by researchers or commercial companies.
  6. When it comes to advertising, data protection must also be taken into account. Consumer behavior can be seriously influenced, since advertising can be precisely adapted to the user and could therefore influence entire mental mechanisms.

Sources:

  1. VR changes how we think. Now what?, Artefact group (n.d.), https://www.artefactgroup.com/ideas/vr-changes-how-we-think/
  2. We’re Already Violating Virtual Reality’s First Code of Ethics, Daniel Oberhaus (06.03.2016), https://www.vice.com/en/article/yp3va5/vr-code-of-ethics
  3. Recommendations for Good Scientific Practice and the Consumers of VR-Technology, Michael Madary, Thomas Metzinger (02.2016), https://www.researchgate.net/publication/295083641_Recommendations_for_Good_Scientific_Practice_and_the_Consumers_of_VR-Technology

The History of VR, AR and MR

1838: Sir Charles Wheatstone found that humans can only perceive 3D because the brain combines two photographs of the same object, but from different points. He later invented the stereoscope.

1935: Nearly 100 years later, sci-fi author Stanley Weinbaum presents a fictional model for VR. The story, “Pygmalion’s Spectacles,” is about VR glasses in which the character is fully immersed in an interactive environment when wearing them.

1956: Morton Heilig, a filmmaker and inventor, invented the Sensorama. It gave the feeling of an immersive 3D world. Six short films were developed for this booth and inside there were full-color displays, audio, vibration, smell and atmospheric effects like wind. 

1960: Four years later, Marton Heilig invents the first head-mounted display (HMD) called Telesphere Mask.

1961: Comeau and Bryan, both engineers, invent the first motion-tracking HMD. However, it was only used for the military to detect threats from a distance. After that, this technology was mainly used for the military because they had the money to fund it. It was then mainly used to build better flight simulators.

1968: Ivan Sutherland, a computer scientist, and Bob Sproull created the first virtual reality HMD. It was called “The Sword of Damocles.” It could only show simple virtual wireframe shapes, but they changed the perspective to the movement. That was the birth of AR. Because the device was too heavy, they had to strap it from the ceiling.

1969: Myron Krueger, a computer artist, created Glowflow, a succession to AR. The computer-generated environments could respond to the people in them.

1972: The General Electric Corporation built a computerized flight simulator. Using three screens, this simulator had a 180-degree field of view.

1975: Myron Krueger invented the first interactive VR platform VIDEOPLACE without the use of HMD or gloves. Through the use of projectors, video cameras, video displays, position sensing and computer graphics, the user could see the imitated movements of his silhouette.

1978: Aspen movie maps were developed. These maps were used to familiarize soldiers with remote locations. They made stop-motion images from a first-person perspective in a car. It was the first Street View version.

1979: McDonnell-Douglas Corporation integrated VR into an HMD. It was called the VITAL Helmet and was for military use only. To adjust the computer-generated images, the technology followed the pilot’s eye movements.

1980: StereoGraphics company invented stereo vision glasses.

1982: Sandin and Defanti invented the Sayre gloves. They monitored the movements of the hands. It is the beginning of gesture recognition.

1985: Thomas Zimmermann and Jaron Lanier founded VPL Research Inc. They were the first to sell VR glasses and gloves.

1986: Thomas Furness developed the Super Cockpit. It was the first flight simulator with computer-generated 3D maps. The pilot could control the plane through gestures, speech and eye movements. A year later, Jaron Lanier made the term “virtual reality” known to the public.

1989: Scott Foster founded Crystal River Engineering Inc. Through this company, real-time binaural 3D audio processing was developed.

1990: Jonathan Waldern introduces Virtuality. It was the first mass-produced VR entertainment system and was an arcade machine for gamers. That same year, Tom Caudell coins the term augmented reality.

1991: Antonio Medina, a NASA scientist, develops a VR system to control the Mars robot rovers. The company SEGA tries to release the first VR headset for the general public to buy, but it was never released because it was claimed that people could hurt themselves due to the VR effect being too realistic. However, the real reason could be that the processing power was too limited.

1992: Louis Rosenburg of Armstrong Laboratories, USAF, invents the first immersive mixed reality system, called Virtual Fixture.

1994: Paul Milgram and Fumio Kishino describe mixed reality as “anywhere between the extrema of the virtuality continuum.”

1995: Affordable VR headsets for home use were released

1997: Georgia Tech and Emory University created VR war zone scenarios for veterans to help them recover from PTSD.

2001: Z-A Production released the first PC-based cubic room. It was called the SAS Cube.

2007: Google introduced the Street View virtual map.

2010: Google adds stereoscopic 3D mode to Street View.

2012: Palmer Luckey develops the prototype of the Oculus Rift. He was 18 years old at the time.

2014: Facebook buys Oculus VR, Sony announces it is working on a VR headset, Google releases Cardboard, and Samsung announces the Samsung Gear VR.

2015: Since then, the technology has evolved significantly and is now used in many areas, from teaching to fighting diseases to gaming.

Sources:

  1. History of VR – Timeline of Events and Tech Development, Dom Barnard (06.08.2019), https://virtualspeech.com/blog/history-of-vr
  2. The History of Artificial and Virtual Reality AR/VR, o. A. (05.06.2020), https://avtsim.com/the-history-of-ar-vr/

Difference between VR, AR and MR

To get an idea and a basic knowledge of what the differences are between VR, AR and MR, I started with this in my research:

Virtual Reality:

  • Is the most commonly used
  • It allows the user to fully immerse himself in another world
  • With the use of head mounted displays, screens, sensors, gloves, etc. the images can be seen and interacted with
  • Simulations of real-life situations or entire environments can be immersed and interacted with
  • To use VR, the needed program or software, vision devices like TV, laptop, projector, smartphone, tablet, gaming console, OPI etc. and interactive devices like keyboard and mouse, haptic devices, joystick etc. must be provided

Augmented Reality:

  • It combines direct or indirect physical environments to the real world with other digital elements (like Pokémon Go for example)
  • While the user is in his real environment, virtual elements are generated in real time which the user can interact with
  • Through a device, these virtual elements can be seen and interacted with
  • To use AR, a program is needed that can mix the real and virtual world, capture devices such as webcam, video camera, smartphone, etc. for object recognition or geolocation, devices that can display the images such as smartphone, laptop or computer display, projector, etc. and activators that recognize at what time which virtual elements should be displayed
VR, AR and MR.

Mixed Reality:

  • Digital elements and the real environment are combined – it is a mixture of virtual reality and real-life
  • Real objects in the physical world are masked by virtual objects so that it feels as if the virtual objects really exist
  • With VR, you are immersed in a virtual world that does not exist. With MR, you feel like the virtual world is immersed in the real world and they merge together
  • For MR, you not only need sensors, displays and headsets but, most importantly, an environment where the location exists in both the real and virtual world
Mixed Reality is the result of blending the physical world with the digital world.

Sources:

  1. Virtuality Continuum ́s State of the Art, Héctor Olmedo (2013), https://www.researchgate.net/publication/257388578_Virtuality_Continuum%27s_State_of_the_Art
  2. What is Mixed Reality?, Brandon Bray (26.08.2020), https://docs.microsoft.com/en-us/windows/mixed-reality/discover/mixed-reality

VR as an assistance for people with Prosopagnosia

The biggest motivation and reason why I chose to study design is to help people who are struggling with difficult life situations. Design makes it possible to be a supportive help to people, regardless of the medium. So this is why my chosen topic revolves around the condition Prosopagnosia or face blindness in combination with the use of VR. Prosopagnosia is an illness where an affected person cannot recognize the identities of other people.

Prosopagnosia

To go a little deeper into this illness and what it is about, there are actually about 2 percent of the population that have prosopagnosia – or face blindness. Due to this – inheritable – disease they cannot even recognise the face of their family members. To be able to live as normal as possible, they have certain tricks. To cover up the illness, they sometimes use their cell phone as a safety mechanism or are deeply engaged in a book. In the worst case, it can lead to social anxiety. Many affected people also learn certain characteristics of their counterpart, for example facial features, jewelry, height, skin color, body movement, weight and so on. However, if the person does not have any special characteristics to help them recognise, learning things like hair color or hair length or mouth or nose shapes do not really work, since a lot of the people look like a certain type. Affected people say that they are afraid to tell others about this illness. The reason behind it is, that they do not recognise others on the street and because of that they have this fear that it looks unfriendly or arrogant to others. If people with prosopagnosia are not in their familiar surroundings for a long time, it can happen that they do not even recognise people who are familiar to them at the beginning, because the features that they learned to recognise become blurred in their heads. So they are always wearing a mask to cover up their situation to not come across rude and not to hurt anyone, because people do not want to be forgotten. It makes people question their own worth and it makes them question the worth of the person with the condition because he or she couldn’t even take the time to remember this certain someones face.

Best Cases

The vision I have with this topic is to help somehow. So after I started thinking about how to approach this topic and which technology could be of help, I kept coming back to the topic of virtual reality. Nowadays, the use of VR or AR is becoming more and more popular in school, jobs or even in medicine. To mention a few companies who are interested in helping people with an illness, there is a start up company who want to help people with Prosopagnosia for example. They developed an AR App that reminds them, who the person in front of them is. The App functions in combination with smart glasses.

Augmented Reality To Tackle Face-Blindness (Start-up Social Recall)

Another company uses VR, for example, to immerse people with disabilities, who are limited in their movements or can only perform certain actions, in a virtual world. Through this gamification, they automatically make larger movements than they would otherwise do. This is physically beneficial to them and they have fun because they can perform certain tasks and are therefore focused on solving them.

Virtual reality as a new therapy for patients with strokes (Start-up rewellio)

And there are a lot more of these kind of companies or ideas to help people. One company focuses on supporting people who have suffered a stroke. Learning or practicing with VR or AR helps to remember the information much easier and a lot faster. 

My Vision

The problem I want to solve and the vision I have is to find a way or develop something to help this group of people and provide assistance from the beginning. Through this safe environment and of course the technology, the patient is not overwhelmed but can get used to the new situation at his own speed. Other advantages would be that the self-confidence is getting increased and the patient is perhaps a little better prepared to deal with the new situation in reality. I think that these technologies have a lot of advantages and can and should be used to help them in the best way possible. 

This is a really well produced short film I found about Prosopagnosia:

Sources

  1. Die Zukunft der Arbeit/ AR und VR Trends für 2019 laut Experten, Vicki Lau (07.02.19), https://www.valamis.com/ de/blog/die-zukunft-der-arbeit-ar-und-vr
  2. Mit VR gegen die Angst/ Invirto bietet App-Therapie und Virtual Reality, Julius Beineke (03.02.20), https://t3n.de/news/ vr-gegen-angst-invirto-bietet-1248233/
  3. SocialRecall – Augmented Reality To Tackle Face-Blindness, Rebecca Hills-Duty (14.09.18), www.vrfocus.com/2018/09/augmented-reality-to-tackle-face-blindness/
  4. rewellio – Virtual Reality als neue Therapie für Schlaganfall-Patienten, Andrea Losa (27.12.19), https://futurezone.at/start-ups/virtual-reality-als-neue-therapie-fuer-schlaganfall-patienten/400705860
  5. WalkinVR – Virtual Reality for People with Disabilities, WalkinVRDriver (o.D.), https://www.walkinvrdriver.com
  6. TED Talk: VR Therapy, Unlocking the Potential of VR, Brian Boyle (22.10.19), http://www.youtube.com/watch?v=qxd-ppIDfjw
  7. Galileo: Mein Leben als Gesichtsblinde, Galileo (01.10.16), http://www.youtube.com/watch?v=bDGTKQAKHKY
  8. Prosopagnosia / Thriller Short Film, Hugo Keijzer (01.01.15), vimeo.com/129415238 
  9. TED Talk: How being Face-Blind made it easier to see people, Fleassy Malay (16.02.19), https://www.youtube.com/watch?v=q3sZaoPQSc4