3.1.0 Overview

This section outlines the design principles that inform the Guidelines as seen in Section 4. These principles provide guidance about:

  • The design and development process of a wayfinding and digital navigation system for vision impaired people
  • The high level thinking for creating audio instructions
  • The usage of sound in a wayfinding system for vision impaired people

3.1.1 Involve users in the process

In order to ensure that a wayfinding system is useful and usable, validating the system with users within the environment in which it will be deployed is key. You must involve users in the design and development process so as to validate the system in the real setting.

It is recommended that a wide range of people with vision impairments are invited to participate in the validation of the system. The recruitment factors that need to be considered are the following:

  • Level of vision impairment: from partially sighted to blind
  • Wide age group: 18-55 years old
  • Gender: even split
  • Mobility aids: no aid, symbol cane, long cane and guide dog.
  • Confidence in independence travel: from people who travel independently worldwide to people who only do familiar local routes without assistance
  • Tech literacy: smartphone users from basic to expert (self-assessed)
  • Venue familiarity: from no familiar, to everyday users of the venue

Feedback and validation from vision impaired people should be sought:

  • To understand user needs for each different type of built-environment: For example, understand what are the main points of interest, what are the landmarks used along the route, what are the safest route options etc. Research methods that could be used in this phase are interviews, observations, surveys and walkthroughs.
  • To validate the system while it is being developed: This might include incremental testing of the following:
    • The installation and the configuration of wayfinding technology, such as Bluetooth Low Energy beacons (see Section 5.1 for best practices)
    • The information and the terminology used in the audio instructions (for information on how to design the appropriate audio instructions see Section 3.2 and the guidelines for each environmental element under Section 4.1)
    • The usability and accessibility of the digital navigation service
  • Research methods that could be used in this phase are observations and think-aloud one-to-one sessions of participants trying out the system.

3.1.2 Focus on the environment not the technology

Vision impaired people pay considerable attention to their remaining senses in order to understand their environment better and to identify environmental landmarks and clues. It is important therefore that they should have as few distractions as possible from the technology that is assisting them.

With fewer distractions, vision impaired people are better able to memorise new routes. When they have memorised a route their reliance on digital navigation technology for that particular route will be reduced and their feeling of confidence and independence will be increased. Confidence in using technology may also lead vision impaired people to use more routes in less familiar or more complex environments.

3.1.3 Use simple and concise messages

The “less is more” principle applies when designing audio instructions. Wayfindr trials and the work of researchers [3] have shown that vision impaired people do not like long or very detailed instructions due to the time required to think about and process the information.

Limiting messages only to those that are relevant and essential to navigate a route will greatly assist vision impaired people. This means excluding information that is not relevant to their chosen route, for example, those obstacles and objects that can be detected and easily avoided with their mobility aid (long cane or guide dog).

3.1.4 Use active words

Audio instructions that need to communicate movement must include active words, i.e. verbs. For example a phrase like, “The stairs are in front of you” does not imply any action. Some people might take the instruction literally and will not take any action when they reach the stairs. Instead a phrase like, “Walk forward and take the stairs up” makes the action clear.

Verbs used in audio instructions must be carefully considered. Some verbs may be vague and open to different interpretations. An example is the verb “to bear”. When it is combined with a direction such as, “bear left” this might cause the vision impaired person to move through a space in ways that were not anticipated when the messages were constructed.

3.1.5 Provide reassurance information

Some vision impaired people may not easily relate to distance information given in feet or metres. Distance information described in terms of the number of steps may be more familiar but there are factors to be considered. The length of each step may vary based on the person’s height and how tired they are feeling. Even if the calibration of someone’s step length is accurate, it has been reported in research studies [2] that it is demanding for an individual to count or memorise the number of steps required.

In enclosed environments such a railway station, distance information may not be needed. Instead, reassuring feedback can be sufficient in order to make a vision impaired person feel secure that they are moving in the right direction or that they are approaching the next landmark on their route. However, in large open areas a rough estimation of the distance that they will be travelling before the next instruction might make them feel more comfortable.

3.1.6 Provide an instruction at every decision making point

Initially, for vision impaired people learning a route, it is important to have an audio instruction at every decision making point on the route, even if the user is not changing direction. The need to follow an instruction in every decision making point is likely to be reduced as people become familiar with the route.

3.1.7 Provide different techniques for diagonal directions

There are various techniques for communicating directions for different angles. These techniques are divided in two broad categories depending on the user’s point of reference:

  • Egocentric frame of reference is when the spatial layout and orientation is communicated based on the individual’s current location and viewpoint. For example, “Keep moving forwards, the escalators are in front of you” is information described based on an egocentric frame of reference. The most common ways to communicate directions like this are by using:
    • Clock faces, when the space is divided into 12 parts as on a clock face with the user situated in the middle of the dial, i.e. with 12 o’clock indicating straight ahead and 6 o’clock behind. This technique seems to be very popular with older people as it is taught during O&M training however the younger generations of vision impaired people do not have a lot of experience with clocks with fixed numbers and moving hands, which makes this technique confusing for them. During Wayfindr trials many also reported that it is difficult to distinguish between 1 and 2 o’clock. Thus, clock faces should be used to communicate the general direction.
    • Degrees, when directions and position are communicated using degrees. For example, “turn 45 degrees to your right.” Similarly to those using clock faces, various vision impaired people expressed concerns around distinguishing between 45 and 60 degrees for example.
    • Proportional is when instructions such as “straight ahead,” “to the right,” “diagonally to the left” or “slightly to your left” are being used to communicate positioning and direction. Again, this technique is open to interpretation by users. For example, during Wayfindr trials some vision impaired people asked, “how many degrees diagonally do I have to turn?” It has been observed that people interpret the word “slightly” in different ways.
    • Orthogonal axis means that direction is communicated based on right angles to a line going ahead, behind, left or right. In this way the risk of confusion – as with diagonal directions above – is reduced. On the other hand users may have to follow a less efficient or longer route based only on orthogonal directions.
  • Allocentric frame of reference is when the layout and the relationship between objects are described independently from the individual’s current location and viewpoint. For example, “the escalators are 10m straight ahead after the ticket barriers” indicate a fixed relationship between these two landmarks. Allocentric frames of reference are useful because they allow individuals to create a memorable mental image of an environment that would help them recover should they get lost or make a detour.   The most common technique using an allocentric frame of reference is:
    • Cardinal coordinates when position and directions are communicated as being “north,” “southeast,” “northwest” etc. Vision impaired people have reported [1, 5] that they find it difficult to orientate themselves based on cardinal coordinates as they have to first translate them into an egocentric frame of reference before use. Use of cardinal coordinates also vary from one country to another and it can be difficult for anyone to orientate themselves using them indoors, regardless of whether they are sighted or not.

Based on the above, vision impaired people have no single preferred way for communicating directions. Their preferences are based on various factors including previous Orientation & Mobility education, experience of independent travel, familiarity with analogue clock metaphors and familiarity with existing digital navigation services. One size does not fit all.

A suggestion for further Investigation on how to provide this choice to users through a digital navigation app can be found in the Section s4.3.1.1.

NB. This functionality, i.e. enabling users to choose their preferred technique for diagonal directions, is not currently demonstrated in the Wayfindr Demo iOS app v0.4.

3.1.8 Provide auditory cues

As seen in the Section 2.4 “Landmarks and clues”, auditory information plays an important role in the everyday life of a vision impaired person, whether this auditory information is environmental, e.g. a car passing by, or pre-determined, e.g. any sound design component such as a keypad sound or alert.

Any pre-determined auditory cues, i.e. sound design components, designed specifically for wayfinding, need to reflect a very high level of care and attention in their design, with the aim to guide and assist the user in a safe and reassuring manner.

The predetermined auditory information for digital wayfinding, i.e. the sound design components, comes in two forms:

  • Notification alerts, which are sound alerts that precede the audio instructions
  • Synthesised voice commands, which are computer-generated voice commands often associated with the accessible use of a modern smartphone or computer, e.g. VoiceOver mode on iOS, Windows Narrator or TalkBack mode on Android. These are vital tools for vision impaired people.

Specific guidelines around sound design along with the proposed Wayfindr sounds can be found in Section 4.3.2.

3.1.9 Divide the route into clear segments

Research has shown that a very common wayfinding technique for vision impaired people is to divide the route into a number of segments that they can memorise and follow based on identified landmarks [4]. This technique is called piloting or in the psychology of learning – chaining.

During Wayfindr trials vision impaired people reported that they would like to know where they were heading as this helps them create an image of the space in which they are moving. Their preference is to move around in relation to particular areas or landmarks instead of just simply following instructions. Additionally, some people enjoy having an extra layer of information to “colour” their journey to give themselves a deeper understanding of their current environment. Therefore, it is good to divide the route into a number of segments.

An example with the route segments in a Mainline Rail and Metro Station can be found in Section 4.2.1.

3.1.10 References

  1. Chen, H. E., Lin, Y. Y., Chen, C. H., & Wang, I. (2015). BlindNavi: a navigation app for the visually impaired smartphone user. In Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems (pp. 19-24). ACM. (last accessed: 24 February 2016)
  2. Kalia, A. A., Legge, G. E., Roy, R., & Ogale, A. (2010). Assessment of indoor route-finding technology for people with visual impairment. Journal of visual impairment & blindness, 104(3), 135. (last accessed: 24 February 2016)
  3. Nicolau, H., Guerreiro, T., & Jorge, J. (2009). Designing guides for blind people. Departamento de Engenharia Informatica, Instituto Superior Tecnico, Lisboa. (last accessed: 24 February 2016)
  4. Swobodzinski, M., & Raubal, M. (2009). An indoor routing algorithm for the blind: development and comparison to a routing algorithm for the sighted. International Journal of Geographical Information Science, 23(10), 1315-1343. (last accessed: 24 February 2016)