2.1 Why an Open Standard?

There are an estimated 285 million people worldwide living with sight loss.

This can often lead to isolation, poverty and depression. Of the estimated 2 million vision impaired people in the UK, almost half say they would like to leave their home more often. At the moment many vision impaired people are unable to travel independently, instead relying on other people to help them get around or just not venturing out at all.

What if vision impaired people were empowered to navigate independently using the smartphone they already have in their pockets?

Emerging indoor navigation technologies such as Bluetooth Low Energy (BLE) Beacons hold the key to opening up the world for vision impaired people. However, in order to achieve the greatest impact globally, there is a pressing need to develop a consistent standard to be implemented across wayfinding systems. This will truly open up a world where vision impaired people are no longer held back by their sight loss, removing barriers to employment, to seeing friends and family and engaging in their community.

Accessibility needs to be ‘baked’ into the roll-out of all indoor navigation services. Venues need reassurance that their investment in the installation of indoor navigation services will improve the customer experience, for all their customers. The Wayfindr Open Standard aims to do just that.

When individuals and organisations get behind a purposeful vision, solutions to what previously seemed like big challenges become attainable.

The aim is that this Open Standard will help lower the barrier for built-environment owners and digital navigation services to make their environments, products and services inclusive from the outset as we continue to weave technology into our cities.

Once the Open Standard is adopted across the built-environment and digital navigation services alike, vision impaired people will benefit from a consistent, reliable and seamless navigation experience.

2.2 1. How are the guidelines informed?

Six main resources have informed the content of the Open Standard to date:

  • The work of the Working Group that consisted of the following members:
    • Martine Abel-Williamson, Objective Leader – Access to the Environment and Transport, World Blind Union
    • Nicole Holmes, Technology Accessibility Officer, Guide Dogs NSW/ACT
    • Manuel Ortega, Head of R&D, Ilunion Technology and Accessibility
    • Kelly Prentice, Mobility Specialist, Guide Dogs NSW/ACT
    • John Welsman, Policy Lead on Transport and Travel, Guide Dogs UK
  • The qualitative user research conducted by Wayfindr in the trials in London and Sydney. In total, 75 user sessions have taken place in these trials.
  • Academic research that supports the findings in Wayfindr trials. Wherever used, academic research resources can be found under “References”.
  • Factual information about vision impairment and the navigation techniques of vision impaired people. The resources for this information can also be found under “References”.
  • Input and feedback from industry experts: Professor Peter Barker OBE, Alan Brooks – Mobility Specialist, Henry Daw – Sound Designer and Audio Branding Consultant & Owner of Oblique Sound, Ann Frye – Independent Accessibility Consultant, Google Beacon Platform Team, Sue Sharp – RLSB, Alastair Somerville – Sensory Design Consultant at Acuity Design, Marten van Doorn – O&A expert, Hylke Bron – Co-founder of Movin
  • Input and feedback from Wayfindr Community members such as BlindSquare, BlueCats, Estimote, Guide Dogs NSW/ACT, Kontakt.io, Nominet Innovation

2.3 How to use the Open Standard

Each section of the Wayfindr Open Standard might be relevant to different audiences. Some of the audiences that will benefit from the Open Standard are:

  • Venue owners and their access consultants that want to make their estate accessible
  • Developers and designers of navigation products and services that offer wayfinding for vision impaired people
  • Researchers who are conducting research or experiments in the area of wayfinding for vision impaired people

Anyone who is involved in projects about wayfinding for vision impaired people should become familiar with Section 2.5 “Learning about mobility of vision impaired people”, that provides an introduction to the world of vision impaired people and their navigation and mobility techniques.

If you are a venue owner or an accessibility stakeholder, the following sections will be the most relevant:

  • Section 1.10 “Specific features, landmarks and objects” with guidelines about different types of built-environments, such as a rail station and the environmental elements that are likely to be found in them.
  • Section 2.11 “Wayfinding technologies” that includes information and recommendations about the installation, configuration and maintenance of specific technologies that can be used for wayfinding, such as Bluetooth Low Energy beacons.

If you are involved in digital navigation services as a researcher or designer, the following sections include the most relevant content:

  • Section 6Designing for vision impaired people” provides an overview of the design principles that underpin the Wayfindr Open Standard as well as a taxonomy of the elements that formulate an audio instruction.
  • Section 1.10 “Specific features, landmarks and objects” with guidelines about the information that needs to be provided when vision impaired people interact with or confront various elements in their environments, such as entrances, escalators, lifts etc.
  • Section 1.12 and 12 with guidelines about the functionality that should be provided through a digital navigation service in order to provide a good user experience for vision impaired people. Section 2.12 also features guidelines around sound design, as sound is an important aspect in navigation for vision impaired people.

If you are involved in digital navigation services as a developer, the following sections will be relevant:

  • Section 6Designing for vision impaired people” that provides a taxonomy of different types of instructions that are used in a wayfinding system.
  • Section 1.12 with guidelines about the functionality that should be provided through a digital navigation service in order to provide a good user experience for vision impaired people.
  • Section 12.2 “Open Source Wayfindr Demo mobile app” with information about the latest version of the Wayfindr Demo Mobile app and a link to a Github repository with access to the source-code. This app is open-sourced and aimed for testing and demonstration purposes.

2.4 What is a Candidate Recommendation?

The Candidate Recommendation is a document of greater maturity than the Working Draft.

The release of a Candidate Recommendation triggers a period for public feedback, where extensive feedback is being sought by the Community and the public.

If you have any feedback on the candidate recommendation or the Open Standard you can send it to hello@wayfindr.net.

 

 

2.5 Learning about mobility of vision impaired people

5.0.0.        Purpose of this section

This section provides an introduction to vision impairment and the mobility aids and techniques used by vision impaired people to navigate their way around in the course of their everyday lives. The aim is to inform anyone who is involved in installing and operating wayfinding systems in built-environments as well as creating navigation services and products for vision impaired people.

A person with a vision impairment moves through an environment processing what is happening in their immediate personal space to negotiate objects and changes in surface level, whilst keeping orientation to their final destination. Each blind or vision impaired person will use strategies that work best for them and use the information which is pertinent for them to achieve this. This information may be gathered from residual vision (very few vision impaired people in the world have no light perception), tactile information (e.g. underfoot surface texture) following building line, auditory information (e.g. hearing landmarks & cues) and other sensory information (e.g. smells from a coffee shop, bakery etc.… or vibrations from an aid such as a cane).

Wayfinding technologies such as beacons are a tool that can be used to add another layer of information to use when moving through an environment. It may provide awareness of items in the person’s immediate space that they will need to negotiate. It can also provide confirmation they are on the way to their final destination or arrived.

5.0.1.        Some facts about vision impairment

According to the World Health Organisation (WHO) [9, 10], there are an estimated 285 million vision impaired people worldwide. The vast majority live in less developed countries, in low income settings and are aged over 50. The most common causes of vision impairment globally in 2010 were:

  1. refractive errors – error in how the eye focuses light – 42%
  2. cataracts – a clouding of the lens in the eye – 33%
  3. glaucoma – a group of eye diseases which result in damage to the optic nerve and vision loss – 2%

The World Health Organisation (WHO) [9, 10] classifies the levels of vision impairment as follows:

  • Normal (full) vision – no visual impairment
  • Moderate vision impairment
  • Severe vision impairment
  • Blindness

About 15% of people who are registered as having vision loss cannot see anything at all [10]. The remaining 85-90% may have residual vision or other types of low vision and may have difficulties with colour, light, form or movement perception.

5.0.2.        Primary mobility aids

The most commonly used primary mobility aids in many countries are a (long) white cane and a guide dog (“seeing eye dog”). Vision impaired people might be using one or both of them depending on the environments they are travelling through. Generally, guide dog users have long cane skills, as good orientation and mobility skills are required to ensure that the guide dog is guiding the person who is vision impaired in the correct manner and along the routes that the user specifies.

In those countries and cultures that use them, the primary role of the guide dog is to walk in a straight line avoiding obstacles in the person’s path of travel. It is the job of the guide dog user, however, to give the dog commands and direct the dog to find certain features of the environment and get to a destination. Guide dog users often ask their guide dog to target a specific landmark such as a door, steps, escalator etc., a task that is called “targeting”.  Guide dogs are able to memorise a route; however the user always needs to be aware of the environment they are travelling through and cannot rely totally on the guide dog to get to a destination. The guide dog needs support and encouragement from the user to ensure that the person is safe and gets to a destination.

There are three types of cane that a person who is vision impaired may use:

  • Long cane: Used by people with little or no vision.
    • A long cane allows the user to detect obstacles and hazards, drop-offs/kerbs, ground level changes and stairs in the path of travel.
    • A long cane provides information from the environment that assists orientation. For example, the cane user can detect changes in surface textures between grass and concrete to follow a footpath.
    • Many cane users experience an increase in confidence because they hesitate less about the safety of the next step.
    • A long cane improves the user’s posture, because they don’t need to feel the ground with their feet while travelling or walk with the head down to check the surface directly at their feet.
  • Identification/symbol cane: This cane is a tool that allows the general public to identify a person as having a vision impairment. This cane is not designed to make contact with the ground surface, but may be used to check the height of a step or drop-off/kerb.
  • Support cane: The white support cane allows a person who requires balance and stability support to be identifiable as having a vision impairment.

Any navigation service should not replace the primary mobility aids, but rather it should be treated as an orientation tool used along with other skills to augment the user experience and reassure its users.

5.0.3.        Orientation and Mobility (O&M) training

In many countries, some vision impaired people receive Orientation & Mobility (O&M) training.  This is usually one-to-one training with a Mobility Specialist and the vision impaired person learns techniques and skills that will help them to travel safely and independently in their environment.

In the context of Orientation & Mobility training, orientation refers to the knowledge of the individual of where they are located in an environment and how they will get to a destination confidently and safely [6].

Some of the skills that may be covered in this type of training include using residual vision, sensory awareness, understanding how objects relate to each other in one’s environment, how to search for places and objects, personal safety and cane usage.

The techniques for long cane training are outlined below:

  • The cane is used with the index finger held alongside the cane so that the cane acts as an extension of the finger to provide a tactile experience of the ground one pace ahead of the user. Each user will be measured and fitted with a cane to match their height, length of stride and walking speed.
  • Touching or Constant Contact: The cane user moves the cane in an arc from left to right, slightly wider than their shoulder width to ensure that they preview the environment prior to moving forward in order to detect features such as barriers, upward or downward slopes and obstacles. Touching or Constant Contact technique allows the user to arc the cane from left to right while keeping constant contact with the ground surface.
  • Two-point touch: This is where the cane user taps the cane to the left and right instead of keeping constant contact with the ground. This allows the person to get audio feedback about the environment from the cane taps.
  • Shorelining is following a wall, kerb, hedge or other contrasting surface to the one a person is walking on in order to maintain a specific orientation while travelling through environments to arrive at a decision-making point. Shorelining allows a person to arc the cane either by constant contact or two-point touch technique to cross an open space or easily travel towards a known point in a crowded environment. Shorelines are an important element of how a vision impaired person navigates. There are two shorelines:
    • Inner shoreline, in the junction of a ground surface and a wall
    • Outer shoreline, in the junction of pavement and kerbline
  • Trailing is different to shorelining in that the technique allows a person to trail a contrasting surface such as a wall with their hand. The degree to which people use shorelining or trailing depends on the individual, their orientation and mobility skills and the environment they are in at any one time. Generally shorelining is the preferred technique in public places.

Importantly, those who are experienced travellers may not use shorelining or trailing. Each traveller differs in their approach to different techniques, so designers should not refer to shorelines or trailing as the only techniques to use in their environments. Rather they should design a system that refers to different features of the environment.

5.0.4.        Landmarks and clues

Primary landmarks are defined as objects always found in the environment that are difficult to miss at a particular location such as kerbs, a change in walking surface etc.

Clues are more transient and may include sounds, smells, change of temperature etc. Clues may include the noise of machinery but this may only be present in the working day, smells of the baker’s shop or dry cleaners, the change of temperature as you enter the open door to a building providing the air conditioning/heating is working.

Landmarks and clues play an important role in wayfinding and navigation, as they reassure the individual that they are walking in the right direction as well as helping to place themselves in the space [1].

Sighted people can easily identify features in the environment that they use as for this purpose. Vision impaired people also make use of landmarks and clues as they move, although they may be different to those used by sighted people. Orientation & Mobility experts define clues and landmark as, “any familiar object, sound, smell, temperature, tactile or visual clue that is easily recognised, is constant and has a discrete permanent location in the environment that is known to the traveller.” A clue can include the sounds, smells, temperature, tactile clues etc., whereas a landmark is a specific and permanent feature, which is familiar to the user.

Vision impaired people tend to use different clues and landmarks than sighted people, sometimes more detailed ones [2, 7]. Most seem to be closest to the individual [6] in other words in the areas that can be touched by the long cane or felt through the soles of the feet.

Non-visual cues such as wind direction, smell of the bakery, soap shop, heat from the sun are more inconsistent and so considered less reliable. In order to increase the reliability of the clues used, vision impaired people often combine the use of different senses. For example they may use a tactile landmark followed by an auditory clue in order to confirm that they are approaching a landmark [2].

None of the above should be seen as supporting the myth that blindness sharpens other senses but rather, that it makes vision impaired people pay more attention to their available senses in order to cognitively process the information from the environment.

5.0.5.        The safest, not the fastest or shortest route

For most digital navigation services, the main criteria for calculating the shortest route are distance and time taken.  Some navigation services have additional criteria such as quieter routes, less stressful routes and those with fewer changes.

During Wayfindr trials and in other work by other researchers [3, 5, 8] vision impaired people reported that they are willing to walk further provided that the longer route is considered safer and easier to manage. For example, instead of crossing a hotel lobby where there are a lot of people waiting with luggage they might prefer to walk all the way round the lobby rather than face obstacles and the potential danger of collision and/or loss of orientation. This makes their journey less stressful and enhances confidence. Thus the shortest or quickest route may not be appropriate for some vision impaired people.

Routes should be planned using a combination of what is considered a safer route and less challenging for the individual and with the advice of Orientation and Mobility specialists. A number of different routes should be considered as some travellers will prefer certain routes depending on their travel skills, the situation and where they need to get to at a given time. Guide dog users however will utilise the dog’s ability to avoid obstacles, pedestrians and other hazards along with its ability to target the interim or final destination, which may result in them taking the shortest, most direct route.

Some partially sighted people, for example, confident independent travellers, may still prefer the shortest or quickest route to their destination. Therefore, it should not be assumed that a difficult route for one vision impaired person might be the same for another.

  1. This functionality, offering alternative and personalised routes, is not currently demonstrated in the Wayfindr Demo iOS app v0.4.

5.0.6.        User Preference for discreet technology

Holding a device

During Wayfindr trials, vision impaired people reported that they would feel vulnerable holding a smartphone in their hand when navigating public spaces, e.g. because of the risk of having it stolen. They also pointed out that with one hand usually holding their primary mobility aid they need the other hand to be free to hold handrails or their ticket. For these reasons most people would be prefer to keep the smartphone in their pocket.

Even with the device in the pocket it should be able to communicate with Bluetooth beacons and trigger instructions at the appropriate time. Based on the Bluetooth antenna orientation and the signal reflection, the instructions might be triggered earlier but this should not cause a problem for users if the beacons have been installed and calibrated following the best practices as seen in Section 2.11.01 “Bluetooth Low Energy beacons.”

Listening through headphones

In busy and noisy environments, hearing audio instructions from a smartphone speaker, particularly when in a pocket, will be a challenge. During Wayfindr trials and in the work of other researchers [4], vision impaired users reported that they do not like using headphones whilst moving around as they block out auditory clues and warnings in the environment. Bone conducting headphones might offer a good solution.

Additionally, vision impaired people have reported that ideally they would like to use wireless headphones to avoid wires becoming tangled. Bluetooth headphones might be a solution. However due to the limitations of Bluetooth technology, there may be a time delay in the delivery of the audio instruction to the Bluetooth headphones that might be a problem for the user.

5.0.7.        References

  1. Allen, G. L. (1999). Spatial abilities, cognitive maps, and wayfinding. Wayfinding behavior: Cognitive mapping and other spatial processes, 46-80.
  2. Fryer, L., Freeman, J., & Pring, L. (2013). What verbal orientation information do blind and partially sighted people need to find their way around? A study of everyday navigation strategies in people with impaired vision. British Journal of Visual Impairment, 31(2), 123-138.
  3. Gaunet, F., & Briffault, X. (2005). Exploring the functional specifications of a localized wayfinding verbal aid for blind pedestrians: Simple and structured urban areas. Human-Computer Interaction, 20(3), 267-314. http://lpc.univ-amu.fr/dir_provence/dir/gaunet/articles/Exploring%20the%20functional%20specifications%20of%20a%20localized%20wayfinding%20verbal%20aid.pdf (last accessed: 24 February 2016)
  4. Golledge, R., Klatzky, R., Loomis, J., & Marston, J. (2004). Stated preferences for components of a personal guidance system for nonvisual navigation. Journal of Visual Impairment & Blindness (JVIB), 98(03). http://www.geog.ucsb.edu/~marstonj/PAPERS/2004_JVIB_GMLK.pdf (last accessed: 24 February 2016)
  5. Helal, A. S., Moore, S. E., & Ramachandran, B. (2001). Drishti: An integrated navigation system for visually impaired and disabled. In Wearable Computers, 2001. Proceedings. Fifth International Symposium on (pp. 149-156). IEEE. http://www.cs.umd.edu/class/fall2006/cmsc828s/PAPERS.dir/wearableConf-1.pdf (last accessed: 24 February 2016)
  6. Long, R. G. & Giudice, N. A. (2010). Establishing and maintaining orientation for orientation and mobility. In B. B. Blasch, W. R. Wiener & R. W. Welch (Eds.), Foundations of orientation and mobility (3rd ed. Vol.1: History and Theory, pp. 45-62). New York: American Foundation for the Blind. http://www.vemilab.org/sites/default/files/Long%20%26%20Giudice(2010)-%20Orientation%20and%20mobility%20(foundations%20of%20O%26M).pdf (last accessed: 24 February 2016)
  7. Passini, R., & Proulx, G. (1988). Wayfinding without vision an experiment with congenitally totally blind people. Environment and Behavior, 20(2), 227-252. http://www.stolaf.edu/people/huff/classes/Psych130F2010/Readings/Passini’88.pdf (last accessed: 24 February 2016)
  8. Swobodzinski, M & Raubal, M. (2009). An indoor routing algorithm for the blind: development and comparison to a routing algorithm for the sighted. International Journal of Geographical Information Science, 23(10), 1315-1343. http://www.raubal.ethz.ch/Publications/RefJournals/Swobodzinski&Raubal_IndoorBlindWayfinding_IJGIS09.pdf (last accessed: 24 February 2016)
  9. World Health Organisation, Fact Sheet No 213, Blindness: Vision 2020 – The Global Initiative for the Elimination of Avoidable Blindness (last accessed: 24 February 2016) http://www.who.int/mediacentre/factsheets/fs213/en/
  10. World Health Organisation, Fact Sheet No 282, Visual impairment and blindness (last accessed, 24 February 2016) http://www.who.int/mediacentre/factsheets/fs282/en/

2.6 Designing for vision impaired people

6.0.0.        Purpose of this section

This section outlines the design principles that underpin the Wayfindr Open Standard beyond those included in Section 1, the different types of audio instructions and the various structural elements of an audio instruction.

6.0.1.        Additional Design principles

6.0.1.0.   Overview

This section outlines the design principles that inform the guidelines as seen in Section 1. These principles provide guidance about:

  • The design and development process of a wayfinding and digital navigation system for vision impaired people
  • The high level thinking for creating audio instructions
  • The usage of sound in a wayfinding system for vision impaired people

6.0.1.1.   Involve users in the process

 

In order to ensure that a wayfinding system is useful and usable, validating the system with users within the environment in which it will be deployed is key. You must involve users in the design and development process so as to validate the system in the real setting.

It is recommended that a wide range of people with vision impairments are invited to participate in the validation of the system. The recruitment factors that need to be considered are the following:

 

  • Level of vision impairment: from partially sighted to blind
  • Wide age group: 18-55 years old
  • Gender: even split
  • Mobility aids: no aid, symbol cane, long cane and guide dog.
  • Confidence in independence travel: from people who travel independently worldwide to people who only do familiar local routes without assistance
  • Tech literacy: smartphone users from basic to expert (self-assessed)
  • Venue familiarity: from no familiar, to everyday users of the venue

Feedback and validation from vision impaired people should be sought:

  • To understand user needs for each different type of built-environment: For example, understand what are the main points of interest, what are the landmarks used along the route, what are the safest route options etc. Research methods that could be used in this phase are interviews, observations, surveys and walkthroughs.
  • To validate the system while it is being developed: This might include incremental testing of the following:
    • The installation and the configuration of wayfinding technology, such as Bluetooth Low Energy beacons (see Section 2.11.01 for best practices)
    • The information and the terminology used in the audio instructions (for information on how to design the appropriate audio instructions see Sections 1.8-1.9 and the guidelines for each environmental element under Section 1.10)
    • The usability and accessibility of the digital navigation service
  • Research methods that could be used in this phase are observations and think-aloud one-to-one sessions of participants trying out the system.

6.0.1.2.   Focus on the environment not the technology

Vision impaired people pay considerable attention to their remaining senses in order to understand their environment better and to identify environmental landmarks and clues. It is important therefore that they should have as few distractions as possible from the technology that is assisting them.

With fewer distractions, vision impaired people are better able to memorise new routes. When they have memorised a route their reliance on digital navigation technology for that particular route will be reduced and their feeling of confidence and independence will be increased. Confidence in using technology may also lead vision impaired people to use more routes in less familiar or more complex environments.

6.0.1.3.   Provide auditory cues

As seen in the Section 5.0.4 “Landmarks and clues”, auditory information plays an important role in the everyday life of a vision impaired person, whether this auditory information is environmental, e.g. a car passing by, or pre-determined, e.g. any sound design component such as a keypad sound or alert.

Any pre-determined auditory cues, i.e. sound design components, designed specifically for wayfinding, need to reflect a very high level of care and attention in their design, with the aim to guide and assist the user in a safe and reassuring manner.

The predetermined auditory information for digital wayfinding, i.e. the sound design components, comes in two forms:

  • Notification alerts, which are sound alerts that precede the audio instructions
  • Synthesised voice commands, which are computer-generated voice commands often associated with the accessible use of a modern smartphone or computer, e.g. VoiceOver mode on iOS, Windows Narrator or TalkBack mode on Android. These are vital tools for vision impaired people.

Specific guidelines around sound design along with the proposed Wayfindr sounds can be found in Section 2.10.01.

6.0.1.4.   Divide the route into clear segments

Research has shown that a very common wayfinding technique for vision impaired people is to divide the route into a number of segments that they can memorise and follow, based on identified landmarks [3]. This technique is called piloting or in the psychology of learning – chaining.

During Wayfindr trials vision impaired people reported that they would like to know where they were heading as this helps them create an image of the space in which they are moving.  Their preference is to move around in relation to particular areas or landmarks instead of just simply following instructions. Additionally, some people enjoy having an extra layer of information to “colour” their journey to give themselves a deeper understanding of their current environment. Therefore, it is good to divide the route into a number of segments.

6.0.2.        References

 

 

2.7 Effective messaging for audio instructions

7.0.0.        References

2.8 Different types of audio instructions

8.0.0.        References

2.9 Guidelines

9.0.0.        Guidelines for various environmental elements

Suggestions for further investigation

The following paragraphs are not guidelines but suggested areas for future investigation through user research.

9.0.0.1 – Advising to keep on one side on a two-way route

On a route where people are moving in both directions, advise vision impaired users to keep on the side of the route that is best for the direction in which they are travelling.  For example:

“Turn left and move forward. Keep on the left side of the corridor after turning”

s9.0.0.2- Announcing the pathway type

There are different types of pathways such as corridors, tunnels, ramps etc. It is suggested to announce which type of route they are following.  For example:

“Turn left and move forward into the tunnel.”

Suggestions for further investigation

The following paragraphs are not guidelines but suggested areas for future investigation through user research.

S9.0.0.3 – Advising which side of stairs to use

As with Escalators (Section 1.10), on some staircases users will be advised to use one part of the stairs, i.e. either the left or the right. Vision impaired people need to be guided towards that side of the stairs. Suggested example:

“Move forward and take the stairs up to the ticket hall. This is a long flight.

Keep to the left side of the staircase.”

 

S9.0.04- Announcing open riser stairs

Open riser staircases present particular challenges for vision impaired people. For example, there is a danger for their white cane or their feet to get caught in the open riser staircase. In such situations, it may be helpful to include notice of that detail in the audio instruction. Suggested example:

“Move forward and take the stairs up to the ticket hall.
This is an open riser staircase.”

References

  • Approved Document M: access to and use of buildings, volume 1: dwellings (2015), Department for Communities and Local Government. https://www.gov.uk/government/uploads/system/uploads/attachment_data/file/506503/BR_PDF_AD_M1_2015_with_2016_amendments_V3.pdf (last accessed: 2 April 2016)
  • BS EN 81-70:2003, Safety rules for the construction and installation of lifts. Particular applications for passenger and goods passenger lifts. Accessibility to lifts for persons including persons with disability, British Standards Institute.
  • BS 8300:2009+A1:2010, Design of buildings and their approaches to meet the needs of disabled people. Code of practice. British Standard Institute.
  • 12 Lifts, escalators and moving walks: Facilities for persons with disabilities.

9.0.1.        Platforms

Suggestions for further investigation

The following paragraphs are not guidelines but suggested areas for future investigation through user research.

S9.0.1.1 – Determining position in relation to the platform length

Vision impaired passengers require information to determine exactly where they are on the platform in relation to the platform length. This information is valuable because again it provides more clues to help them make sense of space. Some vision impaired people might also want to board a specific section of the train if this will help them find the exit more easily when they leave the train. For example:

“You are standing at the front part of the platform.”

“You are standing in front of carriage 3.”

You are standing between carriages 2 and 3.”

Note that these messages might not work for island platforms where the direction of trains might be different or might not be as effective if the trains do not have a consistent number of carriages.

S9.0.1.2 – Determining orientation in relation to the direction of travel

To assist vision impaired passengers it is recommended that there is an announcement about the direction of approach of the train.  For example:

“The train will arrive from your left as you face the platform edge”

s9.0.1.3 – Announcing close proximity of two platforms

Where two platforms are very close to each other, but are not island platforms, vision impaired passengers require guidance about the correct platform that they wish to take. For example:

“You are now at the platform.
[Central] line trains leave from the platform on the left.”

S9.0.1.4 – Warning if platform is part of pedestrian route

In some cases the route to another platform or an exit goes via a platform. As platforms are a potentially risky area for passengers it is recommended that vision impaired users are advised that they are required to walk along a platform so that they may be aware of the platform edge.

S.9.0.1.5 Announcing nearest way out before leaving the train

On exiting a train vision impaired people will generally want to move away from the platform as quickly and safely as possible by the nearest exit.

Taking into account the need for safest route, as seen in Section 5.0.5, it is advisable therefore to provide directions that avoid routes involving negotiating other platforms if possible.

2.10 Guidelines for mobile app development

10.0.0.        Guidelines for mobile app functionality

Suggestions for further investigation

The following paragraphs are not guidelines but suggested areas for future investigation through user research.

S10.0.0.1 – Enabling selection of directional instructions

As mentioned there are various strategies for communicating directions to vision impaired users and different approaches in different countries. Vision impaired users have no clear single preference for the mode of communication as long as they are able to pick the means they prefer.

It is recommended therefore to enable users to choose between:

  • Clock face directions, for example “turn at 1 o’clock” (common in the UK)
  • Degrees, for example “turn at 45 degrees to your right”
  • Proportional directions, for example “turn slightly to your left”
  • Cardinal coordinates, for example “turn facing east” (common in the USA)
  • Using the smartphones internal compass which will allow for more accurate angular directions.

S10.0.0.2- Enabling dictation for selection of options

It is recommended to provide vision impaired users with the means to dictate their selection from a list of options as this reduces the cost of the selection on their device.  It is useful too for the user to ask for the previous instruction to be repeated or to make a selection from a list.

S10.0.0.3- Providing guidance to the nearest help point

Where a passenger feels that they may have got lost in a controlled environment such as a station it is important to provide guidance on how to seek help in an emergency. It is possible to integrate this function with the venue owner’s operations. Alternatively, guide vision impaired people to the nearest help points that can generally be found at various locations.

S10.0.0.4- Suggesting the safest not the shortest or fastest route

As mentioned in Section 2.5.0.5 many vision impaired people prioritise route safety over distance and duration. Investigating accessibility levels along each route to the destination enables this metric to be calculated. However, what is considered as a “safe” route is subjective to each individual and more investigation needs to happen in order to better define the aspects that make a route “safe”.

S10.0.0.5- Enabling users to choose voice type

This section was removed because the Operating System of the smartphone device determines what voice can someone use.

S10.0.0.6– Displaying the instructions as text

Some people might be using braille displays to read the instructions or might be able to read the instructions. Therefore, apart from communicating navigation instructions with audio, it might be also helpful to display them as text on the screen.

S10.0.0.7- Enabling users to choose their mobility aid

Enabling users to choose a route based on their mobility aid (white cane, guide dog, no aid) offers a more personalised user experience.

A cane user may prefer a route with good tactile clues and landmarks. A guide dog user may prefer an alternative route that suits the dog’s ability with straight line working or “targeting” doors, steps, turns etc.

Based on their primary mobility aid, different instructions can be provided.

For example:

  • For white cane users and no aid users: there may be no need to mention wide gates.
  • For no aid users: since these users are more likely to have functional or residual vision, they will be able to see the layout of the space. To make the system more responsive to their needs the instruction should be to “take the first left” or “take the second right.” You can also use proportional directions instead of clock faces or degrees. For example, “go through to the gates on your left” instead of “turn 45 degrees to your left and go through the gates.”

S10.0.0.8- Enabling saving of frequently used places

To reduce the effort required to input destinations it is recommended that users are enabled to save their most frequent destinations.

S10.0.0.9- Enabling saving of personal landmarks

As described in Section 2.5.0.4 “Landmarks and clues”, vision impaired people use various landmarks along a route.  These landmarks might be different for every individual and might be used repeatedly for various purposes. For example, a vision impaired person might use the landmarks as a reassurance that they are on the right track, as a meeting point etc.

Thus, allowing saving of personal landmarks with language that is more meaningful to each individual user is likely to provide a more personalised user experience.

10.0.1.        Guidelines for sound design

10.0.1.0.   Overview

As seen in Section 2.6 “Provide auditory cues”, sound plays an important role in the wayfinding experience of vision impaired people, whether it is environmental, e.g. a car passing by, or pre-determined, e.g. any sound design component such as a keypad sound or alert. The predetermined auditory information for digital wayfinding, i.e. the sound design elements, comes in two forms:

  • Notification alerts, which are sound alerts that precede the audio instructions
  • Synthesised voice commands, which are computer-generated voice commands often associated with the accessible use of a modern smartphone or computer, e.g. VoiceOver mode on iOS or TalkBack mode on Android. These are vital tools for vision impaired people.

This section includes mainly guidelines about the sound design of notification alerts, as the synthesised voice commands are native to the mobile operating system.

10.0.1.1.   Using sound to attract people’s attention

Using sound to attract people’s attention ahead of an audio instruction is helpful for the following reasons:

  • It allows users to have their attention on the environment and to focus on the technology only when they are required to listen to an instruction. If there is no clear differentiation from the audio instruction and the environmental noises, they will be constantly focusing on the technology for fear of missing an audio instruction.
  • Similarly, without the use of sound to attract the user’s attention, environmental noises may override the audio instruction resulting in users missing the audio instruction.
  • In addition, requesting the user’s attention only when it is needed enables them to engage in conversation with companions. The sound will signal to them that an instruction is coming up, giving them the time to shift the focus of their attention.

10.0.1.2.   Using different notification alerts for different purposes

Consider using two core notification alerts that serve different purposes, which are the following:

  • General Notification Alert is a short 2-note alert heard immediately prior to all audio instructions, excluding the ‘Journey Completed’ audio instruction (see below). This notification alert prepares vision impaired people for any upcoming voice instruction, as seen in Section 2.6 “Provide auditory cues”. The proposed Wayfindr sound can be heard and downloaded below. It is also integrated in the Wayfindr Demo iOS App v0.4.
  • Journey Completed Notification Alert is a short 3-note alert heard immediately prior to any voice command that confirms the scheduled journey has been completed. The proposed Wayfindr sound can be heard and downloaded below. It is also integrated in the Wayfindr Demo iOS App v0.4.

The notifications alerts should alert the user without causing undue alarm or stress. They should be minimally designed functional sounds that are simple and concise, providing reassurance, whilst preparing the user for the imminent audio instruction. The proposed Wayfindr sounds follow these principles and it is encouraged to be used in digital navigation services.

The two Wayfindr notification sound alerts above act as a coherent pair, yet they are varied enough to suit their individual function.  The identically pitched 2-notes used for the general notification alert are short and generic, acting as a quick-fire prompt to the upcoming voice instruction, whereas the 3-notes used for the journey completed alert have a rising pitch, indicating resolution – the successful completion of the intended journey.

Paired functional sounds (or UI sounds) are common practice, especially within the context of smartphone use – renowned examples include the “Listen” and “Confirmation” sounds used for the personal assistant “Siri” in Apple iOS, or the similarly-paired “Google Now” sounds.  Other examples of paired UI sounds typically include an on and off, lock and unlock, and connect and disconnect pairing.  The paired sounds will typically be similar in tonality or instrumentation (sharing design principles that contribute to a consistent UX), but varied enough in their form to communicate a specific function.

10.0.1.3.   Distinguishing notification alerts from other sounds

The notification alerts should be audible within a noisy environment and be distinguishable from existing phone operating system alerts, ensuring the wayfinding experience is unique, consistent and recognisable.

Suggestions for further investigation

The following paragraphs are not guidelines but suggested areas for future investigation through user research.

S10.0.1.1- Using new sound alerts for more specific purposes

Additional notification alerts can be considered for future updates of the Open Standard – such as specific warning alerts, categorised alerts for different objects or any other contextually-aware notifications.

2.11 Wayfinding technologies

11.0.0.        Purpose of the section

There are various technologies that can be used for wayfinding purposes. The most common ones are the GPS, Wi-Fi, RFID and Bluetooth.

In this section, as a built-environment owner who wants to provide wayfinding solutions to make your environment accessible or as a developer who might be involved in wayfinding projects, you will be able to find recommendations and best practices about the whole lifecycle of a technology, from the installation to long-term maintenance.

The Wayfindr research trials so far have been conducted using Bluetooth Low Energy beacons. Thus, this version of the Open Standard includes recommendations about this type of technology.

11.0.1.        Bluetooth Low Energy beacons

11.0.1.0.   Purpose of this section

This section contains considerations and recommendations that are related to the whole lifecycle of using Bluetooth Low Energy (BLE) beacons as a technology solution for digital wayfinding.

Venue owners, their accessibility stakeholders and developers who are involved with beacon deployment can find information on:

  • Installation of BLE beacons in a built environment
  • Configuration of the BLE beacons in order to make the most out of the installation
  • Maintenance and Operational considerations that should be taken into account in order to manage a fleet of installed BLE beacons.

 

11.0.1.1.   What is a Bluetooth Low Energy beacon?

Bluetooth is a global wireless communication standard for exchanging data over short distances – it was introduced in 1994 [3].

Bluetooth Low Energy (BLE) is a technology applied in Bluetooth 4.0 protocol and above. It was specifically designed with low power consumption in mind to support applications and services that require connected devices (smartphones, tablets, laptops for example) located nearby. Most recent mobile devices support BLE.

A BLE beacon is an electronic device that repeatedly transmits a radio signal at periodic intervals. The signal’s broadcast frequency and transmit power can be manually adjusted (see Section 2.11.0.1). This radio signal carries data that allow each specific beacon to be identified by compatible devices once they are in range.

When a device receives the data from the BLE beacon, it can estimate how far away the BLE beacon is located. Once the distance estimation is made, it prompts a number of things to happen depending on the code used in the app. The beacon may cause various triggers. An application running on a device can detect the presence of BLE beacons and take actions depending on the distance and the data received. For example, in audio-based wayfinding for vision impaired people the relevant audio instruction is triggered. All the logic about the triggering of the correct audio instructions is in the app. See for example the Wayfindr Demo iOS app v0.4.

11.0.1.2.   Sources of Bluetooth signal distortion

The BLE signal is transmitted in the 2.4 GHz radio frequency. This means that the BLE signal may be distorted by interference from specific elements in the environment [2], such as:

  • Metallic surfaces bouncing the signal off the surface in unexpected ways as it is unable to penetrate the material
  • Water absorbing BLE signal
  • Human body mass absorbing and distorting BLE signal
  • Concrete and bulletproof glass absorbing the signal
  • Marble and bricks absorbing the signal
  • Electronic devices operating in the 2.4 GHz frequency, thus emitting signal on the same radio frequency which might overlap with the beacon signal
  • Fluorescent lighting emitting signal in the 2.4 GHz frequency, thus likely to distort beacon signal in unexpected ways
  • Power sources such as electric railroad tracks or power lines also causing interference

When the Bluetooth signal is distorted, the mobile device will receive a signal that does not reflect the real situation, e.g. the distance to the BLE beacon as read by the device might not be accurate. Installing BLE beacons following the best practices proposed in the following section will mitigate this risk.

11.0.1.3.   BLE beacon installation

Purpose of this section

Venue owners and other parties involved in BLE beacon deployment can find a description of the two main approaches for installing BLE beacons  (the proximity-based and the trilateration-based approach) along with best practices for BLE beacon positioning.

Proximity-based approach

Installing BLE beacons following a proximity-based approach means that beacons are placed only at decision making points where people need to be instructed. These decision making points might be before and after doors, staircases and at pathway intersections.

Advantages of this approach

  • The main advantage of the proximity-based approach is that a small number of BLE beacons is needed to complete this type of installation.
  • As a result the costs for BLE beacon procurement, installation and configuration time, maintenance are reduced.

Disadvantages of this approach

  • The decision making points in a venue where BLE beacons will be placed need to be decided carefully.
  • There are likely to be areas that the BLE beacon signal does not reach.
  • It might be difficult to work successfully in large open areas.
  • Misdirection or user error that leads a user out of the covered area is non-recoverable.
  • To detect the orientation of users, a combination of technologies might be needed. Orientation needs smartphone orientation capabilities.
  1. The Wayfindr Demo iOS app v0.4 is developed for beacon deployments with a proximity-based approach.

Trilateration-based approach

Installing BLE beacons following a trilateration-based approach means that BLE beacons are placed so that they provide coverage to the whole area. In this way, the location of the user’s smartphone device is estimated by measuring distance from the 3 closest BLE beacons using a trilateration algorithm [7].

Advantages

  • The main advantage of this approach is that the majority of a venue area is covered in BLE beacons and as a result there are unlikely to be areas where the user position cannot be estimated.
  • With the trilateration method, the orientation of the user’s smartphone device can be determined dynamically and as a result, the instructions given to the user can reflect that orientation.

Disadvantages

  • A larger amount of BLE beacons is required to achieve trilateration compared to the proximity-based approach. This means increased costs for the beacon installation, configuration and maintenance
  • Location accuracy cannot be guaranteed as there are a few variables that are likely to affect the stability of the Bluetooth signal such as the ones mentioned in Section 2.11.0.1 “Sources of Bluetooth Signal Distortion”
  • Changes to a built environment such as new construction, signage, temporary fixtures, and sources of 2.4GHz being introduced or removed can change the profile of the space and affect a trilateration algorithm. If physical changes are common, system calibration will be an on-going task and expense.
  • Digital navigation services that do not use a trilateration algorithm to calculate routes might have difficulties providing a good user experience in built-environments where beacons have been installed with trilateration in mind. Thus, ensure that beacons are positioned on decision making points, landmarks or point of interests as described in the following section.

Best practices for BLE beacon positioning

BLE beacons can be installed in both indoor and outdoor environments. There is no single way to install beacons, as layout and material vary across different environments. The following best practice guidelines can be applied for efficient positioning across environments:

  • Place the BLE beacons above head height (above 2.5 metres) in order to avoid interference from human body mass, likely to absorb a BLE beacon’s signal in busy environments.
  • If the ceiling is up to 4 metres high, then place the BLE beacon on the ceiling.
  • If there is an arch, then place the BLE beacon at the top and centre of the arch.
  • If the ceiling or the top of arch is higher than 4 metres then use walls, placing the BLE beacon at a height of around 2.5 metres (up to +1 metre) from the floor level. Alternatively, if possible, suspend the BLE beacon from the ceiling on a cable.
  • When the optimal position of a beacon interferes with other venue elements such as metallic signage, major power conduit or fluorescent lighting, place the BLE beacon 1 metre away from these elements.
  • If the BLE beacon is to be placed in a corridor below 4 metre width, then place the BLE beacon in the middle of the corridor to cover the full width equally.
  • If the corridor is wider than 4 metres, consider using more BLE beacons to cover the area evenly. In this case, place a BLE beacon at 4 metre intervals. For example, in an entrance of a venue with multiple doors, the area is likely to be wider than 4 metres. In this instance, to cover all doors with beacon signal, place a BLE beacon every 4 metres.
  • Place a BLE beacon 4 +/- 1 metres before any landmarks or point of interests that people are likely to go through or interact with. This should be the case both for the proximity and the trilateration-based approach to installation. These landmarks and objects might be for example:
    • Entrance doors
    • Pathways
    • Escalators
    • Stairs
    • Lifts
    • Ticket control gates

Read more about the information that needs to be provided when interacting with these landmarks in the Section 1.10 “Guidelines for various environmental elements”.

  • Most BLE beacons are suitable for outdoors installations, check with your BLE beacon supplier if their BLE beacons are waterproof and what temperatures they work well in.
  • Consider the orientation of the BLE beacon directional antenna. Based on the BLE beacon manufacturer, some BLE beacons might not emit an all-round symmetrical signal, but instead they emit a signal of elliptical form depending on which way the BLE beacon antenna is facing. Ask your BLE beacon manufacturer for details and orientate the BLE beacon in a way that supports your needs.

11.0.1.4.   The parameters of a BLE beacon

Configuration settings of BLE beacons vary based on the protocol on which they are configured. There are two main beacon protocols open to any beacon manufacturer in the market at the moment:

  • iBeacon: The iBeacon protocol [1] is a communication format developed and introduced by Apple in 2013 and is based on Bluetooth Low Energy technology. The protocol is compatible with any iOS or Android device that supports Bluetooth 4.0 and above. The minimum requirements for the operating system is iOS 7 or Android 4.3 (Jelly Bean) and above.
  • Eddystone: The Eddystone is an open beacon format developed and introduced by Google in 2015 [4] and is based on Bluetooth Low Energy technology. It is compatible both for iOS and Android devices that support Bluetooth 4.0 and above.

11.0.1.5.   iBeacon

The iBeacon protocol [1] is a communication format developed and introduced by Apple in 2013 and is based on Bluetooth Low Energy technology. The protocol is compatible with any iOS or Android device that supports Bluetooth 4.0 and above. The minimum requirements for the operating system is iOS 7 or Android 4.3 (Jelly Bean) and above.

 

iBeacon Identifiers

A BLE beacon configured with the iBeacon format transmits its Unique ID, an advertising packet that contains three customisable identifiers: the UUID, the Major and the Minor. These three identifiers make an iBeacon identifiable and distinguishable from other iBeacons.

  • Universally Unique Identifier (UUID): It is a 16 byte (128 bit) number that can be used to identify a large group of BLE It is formatted in 32 hexadecimal digits, split into 5 groups, separated with hyphen characters [8].
    An example of a UUID is f3af8c82-a58a-45a1-b9b6-21e3cf47ded9.
    It is common practice that the same UUID is being used to identify all the BLE beacons that belong to the same company or organisation. For example, a transport operator can have the same UUID for all the stations they manage. This is only a common practice and not a constraint, as the same company or organisation can generate more than one UUID. Although it is called “unique identifier”, there is a possibility that different companies or organisations might be using the same UUID.
  • Major: It is a 16-bit integer that can be used to identify a subgroup of BLE beacons that are under the same UUID. It is common practice that the same Major is being used for all the BLE beacons that belong to the same region or venue of the organisation. In the transport operator example, the BLE beacons that belong to the same station, would be under the same Major. In this case, the Major becomes the venue identifier.
  • Minor: It is a 16-bit integer that can be used to identify an individual beacon within a group of beacons with the same Major.

Things to keep in mind:

  • The iBeacon format requires all three identifiers to be assigned.
  • These three identifiers are advertised publicly. This means that anyone with an app or a device that can detect BLE beacons will be able to capture these identifiers. However, this does not necessarily mean that they can connect with them. Many BLE beacon manufacturers have solutions to prevent “piggybacking” onto a fleet of BLE
  • Since Major and Minor are integers, they cannot include characters other than numbers. Therefore, every venue needs to be related to a number, which very often will be represented by the value of the Major.

1.0.1.6.   Eddystone

Where the iBeacon transmits one advertising packet that includes the UUID, the Major and Minor, the Eddystone BLE beacons broadcast more than one type of advertising packet [4]. These advertising packets are called “frames”.

  • The Eddystone-UID frame broadcasts a unique 16-byte BLE Beacon ID. It consists of two parts: the 10-byte Namespace ID and the 6-byte Instance ID. The Namespace ID can be used to specify a particular group of beacons, similar to the UUID in iBeacon.  The Instance ID can be used to identify individual beacons in a large fleet of beacons.
  • Eddystone-EID is a component of the Eddystone specification for BLE beacons. Eddystone-EID BLE beacons broadcast an identifier that changes every few minutes. The identifier can be resolved to useful information by a service that shares a key (the Ephemeral Identity Key, or EIK) with the individual BLE beacon. Any use of Eddystone-EID requires both a BLE beacon and a resolving service (such as the Google Proximity Beacon API).
  • The Eddystone-URL frame broadcasts a URL in a compressed encoding format. Once received by a device, the URL is decoded and the user can select if they want to visit the broadcasted URL. Although this advertising packet might not be very helpful for wayfinding applications, it is useful for applications intended for web content discovery.
  • The Eddystone-TLM frame broadcasts data about the BLE beacon’s own operation, the so-called telemetry information. This data is useful for monitoring the fleet of beacons. When the Eddystone-TLM frame is transmitted, the following data can be captured:
    • BLE beacon battery level, in an encrypted format with the shared key, similar to the Eddystone-EID
    • Time that the BLE beacon has been active since last time it was switched on
    • The number of frames, i.e. advertising packets, the BLE beacon has transmitted since last time it was switched on
    • BLE beacon temperature

11.0.1.7.   Estimating the distance from a BLE beacon

As seen above in Section 2.11.0.1 the Bluetooth signal is open to distortion from different sources. As a result it is difficult to accurately estimate the distance from a BLE beacon to a device. Depending on the BLE beacon format that is used, i.e. iBeacon or Eddystone, there are various parameters that could give an indication of the beacon’s distance from the device:

  • Received Signal Strength Indicator (RSSI): This is an indication of the BLE beacon signal strength as measured by a device. The RSSI is measured in dBm (Decibel-milliwatts). The bigger the RSSI value, the stronger the BLE beacon signal. Based on the changes of the RSSI value, we can tell if a user is heading towards or away from a BLE beacon. The RSSI values are specific to each beacon manufacturer and based on how they have calibrated their BLE beacons (see “Measured Power or Ranging Data” in Section 2.11.0.1 The RSSI values as read by smartphones vary between devices as it also depends on the Bluetooth chip that the device has on board.
  • Proximity zones: For the iBeacon format only, the area around a beacon is divided in four proximity zones based on the RSSI.
    • Immediate, when the device is very close to the beacon in distances less than 50cm
    • Near, when the device is estimated to be in distances between 50 centimetres and 3 metres from the beacon
    • Far, when the device is further away or the signal is fluctuating due to distortions
    • Unknown, when the distance cannot be estimated mainly because the distance from the BLE beacon is too far and also due to distortions of the signal

The proximity zones can be used as filters in order to trigger content in context. For example, when a device enters a BLE beacon in the “Near” zone, then a particular set of content can be triggered, whereas when the device is in the “Immediate” zone a different set of content can be triggered.

  • Accuracy: This is a parameter in the iBeacon format only that indicates the proximity value measured in metres. More specifically, it indicates the one sigma horizontal accuracy, a parameter used in statistics [6]. The iBeacon documentation suggests that the Accuracy parameter can be used to differentiate between beacons with the same proximity value. However, the iBeacon documentation suggests that the Accuracy should not be used to identify the exact distance of a user’s device from a beacon. The reason is that the Accuracy levels are affected by various sources of signal distortion and might not reflect the actual distance in metres.

1.0.1.8.   BLE Beacon configuration settings

Regardless of their protocol format, i.e. iBeacon or Eddystone, a BLE beacon can be configured by adjusting the following parameters in order to achieve best results:

  • Advertising interval. It specifies how often a BLE beacon transmits its signal. The values for the advertising interval range from 100 milliseconds to 2000 milliseconds. The shorter the interval, the more often the BLE beacon transmits its signal. Also, the shorter the interval, the more stable the BLE beacon signal is. Reducing the interval, i.e. making the BLE beacon emitting its signal faster, has a big impact on the BLE beacon’s battery life. In most cases an advertising interval adjusted between 300-350 milliseconds will maintain a good balance between signal stability and battery life.
  • Broadcasting or Transmission (TX) Power: It determines how far a BLE beacon emits its signal. The broadcasting power is measured in dBM (Decibel-milliwatts) and ranges between -100 dBM and +20 dBM. The more power, the further the signal is emitted. The maximum distance that a Bluetooth Low Energy signal can travel is several hundred metres, assuming that there are no walls or other sources of signal distortion.

Finding the right broadcasting power for a BLE beacon depends on the context. For example, a BLE beacon installed in a large concourse might need to be set to a higher broadcasting power. In cases where a BLE beacon must be triggered only when in the boundaries of a point of interest or landmark, the broadcasting power will need to be lower so that the BLE beacon is not detected from further away.

The broadcasting power also has an impact on the BLE beacon’s battery life, but not as greatly as Advertising Interval.  As a rule of thumb, battery usage is 30% higher at Maximum power than at a Minimum power.

  • Measured Power or Ranging Data: This parameter is used in estimating distance from a BLE beacon (see Section 2.11.0.1). iBeacon and Eddystone define this parameter differently. In the iBeacon protocol, it is called Measured Power and it is the expected value of Received Signal Strength Indicator (RSSI) at one meter distance from the BLE beacon. In the Eddystone protocol, it is called Ranging Data and it is measured at zero meter distance from the BLE beacon. However, the Eddystone specification [4] proposes to “measure the actual output of the BLE beacon from 1 meter away and then add 41 dBm to that. 41 dBm is the signal loss that occurs over 1 meter.”
  • Many BLE beacon manufacturers have a default factory calibration for the Measured Power that cannot be changed. However, it is advised that when beacons are installed indoors, Measure Power or Ranging Data samples be taken in-situ and the value be set accordingly. In this case the real environment with the potential sources of Bluetooth signal distortion are taken into account (see Section 2.11.01).

 

Note that the Eddystone protocol is using the Ranging Data parameter only for the UID, EID and URL frames.

11.0.1.9.   Maintenance and Operational considerations

As well as considering installation and configuration of BLE beacons, it is important to consider maintenance and operation over the longer term for the fleet of installed BLE beacons to remain functional and up to date.

Things to consider:

  • Which department in the organisation will own the BLE beacon infrastructure and will be responsible for their installation, configuration and maintenance.
  • What happens when the BLE beacon battery is due to run out? There are two options: Replace the whole BLE beacon or replace the battery in the BLE beacon. The latter ensures continuity as the beacon does not change.
  • What are the organisational needs for the battery health data collection. There are various options available:
    • The battery health data can be collected manually by someone in the organisation who will need to do a walk through all BLE beacons with a mobile device that will collect battery levels
    • The battery level can be automatically collected and sent anytime a mobile device is connected to the BLE beacon. In this case, the collection of battery levels is crowd-sourced through people passing through the venue
    • Some BLE beacons have the ability to connect to the Internet themselves if they are in the range of a wi-fi network. In this case they can send their battery levels automatically at periodic intervals
  • What are the organisational needs for a dashboard to monitor status of the installed BLE beacon fleet.
  • How will the system administrator be notified when the beacon battery is running low, in order to arrange action for maintenance?
  • How battery life can be preserved while the venue is closed.
  • What security actions can take place in order to mitigate the risk of someone else piggy backing on the fleet of BLE beacons without the organisation’s permission. Although the BLE beacon identifiers are publicly transmitted data, there are ways available to increase the security of the beacon fleet.
  • Where does the data from the user’s smartphone, that is using the BLE beacon network for travel, ultimately end up.
  • How is the users’ travel data stored/deleted or shared /protected?
  • Disabling the whole system in case it is needed, for example when system maintenance is in progress.
  • What are the available options for the BLE beacon firmware upgrades provided by the beacon manufacturer. Updating the firmware keeps the beacon up-to-date with the latest software and security updates made by the BLE beacon manufacturer.
  • What is the performance of the BLE beacon manufacturer’s SDK and API and how often they are updated.
  • To what extent the BLE beacon manufacturer’s SDK and API are documented. A well documented SDK and API will help developers integrate them better into their mobile app.
  • How engaged is the online community around the beacon manufacturer in order to provide support and recommendations if any issues faced.
  • How responsive is the beacon manufacturer’s support team, and what capability do they have to escalate questions to a developer when required.
  • What capability does the beacon manufacturer provide in order to facilitate sharing of the beacon network with other third party developers who might be interested in utilising it.

11.0.1.10.                 References

 

2.12 Open Source Wayfindr Demo mobile app

12.0.0.        The aim of the Wayfindr Demo mobile app

We know that there is a community of individuals and organisations worldwide interested in exploring digital wayfinding for vision impaired people. We want to support these people who are planning to explore audio wayfinding in different types of built-environments, e.g. transport, retail, health, cultural and entertainment venues. We have open-sourced the Wayfindr Demo mobile app that has been used in all the Wayfindr trials to date, in order to provide developers, designers and researchers around the world with a free, open tool to conduct their research in context.

This app serves also as a demonstration of Sections 1.12 and 2.12 “Guidelines for mobile app development” as seen in the Wayfindr Open Standard.

We invite all the interested parties to download the Wayfindr Demo mobile app, customise it based on their needs, run their experiments and then share new versions of the app with the Wayfindr community.

We would like to see the app become an evolving open tool to aid the research and development of wayfinding systems for vision impaired people.

12.0.1.        How was the Wayfindr Demo mobile app developed

The Wayfindr Demo app has been developed over the last few years. The functionality of the app is based on user needs that have been identified in our trials and have been improved through continuous iteration.

The reasons behind this functionality and the guidelines for developing digital navigation services for vision impaired people can be found in Sections 1.12 and 2.12 “Guidelines for mobile app development” of the Wayfindr Open Standard.