Wayfindr’s Public Transport Hackathon

Wayfindr
January 31, 2017

Over the 21st and 22nd January Wayfindr held its first Public Transport Hackathon at Imperial College London, in partnership with OrganiCity and London Underground. A group of 30 generous people gave up their weekends to do something that could very well end up benefitting every commuter in London. The idea behind it was to explore how Bluetooth beacons and APIs could be used to make travel better for vision impaired people and in turn the citizens of London.

 

The task: Help vision impaired people commute independently


The task: find a solution to a challenge faced by vision impaired people when travelling. We asked the teams to do this by using a test API, created especially for the event by London Underground, in combination with Bluetooth beacons and smartphones. We also wanted to test the OrganiCity tools that were available for people to use, TinkerSpace, Scenarios, and the OrganiCity APIs.

Each solution would be presented in a disused part of Earls Court station and assessed by Wayfindr CEO Ume Pandya, Belen Palacios from OrganiCity and Rakesh Gaur, Head of Innovation at Transport for London (TfL).

It was wonderful to have so many people come down to work on ideas that would help vision impaired (VI) people to travel on the tube independently. The day got started with intros from the three main partners. The first was from our CEO Ume Pandya who spoke about the mission of Wayfindr – to empower vision impaired people to navigate the world independently, and about our journey so far.

Next up was Belen Palacios, designer and researcher for Future Cities Catapult, the company that created the OrganiCity project. She talked about OrganiCity’s aim – which is to put people at the centre of the development of future cities.

Lastly, we had Turan Suleyman, Electronic Engineer at TfL, who built the Beacon API that people used over the weekend. He talked about how the API worked. His aim for the weekend was to see the different ways that people would use the API, and how he could improve it.

Getting to grips with the challenges of travel for vision impaired people


In teams, people mapped out the challenges they had experienced on public transport, then they began working on solutions to these problems. Later in the day, a group of vision-impaired commuters joined each team to discuss the ideas they had come up with and build a better understanding of their own needs so the teams could develop a useful product.

VI participants giving feedback at hackathon

 

When the VI people left it was time for everyone to start coding. The atmosphere in the room was great, you could hear people talking over ideas, using the Scenarios, assessing TfL’s API, then trying to work out the best way to implement their ideas. The teams were coding and iterating on their designs late into the night.

 

Operating a Sunday Service


On Sunday people got straight back to work to meet the midday deadline for submitting their ideas. TfL kindly offered an abandoned part of Earl’s Court underground station in which to trial the final ideas. This allowed the teams to test in a similar environment to where the finished product would be used. Sixteen Bluetooth beacons had been placed around the ticket hall, the escalator and the walkway.

Five teams came down on Sunday. Each tackled the issue of independent travel in a different way. From quick decision making on the platform to an end to end journey. Each team had interesting ideas and gave a great presentation.

Team 1 – Sound of the Underground
Sound of the Underground wanted to reassure people that they were travelling in the right direction without overloading them with information.

Team 2 – Step On/Step Off
Step On/Step Off came up with a voice application to help vision impaired commuters cope with the stresses on getting on and off trains.

Team 3 – WalkPal
WalkPal addressed two of the biggest challenges any vision-impaired person faces on the London underground: indoor navigation (i.e. travelling from the main gate to a platform) and understanding the train destination.

Team 4 – Sound Guided Navigation
Sound Guided Navigation wanted to create audio guidance that was as intuitive and non-intrusive as possible. They developed a ping-based localiser, with 3D audio, so that users could “follow the pings” between beacons. They also developed a computer-vision based empty seat finder.

Team 5 – Speakin’ Stations
Speakin’ Stations created audio and text navigation cues for navigation through a station.

 

Who came out on top?


All the ideas impressed the judges, especially as they were created in such a short amount of time. There could only be one winner though and they agreed that WalkPal was the best idea of the weekend. This was because it integrated the beacon API, the Bluetooth beacons and a smartphone. It also took into account the needs of a VI user, ensuring it was a useful product.

Winning Team - WalkPal

What happens next?


For us, the weekend was a huge success and it seemed like our participants felt so too. We look forward to running more events like this in the future. We would like to thank OrganiCity and Transport for London for their support in this project. And of course a huge thank you to all the people that gave up their time and brainpower to make it such a great event.

 

If you are interested in hearing about any other similar events to this then please get in touch with us here.

Wayfindr

Our team combines the digital product and user centred design expertise of ustwo, with the Royal London Society for Blind People’s 175 years of experience working with blind people.