On the challenges faced by the visually challenged, as a designer.
During my second semester at NID (National Institute of Design), I was, along with my batch-mates, given an opportunity to develop a live project in collaboration with the Railway Design Center. One of the briefs given to us was to improve the passenger information systems. A passenger information system (PIS) is an automated system for supplying users of public transport with information about the nature and state of a public transport service, through visual, voice or other media.
There are over 25 million people in India who can’t see, and in effect do not have access to the passenger information infrastructure in railway stations.
I first started my train of thought in the lines of removing clutter in the information system, and maybe using alternative media like Augmented Reality to simplify the design of the information systems.
But instead of re-designing and improving the existing system, I decided to first consider the users that thronged the stations and what their needs and wants might be.
It was then I realized how we, as a society have been very unkind to the specially-abled. Around 1.5% of India’s population can be categorized as legally blind, which is a huge number considering India’s large population. This means that there is a huge chunk of Indians who don’t have access to the information railway stations provide, rendering them unable to travel on trains independently.
On further research, I came to know that around 40% of the total blind population in the world lives in India. India though, is not known for its accessibility enabled infrastructure. When compared to Western countries, where the infrastructure is very accessible and inclusive, India has a long long way to catch up.
It was then I realized how we all have been blind to the visually challenged and their challenges.
Even something as basic as braille labelling for seat numbers, or tactile maps (which is not very helpful, I learned later) inside railway station were only introduced in 2016, 153 years after Indian railways was established.
It was imperative that I took this opportunity to do something, in my capacity, for the visually challenged. This was the process I followed to arrive at a concept solution.
I had never met a visually impaired person. I realized I knew nothing about how they went about their routines, the struggles they face, the happy parts of their life, and so. I did a bit of secondary research and browsed online for the medical side of the spectrum, and read about the various types and degrees of impairments like macular degeneration, glaucoma, diabetic retinopathy, and cataracts and various others. In a developing country like India, most cases of blindness are preventable if we had a proper and affordable healthcare system.
I was introduced to Dr. Habeeb C, Professor of English in Farook College, Calicut, who was visually challenged from birth. He is also the vice president of the Kerala state association for the visually challenged. Talking to him, and observing him, I learned a lot about how the visually challenged lived their life. I observed and understood a lot of things like how they use the cane for navigation, how they use their mobile phones, the mobile applications they use, the way in which they operate the applications we use and so on. He introduced me to a lot more people and every one of them had stories and pieces of advice to share. I didn’t want it to be a user interview, it was more of a user conversation where actual experiences and stories flowed on its own. I was in correspondence with him throughout the length of this project through mail and messages.
But I couldn’t actually observe a visually challenged person in the railway station performing the tasks I wanted them to perform. I had to do a bit of role play myself to know the pain points. An unblind person thinking in the shoes of a visually challenged person would not be very accurate in identifying the actual pain points. Dr. Habeeb later helped me eliminate many of the identified problems, which I thought were really valid points. It was truly an exercise in trying to completely understand and empathise with someone totally not like oneself.
The normal person’s perspective of how it is for the blind is nothing like how it actually is. We assume everything to be dark for them.
The reason we know darkness is because we have known light.
Someone who is completely blind from birth has their own sense of world inside their head. It is not actually darkness, it is nothingness they perceive. By nothingness, I mean a total lack of sensory information from the eyes (in the case of completely blind). It is not black or dark as we would assume it was. Though many of them can understand when a light is being flashed to their face (This is because the light is picked up by a part of the retina in the eye that is separate from the rods and cones that control overall vision), there is a complete absence of vision. It is not something we, the unblind, can imagine.
For starters try imagining how it would be to see through something that doesn’t perceive light, say something like an elbow.
Don’t imagine the above picture. Close your eyes, and try to pass on your sense of vision to your elbow and try looking through it.
That’s the closest you can be to perceiving blindness. People who became blind later in life have a different thing going on. They experience bright flashes of colors and shapes that keep on changing constantly (which is the brain trying to make up for the lack of vision). Something similar to this would be bokeh. It is not as exciting as you think, as it is not voluntary, and after a lot of light, one would start longing for darkness.
I was to blind to this, all this while.
Armed with a lot of pain points, I set about to create a solution. One of my first concepts was to use computer vision. There are many applications that make use of image recognition to help the visually challenged like Microsoft’s Seeing AI. But in this particular context, it would not help the visually challenged and would only add on to the worries. Then there is Blindsquare. It uses GPS with audio assistance to guide users. This application is not available in India yet. Nearby Explorer is also a similar application and is available in India. Using GPS to track locations is not a very context appropriate solution in a railway station, because GPS has a 7.8-meter accuracy range and some stations in India are only that long. Physical Web solutions like Bluetooth Low Energy beacons like iBeacon and Eddystone beacons were considered and are used in indoor navigation inside shopping malls, and airports. Beacon is kind of like a lighthouse: it repeatedly transmits a single signal that other devices can see. Instead of emitting visible light, though, it broadcasts a radio signal that is made up of a combination of letters and numbers transmitted on a regular interval of approximately 1/10th of a second. A Bluetooth-equipped device like a smartphone can “see” a beacon once it’s in range, much like sailors looking for a lighthouse to know where they are. A beacon can locate your exact location based on the signal strength from the current beacon and your proximity to other beacons.
Beacons are pretty cheap and are long-lasting. So a beacon-based navigation seemed perfect for indoor spaces. Started reading more about the beacon technology, it’s pros and cons.
A persona was created based on all the interviewed users and a user scenario was charted out to identify the location markers that would be relevant during navigation.
The location markers that would need beacons were identified and classified broadly as convenience markers and navigational markers.
Following navigational markers were identified;
- Floor — Tactile paving, Ramp, Step, Slope Pathways.
- Door — Types of doorways (security entry, automatic entry, open entry).
- Obstacles — Existence of objects (e.g., trash cans).
- Elevator — Location of control panel inside the elevator, Door opening and closing.
- Escalator — The correct standing side (left or Escalator right), Directions to the adjacent escalator(s).
- Stairs — Shape, Number of steps and landings.
The following convenience markers were also identified;
- Shop — Type of shops, Name.
- Restaurant — Cuisine, Name.
- Toilets— Gender, Wheelchair accessibility
- Waiting room — Class, Facilities
- Buggy assistance — Availability
- Information enquiry
- Railway Protection Force.
- Station Master.
Once all the markers were identified, started on the next phase of the project.
It was time to start the prototyping process. Apps for the visually challenged is not something we are used to seeing around in the mainstream, so I was perplexed on how to initiate the prototyping process. I first started by listing the key features on post-it notes and then identifying its hierarchy in the flow. The hierarchy was determined based on the user conversations. One feeling that was echoed among all the people I interviewed was the need to feel in control, and to know whether they are on the right path. The app has to interact with the user more than the user interacts with the app. The app has to have provisions for emergencies, quick access features and should update the user with constant updates and assurances. Sketches and wireframes were made and based on the information hierarchy, user flow was defined.
When it came to audio design, the onus was on how to simplify the way-finding without overloading the user with too much information but also ensuring the user feels safe and in control. Most adults can store between 5 and 9 items in their short-term memory. This idea was put forward by Miller (1956) and he called it the magic number 7. He thought that short-term memory could hold 7(plus or minus 2 items) because it only had a certain number of “slots” in which items could be stored. So a set of 6 quick gestures were defined for emergency shortcuts. One of the items in a quick gesture that was identified to be important during navigation is the Binaural Earcon.
For the unblind users, meaningful information is encoded in visual format in icons. Earcons is the audio counterpart of the icons. The alert sounds for chat notification is an example of an earcon. A visually challenged user can make sense of the location if he can hear a sound originating from the location. Dr. Habeeb, for instance, is able to hit the stumps with a ball with perfect accuracy, all he needed was a clap from someone standing just behind the stumps. Binaural audio is just how we hear sounds naturally (Bi: two ; Aural: related to ears). But it can be recorded or synthesized, and optimized for headphones to recreate the perception of distance — making audio more immersive.
Bone conduction headphones are recommended for an optimum experience so that the user does not miss out on valuable audio cues from the environment which is also as important. Bone conduction headphones recreate the feeling that voice is being played inside the head and at the same time, the user can sense the location of the sound it is coming from. Using the signal strength from the marker beacon, and the gyroscope of the mobile phone, an application recreate the sound binaurally.
Check out my Behance project to know more updates about this project which I have titled- iCan.
Source link https://uxdesign.cc/when-i-realised-i-was-blind-969f77c5bb97?source=rss—-138adf9c44c—4