How can we design differently to keep users (esp. children) secure?

© Markus Spiske

I recently attended the Designing for Children Talkoot in Helsinki, which focused on integrating child rights into our daily design process. We were 70+ designers, developers, activists, researchers and children’s product builders working together to help designers consider the needs of children and other vulnerable users. In short, it was a phenomenally valuable learning experience.

© Noora Kumpulainen

Despite the cheery picture, what I learned at the event made me deeply uncomfortable and motivated me to significantly change my own design method, and help others do the same. This post is the first step and contains my initial thoughts and suggestions for things we can do. I’m starting work on building a more cohesive guide and tools to use, and will post again when they’re ready to test.

NOTE: This is a long post, so if you’re already convinced of the need, you can skip ahead to ‘What Problems Did We Identify’ to see the problems or ‘What We Can Do’ which has concrete suggestions and benchmarks from products that do it right. Right at the end, there’s a series of tips on how to test with children.

Why is this a Problem?

Fact #1: safety is not part of the standard design curriculum or process

Whether they go to university or not, product builders (designers & devs) are largely self taught, from online resources. Short tutorials, inspecting other people’s code or large, multi month long self paced bootcamps(pictured below).

Photo from Udemy. Red Lines mine.

I have personally taken the first two courses in that picture, and many more similar ones after. I heavily recommend them to everyone I meet, especially Colt Steele’s classes. But not one of the 357+305 videos dealt with user safety and minimum standards for keeping my users safe (and happy rather than engaged) other than authentication (login system).

Similarly, when I was learning to be a designer, I largely taught myself from online resources on top of several project courses at Aalto with real clients and projects with real funding. 99% of that time was spent figuring out how to get our users value and keep them engaged with our service.

Precious little time was spent figuring out how to ease the UX journey for users in a crisis, or thinking about keeping non target demographics like children safe on our service. We did not run scenarios for child or child predator users. We largely did not consider predator use cases at school, since we were building ‘minimum viable products’.

The picture below is often used to illustrate MVPs. But people only point out the core feature: ‘wheels aren’t MVPs. but a skateboard is. you can use it to move around.’ No one points out that keeping users safe is also a minimum standard. A skateboard may be an MVP, but it is not a safe or accessible one for most. This needs to be stressed every time this picture is shown.

Fast forwarding to work life. Since I left school, it has gotten a lot better. For services we build now, we think about user safety a lot. In fact, we often point it out to our clients, many of whom were shocked to find have never thought about stuff like this. But we shouldn’t be shocked.

Emphasis mine

There is no stage to the standard design/dev sprint with a user safety audit. No quick rules of thumb, popular small exercises or simple safety checklist to follow for MVP safety.

We need to build these things.

It’s still easy to think mainly about target users and general predators. It’s still easy to miss out thinking about secondary users like children or the disabled and their predators. In rushed situations like hectic design sprints, it’s again easy to fall back on old habits and miss out on thinking about user safety at all while frantically getting to MVP.

But, user safety and wellbeing needs to be part of the core curriculum when learning and part of MVP when building. They aren’t right now, so we get to fact#2.

Fact #2: No one is responsible for user or child user safety and there are no ‘minimum’ standards for safety

You give your personal data to every small online service everyday. In many cases, that data is hosted extremely insecurely, invisible to you and swapped around without your consent. There is no law enforced minimum standards of safety for most services that aren’t in regulated fields like healthcare.

Web developers for example, are largely free to be as incompetent as they like until a better coder points it out or some crisis happens (e.g. leaks or a user is harshly victimised on the platform). It shocked me to go backstage behind an app I’d used every day that had my payment info and phone number and find out that it was storing user passwords in a way that took about 10 seconds of effort to hack.

It happened again at a hackathon, where their custom registration system stored all our sensitive data in unsecured plaintext. That hackathon had thousands of attendees annually before it was discovered. If I gave all the examples I can think of (*cough* Linkedin, Twitter hacks*cough*) this article would be several books long.

Here’s a headline from today’s paper:

‘Fitness tracking app Strava gives away location of secret US army bases’.

Now think about a child publicly posting their daily run route through a forest for any predator to see.

The new GDPR rules in the EU are looking to fix some of this. One example is that companies now have to appoint a data protection officer who is personally responsible for keeping user data safe.

But who is responsible for making the entire service itself safety-first?

Fact#3 User safety is not about just data

All of this is ignoring the hundreds of ‘dark patterns’ that web developers and large companies use to deceive users. I’m also not mentioning how the lack of things like simple user feedback systems, community monitoring and moderation can lead to disaster. This was on purpose. It’s easy to fall into the trap that ‘safety’ is just about keeping user data secure. It isn’t. It’s also about designing against other forms of predatory behaviour and thinking about how users can seek help in a crisis.

Quick scary fact: children can damage their hands permanently from using devices meant for large adult hands, from repeated tapping and stretching.

Another one. For almost every service, there is a HUGE document you have to agree to that protects the business and explaining their rights and absolving them of any blame, ever. Where is the document with your rights? Do you know them? Do you have any? If something goes wrong, what are your rights and options? What could go wrong and how can you prepare for it?

© Huffington Post UK Emphasis mine

The fact that you don’t know the answers to those questions is the problem. If you don’t know, your children don’t know either, and are being taught everyday not to care.

We need a better system. One that speaks human language, not legalese. Protect your users because they don’t necessarily know how to protect themselves.

Fact #4: Popular resources and Design/Dev tools don’t consider this problem

These days, designers and developers use a ton of code snippets and frameworks built by other people when designing. This can mean something like Bootstrap (a framework for building websites and web apps), which approximately 20% of the top million websites use.

Bootstrap — “the world’s most popular framework for building responsive, mobile-first sites”

Think of Bootstrap as Lego blocks for coders. We put together MVP sites quickly using these blocks to solve common problems like pricing or photo albums (pictured). But again, there is no emphasis on user safety or well being.

There is no Bootstrap for safety features. No popular resource (that I know of) that lets me quickly get to minimum user safety. Hell, there aren’t even easy to read minimum standards or a specific, validated checklist I can use always.

Even if there are resources I’m missing, builders shouldn’t have to recognise that there is a problem themselves and look. The standard tools we use should include user safety.

A quick shout out and request:

Pretty please? I’ll even help if you need it.

This is the best collection of free ‘learn design by example books’ I know. I learned a lot from them. Please make separate book for user safety and accessibility, UXPin? We clearly need it.

Fact #5: Children, disabled users and other vulnerable users are rarely designed for if the product is not for them

I’ll keep this short: even babies are on Facebook, either on their own accounts or being posted about constantly by their parents. 0–3 year olds are on Youtube watching videos that aren’t meant for them.

Children and vulnerable users can and most likely will use your product, even if its not meant for them. Think of how to eliminate/minimise damage when they do.

This wasn’t for children. They watched it anyway.

Similarly, disability doesn’t revoke your internet access or your ability and right to use any non-digital services. And yet, if I google for information on disabled users, top results are about product building, but to solve problems for the developer, not disability users.

‘nuff said

Of course, this is a cherry picked example. A few more searches and keyword fine tuning and you get to what you’re looking for. But ‘accessible design’ is a keyword you know if you already know about this problem, not if you’re a beginner or junior designer/developer. And its taught very, very poorly at most universities/online courses for product builders.

Design for accessibility is woefully lacking in finished products. My company (Eficode) runs more UX tests every year than any company in Finland. By and large, they find a ton of accessibility problems in large and small services. This shouldn’t be a problem in a day where a quick googling leads to amazing accessible design guidelines and checklists. But it is.

Fact #5: Solutions to this need to be native. Not external.

Two of my teammates on the weekend worked with an NGO that does amazing work in protecting children from all sorts of abuse. They (and many organisations like theirs) also develop great apps with amazing safety features. An example is a big emergency button app that users can use to report problems or crises online or offline.

But they all struggle to get users to adopt the solution.

Unless something systemic changes and we’re taught to use a specific app like this since birth, this sort of outside in solution can only help those who know of them and use them.

Safety and accessibility features should be built in. Don’t leave the responsibility to someone else to haphazardly fix. This is your responsibility as a builder.

What Problems Did we Identify?

This is the average life of a product. It is designed, goes into beta, gathers a bunch of users and evolves if its popular enough to stay alive. Often, users eventually leave for greener pastures and the product dies or pivots.

Problems in Initial Design & Testing

This stage includes ideation, prototyping, development of MVP, user testing, iteration and refinement. Essentially everything up to open registration of users, and all testing after that.

  • If a child is not the core user, they are rarely tested with. Children are rarely considered as a group worth testing with if the product isn’t for them. Their use cases (especially at different age groups) are not thought about in general.
  • No job or position responsible for user safety (e.g. Data protection officer for user data rights is mandated by EU GDPR law)
  • The tools, free code snippets frameworks and tutorials used by product builders have no components for user safety. An open library of these should be built and distributed to popular platforms (e.g. Twitter bootstrap, Udemy, Lynda etc)
  • Designing for worst-case scenarios and considering child predator use cases is mandatory but forgotten.
  • Digital literacy and ability to protect self online is wrongly assumed.
  • Safety, accessibility and privacy features should be open source, so that other builders can use them too. Facebook is leading by example.
  • Parents are usually unable/unwilling to parent their child online the same way they would in offline life.
  • On the flipside, parents are able to violate child privacy rights and access their personal data without their consent on certain services.

Problems in Marketing, Registration and Onboarding

This stage includes marketing to the user, registration and the product’s onboarding and manuals.

  • During onboarding, Users are typically not informed of their rights and what to do to get help.
  • Don’t collect data you don’t need
  • Long page of rights (privacy policy) protecting the company, nothing like this for the user.
  • Marketing uses plenty of dark design patterns to trick children and vulnerable users into giving away data and money.
  • Manuals and guides are written with long, inaccessible text that almost no one reads, especially children.

Problems during Active Use and Ongoing Development of Product

  • Safety and privacy features should be front and center, not hidden away to be looked for. Be proud of your efforts to keep users happy and safe.
  • A visible equivalent of an emergency button for crisis is recommended
  • There’s not usually options for quick, anonymous, free text feedback to services (e.g. like in Jodel). Feedback and reporting forms are usually very complex, especially for children.
  • Few businesses monitor for dangerous keywords that signal distress and crisis (e.g. suicide, ethnic slurs, sexual content)
  • Fewer offer support actively, not just when asked for. (e.g. Facebook suicide watch)
  • Retention should not be the only metric for your impact. Time on screen, physical damage to user etc. should be considered.
  • In specialised crises like sexual abuse etc, ask for expert help or consult local NGOs and nonprofits that do field work. It is easy to make things worse. Always involve police and law enforcement when illegal behaviour is suspected.
  • Not common to get user safety (not security) audits or widely known where and how to get them. Also difficult to know who’s actually qualified to do them.

Problems after User Exit/Product Death

  • Users (especially children) forget that services they using still have data on them
  • Products that fail/change/merge often forget to dispose of collected data responsibly
  • Mergers for the sole purpose of getting more user data
  • Selling user data to recoup losses from product death
  • Actively ask why users are leaving/unhappy. Not just with a web form.

So, How Can We Improve?

First off, I’m starting to build a guide as well as reusable components for building safer services. If you want to help with this, send in your ideas/suggestions to [email protected]. Warning, there’s no money in it for you (or me).

Ideas & Benchmarks

#1. Anonymous, free text feedback

Jodel is an app students, kids, service workers etc use to talk to each other anonymously and without fear of judgement. They make channels that function like tribes, and get support from people who have the same thoughts, troubles and gain karma from talking about what they’re thinking.

I was wondering why more companies don’t have a Jodel like system for feedback? It takes seconds to voice your thoughts on Jodel and apps like it, but for most forums you need to sign up, and most feedback/reporting forms are a mess of complicated dropdowns and fields.

If I get frustrated and ragequit feedback forms and error reporting, I’m pretty sure that children have the same issues.

These could be visible publicly and/or privately if the company doesn’t want to show dirty laundry.

For an example of this, check the Finnkino example from the next point.

#2 Anonymity (and always a choice to use them)

I’ve worked and studied with a lot of Germans in my time, and their way of using services like Facebook always impresses me. It’s common to not use your full name and your picture as a profile photo. Here’s a headline that made me very happy:

Facebook ordered to allow fake user names in Germany — The Verge

Finnkino movie theather feedback app

Anonymity should always be an option. Now I use my photo everywhere and my full name, but mainly because I lead a non controversial, boring life, have no major insecurities and no one regularly victimises me.

A lot of people don’t have that luxury, often straight from childhood. They have to choose between making themselves vulnerable to attack and being cut out from basic services that everyone else uses.

The example to the left is from the Finnish movie theater company’s mobile app. In addition to the usual feedback forms, they have a simplified one for kids, which can be used with just an avatar and has free text feedback like I recommended in option #1.

#3 What would a modern forum built for vulnerable users look like?

This. Heimo is one of the most impressively designed forums I’ve ever seen. I strongly recommend going through it and learning from it. They have a lot of mechanics that should be the basic norm.

Heimo

#3 Look for trouble actively. Don’t wait for a crisis

Does this even need a reason? For example, Facebook has a ‘suicide watch’ in beta that offers help to people who seem to be messaging about suicide and death a lot.

Not every company has the resources for this, but a) there’s open source solutions for stuff like this b) it is a basic duty of companies to moderate their forums and online communities.

If a gaming company can make toxic idiots (a key market share for them) less impactful, so can you.

Run keyword searches for slurs, block them out like on Steam (pictured above). Do more searches for words associated with victimisation and have good user reporting systems or karma systems (again, there are open source versions and microservices).

#4 Don’t Ask for Data You Don’t Need

This is a general rule of thumb, but especially during user registration, stop asking for information you don’t need. This is very prevalent in e-commerce companies who can ask for everything from my dog’s first name to every single detail about my payment info.

Also, offer options that allow for anonymity and reduce your own need for information security. An example is bank transfer and cash on delivery options in e-commerce. They’re rare, but so necessary. I shouldn’t have to give my card info to small companies with no guarantee of how they store this information.

Thomann.de one of the largest online music instrument retailers

#5: Talk at eye level and Talk so that you’re understood.

Documentation for what to do in crises is almost always written in complicated language and/or stupid indecipherable legalese. It’s written as an afterthought. It’s written to cover the company’s ass when things go wrong. it’s not written to be read.

In fact, most on-boarding and help documents are written for quite competent adults who speak good English or whatever the native tongue of the developers is. I often have no idea about my rights or what to do when online and offline services go wrong or fail me. And I bet, most of you don’t either. It’s easy to frame this from a child’s perspective.But it applies to every user including me.

Aim to be better. Write in simple, short sentences that can be easily Google translated. Don’t use buzzwords. Don’t use jargon.

help.medium.com

A good benchmark is Medium’s help documentation. It’s written for clarity and from the perspective of users having different sorts of problems. Short words, helpful icons. A good search.

If you’re designing a help section for kids, tone is very important. They won’t suffer through dry text like us adults are conditioned to. Speak like humans speak. Not like they write on university papers.

#7 Safety features should be front and center

In a year where almost everyday we hear about how the services we use most harm us greatly, it goes without saying that companies that actually make the effort to take care of their users should be proud of this.

Instead, safety features and what to do in a crisis are always hidden away.

Figma, one of the companies I admire most in the world, has a different approach. Figma is the world’s first fully collaborative online design tool. They have a help and chat button always present in the bottom right corner of the app, and their support experience in unmatched. You could even start a live chat with support staff right there, without leaving the app or looking. That’s what customer support should be.

Figma’s Help and Live chat support functions

Tools, Tutorials & Reusable Components

This is by far the most pressing problem I can think of right now.

The tools developers and designers use have little to no focus on user safety. The tutorials we use to learn how to build are similarly lacking.

© Udemy’s Top Courses for developers.

That picture above is scary. The web is full of tutorials on how to become a developer, but almost none of them have anything to teach on developer ethics or how to keep users safe AND well. Most programming books don’t have anything to say or teach about this either.

And we wonder why we’re in the situation we’re in these days.

When recruiting, companies need to start asking for certifications on user safety. Universities need to treat it as more than an afterthought. Startups need to think of it as part of any MVP.

We need to teach ourselves to care about user well-being and consider a fundamental part of UX and the design process. The problem isn’t that Udemy and sites like them are bad. In fact, it’s the best place to learn the technical parts of development that I’ve found. But it needs to also teach and push the non-technical parts. I’ve learned a lot about programming for free from Youtube. Check out Level Up Tuts and Brad Traversy if you doubt just how high quality of a developer education you can get for free right now. But for the life of me, I can’t see any popular user safety guides anywhere.

Moreover, we need a Bootstrap or Foundation for safety features, and a strong open source culture around them. Code blocks, snippets, components and micro services that focus on the stuff most startups, companiesand developers forget and don’t know to look for.

So we need to make these things. Many people have(just look at the picture from the Talkoot at the top of the article). Please start too.

Design Methodology

#1. Journey Map different sorts of Crises and Worst case Scenarios

The simplest fix is to just start thinking about these things in the first place. Make an experience map for different crisis scenarios.

It can be this easy to think about user crisis . Emoji’s and a straight line. Highs are high, lows are low.

The picture above is very simplified, but it’s a good tool to use when thinking of user safety and crisis.

Side note: this is the journey for a user who knows (s)he has a problem. Also think of what happens when people don’t know they’re being victimised.

#2. Experience Matrices

This is a huge problem, with diverse effects on many stakeholders. Make matrices with all variables you can think of paired with different worst case scenarios. An example is shown below:

Examine how different types of crises affect different user groups. Here, users = children

Make different members of your team and your test users fill these out, and predict the crises they suspect or have seen/been through. This is a good way to make problems visible, draw from all team members expertise and experience with common problems and encourage design solutions for found issues. Explore together how the product prevented or worsened the known scenario

Also consider involving child/user safety experts and UX experts.

#3. Design against predators and crises during normal design Sprint

Think of the data you’re collecting and/or making visible and how it can be used to prey on your users. The Strava example above is a good one to keep in mind. Another comes from a Glasgow University experiment. Users of a run tracking app were giving away the exact locations of their house specifically because they turned off the tracking right before they got back home (to hide their house location), which obviously marked exactly where they lived.

Caution users against giving away their info and common ways people endanger themselves.

#4 What to think of when testing with children

Time

  • Children have short attention spans. 30 minutes to an hour is the recommended testing time.
  • Test in short bursts and give intervals. Include breaks for play. Don’t cram too much into the hour.

Environment

  • Environment and time where it is tested is really important. Kids have very defined time schedules and places where they are at different times of the day.
  • Consider doing the test in an environment in their own house or playground. Where they feel natural. If in a foreign environment give them time to adjust to it. Consider repeated intervals. Results change as they get more familiar with environment

Physical

  • Consider how children will use your product. e.g. children have small hands so they need bigger touch targets and rest palms on screen while using.
  • Children of different ages (0–5, 6–10, 10–15, 15–18) act very differently

Consider the role of guardians and caretakers

  • They will always see you as an authority figure
  • Kids look to adults for small signals and change their behaviour around them. So you get a biased result when you have them there. Keep this in mind.
  • In a design sprint, it is common to start ignoring kids to get work done fast. Adults in the tests often overrule kids and tell them ‘no, that is not what you think’
  • Think of how kids parents and caretakers are affected by your product. E.g. toy packages are mainly for adults, even if toys are for children

Avoid Rigidly Structured Tests

  • If you create a testing framework thats too defined, you’re getting your own perspective. Not the kid’s. You’re defining boundaries that the kid doesn’t have in real life. No rigid instructions.
  • Consider giving them tasks instead of instructions if absolutely necessary

Peers

  • Kids act more naturally and talk like kids when they have peers around them, as opposed to doing the test alone with adults
  • A good method is to ask them to discuss with other kids why they use service / what they like / dislike

#5: UNICEF child online safety assessment tool

UNICEF has made a free, easy-to-use and constantly evolving safety assessment tool that can tell you how your company is treating its most vulnerable users. Go through it. It’s a simple excel sheet that’ll expose serious holes in your system.

I’m Chitrak (AKA Chichi). I’m a Digital Product Designer at Eficode.

I design and build apps, websites, web apps, aftermovies, print and web graphics. I also teach others to do the same.

If you want to discuss what you read above, or anything related to digital product building or environmental efforts, email me at [email protected]



Source link https://uxplanet.org/how-to-stop--at-user-safety-6417edc3b1d6?source=rss—-819cc2aaeee0—4

LEAVE A REPLY

Please enter your comment!
Please enter your name here