I was recently asked by Dirk Knemeyer and Jon Follett to discuss the future of creative jobs. You can listen to that podcast below. What follows captures and expands on my thoughts.
When people are asked about this sort of thing they think their own fields are more stable than others. That’s not always due to ego or a false sense of security (although it can be) as much as change happens incrementally and is internalized quickly. Looking at my own field critically there are a few factors driving change already that will only gain momentum in the next 10 years.
People in UX share several traits that will continue to serve us well:
- We’re always curious. We’re always hungry to learn new skills, new tools, new approaches. This is our fundamental creative superpower.
- We tend toward abstraction. If traditional design is about “look” then we’re about “feel.” That means the “why” is our cornerstone.
- We gravitate toward systems. Details are of course important but we don’t look at things in only one context. As a result we aim to create cohesion (not consistency) across many touchpoints.
- We deal in strategic concerns. We understood early that our success as a new profession was going to be based on solving problems for people and driving results for business. As a result it was easy for us to become a fulcrum as “design thinking” took root in organisations.
- We’re ambitious and transform at rapid rates. It can be off-putting and very mercencary but we always chase the next big thing. It’s a kind of career camoflauge. Career advancement as self-defense mechanism. The march from UX to product design to service design is an example of this shapeshifting.
Ethics scandals run amok in the tech world today.
- Dark UX is tarnishing our reputation as a community. UX has praised itself as the “champion of the end-user” but we’ve been piss-poor at demonstrating that. We need to earn that responsibility again.
- Regulation is underway. Europe and Canada have been leaders so far. GDPR and other approaches are beginning to influence policymakers in the US.
- Tech’s Hiroshima Moment is coming. The atom bomb changed physics as a profession and science writ large. Despite all the scandals — from Tesla to Uber to Facebook — technology hasn’t been forced to look inward. Yet.
AI will change our ways of working and how we define our professional value.
- AI will change the creative industries. Sophisticated algorithms will effect us in similar ways robots and automation have manufacturing. That means repetitive tasks that can be modelled with ease will be handled by code.
- Creativity and curation will become synonymous. Design will supply better training data, visualise the structure of algorithms, craft interfaces for AI and more. This will improve the value of AI for people and not just businesses. As a tribe we’re more diverse and inclusive than engineering. This will help solve some of the pernicious bias issues we see today.
- Designers will change AI. Currently engineers have had near-exclusive domain over the development of AI. That’s lead to a lot of weird bias that designers will have to help fix.
3 new roles
Based on all that the following roles are likely:
Data viz practitioners will specialise as AI visualisers or Algorithm Designers. Whatever it’s called it will be their job to help us better understand how complex AI works. Their stock in trade will be visualizations and interfaces. Right now AI and algoritms are a black box. Legislation, consumer protection concerns, corporate social responsibility programs and brand differentiation will all make it necessary to expose AI’s inner-workings in understandable ways. Even if that’s guess-work or just scratching the surface it will be critical in the near-future. These designers will upskill in data science and be able to demand hefty price tags.
Design researchers and data scientists will see the quantitive side of their jobs automated. The qualitative side will become more important but it will be driven by data even more than today. I suspect they’ll shapeshift into Data Ethnographers. It will be their role to question the CULTURE of the data-sets used to train AI. Who generated the data? Who assembled the data? When was it last updated? They’ll use existing skills like recruiting test candidates in new ways. They will be the first line of defence against bias in AI. They’ll work with algorithm designers, engineers, and more.
Corporations, in a way to show they care about an issue, love adding niche roles to the c-suite. Today we have positions like “Chief Diversity Officer” and “Directors of Data Privacy.” Soon we’ll see the Chief Ethics Officer emerge. Their main role will be to imagine what Dan Hon calls “negative externalities.” That is, all the terrible ways business decisions can fuck shit up. Zuckerberg and other CEOs have long used lack of imagination as a defence. “We couldn’t imagine bad actors doing something like this” is a common refrain in big tech apology tours. In the future they won’t have an excuse. The Chief Ethics Officer will make a shedload of money in combat/hazard pay for taking this sort of heat.
Share your thoughts
How do you see roles changing in the next 10 years or so? What factors will drive those changes? How will we evolve to cope? I’d love to know what you think.