Run an A/B test
An A/B test (sometimes called a split test) measures two or more variants of content by serving them randomly to users and determining which performs better. These tests are fairly easy to implement, and can provide direct and measurable results on word choice.
Check out social media
Many social media management tools provide ways to tag response trends, quantify hashtag use, and much more. Some even have sentiment analysis as well (i.e.-computational identification of opinions expressed in text with categorization into positive, negative, and neutral).
Get a list of hashtags where relevant conversations are happening. Spend some time on each social media property searching for your product name or other relevant search terms too.
Talk to customer support
Get regular reports from your customer support team. What trends can you find in support tickets? What issues are people having with products? Which specific feature is causing trouble?
Beyond calls, chats, and emails, you can look at passive data too. Peek into the analytics of your online help content (FAQs, help articles, troubleshooting videos, etc). Which pages are accessed most? Which pages are people spending an unusual amount of time on? (Bonus points if your help content is equipped with a feedback mechanism — this makes evaluating issues even easier.)
Read Google Play, App Store, and Amazon reviews
If you work on an app, then you have a built-in feedback system with app store ratings and reviews. Same with physical products on Amazon. Check these areas often (and especially after a new release) for valuable nuggets of info from users.
Conduct random, in-person polling
In the UX Research world, these are called “intercepts.” No matter what you want to call them, the concept is simple… approach people unfamiliar with your work and ask them for feedback. Oh, and you’ll get much more participation if you offer a small incentive — even a $5 Starbucks gift card. (I’ve done this while conducting surveys outside of Starbucks.)
This Medium article has some helpful tips on Testing for UX writers including methods to measure content, like a cloze test.
Ask another content strategist or UX writer
There’s a big content community outside your desk. Here is a quick list of places you can find like-minded folks. Members of these communities are friendly and always willing to help answer a question or give feedback, so reach out!
Provide a feedback link
This practice is becoming more and more common — and I love it. Place a link to give feedback in areas where users can easily access it. Use a mailto link, create a simple form, or just offer a smiley-face rating system to get an idea of how your content is performing.
You can also create a vanity URL like yourproduct.com/feedback too.
Find subreddits based on your product (or a related topic) to get insight into the types of conversations — and issues — users are having. Back when I worked at Motorola, design team members would subscribe to current product subreddits like r/MotoX and r/Android.
Mine search data
Check out your website’s internal search data for clues to customer confusion. What words or phrases are searched most often? Which searches return zero results? This info can point you to sources of customer confusion and also clue you in to missing content altogether.
Google Trends is a fun tool to get an idea of searches outside of your site. Plug in terms specific to your industry, product, or features and get a better understanding of usage over time and location.
Read professional reviews
When professional reviewers cover your product, they often do it in great detail. And, they don’t hold back their opinions. Any issues they have are likely going to be problems for the average user too.
Once you’ve finished the article, don’t forget to peruse the comments. There’s often more gold there — straight from the customer or user’s mouth — that people tend to overlook.