Documentation metrics: Measuring the value of documentation (w/ Bob Watson)
Valeria Hernandez

Valeria Hernandez

6 min readAug 18 2024

Documentation metrics: Measuring the value of documentation (w/ Bob Watson)

On Friday, July 12th, 2024, the Hackmamba team hosted an X space featuring Bob Watson, a senior technical writer at Google, who talked about metrics that matter most in technical writing. This article summarizes key takeaways from the conversation. The recording is available on our Hackmamba X account if you missed the live session.

What do we mean by “quality of documentation”?

That's a million-dollar question. Coming from an engineering background, where testing is an integral part of the development process of software, I realized that testing product documentation isn't easy. This led me to complete two graduate degrees and I still don't have the answer.

The hardest part is defining what you want to do because documentation often lacks a clear purpose. Before determining if documentation is working, you need to define what “working” actually means. There are different types of documents: references, tutorials, and conceptual topics; each has a specific goal.

To decide if documentation is working, it is important to follow these steps:

  • First, define the purpose of each document within your documentation. How well you define the purpose depends on what you want to accomplish, and readers can provide valuable input.
  • Second, data should be collected over a long period of time.
  • Third, analyze the data and filter out the noise. Data may be scattered, so statistical analysis is usually needed.

For example, during my time at Amazon, we saw significant changes in landing page traffic metrics, but nothing significant in our documentation had changed. Upon investigation, it turned out we had renamed pages, and redirections got lost during that process. Customers went to the home page when they couldn't find the page they were looking for. After we corrected the redirections, it took three to six months of following the metrics for the updated changes to reflect and for the data to stabilize.

In conclusion, the key to gathering good product documentation data is to start by defining a clear goal for your technical documents. Once you’ve categorized technical documents into basic types— whether they are reference topics, tutorials, or landing pages—track metrics according to your goals.

What can metrics tell us? What can’t they tell us?

Metrics can tell you whether something has happened, but they can never tell you why. When you start finding whys in the numbers, that’s a sign that you’ve gone too far and misused the metrics. The numbers can only show if an event occurred; they can’t give you an underlying cause.

For example, when I noticed the traffic increase on the landing page, the metric changed, but I had to investigate why it occurred. After investigating, I realized the server redirections were incorrect, which I suspected was the reason for the change in page views. Generally, metrics tell you how much of something happened, but discovering the why requires further investigation.

I would like to see more technical writers pairing metrics with user testing. Watching a few customers interact with your documentation can tell you more in 30 minutes than six months of metrics can. I have found this approach to provide much more value.

If you think that your documentation is doing great, observing three customers use your docs will tell you if that is the case. If three people stumble over the same thing, then it’s a clear signal that something needs fixing. I often compare this to watching people walk down a sidewalk: if you see three people trip over the same spot, how many more do you need to see before you decide to fix it?

Using metrics to understand what happened and user testing to determine the why can help you gain the insights needed to make meaningful improvements.

Are there universal metrics for all types of documentation?

Not that I've seen. Each type of documentation has a different goal. It is important to categorize documentation by audience and desired outcome. Be intentional about the experience you want to produce.

What are vanity metrics, and how can technical writers spot them?

Vanity metrics are numbers that tell you nothing about the goal of your document. The example I always use is page views. What does it mean that 100 customers visit your website? Page views might be useful for overview pages, as their purpose is to let people know about your product. However, page views on reference topics don’t mean anything.

Take airplane safety instructions, for instance—not many people read them, but that doesn’t doesn’t diminish their importance. The same also applies to reference documents. Even if only 10% of your audience visits a reference document, it’s still crucial for 100% of your audience to have it available. This is why I often highlight page views as an example of a vanity metric. They might matter for overview topics but offer little value for reference content.

For tutorials, page views fall somewhere in between. While it’s useful to know that customers are finding the content, it doesn’t indicate whether they understand the instructions.

Are there any universal metrics that can be used to gather data on all types of documentation?

Not that I’ve encountered. This is because each type of document has a different purpose. To effectively assess documentation, it’s essential to categorize it by audience and desired outcome. You should know how you want the reader to differ when they’re done.

For example, with an overview topic, the goal might be for the reader to leave with a better understanding of what the product does. In contrast, with tutorials, we want readers to know how to perform a specific task. Writers need to be intentional in the experience they want to produce with their documentation. Once the goals are defined, you can proceed to writing the content.

If readers successfully complete the first step of a tutorial, you can reasonably assume they’ll proceed to steps two and three. However, to verify this, you’ll need to run tests and gather feedback from readers to see if your assumptions are correct.

Having a clear purpose for each piece of content gives you direction for writing. But you need to find goals that you can later test. You may find yourself going back and forth between setting goals and figuring out how to measure their success.

What are the most common mistakes technical writers make when measuring documentation?

I think it is mostly about misunderstandings, looking for shortcuts, and misapplying the metrics. A frequent mistake is working backward from the numbers rather than starting with a clear purpose.

People often say, “We need to start tracking this number.” The real question is, “Why? What do we do with that information?” Management often pressures technical writers to provide measurable data: "Everyone has a number; show us yours.” This pushes writers to rely on easy metrics, like page views, without considering their relevance.

Another misunderstanding is using Google Analytics, which is primarily a marketing tool, is also a misunderstanding. While it can offer some insights for technical documentation, it’s not always the best fit. It works to a degree, and since it’s widely used, many technical writers are tempted to rely on it.

In summary, to follow a data-driven approach, start from the goal of the document and work your way to the metrics that will make the data make more sense.

How do you approach measuring the Return On Investment (ROI) of technical documentation?

That’s a challenging question to answer. I get a little bit defensive about it because if I have to prove the ROI of documentation in my organization, it has already failed. Let’s prove the ROI of building a new feature or the ROI of engineering. That question is malformed. Technical documentation is simply another feature of the product.

If you don’t think it belongs in the product, don’t put it there or pay for it. If you think it belongs in the product and have a budget, then pay for it. Don't pay for it if you don’t think it’s worth that much. Business decisions involve costs across the board, from engineering and marketing to customer support and documentation. If you don’t think the return is worth the money, then stop paying for it; it is as simple as that.

But proving ROI in a strict sense isn't practical. For example, how do you measure the ROI of adding a new button on a website? It’s not an investment but a necessary engineering and product development cost.

Documentation is no different—it’s part of the expense of creating a product. The real issue often lies in the technical writer’s role. Writers need to show that their work isn't just about writing down features; it’s about having clear goals for the documentation.

You've proven its value by demonstrating that the documentation helps users achieve specific goals. Collecting data about how users meet their goals through the documentation supports not ROI but how documentation enhances customer experience.

In short, documentation is not an investment but an expense of developing a product.

When you come to a new documentation project, and the team doesn’t know what success looks like, where should you start?

Start with the overview topics because those are the easiest ones to work with and measure. Overview topics serve a similar function to marketing content and can be measured using marketing metrics. Once the overview is in place, move on to tutorials. After readers learn about the product, they’ll naturally progress to the next stage and likely consult tutorials. While it may take a year or two to gather meaningful data, you can supplement your efforts by conducting user tests in the meantime.

What tools have you personally used to track the metrics of documentation you have worked on?

I primarily use Google Analytics to measure the performance of documentation because it is free and widely available. In some teams, we’ve also utilized external surveys to gather feedback and track metrics.

Would setting a metric goal before creating the documentation help measure it?

Not necessarily. If you have documentation and a benchmark, then you can decide what you want to increase or decrease. However, the challenge with setting metric goals is that no metric can measure the performance of every aspect of your documentation.

How can you measure the impact of documentation on reducing support requests or calls?

Support systems can connect calls made to the help content and integrate content interactions with customer support requests. When you can connect those dots, you can see the support requests that came through the help page or vice versa, which were diverted to the help documentation center.

However, without such integration, it’s difficult to measure whether your documentation is helping reduce customer support costs. Proving this is especially challenging for reference documentation, as it serves broader reader goals that are not constrained to the web. In this type of documentation, it’s hard to distinguish whether users succeeded because of your documentation or external resources like Reddit or StackOverflow.

Wrapping Up

We thank Bob Watson for sharing his expertise on documentation metrics for technical writers with the Hackmamba Creator community. To conclude, we list the key takeaways from our conversation with Bob:

  1. Establish a goal for each document. Start by establishing a clear goal for every document you write and making that goal measurable.
  2. Pick the right metrics. Carefully evaluate which metrics will help you measure that goal and collect the data. Avoid using metrics simply because they are common.
  3. Analyze the data and create a baseline. This will help you filter out irrelevant fluctuations and accurately assess your documentation's true impact on the reader's experience.
  4. Conduct user research to uncover underlying reasons for metrics change. Documentation metrics can only tell you if something happened and the magnitude of the change. Observe readers using your documentation to discover why metrics change and note where they stumble.

At Hackmamba Creators, we constantly invite industry experts to share their expertise with our technical writing community. We would love to have you join us and hear which topics you would like us to discuss next or who you would like us to invite. Join our technical writers' community to connect with other technical writers and take part in our community activities.

Contact Hackmamba


About the author

Valeria Hernandez is a Community Manager at Hackmamba. She enjoys connecting with people and building communities. A self-described introvert, Valeria's approach combines empathy, technical insight, and strategic thinking to create a welcoming environment.

More articles

Banner image of the blog post Top 5 Open-Source Documentation Development Platforms of 2024
Chidi Eze

Chidi Eze

6 min readFeb 09 2024

Top 5 Open-Source Documentation Development Platforms of 2024

This article examines the top five tools for developing documentation, highlighting their unique qualities to provide a more comprehensive understanding of each platform's strengths and weaknesses.

Read Blog

arrow icon
logo
Get insights and tips on technical Content creation

Augment your marketing and product team to publish original technical content written by subject-matter experts for developers.

+1 (302) 703 7275

hi@hackmamba.io

Copyright © 2024 Hackmamba. All rights reserved