This post will provide you with a framework for choosing technical documentation metrics to report. It’s a few conceptual steps above “5 easy tech doc metrics to blow your product team’s mind.” This is some of the thought behind those metrics.
Trigger warning: This article refers to content metrics reports as “storytelling.” I realize the term is overused and gag-inducing for some.

How this framework helped me
In my experience, discussions with tech doc writers about choosing which metrics to track can quickly become confusing and overwhelming. Many doc writers tend to be all trees and no forest: They look to various metrics that help them with daily tasks of writing and updating individual docs.
On the other hand, I want to tell a story about content performance to stakeholders —and keep writers informed about trends in the performance and health of their docs. There’s validity in both points of view, but it’s important to understand the difference in objectives and approach. Following are some basic ideas for organizing discussions about tech doc metrics.
Two principles to keep in mind when you choose metrics to report
When choosing metrics to report to your product team, manager, and other stakeholders, keep the following principles in mind:
- Busy stakeholders want to quickly see: How are the tech docs doing? This means you’ll provide a snapshot of a handful of primary metrics that show performance. Keep the narrative clear, simple, and trackable from week to week or month to month. Don’t report on numbers alone; provide a brief analysis explaining trends in content performance and health.
- If you report a metric, expect to be accountable for its performance. This is a compelling reason not to report on lots and lots of metrics to product teams and other stakeholders. They are very likely to want you to take accountability for the performance of those metrics, as in: “Why is that number down this week? What are you going to do about it?”
Reporting on fewer metrics isn’t weaseling out of accountability. If you report to stakeholders on key metrics for traffic, search performance, and customer success, you’ll have plenty to be accountable for.
Three categories of tech doc metrics
In discussions with other tech doc writers and managers, I’m finding it useful to organize metrics into three basic categories:
- Performance metrics
- Diagnostic metrics
- Operational metrics
These categories help me focus on the objectives of the metrics under discussion and reduce the confusion that often accompanies choosing among a lot of available data.
Performance metrics
Doc performance metrics are core measures you’ll report to stakeholders that provide a snapshot of the reach of your docs and amount of interest in them, as well as other insights. Examples are:
- Traffic, such as visits, page views, and unique visitors
- Percentage of referrals from search
- The top articles by visits or page views
Diagnostic metrics
Diagnostic metrics help when you need to understand and fix a problem with doc performance and health. Many are content metrics you wouldn’t choose to report routinely to stakeholders.
Some examples:
- Inbound referrers
- Search queries
- Verbatim customer comments
Diagnostic and performance metrics overlap. Further, there may be good reason to move a diagnostic metric into the performance metric category. It depends on the strategic objectives of your content.
However, it’s useful to put most doc metrics into the diagnostic category in order to reduce noise in your reports, while still acknowledging they are important metrics writers need for their daily work.
Operational metrics
Operational metrics aren’t web metrics; they measure how your organization does its work. Operational metrics may vary significantly among organizations, depending on workflows and project tracking methods. Here are some that matter to my doc team:
- Progress on tasks that address customer feedback and impediments
- Rate of response to customer feedback (Git issues for my team)
- Number of new or updated articles published in a given period
There’s a good case to be made for reporting some operational metrics, such as progress documenting customer pain points, to your doc stakeholders.
Learn more about web metrics and reporting
To learn about performance and diagnostic web metrics, see the following:
- For a quick intro, check out my post Revenge of the English majors. Under “Measure results,” I describe a handful of basic performance and diagnostic metrics to track for your tech docs.
- For a much deeper dive: Web Analytics 2.0 by Avinash Kaushik, analytics evangelist for Google and author of the blog Occam’s Razor
Categories: Content strategy, Data
Wow, Carolyn, you are absolutely slaying it in this blog! So on point and so much useful information. Thanks!
LikeLiked by 1 person
Can you talk a little more about what you mean by document health? Is this along the lines of Redundant/Outdated/Trivial (ROT) content?
LikeLiked by 1 person
Great question, Ed! And thank you for sharing an acronym I wasn’t familiar with. There are a few ways to measure document health for tech docs, and they may vary with your business strategy. For public-facing cloud platform documentation, I measure health in part by standard web performance metrics: Traffic trends, % of search referrals, and bounce rate. I also look at an operational metric: Burn-down of work items from customer feedback and pain points.
Redundant/outdated/trivial content and other content lifecycle concerns could be classified as operational, but ROT has significant impact on search referrals and overall traffic. Moz talks about this content as “cruft” and has shown that it drags down search performance (plus, we know it’s confusing to customers and creates extra work for tech writers). [EDITED]Cruft has different implications for content marketers – the primary audience for Moz – but the ideas and practices around it are useful for those of us with public-facing docs and strong competitors. Here’s the Moz article: https://moz.com/blog/clean-site-cruft-before-it-causes-ranking-problems-whiteboard-friday
LikeLiked by 1 person