
Quality content is a hard sell. Sure, people get that content is important — but getting people to invest the time and resources needed to make content great isn’t easy.
It’s not enough to tell decision makers you need quality content. In order to make the case for it, you have to demonstrate success and failure. Selling content strategy is a continuous process. You must show how content quality impacts business goals and user needs.
This is a tall order. As content strategist Melissa Rach says on the value of content: “Most people understand that content has value. Big value. They just can’t prove or measure the ROI [return on investment]. And, therefore, they have no concept of how much content is worth.”
So, how do we determine if content is good or bad? How do you know if it’s working as you’d hoped?
Content governance is not possible without content measurement. You can’t define content and resource needs without understanding the value and effectiveness of your content.
How Do You Measure Content Quality?
Fundamentally, there are two types of content measurement: quantitative and qualitative.
You can think of quantitative and qualitative as what vs. why. Quantitative analysis can tell you what users are doing — how they’re interacting with your content. Qualitative analysis can tell you why they are on your site — what their intent is and whether your content is communicating clearly.
Together, these two forms of analysis help paint a well-rounded picture of content value. It’s no good knowing what is happening if you don’t know why it’s happening. And it’s no good understanding why if you don’t know what got users there in the first place or what they’re doing.
When it comes to web analytics, I’m equally enthusiastic and cautious. Web analytics provides easy access to valuable insights — not just for content governance but also content planning. However, when used poorly, it can confuse and mislead rather than guide and inform.
In order to make good use of web analytics, you need to understand its strengths and weaknesses.
What Web Analytics Can’t Do
1. Provide a complete content measurement solution
It’s a common mistake to use web analytics as a default content assessment tool. Remember, it’s only one side of the content measurement equation . As content strategist Clare O’Brien says, organizations are overly obsessed with analytics data:
Broadly speaking — and thanks largely to the ubiquity and ease of access to Google Analytics (GA) — businesses have become fixated by traffic volumes, bounces, sources, journeys and subsequent destinations and the like and aren’t looking to learn more.
We have to think bigger when it comes to content assessment. On its own, web analytics can be misleading.
2. Provide accurate data
One of the reasons web analytics is so compelling for data nerds is that numbers appear definitive and actionable. But, in reality, no analytics tool provides completely accurate data. Different data collection methods, reporting errors, and user blocking information sharing compromise accuracy.
(But don’t worry — I’ll soon tell you why this inaccuracy is okay.)
3. Adequately answer why?
As I mentioned, web analytics can help us understand what users are doing and how they interact with our content. However, it can’t answer why they are interacting with our content.
Web analytics can’t adequately replace qualitative analysis or even a single user telling you why they visited your website and why they left.
What Web Analytics Can Do (And Why It’s Great)
1. Quantitatively evaluate web content quality
There are many definitions for web analytics, but the most clear and succinct I’ve found is on Wikipedia:
"Web analytics is the study of online behavior in order to improve it."
Indeed, that is the strength of web analytics. By understanding how people use your website, you’re empowered to discover and assess content problems that lead to positive change.
2. Comparative analysis: measure website trends
Stumped by the notion that web analytics can’t provide accurate data? As promised, fear not! The reason this is okay is because the power of web analytics lies in trends, not in individual numbers.
Without context, single metrics are meaningless. Knowing that you received 8,000 admissions website pageviews last month isn’t as important as knowing that those 8,000 pageviews are a 25 percent increase from the previous year. That’s progress.
3. Challenge and validate assumptions
We make assumptions every day about how people use our website and what information is most valuable. I’m unable to count the number of website redesigns I’ve witnessed that were guided by assumptions regarding content needs and user goals.
While some of these questions are best answered through a comprehensive content analysis, web analytics can help validate or disprove those costly assumptions.
4. Demonstrate how your website meets established business goals and users’ needs
As important as qualitative content analysis is, these findings rarely make the case for quality content on their own. People need concrete data to assess value.
It’s not enough to simply say that Sally doesn’t want to fill out your two-page inquiry form. It’s more effective to show that the inquiry form has an 80 percent abandonment rate. Gut instincts are good, but numbers are better.
5. Enable stakeholders and content owners to measure the success of their own content
As we know, content governance in higher ed is not a one-person job. It involves numerous departments, content owners and other stakeholders who are charged with making decisions about content. Unfortunately, most of these content stakeholders are not content experts or skilled at assessing content performance.
With planning, web analytics can provide content stakeholders with relevant web metrics to evaluate the success of their content. This is huge. I also find that the more people are aware of how their content is being used, the more likely they are to care about maintaining it. Win, win!
What Is Next?
This post kicks off a series of posts on web analytics and content assessment. I’d like to discuss how we can be smart about our use of web analytics and our approach to governance and measurement.
If there are analytics topics you’d like to see covered as part of this content measurement series, let me know. I’m taking requests!
Update 11/8/12: Check out the second post in this series on web analytics and content assessment, A Web Analytics Framework for Content Analysis.
Interesting article, Rick. Unfortunately, the new Dutch cookie law will prove to be a challenge as the majority of users will not give permission to place web statistic cookies on their computer. Web analytics will be purely indicative and the majority of sites will have not have enough visitors that give permission to get statistical relevant information. Guess we will have to focus more on the quality side of web info. Visitors will be bombarded with question popups. First to ask for cookie permission, than when a negative decision is taken the visitor will get questions what they think of the website. We will try to get to know more ‘why’ as we won’t get much ‘what’ anymore
Do you have any examples of a “balanced scorecard” approach to assessing important aspects of a website?
Great article, thank you for sharing!
I’m writing a post about Web Analytics in portuguese and I’d like to use this post of yours as one of it’s references, linked of course. Do you mind?