In a Google Webmaster Hangout, Google’s John Mueller answered which low traffic pages to noindex and which ones to not worry about.
Are Low Traffic Pages Harmful?
It’s generally understood that it’s a good idea to remove low performing pages. Low quality pages tend to attract low amounts of traffic and should be either no-indexed or removed entirely.
That is what the question John Mueller answered is about.
The question is specifically about a news site but Mueller’s answer widens to be useful to more than just news sites.
This is the question:
We’re publishing news and articles.
For example, we have 100 new articles every day and ten of them give us 95% of the organic search traffic. Another 90 go nowhere.
We’re afraid that Google can decide our website is interesting only for 10%.
There’s an idea to hide some boring local news under noindex tag to make the overall quality of all publishing content better.
What do you think?
CONTINUE READING BELOW
How Google Analyzes Website Quality
Google’s Mueller first discusses how Google’s algorithm reviews web pages and the entire site in order to understand what the quality level is.
His answer was on a general level, meaning that it’s applicable regardless if it’s a news site or any other kind of site.
This is what Mueller said:
In general, we do look at the content on a per page basis.
And we also try to understand the site on an overall basis, to understand how well is this site working, is this something that users appreciate. If everything is essentially working the way that it shout be working.
So it’s not completely out of the question to think about all of your content and think about what you really want to have indexed.
Now Mueller focuses on news sites.
He states that traffic isn’t necessarily the metric to use for judging whether a news web page is low quality.
CONTINUE READING BELOW
But especially with a news website, it seems pretty normal that you’d have a lot of articles that are interesting for a short period of time, that are perhaps more of a snapshot from a day to day basis for a local area.
And it’s kind of normal that they don’t become big, popular stories on your website.
So from that point of view, I wouldn’t necessarily call those articles low quality articles, for example.
So, just because a news article isn’t popular doesn’t mean it’s low quality.
John Mueller then advises on how to know when content is truly low quality.
He highlights issues such as content that is hard to read, broken English, and content that is poorly structured. Then he says what to do if you have a mix of good and poor quality content.
This is what he said:
On the other hand, if you’re publishing articles from … hundreds of different authors and they’re from varying quality and some of them are really bad, they’re kind of hard to read, they’re structured in a bad way, their English is broken.
And some of them are really high quality pieces of art, almost that you’re providing. Then creating that kind of a mix on a website makes it really hard for Google and for users to understand that actually you do have a lot of gems on your website…
So that’s the situation where I would go in and say, we need to provide some kind of quality filtering, or some kind of quality bar ahead of time, so that users and Google can recognize, this is really what I want to be known for.
And these are all things, maybe user-submitted content, that is something we’re publishing because we’re working with these people, but it’s not what we want to be known for.
Then that’s the situation where you might say, maybe I’ll put noindex on these, or maybe I’ll initially put noindex on these until I see that actually they’re doing really well.
So for that, I would see it making sense that you provide some kind of quality filtering.
But if it’s a news website, where… by definition, you have a variety of different articles, they’re all well-written, they’re reasonable, just the topics aren’t that interesting for the long run, that’s kind of normal.
That’s not something where I’d say you need to block that from being indexed. Because it’s not low quality content. It’s just less popular content.
John Mueller made an important point about diagnosing an article for quality. He basically said to look at the content itself to determine if the reason for low traffic is because the content is not popular or if the article is poorly written.
Just because a web page is not popular does not mean it’s low quality. Content like that won’t reflect poorly on a site.
Low traffic can be a flag to alert you to a possible problem. But it’s not the problem itself.
Take a look at the content and determine whether the low traffic is because:
- The web page information is outdated (not good, should be improved)
- The web page is thin (not okay)
- The web page is on a topic that’s not popular (that’s okay)