#15 – Great Content Can’t DESERVE To Rank
“Produce great content”
That’s what we need to do, right? Content Marketing is the new black, and we need to be bloody good at it if we want our sites to rank.
Sean recently proposed that some SEO folk are waiting with baited breath and crossed fingers that Google’s next update will reward quality content that deserves to rank. I sincerely hope that people don’t actually believe such garbage, and would like to dig into why it could never actually happen.
For the purpose of this discussion we’ll consider only the content medium of copy, since that is the most basic form of content, and most easily digestible by our spidery friends.
How can Google go about detecting quality content?
It could run through various checks and comparisons:
- Is this copy identical to another page?
- Are there chunks of copy that are largely similar to other places online?
- How is the text positioned on the page?
- Does this copy occupy a central theme or does it contain semantic oddities?
- Is this copy sound in terms of grammar, spelling and syntax?
Hmmm, so Google knows how to tell if content is duplicate, poorly structured or incoherent. Of course, we know this – it’s exactly what Panda has been hammering sites for over the last couple of years. Google attempts to surface better quality sites simply by removing the lower quality ones.
But can Google look at two unique blog posts written on the same subject and determine which is of the higher quality? I sincerely doubt it.
A more pertinent question might be: ‘can humans do it?’
The literary world
The best selling books of all time, according to Wikipedia, contain amongst their top 10 writers such as Tolkien, Dickens and Lewis. A fair shout, many would argue. Yet nestling in at the number 9 spot is none other than The Da Vinci Code, by Dan Brown. Those familiar with Mr Brown’s work will know that his use of language is…clumsy, at best. His writing is painful to read, and he is not without criticism, evidenced by this collection of his ‘worst sentences‘.
Other works that have achieved massive commercial success yet are notoriously badly written include 50 Shades of Grey and the Harry Potter series. I must admit to enjoying the travails of Master Potter, although the prose can’t half make you wince at times.
The thing is, having discussed my distaste for such poorly written works with quite a few people over the years, some people simply don’t notice that the writing is bad. Others don’t give a shit. Well educated, highly intelligent people no less. I know this makes me sound pompous and arrogant, but hopefully is getting my point across – everybody has a different definition of quality.
Google MUST rely on external factors
Content is made to communicate meanings, and communication is an exchange of information. Content is only effective if it is successful in communicating its meanings to its audience. But the audience is not a robot.
Since every person has fundamentally different experiences, opinions and ideologies, it follows that they should disagree on ‘what is good.’
The notion that content can, in itself, deserve to rank is flawed.
If people can’t agree on content quality, then Google certainly can’t make that call algorithmically. And we’ve only considered text! Imagine Google trying to grade images qualitatively, never mind animation, music or videos. This is art not science.
Google must rely on good old external factors, such as links, social shares and author associations. And at the end of the day, Google doesn’t give a shit that a Dan Brown book reads like a soap opera on steroids – if their users want it then Google will give it to them.