Every so often, I’ll see a conversation about SEO that looks pretty interesting. I’ll dive into the thread and start reading the comments to get a feel for the ebb and flow of the discussion.
Then…BAM!
Someone assigns Google sorcerer like powers that defy the common rules of physics and they end up derailing the conversation with stupid conspiracy nonsense and half-wit commentary.
Today’s example was around link building.
This individual talked about how he’d built out a bunch of links to his site prior to Christmas and didn’t see much impact from his efforts. He felt that some of the links he’d built were really good links that come from what Moz deemed to be high authority sites.
There were some good comments about giving it time because maybe Google hadn’t crawled and indexed all of those links yet and another person made a really good point that his content might not be good enough for Google to rank it for anything useful irrespective of the number of links.
Those were valuable comments and gave the guy who raised the issue plenty to think about – in my opinion, both were probably true.
Then the original poster made a comment about thinking that perhaps Google devalued links from sites that didn’t get much traffic or weren’t updated regularly. He made an offhand comment about PBNs and Ghost Town sites which led me to think that MAYBE his link building wasn’t all “white hat” and his effort was wasted which makes sense.
At this point, crazy turned up to the party wearing a toga and looking to do keg stands between bong hits.
Someone said that they believed that Google evaluated every link based on things like bounce rate, time on page, quality of content… On the originating page.
Let that sink in for a second…
Every single link on the internet would be evaluated on a bunch of metrics and then individually ranked in terms of value.
For reference, the last time Google gave an indication of the number of pages they index was back in November 2016. At that point in time there about 130T pages that Google crawled. You don’t need to used advanced calculus to work out that the number of links would probably be in the quadrillions.
I’m going to spare you the boring math, but the computing power to achieve that kind of link ranking and indexing would be impossible for Google to afford or build. They currently run in excess of 3m servers around the world to deliver search results.
Could you imagine how many they would need to do that kind of link analysis?
This is what I mean by people ascribing to Google magical powers that exceed the laws of the known universe.
Google works by associating a “PageRank” value to each page in the index. That calculation has hundreds of factors attached to it… At a page level.
Pages that don’t have much value or whatever based on those ranking factors don’t have much PageRank value and therefore the outbound links on those pages aren’t terribly useful.
The value of the links you receive from a page is a transfer of some of their “PageRank” value (commonly referred to as Link Juice) to your site.
This is a staggering technical achievement on a page level, trying to determine that on a LINK level is simply beyond comprehension.
The funny thing is, there’s so much happening at Google that is unbelievable in terms of search that people making up fantasies of their own is stupid.
For example, large parts of the Google ranking algorithms are now purely based on AI – they have some basic rules that the algorithms are able to adjust on their own based on a set of described desirable outcomes.
One comment from a prominent Google search staffer was that there probably isn’t a single person anywhere in the world that really understands exactly all of the moving parts of the Google Search algorithm.
Again, let that sink in for a minute – Google acknowledges that there probably isn’t anyone in the world who knows precisely how the ranking system works intimately.
Depending on your personal opinions on AI and technology this might excite you or scare you when you consider how vital Google is to our everyday life – there are over 1B searches a day with people looking for health-related answers.
With the sheer magnitude of the achievement that is Google’s search algorithm, why do people persist in trying to assign it magical powers?
I guess the easy answer would be to say that most things we don’t fundamentally understand could best be described as magic, but that’s lame.
The best takeaway from this post is that it’s not really possible to identify all the elements that make up the search rankings anymore – whenever you see people writing long blog posts about the 200+ ranking factors, they are writing clickbait.
So if Google engineers struggle to understand all the aspects of the search algorithm, what chance do we have?
That question gets raised with me all the time.
The simple answer is you have no chance and you shouldn’t try. In fact, that’s the wrong way of even looking at the problem.
Google is a prediction engine – it tries to interpret what you’re requesting, figure out the context and deliver back to you a set of results ranked in order of the highest probability that the pages listed will answer your question.
With that in mind, your goal is to find the patterns at a high level for what Google deems to be a high probability result. Your content creation process needs to be viewed through two lenses:
- what is the best possible content I can create that serves the person reading it
- what does Google like to see in terms of page mechanics to rank content as having a high probability of answer a user’s request successfully
Those two things are obviously closed related – Google values high-quality content that answers questions successfully for their search users, but there are also some mechanical elements (on-page SEO) that are important.
Your job is to figure that out, understand the probabilities and adjust your content accordingly.
Google’s search algorithm is incredibly powerful and one of the most complex things ever created by humans, but it isn’t sorcery.