So this chart looked at the percentage of locations that had a consistent URL. But I was also interested in knowing if they don’t have a URL, how many are there? Is it two that are constantly switching, or is it a larger number?
Now it turns out that when you look at all these SERP positions, all 40,000 SERPs in 14 days, all these positions, most of the positions actually hosted 4 URLs or more, which is an incredibly large number.
So while there were some stable ones with just one URL, the ones that fluctuated fluctuated a lot over this period.
Why so much change?
So why so many changes? Well, there are a few answers, some more obvious, some less obvious.
I think the most obvious thing is that we know that Google has said in the past, I think about five years ago, Google said that they were doing, on average, seven daily algorithm updates.
So, in addition to the big ones we hear about, there are always all these little ones. We can only assume that that number has increased since then. So all these daily changes, Google rolling out all these tests that only affect a small percentage of SERPs, that’s obviously going to cause a lot of fluctuation. The site also changes.
If you think SEO works, then you think that changing things on a website changes the rankings, and people change their websites all the time. In any SERP, there are many chances that one of the URLs or one of the sites that URL is on will be played today. Maybe the internal links have changed. Maybe it was connected to you somewhere externally. Perhaps the anchor text has been updated.
Maybe new content has been added, a new product. There will be all these little changes. If the results are very close in terms of ranking ability, then you might imagine that they will go up or down slightly over this period.
Lastly, and I think this is perhaps the most controversial thing, we’ve been hearing a lot recently about the United States v. Google case and some types of proprietary data that perhaps give Google an unfair advantage.
Much of what has emerged concerns how they might use user data to inform search results. There have been experiments in the past, conducted by people like Rand Fishkin, that show how you can influence ranking changes in real time using user data, as if you asked everyone in a large room to go and click on the location 2 or 3 on a SERP, then that result will move up, that kind of thing.
So maybe some of this reactive data filters through and influences the SERPs in real time. We don’t know, but it’s possible.