In May, Google announced an update. For the average user, it didn’t make many waves. But for SEO professionals, the change felt seismic.
Google launched this update — a
new and improved version of its Googlebot web crawler — at its annual I/O developer conference. Unlike previous iterations, the new Googlebot is designed to be “evergreen,” meaning that no matter how many new bells and whistles are added to or subtracted from Google’s Chromium web browser, the tool will always be able to crawl through the code on a webpage and pick up on all of the SEO-rich data it contains.
While some SEO enthusiasts are heralding this change as the end of the need for complicated technical workarounds to get higher page rankings, others are more skeptical of how comprehensive this change will really be.
A Whole New World — or Is It?
On its surface, the new Googlebot seems like a breath of fresh air for content creators everywhere. If Google’s software is able to do a better job crawling and indexing what you create, you’d ideally spend less time adjusting to operate properly within the system and more time
creating high-quality content that will drive readers to your page. But considering Google’s admission that Googlebot still can’t see everything on a page, professionals in this space have to remember that their technical SEO work is not, by any means, done.
While this update
should mean SEOs and webmasters will be able to refocus their energies in this way, it doesn’t mean that technical SEO and frequent auditing of Google’s indexing should be abandoned altogether, though.
That’s what
technical SEO is all about: helping search engines interpret and index your web content properly so all of the work you put in doesn’t go to waste.
The Future of SEO
Some SEO experts believe Google is advanced enough that technical SEO is no longer the priority it once was. But it’s still critically important to know that practicing technical SEO and understanding the way Google crawls your content are necessities, even as technology advances. You might be able to deprioritize certain aspects of the process, but you shouldn’t fully forgo any of them.
That said, here is what SEO experts need to do in this post-Googlebot update world to ensure their page rank stays high and their attention to optimization does, too:
- Review your log files.
Whether you’re working on SEO for your own page or consulting for a business, request access to the site’s log files, which house data and page visits. Regularly checking these can help you see when and how Googlebot is crawling your site. Is it missing any vital pages or spending unnecessary time on unimportant pages?
Reviewing your log files makes it possible to understand how Google views your specific site’s content — not just how you think it should based on an update or industry trend.
- Track your rankings.
Keywords are a good place to start when tracking your site visibility, but they’re not the only consideration. Look at the average ranking for key pages across a large volume of relevant terms, and consider how the rank status of each correlates with the topics you are optimizing on your own site.
Don’t stop there, though. Look at how features you add to your own pages may or may not affect Googlebot’s ability to crawl them and, therefore, their ranking. As an example, if you recently installed a new feature on a page that lost rank but all other content stayed the same, it might be that Google is no longer able to read your page in the same manner as before. You can also use analytics tools like heat maps and speed analyzers to identify strengths and weaknesses of your site and then improve them.
Tracking your rankings can be tricky, as they fluctuate for a host of reasons. But being aware of these fluctuations — and drawing conclusions about what causes them — can help you understand your own rankings much better and drive your future SEO decisions.
- Don’t stop practicing technical SEO.
Your site doesn’t have to be “technically perfect.” You’d only be chasing an ideal that very few sites ever reach if that were your standard. But this recent update can give too much confidence — and false confidence — that you no longer need to worry about optimizing your infrastructure. While an evergreen Googlebot should alleviate some concerns about content being crawlable, technical SEO is still just as important as ever if you want to truly ensure your page visibility is the best it can be.
Ultimately, great content is only as great as its ability to be discovered, crawled, and indexed. It’s because of this that all of your SEO campaigns should be rooted in some form of technical analysis, including regular inspections to ensure your content investment has the highest chance for success.
Don’t get too distracted by the Googlebot headlines — the updates are exciting, but they aren’t reinventing the SEO game completely. You still need to sharpen every tool in your arsenal if you want to reach or keep the most coveted spot at the top of Google’s search page.
Source: https://www.themarketingscope.com/is-technical-seo-dead/