The Search Console URL Parameter Management Tool Will Be Retired Soon! What Alternatives?


The URL parameter management tool, launched in 2009, will be removed on April 26, 2022 from Search Console. It will not be replaced by another tool. Explanations and alternatives.

Why remove the URL Parameter Management Tool from Search Console?

According to Google, the tool for managing URL parameters is no longer relevant since its crawler is now able to automatically detect unnecessary parameters.

According to the search engine, only 1% of the settings in place on all the sites would still be useful to manage the crawl of URLs with parameters.

What was the point of this tool?

tool for managing url parameters google search console
Overview of the URL Parameter Management Tool in Google Search Console
  • Force Google not to crawl URL parameters that resulted in duplicate URLs
  • Force Google to crawl only certain URL parameters
  • Prevent Google from crawling pages with session IDs

What alternatives to the URL parameters management tool to manage Google’s crawl?

1- The robotos.txt

Although Google indicates that webmasters will have nothing to do following the effective removal of the tool scheduled for April 2022, nothing prevents SEOs and webmasters from securing their backs by optimizing their robots.txt file with “allow” directives to selectively allow the crawl and “disallow” to block it.

The robots.txt file, found at the root of websites (which have it), is the essential file to optimize when you want optimize Googlebot crawl (and other search engine crawlers) and its budget crawl.

From “Disallow:” directives can thus always be placed in the robots.txt to force Google not to browse the pages containing specific URL parameters.

Here is an example of a simple rule that can be placed in the robots.txt to decide to prohibit the crawl of googlebot to all the urls which would contain for example the parameter “filter=”:

Disallow: /*filtre=*

The stars here mean that whatever is before or after “filter=” in the URL, as long as the url contains this parameter, it must not be followed.

Conversely, it will also be possible to use the “allow” directive to indicate to Google that you want it to continue crawling certain parameters that it could have decided on its own to no longer crawl following removal of the URL parameter management tool.

2- Hreflang tags for URLs with parameters associated with language variants

If you use URL parameters to distinguish between pages translated into different languages, you can, if this is not already the case, set up the tags to ensure that Google continues to crawl the different variants.

How to check that googlebot crawls the right URLs?

To make sure that your new (or old) directives work as you want after removing the URL parameters tool, you can first test each URL of your choice in the robots.txt file testing tool integrate into Search Console.

Then, for further and more macro checking, the ideal is to use an SEO log analysis software which will allow you to clearly see where Google is passing and where it is not passing / no longer in order to adjust your guidelines accordingly.

Source



See also  What Alternatives for Russians?

Leave a Comment

Your email address will not be published.