Forum Posts

sihab seo
Jul 28, 2022
In Welcome to WJZD RADIO DETROIT
In the past, Google had a URL parameter tool in Google Search Console where you could choose how to treat different parameters based on whether or not it changed the page content. The tool was deprecated in early 2022. Here’s what Google had to say about it: When the URL Parameters tool launched in 2009 in Search Console’s predecessor, Webmaster Tools, the internet was a much wilder place than it is today. SessionID parameters were very common, CMSes had trouble organizing parameters, and browsers often broke links. With the URL Parameters tool, site owners had granular control over how Google crawled their site by specifying how certain parameters affect the content on their site. Over the years, Google became much better at guessing which parameters are useful on a site and which are —plainly put— useless. In fact, only about 1% of the parameter configurations currently specified in the URL Parameters tool are useful for crawling. Due to the low value of the tool both for Google and Search Console users, we’re deprecating the URL Parameters tool in 1 month. While not mentioned, I suspect that some users might have been hurting themselves with the tool. I ran into this in the past where someone put in a wrong setting that said the content did not change, but it did. This knocked a few hundred thousand pages out of the index for that site. Whoops! You can let Google crawl and figure out how to handle the parameters for you, but you also have some controls you can leverage. Let’s look at your options. Canonical tags A canonical tag can help consolidate signals to a chosen URL but requires each additional version of a page to be crawled. As I mentioned earlier, Google may make adjustments as it recognizes patterns, and these canonicalized URLs may be crawled less over time. This is what I’d opt for by default. But if a site has a ton of issues and parameters are out of control, I may look at some of the other options. Noindex A noindex meta robots tag removes a page from the index. This requires a page to be crawled. But again, it may be crawled less over time. If you need signals to consolidate to other pages, I’ll avoid using noindex. recommended name is what makes affiliate marketing such a powerful pillar of digital marketing. Blocking in robots.txt Blocking parameters in robots.txt means that the pages may still get indexed. They’re not likely to show in normal searches. The problem is that these pages whatsapp number list won’t be crawled and won’t consolidate signals. If you want to consolidate signals, avoid blocking the parameters. Site Audit When setting up a project in Site Audit, there’s a toggle in the crawl settings called “Remove URL Parameters” that you can use to ignore any URLs with parameters. You can also exclude parameterized URLs in the crawl setup using pattern matching. Blocking a parameter in the crawl setup Blocking a parameter in Site Audit. SIDENOTE. Fun fact: We only count the canonicalized version of pages toward your crawl credits. Final thoughts Just to summarize, URL parameters have a lot of different use cases, and they may or may not cause issues for your site. Everything is situational. Message me on Twitter if you have any questions.
Controlling parameters content media
0
0
5