How can I remove parameters from the GSC URL blocking tool?
-
Hello Mozzers
My client's previous SEO company went ahead and blindly blocked a number of parameters using the GSC URL blocking tool. This has now caused Google to stop crawling many pages on my client's website and I am not sure how to remove these blocked parameters so that they can be crawled and reindexed by Google.
The crawl setting is set to "Let Google bot decide" but still there has been a drop in the number of pages being crawled. Can someone please share their experience and help me delete these blocked parameters from GSC's URL blocking tool.
Thank you Mozzers!
-
Hi Vincent,
My short answer is: don't let Googlebot decide. Tell Googlebot which parameters should or should not create new pages. This is something you should do if you ever have indexation problems with parameters.
Do a site: search for a handful of these URLs with parameters to double check that the drop in the number of pages being crawled is because of these pages or because of something else. If it is because of these pages, you can quickly add them back to the index by using the "Fetch as Googlebot" tool. Once you have Google fetch something, you have the option of submitting it to the index.
(If it turns out the drop in crawled pages is from something else, a good way to figure out which pages are being affected is by creating multiple XML sitemaps and organizing them by site section, so when Google reports on how many of your URLs are in its index, you quickly know which section of the site is being affected. This post is really old, but still incredibly useful here.)
Double check that these URLs with parameters are in the XML sitemap, and that you have a number of internal links on prominent pages pointing to them. Even if these can only be temporary, those links will really help the process.
Hope this helps!
Kristina
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Is there a tool to check the SEO performance of articles from different websites?
Is there a tool to check the SEO performance of articles(like 100+ articles) from different websites in one place? I am looking for a tool where I can put several URLs of the articles we have published in bulk for many of our clients, and it can show me how those articles are performing. This way, we can check the performance of these articles at any time; there is no need to add URLs every time.
Reporting & Analytics | | Foxxr0 -
How can I overwrite previously imported cost data in Google Analytics?
I choose the overwrite option but it keeps adding the costs to the existing costs. When i select that radio button and click done, that option does not appear on the summary screen. That section next to Import Behavior is blank.
Reporting & Analytics | | JGB5550 -
Help Blocking Crawlers. Huge Spike in "Direct Visits" with 96% Bounce Rate & Low Pages/Visit.
Hello, I'm hoping one of you search geniuses can help me. We have a successful client who started seeing a HUGE spike in direct visits as reported by Google Analytics. This traffic now represents approximately 70% of all website traffic. These "direct visits" have a bounce rate of 96%+ and only 1-2 pages/visit. This is skewing our analytics in a big way and rendering them pretty much useless. I suspect this is some sort of crawler activity but we have no access to the server log files to verify this or identify the culprit. The client's site is on a GoDaddy Managed WordPress hosting account. The way I see it, there are a couple of possibilities.
Reporting & Analytics | | EricFish
1.) Our client's competitors are scraping the site on a regular basis to stay on top of site modifications, keyword emphasis, etc. It seems like whenever we make meaningful changes to the site, one of their competitors does a knock-off a few days later. Hmmm. 2.) Our client's competitors have this crawler hitting the site thousands of times a day to raise bounce rates and decrease the average time on site, which could like have an negative impact on SEO. Correct me if I'm wrong but I don't believe Google is going to reward sites with 90% bounce rates, 1-2 pages/visit and an 18 second average time on site. The bottom line is that we need to identify these bogus "direct visits" and find a way to block them. I've seen several WordPress plugins that claim to help with this but I certainly don't want to block valid crawlers, especially Google, from accessing the site. If someone out there could please weigh in on this and help us resolve the issue, I'd really appreciate it. Heck, I'll even name my third-born after you. Thanks for your help. Eric0 -
Removing blog posts with little/thin content
We've got quite a lot (I'd say 75%) of our blog posts which I'd consider as low quality. Short content (500 words or less) with few shares, no backlinks & comments; most of which gets 0-2 unique views a day (however combined this adds up). Will removing these pages provide an SEO benefit greater than the reduction in traffic from the removal of these pages? I've heard the likes of Neil Patel/Brian Dean suggest so, however I'm scared it will provide the opposite as less content is indexed and I'll actually see traffic fall. Sam
Reporting & Analytics | | Sam.at.Moz1 -
I am tracking my domain e.g www.xyz.com to an existing google analytics account . I want to track www.xyz.com/blog separately in the google analytics account. How can I do that?
I am tracking my domain e.g www.xyz.com to an existing google analytics account . I want to track www.xyz.com/blog separately in the google analytics account. How can I do that?
Reporting & Analytics | | Windlass0 -
What tools are people using to analyse clicked links
Hi, What tools do you use/recommend to analyse what/where links are being clicked on a page. I have seen a few mentions about CrazyEgg but are there any free (but good) tools around worth using?
Reporting & Analytics | | NeilD0 -
SEOMoz & Google Webmaster Tools crawl error conflicting info
Site im working on has zero crawl errors according to SEOMoz (it did previously have lots since ironed out) but now looking at GWebmaster Tools saying 5000 errors. Date of those are not that recent but Webmaster Tools line graph of errors still showing aprox 5000 up to yesterday There is an option to bulk action/tick them all as fixed so thinking/hoping GWT just keeping a historical record that can now be deleted since no longer applicable. However i'm not confident this is the case since still showing on the line graph. Any ideas re this anomalous info (can i delete and forget in GWT) ? Also side question I take it its not possible to link a GA property with a GWT account if created with different logins/accounts ? Many Thanks Dan
Reporting & Analytics | | Dan-Lawrence0 -
Indexed URLs in Webmaster Tools
Hi everybody! I've been looking at my Webmaster Tools stats, and it looks like not all the URLs in the sitemap tree have been indexed, according to WMT at least. Is this reliable, and if so, is it worth investigating further? | Sitemap | Status | Type | Downloaded | URLs submitted | URLs in web index |
Reporting & Analytics | | neooptic
| | /ISitemap1.xml | | Sitemap | Dec 15, 2011 | 2,000 | 1,309 |
| | /isitemap.xml | | Index | Dec 15, 2011 | 8,695 | 4,127 |
| | /isitemap2.xml | | Sitemap | Dec 15, 2011 | 2,000 | 998 |
| | /isitemap3.xml | | Sitemap | Dec 15, 2011 | 2,000 | 819 |
| | /isitemap4.xml | | Sitemap | Dec 15, 2011 | 2,000 | 719 |
| | /isitemap5.xml | | Sitemap | Dec 15, 2011 | 695 | 282 | Thanks!0