100K Webmaster Central Not Found Links?
-
http://screencast.com/t/KLPVGTzM I just logged into our Webmaster Central account to find that it shows 100k links that are not found? After searching through all of them they all appear to be from our search bar, with no results? Are we doing something wrong here?
-
Ya, I read through that article yesterday & see that they recommend the same setting as the Yoast plugin should be doing? Although I didn't ever get a response from me to see if there is something missing?
For now, I plan on adding this to the robots.txt file & see what results I get?
Do you know the time frame that it takes to get the updates in GWT? Will this update within a few weeks or would it take longer than that?
Thanks for all the help!
BJ
-
Hello BJ.
The robots.txt file must be on your server, in the document root.
Here is information about how to configure robots.txt
Note that is does have a warning at the end, about how you could possibly lose some link juice, but that is probably a much smaller problem than the problem you are trying to fix.
Nothing is perfect, and with the rate that google changes its mind, who knows what is the right thing to do this month.
Once you have edited robots.txt, you don't need to do anything.
- except I just had a thought - how to get google to remove those items from your webmaster tools. I think you should be able to tell them to purge those entries from GWT. Set it so you can see 500 to a page and then just cycle through and mark them fixed.
-
Sorry to open this back up after a month, in adding this to the robot.txt file is there something that needs to be done within the code of the site? Or can I simply update the robots.txt file within Google Webmaster Tools?
I was hoping to get a response from Yoast on his blog post, it seems there were a number of questions similar to mine, but he didn't ever address them.
Thanks,
BJ
-
We all know nothing lasts forever.
A code change can do all kinds of things.
Things that were important are sometimes less important, or not important at all.
Sometimes yesterdays advice no longer is true.
If you make a change, or even if you make no change, but the crawler or the indexer changes, then we can be surprised at the results.
While working on this other thread:
http://www.seomoz.org/q/is-no-follow-ing-a-folder-influences-also-its-subfolders#post-74287
I did a test and checked my logs. A nofollow meta tag and a nofollow link do not stop the crawlers from following. What it does (we think) is to not pass pagerank. That is all it does.
That is why the robots.txt file is the only way to tell the crawlers to stop following down a tree. (until there is another way)
-
Ok, I've posted a question on Yoast.com blog to see what other options we might have? Thanks for the help!
-
It is because Roger ignores those META tags.
Also, google often ignores them too.
The robots.txt file is a much better option for those crawlers.
There are some crawlers that ignore the robots file too, but you have no control over them unless you can put their IPs in the firewall or add code to ignore all of their requests.
-
Ok, I just did a little more research into this, to see how Yoast was handling this within the plugin & came across this article: http://yoast.com/example-robots-txt-wordpress/
In the article he stats that this is already included within the plugin on search pages:
I just confirmed this, by doing this search on my site & looking at the code: http://www.discountqueens.com/?s=candy
So this has always been in place. Why would I still have the 100K not found links still showing up?
-
We didn't have these errors showing up previously, so that's why I was really suspicious? Also we have Joost De Valk's SEO plugin installed on our site & I thought there was an option to turn off the searches from being indexed?
-
Just to support Alan Gray's response, I'll say it's very important to block crawlers from your site search, because it not only throws errors (bots try to guess what to put in a search box), but also because any search results that get into the index will cause content conflicts, dilute ranking values, and worst case scenario, potentially create the false impression that you have a lot of very thin content / near duplicate content pages.
-
the search bar results are good for searchers but not for search engines. You can stop all search engines and Roger (the seomoz crawler) from going into those pages by adding an entry to your robots.txt file. Roger only responds to his own section of the robots file, so anything you make global will not work for him.
User-agent: rogerbot Disallow: /search/*
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
350 (Out the 750) Internal Links Listed by Webmaster Tools Dynamically Generated-Best to Remove?
Greetings MOZ Community: When visitors enter real estate search parameters in our commercial real estate web site, the parameters are somehow getting indexed as internal links in Google Webmaster Tools. About half are 700 internal links are derived from these dynamic URLs. It seems to me that these dynamic alphanumeric URL links would dilute the value of the remaining static links. Are the dynamic URLs a major issue? Are they high priority to remove? The dynamic URLs look like this: /listings/search?fsrepw-search-neighborhood%5B%5D=m_0&fsrepw-search-sq-ft%5B%5D=1&fsrepw-search-price-range%5B%5D=4&fsrepw-search-type-of-space%5B%5D=0&fsrepw-search-lease-type=1 These URLs do not show up when a SITE: URL search is done on Google!
Intermediate & Advanced SEO | | Kingalan10 -
Link Juice + Site Structure
Hi All, I have attached a simple website model.
Intermediate & Advanced SEO | | Mark_Ch
Page A is the home page attracting 1000 visitors per month.
One click away is Page B with 400 visitors per month, so on and so forth. You get an idea of the flow and clicks required to get to various pages. I have purposely placed Pages E-G to be 3 clicks away as they yield very little traffic. 1] Is this the best way to distribute link juice?
2] Should I point Pages C + D back to page A to influence its Page Rank (PA) Any other useful advice would be appreciated. Thanks Mark vafnchI0 -
Linking and non-linking root domains
Hi, Is there any affect on SEO based on the ratio of linking root domains to non-linking root domains and if so what is the affect? Thanks
Intermediate & Advanced SEO | | halloranc0 -
What To Do With Too Many Links?
We have four pages that have over 100 links (danger, danger from what I gather), but they're not spammy footer links. They are FAQ videos for our four main areas of practice. Does that make a difference? If not, should I just take half the questions on each page and make four additional pages? That strikes me as a worse UX, but I don't want to get penalized either. Thanks, Ruben
Intermediate & Advanced SEO | | KempRugeLawGroup0 -
Http, https and link juice
I'm working on a site that is built on DNN. For some reason the client has set all pages to convert to HTTPS (although this is not perfect as some don't when landing on them). All pages indexed in Google are straight HTTP, but when you click on the Google result a temp 302 header response to the corresponding HTTPS page for many. I want it changed to a 301 but unfortunately is an issue for DNN. Is there another way around this in IIS that won't break DNN as it seems to be a bit flaky? I want to have the homepage link juice pass through for all links made to non HTTPS homepage. Removing HTTPS does not seem to be an option for them.
Intermediate & Advanced SEO | | MickEdwards0 -
Webmaster Tools Internal Links
Hi all, I have around 400 links in the navigation menu (site-wide) and when I use webmaster tools to check for internal links to each page; some have as many as 250K and some as little as 200. Shouldn't the number of internal links for pages found in the navigation menu be relatively the same? Or is Google registering more internal links for pages linked closer to the top of the code Thanks!
Intermediate & Advanced SEO | | Carlos-R0 -
Why my good links disappeared from gWMT?
Hi Last night some of my good links suddenly stopped from being display displayed at the WMT list of links Any thoghts? Thanks
Intermediate & Advanced SEO | | BeytzNet0 -
Too many links!
Hi, I'm running a wordpress blog (modhop.com) and am getting the "too many links" on almost all of my pages. It appears that in addition to basic site navigation I have plug-ins that create invisible links that are counted in the crawl...at least that's my guess. Is there a good way to control this in wordpress? A nofollow in the .htaccess? A plug-in that does this? (I'm sort of at novice-plus level here so the simplest solution is ideal.) Thanks! Jake modhop.com
Intermediate & Advanced SEO | | modhop0