100K Webmaster Central Not Found Links?
-
http://screencast.com/t/KLPVGTzM I just logged into our Webmaster Central account to find that it shows 100k links that are not found? After searching through all of them they all appear to be from our search bar, with no results? Are we doing something wrong here?
-
Ya, I read through that article yesterday & see that they recommend the same setting as the Yoast plugin should be doing? Although I didn't ever get a response from me to see if there is something missing?
For now, I plan on adding this to the robots.txt file & see what results I get?
Do you know the time frame that it takes to get the updates in GWT? Will this update within a few weeks or would it take longer than that?
Thanks for all the help!
BJ
-
Hello BJ.
The robots.txt file must be on your server, in the document root.
Here is information about how to configure robots.txt
Note that is does have a warning at the end, about how you could possibly lose some link juice, but that is probably a much smaller problem than the problem you are trying to fix.
Nothing is perfect, and with the rate that google changes its mind, who knows what is the right thing to do this month.
Once you have edited robots.txt, you don't need to do anything.
- except I just had a thought - how to get google to remove those items from your webmaster tools. I think you should be able to tell them to purge those entries from GWT. Set it so you can see 500 to a page and then just cycle through and mark them fixed.
-
Sorry to open this back up after a month, in adding this to the robot.txt file is there something that needs to be done within the code of the site? Or can I simply update the robots.txt file within Google Webmaster Tools?
I was hoping to get a response from Yoast on his blog post, it seems there were a number of questions similar to mine, but he didn't ever address them.
Thanks,
BJ
-
We all know nothing lasts forever.
A code change can do all kinds of things.
Things that were important are sometimes less important, or not important at all.
Sometimes yesterdays advice no longer is true.
If you make a change, or even if you make no change, but the crawler or the indexer changes, then we can be surprised at the results.
While working on this other thread:
http://www.seomoz.org/q/is-no-follow-ing-a-folder-influences-also-its-subfolders#post-74287
I did a test and checked my logs. A nofollow meta tag and a nofollow link do not stop the crawlers from following. What it does (we think) is to not pass pagerank. That is all it does.
That is why the robots.txt file is the only way to tell the crawlers to stop following down a tree. (until there is another way)
-
Ok, I've posted a question on Yoast.com blog to see what other options we might have? Thanks for the help!
-
It is because Roger ignores those META tags.
Also, google often ignores them too.
The robots.txt file is a much better option for those crawlers.
There are some crawlers that ignore the robots file too, but you have no control over them unless you can put their IPs in the firewall or add code to ignore all of their requests.
-
Ok, I just did a little more research into this, to see how Yoast was handling this within the plugin & came across this article: http://yoast.com/example-robots-txt-wordpress/
In the article he stats that this is already included within the plugin on search pages:
I just confirmed this, by doing this search on my site & looking at the code: http://www.discountqueens.com/?s=candy
So this has always been in place. Why would I still have the 100K not found links still showing up?
-
We didn't have these errors showing up previously, so that's why I was really suspicious? Also we have Joost De Valk's SEO plugin installed on our site & I thought there was an option to turn off the searches from being indexed?
-
Just to support Alan Gray's response, I'll say it's very important to block crawlers from your site search, because it not only throws errors (bots try to guess what to put in a search box), but also because any search results that get into the index will cause content conflicts, dilute ranking values, and worst case scenario, potentially create the false impression that you have a lot of very thin content / near duplicate content pages.
-
the search bar results are good for searchers but not for search engines. You can stop all search engines and Roger (the seomoz crawler) from going into those pages by adding an entry to your robots.txt file. Roger only responds to his own section of the robots file, so anything you make global will not work for him.
User-agent: rogerbot Disallow: /search/*
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How Many Links to Disavow at Once When Link Profile is Very Spammy?
We are using link detox (Link Research Tools) to evaluate our domain for bad links. We ran a Domain-wide Link Detox Risk report. The reports showed a "High Domain DETOX RISK" with the following results: -42% (292) of backlinks with a high or above average detox risk
Intermediate & Advanced SEO | | Kingalan1
-8% (52) of backlinks with an average of below above average detox risk
-12% (81) of backlinks with a low or very low detox risk
-38% (264) of backlinks were reported as disavowed. This look like a pretty bad link profile. Additionally, more than 500 of the 689 backlinks are "404 Not Found", "403 Forbidden", "410 Gone", "503 Service Unavailable". Is it safe to disavow these? Could Google be penalizing us for them> I would like to disavow the bad links, however my concern is that there are so few good links that removing bad links will kill link juice and really damage our ranking and traffic. The site still ranks for terms that are not very competitive. We receive about 230 organic visits a week. Assuming we need to disavow about 292 links, would it be safer to disavow 25 per month while we are building new links so we do not radically shift the link profile all at once? Also, many of the bad links are 404 errors or page not found errors. Would it be OK to run a disavow of these all at once? Any risk to that? Would we be better just to build links and leave the bad links ups? Alternatively, would disavowing the bad links potentially help our traffic? It just seems risky because the overwhelming majority of links are bad.0 -
Tumblr links
I have several Tumblr blogs. Created when Tumblr links were worth more, and now primarily for my amusement. But, I'd like to get whatever link juice I can out of them. I thought only the footer links were do follow, but when I check Moz it's showing all links as do follow. Any idea which is true?
Intermediate & Advanced SEO | | julie-getonthemap1 -
Change of address webmaster tools
Hi dear experts; I trying to migration to https and follow the guide line that explain how to do it. After redirect 301, I created new property on WMT for ssl version and change the pre address to new one. but as you can see in the attach file, the new property does not appear in the list. the old version of domain is: sharifconsult.ir the new one is : https://sharifconsult.ir yd9fL
Intermediate & Advanced SEO | | seoiransite0 -
How Many Natural Links Do You Earn?
Hello Mozzers, This is a bit of an open ended question and I don't think any one person is going to be the same. I have recently seen the light in my link building practices and I am trying to get a feel for what to expect in terms of natural link acquisition in an effective content marketing strategy. My question is how many natural links do you generally find yourself earning after the first 12 months of content creation/placement with a new website/industry? I know this is going to be a question with a multitude of different answers. I look forward to your valuable insight as always!
Intermediate & Advanced SEO | | ChoChauRice1 -
Links on page
Hi I have a web page which lists about 50-60 products which links out to either a pdf on the product or the main manufacturers website page containing product detail. The site in non e-commerce is this the site/page likely to get hit by Penguin? Would it be best to create a separate page for the product/manufacturer group i.e 5 or 6 pages but linking out to the PDFs etc...?
Intermediate & Advanced SEO | | Cocoonfxmedia0 -
Cooking Recipes Blog Links
Hi, I am running an ecommerce store - cookware, bakeware, knives etc... I have someone I know personally that is a writer and one of her blogs is about cooking - lots of well established articles with keywords througout. Is there any harm in getting some inbound links from her blog on certain keywords? If so, should I limit the number of outgoing links per article she has? Any guidelines? Thanks!
Intermediate & Advanced SEO | | bjs20100 -
Why the sudden link drop?
A the end of November I am showing that our total links were 118k. Current links are 22k. We changed sites early November so that was about three weeks before. What would cause the drop of about 100k links? Or where should I start investigating?
Intermediate & Advanced SEO | | EcommerceSite0 -
Is this splitting my authority or link juice?
Hi Using seomoz i am getting told that a 302 temporary redirect is occurring on some of my pages for instance. http://www.eco-environments.co.uk/solar-power/ Then redirects here http://www.eco-environments.co.uk/solar-power/default.phuse is this splitting my page authority because of the temporary redirect? I just want to make sure i have fully understood what's happening before i go to the company who designed and developed our site as i am convinced this is hurting my rankings. Thanks
Intermediate & Advanced SEO | | Nickhoyle10