100K Webmaster Central Not Found Links?
-
http://screencast.com/t/KLPVGTzM I just logged into our Webmaster Central account to find that it shows 100k links that are not found? After searching through all of them they all appear to be from our search bar, with no results? Are we doing something wrong here?
-
Ya, I read through that article yesterday & see that they recommend the same setting as the Yoast plugin should be doing? Although I didn't ever get a response from me to see if there is something missing?
For now, I plan on adding this to the robots.txt file & see what results I get?
Do you know the time frame that it takes to get the updates in GWT? Will this update within a few weeks or would it take longer than that?
Thanks for all the help!
BJ
-
Hello BJ.
The robots.txt file must be on your server, in the document root.
Here is information about how to configure robots.txt
Note that is does have a warning at the end, about how you could possibly lose some link juice, but that is probably a much smaller problem than the problem you are trying to fix.
Nothing is perfect, and with the rate that google changes its mind, who knows what is the right thing to do this month.
Once you have edited robots.txt, you don't need to do anything.
- except I just had a thought - how to get google to remove those items from your webmaster tools. I think you should be able to tell them to purge those entries from GWT. Set it so you can see 500 to a page and then just cycle through and mark them fixed.
-
Sorry to open this back up after a month, in adding this to the robot.txt file is there something that needs to be done within the code of the site? Or can I simply update the robots.txt file within Google Webmaster Tools?
I was hoping to get a response from Yoast on his blog post, it seems there were a number of questions similar to mine, but he didn't ever address them.
Thanks,
BJ
-
We all know nothing lasts forever.
A code change can do all kinds of things.
Things that were important are sometimes less important, or not important at all.
Sometimes yesterdays advice no longer is true.
If you make a change, or even if you make no change, but the crawler or the indexer changes, then we can be surprised at the results.
While working on this other thread:
http://www.seomoz.org/q/is-no-follow-ing-a-folder-influences-also-its-subfolders#post-74287
I did a test and checked my logs. A nofollow meta tag and a nofollow link do not stop the crawlers from following. What it does (we think) is to not pass pagerank. That is all it does.
That is why the robots.txt file is the only way to tell the crawlers to stop following down a tree. (until there is another way)
-
Ok, I've posted a question on Yoast.com blog to see what other options we might have? Thanks for the help!
-
It is because Roger ignores those META tags.
Also, google often ignores them too.
The robots.txt file is a much better option for those crawlers.
There are some crawlers that ignore the robots file too, but you have no control over them unless you can put their IPs in the firewall or add code to ignore all of their requests.
-
Ok, I just did a little more research into this, to see how Yoast was handling this within the plugin & came across this article: http://yoast.com/example-robots-txt-wordpress/
In the article he stats that this is already included within the plugin on search pages:
I just confirmed this, by doing this search on my site & looking at the code: http://www.discountqueens.com/?s=candy
So this has always been in place. Why would I still have the 100K not found links still showing up?
-
We didn't have these errors showing up previously, so that's why I was really suspicious? Also we have Joost De Valk's SEO plugin installed on our site & I thought there was an option to turn off the searches from being indexed?
-
Just to support Alan Gray's response, I'll say it's very important to block crawlers from your site search, because it not only throws errors (bots try to guess what to put in a search box), but also because any search results that get into the index will cause content conflicts, dilute ranking values, and worst case scenario, potentially create the false impression that you have a lot of very thin content / near duplicate content pages.
-
the search bar results are good for searchers but not for search engines. You can stop all search engines and Roger (the seomoz crawler) from going into those pages by adding an entry to your robots.txt file. Roger only responds to his own section of the robots file, so anything you make global will not work for him.
User-agent: rogerbot Disallow: /search/*
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What link would be better?
Hi Guys, Just wondering what would be better in this instance: finding an old post (with good authority) and getting a link from that old article or creating a brand new article and adding the link to that. Finding an old post (with good authority) and getting a link from that old article Creating a brand new article and adding the link to that. Both naturally link out to the page you want a link too. To me, number 1 as the page already has authority but then again number 2 since Google might place some weight to recency. Any thoughts? Cheers.
Intermediate & Advanced SEO | | spyaccounts140 -
Slideshare - Links within
Hi Guys I am going to be putting some powerpoint presentations up over time. I have a couple of questions regarding slideshare. If I add links to the slideshare are these crawl able by Google etc...? If I places the powepoint presentation on our website and slideshare would this be counter productive i.e duplicate content? Love to here your suggestions.
Intermediate & Advanced SEO | | Cocoonfxmedia0 -
Spammy Inbound Links
Hello, We have been using Zendesk to manage our customer support tickets for approx 2 years. We recently noticed that the attached forum had lot's of spam comments attached to it. Promoting Viagra and the like. The system was installed as a subdomain of my site support.mysite.com We have since deleted our account with Zendesk but Moz and Google are reporting loads of inbound links to that subdomain that are all total spam with Viagra in the anchor text etc. The subdomain no longer exists and now throws a 404. Can these links still hurt me? Is there other steps I need to take? I have disavowed all the links.
Intermediate & Advanced SEO | | niallfred0 -
Help Identifying Unnatural Links
http://bit.ly/XT8yYYHi,Any help with the below will be most appreciated.We received an unnatural links warning in Webmaster Tools and noticed a large drop in our rankings. We downloaded and carried out a full link audit (3639 links) and logged in an excel spreadsheet with the following status: OK, Have Contacted, Can't Contact, Not SureWe have had some success but the majority of the ones we identified are not contactable.We use the dis-avow tool to tell Google of these. We then submitted a reconsideration request where we explained to Google our efforts and that we can supply them with our audit if necessary by email as you can't upload any evidence.A few days later we received a response suggesting that we still have unnatural links. We are a little stuck as we don't know what they can be:1. Is Google actually looking at our dis-avowed links before making this judgement?2. We have missed something that Google is considering bad but we can't see in our audit?Again we need a little help as we are trying to sort this out but can't see what we are falling down on.I can provide our spreadsheet if necessary.Many ThanksLee
Intermediate & Advanced SEO | | LeeFella0 -
Excessive navigation links
I'm working on the code for a collaborative project that will eventually have hundreds of pages. The editor of this project wants all pages to be listed in the main navigation at the top of the site. There are four main dropdown (suckerfish-style) menus and these have nested sub- and sub-sub-menus. Putting aside the UI issues this creates, I'm concerned about how Google will find our content on the page. Right now, we now have over 120 links above the main content of the page and have plans to add more as time goes on (as new pages are created). Perhaps of note, these navigation elements are within an html5 <nav>element: <nav id="access" role="navigation"> Do you think that Google is savvy enough to overlook the "abundant" navigation links and focus on the content of the page below? Will the <nav>element help us get away with this navigation strategy? Or should I reel some of these navigation pages into categories? As you might surmise the site has a fairly flat structure, hence the lack of category pages.</nav> </nav> </nav>
Intermediate & Advanced SEO | | boxcarpress1 -
Is there an optimal ratio of external links to a page vs internal links originating at that page ?
I understand that multiple links fro a site dilute link juice. I also understand that external links to a specific page with relevant anchortext helps ranking. I wonder if there is an ideal ratioof tgese two items
Intermediate & Advanced SEO | | Apluswhs0 -
Blog links - follow or nofollow?
I need my memory refreshed here! Say, I've got a blog and some of the posts have links to recommended external sites and content. Should these be nofollowed? They're not paid links or anything like that, simply things relevant to the post.
Intermediate & Advanced SEO | | PeterAlexLeigh0 -
Quantifying Linking Campaign Value
Is there any way to predict if and how Organic traffic would change if we sucesfully added some high-quality links to our website? Quantifying link value would help to plan how much time/efforts we should spend on quality link-building. I understand that the more good links we get - the better. But beyond that, I am looking for some methodology/data/formulas that would help to decide if links are worth pursing. Here is an example: Let's say we acquired 20 high-quality links from PR 0-5 pages of some trusted web sites of PR6-8. Let's say that on these pages would also link to 10-20 other web sites. Would such campaign be of some direct value to our ecommerce website of PR6? My question is limited to how high-quality links improve overall Google search traffic to the website only. I am not interested in calculating value of individual keywords - most of our search traffic comes from long tail. I am also not interested in how to estimate referral traffic - both seem much easier topics to tackle. But how would I be able to measure the value of lets say 1 link from PR 8 site with a PR3 page, when there are 10 other external links on that page?
Intermediate & Advanced SEO | | Quidsi0