Internal linking question
-
Hi there. Are all internal links listed in GWMT actually indexed?
-
Jonnygeekuk,
If GWT is telling you they are "aware" (whether indexed or not) of URLs that you do not want indexed, and you have either blocked them in the robot.txt file or the robots header tag, or the page serves a 404 or 410 response in the http header, it wouldn't hurt to use the URL removal tool to remove those pages from the index just to be sure.
-
So, sounds like you're looking for a list of indexed pages? Will this tool help?
http://www.intavant.com/tools/google-indexed-pages-extractor/
-
I'm sorry it's taking me so long to get back to you on this. However you told me you say you're using the removal tool in Google Webmaster tools?
I want to be certain you're not using the link disavow tool as a removal tool is that correct?
"Google updates its entire index regularly. When we crawl the web, we automatically find new pages, remove outdated links, and reflect updates to existing pages, keeping the Google index fresh and as up-to-date as possible.
If outdated pages from your site appear in the search results, ensure that the pages return a status of either 404 (not found) or 410 (gone) in the header. These status codes tell Googlebot that the requested URL isn't valid. Some servers are misconfigured to return a status of 200 (Successful) for pages that don't exist, which tells Googlebot that the requested URLs are valid and should be indexed. If a page returns a true 404 error via the http headers, anyone can remove it from the Google index using the webpage removal request tool. Outdated pages that don't return true 404 errors usually fall out of our index naturally when other pages stop linking to them."
"
Reincluding content in search
"Content removed using the URL removal tool will not appear in search results for a minimum of 90 days or until the content has been removed from the Google index. However, if you've updated robots.txt, added meta tags, or password-protected content to prevent it being crawled, the content should naturally have dropped out of our index, and you shouldn't need to worry about it reappearing after 90 days. You can reinclude your content at any time during the 90-day period by following the steps below.
Reinclude content:
- On the Webmaster Tools Home page, click the site you want.
- In the left-hand menu, click Optimization, and then click Remove URLs.
- Select the Removed content tab, and then click Reinclude next to the content you want to reinclude in the Google index.
Pending requests are usually processed within 3-5 business days."
-
Hi Chris, Thomas
Thanks for taking the time to reply.
Essentially, the reason i'm asking this question is recently the site in question became heavily over indexed due to search filters etc becoming indexed. This resulted in a ton of thin content being indexed. We've since no indexed these pages but they are taking time to drop off so we are helping a little by using the removal tool in GWMT. A lot of these pages are hidden, it's difficult to find them in the main index but index status says we still have >7k pages indexed when we really should have fewer than 2k. A site: command reveals about 9k but only 600 are listed and they are all valid pages. Basically we're trying to find the urls to remove and noticed that a lot of them are listed in the internal links tab on GWMT. I just wondered whether it was advisable to remove these too, in addition to the 2.5k we have already removed.
-
Hi Johnny, I want to tell you that I agree with what Chris stated above. If you're looking for someone to confirm that. You want to also make sure you do not have over 100 to 150 URLs or internal links on your site. This will hurt Google indexing of the website.
I also use a tool to make internal links. And if that is what you are speaking of. It's called http://scribecontent.com. You can use it not only on word press but on all sites. I have found it to be extremely useful please be cautious though it how many links you built internally so that you do not create a page that cannot be indexed correctly.
http://www.distilled.net/u/search-engine-basics/#crawling
I hope I've been in help,
Thomas
-
Hey JonnyG,
Be sure not to confuse links with URLs. Essentially, a link is clickable thing on a web page that, when clicked, takes the user to another URL. A URL is an address (non-clickable) . A web page is the resource that exists at a URL.
Anyway, the Internal Links tab shows how many links exist on your site that can take you to other pages on your site. However, if you click on the Health | Index Status tab, you'll get choices to see Basic and Advanced info on your indexed URLs. In the advanced tab, you'll see the total number of pages Google's index on your site. Google's Webmaster Tools Help has a page on Index Status for more info.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How can I stop a tracking link from being indexed while still passing link equity?
I have a marketing campaign landing page and it uses a tracking URL to track clicks. The tracking links look something like this: http://this-is-the-origin-url.com/clkn/http/destination-url.com/ The problem is that Google is indexing these links as pages in the SERPs. Of course when they get indexed and then clicked, they show a 400 error because the /clkn/ link doesn't represent an actual page with content on it. The tracking link is set up to instantly 301 redirect to http://destination-url.com. Right now my dev team has blocked these links from crawlers by adding Disallow: /clkn/ in the robots.txt file, however, this blocks the flow of link equity to the destination page. How can I stop these links from being indexed without blocking the flow of link equity to the destination URL?
Technical SEO | | UnbounceVan0 -
301 redirect Question
Hi all, I have a client who has a domain lets say www.xyz.de which is redirected 301 to www.zyx.de. Now they're working on a relaunch and they want to use the www.xyz.de as their origibnal doman after that. So, at the end the www.zyx.de - which is indexed by Google - should be redirected to www.xyz.de. It vice versa. So the redirect becomes the original and the original becomes the redirect 😕 Is there anything we have to care off? Or will that run into the hell? Thanx. Seb.
Technical SEO | | TheHecksler0 -
Linking to CMS page ID
Hi all, Is it that detrimental to SEO if you link to the CMS page ID of a URL rather than the text URL of a page even if when you look at the source code Google sees it as a text URL? Thanks! 🙂
Technical SEO | | Diana.varbanescu0 -
Too many links?
Hello! I've just started with SEOmoz, and am getting an error about too many links on a few of my blog posts - it's pages with high numbers of comments, and the links are coming from each commenter's profile (hopefully that makes sense they're not just random stuffed links). Is there a way to help this not cause a problem? Thanks!
Technical SEO | | PaulineMagnusson0 -
Number of links you should have on a taxonomy term??
According to SeoMoz, my taxonomy terms contain more than 100 links (links to articles in my case) and it tells me that I should reduce it. I have seen a video by Matt Cutts, the google software engineer, and in that video he said that Google's engine has dramatically improved ever since and 100 is not the limit anymore. What do you guys think is the best practice here? To clarify the subject even more: I want to learn this from link juice perspective, does it effect how link juice is distributed? Let's say I have 5 taxonomy terms and all of them have 200 articles and these 5 terms are listed on the home page of a PR7 website. In this case some of the PR will be passed to these 5 taxonomy terms. However, if I increase taxonomy terms to 10, then i will reduce links to 100, but the PR will be distributed even more. This means each taxonomy term will have even less PR value. Am I wrong? Any ideas?
Technical SEO | | mertsevinc0 -
How to measure number of links out from a page
Following on from earlier Q, what do you all use to count links out from a page. I believe there is a bing tool which does this, though rather than a list of sites a simple number would be ideal?
Technical SEO | | seanmccauley0 -
Nofollow internal links
Hi, we have problems with having too many links on page. Our website has a menu with 3 level sub-navigation drop down for categories which we want to maintain, for easy-navigation for the users. http://www.redwrappings.com.au/ After reading this article: http://www.seomoz.org/blog/questions-answers-with-googles-spam-guru, and some other articles, we came up with a solution. We can easily reduce the number of links per page by putting 'nofollow' on our categories links menu dropdown and create a separate 'landing page' that contains links to these categories (and allow 'follow' links for robots). Is it wise to do this? Or any better, easy solution that you can suggest? Thanks
Technical SEO | | Essentia1