Webpages look like they have been de-indexed
-
Hi there,
My webpages seem that they have been de-indexed, I have no page rank anymore for my webpages, my homepage which was a PR4, is now saying N/A, plus lots of my rankings have dropped, what check should I been making to identify that this is the case?
Kind Regards
-
No problem Gary, good luck.
Matt.
-
Thanks for your help Matt, will keep you posted on the progress of this.
Kind Regards
-
Hi Gary,
I would assume that this is the root of the problem. Google themselves have said the following:
"If you've experienced a change in ranking which you suspect may be more than a simple algorithm change, there are other things you may want to investigate as possible causes, such as a major change to your site's content, content management system, or server architecture. For example, a site may not rank well if your server stops serving pages to Googlebot, or if you've changed the URLs for a large portion of your site's pages. This article has a list of other potential reasons your site may not be doing well in search."
Once you have addressed the issues, go into Webmaster Tools and go to your website. Then click on 'Health' then 'Fetch as Google'. Fetch your homepage and then click on 'Submit URl and Linked Pages to Index'.
Only do this once you have rectified your server issues though.
Matt.
-
Hi Matt,
I have just checked webmaster tools and saw that there is over 900 server errors, it does seem likely this is the reason why all of these pages have dropped and no page has any PR, what do you think?
Is there a way of submitting all of these at once?
Kind Regards
-
Hi Gary,
Yes that sounds like a server error. If you are getting a lot of 500 errors then Google will punish your website for this as they will assume that it isn't functioning correctly.
Take a look into the pages that are getting 500 errors and have a word with your web hosting company to see if they can look into the root of the problem with the server.
Once you have managed to sort out the issues with your server, go into Google Webmaster Tools and ask Google to fetch your site using Googlebot, then if there are no errors, re-submit the site to be crawled and indexed.
Hope this helps.
Matt.
-
Hi Matt,
It all happened the other day. I just recieved a notice that SEOmoz have crawled my website and noticed that there is literally hundreds of 500 errors on my webpages, could this be the reason why there is no PR and all my rankings have dropped?
Kind Regards
-
Hi Gary,
This could be for a number of reasons, bearing in mind that there has been a shed load of new algorithm updates from Google recently! The first step to take is to try and identify the rough date that you started to lose rankings.
This way you will be able to understand which update may have affected your website. Go into your Google Analytics and have a look at the organic search traffic coming to your website; if it is showing a dip around the end (24th) of April, then you may have been hit by the Penguin update (like a lot of us).
The Penguin update punished a lot of websites for their linking methods, i.e. having a lot of the same anchor text on links back to your website, having links in the footer areas of websites, duplicating links across single domains many times, linking from link-network websites, etc. If you think that your website may have any links like these then you may be at risk.
Take a look at your Google Webmaster Tools as well as http://www.opensiteexplorer.org/ and take a deeper look into your linking profile to see who is linking to you and how they are linking. If you start to see a pattern emerging, contact the sites and see if you can get those bad links changed.
If you think that everything in terms of links looks good then let me know and I will try to help find what else it could be.
Matt.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Index Bloat: Canonicalize, Redirect or Delete URLs?
I was doing some simple on-page recommendations for a client and realized that they have a bit of a website bloat problem. They are an ecommerce shoe store and for one product, there could be 10+ URLs. For example, this is what ONE product looks like: example.com/products/shoename-color1 example.com/products/shoename-color2 example.com/collections/style/products/shoename-color1 example.com/collections/style/products/shoename-color2 example.com/collections/adifferentstyle/products/shoename-color1 example.com/collections/adifferentstyle/products/shoename-color2 example.com/collections/shop-latest-styles/products/shoename-color1 example.com/collections/shop-latest-styles/products/shoename-color2 example.com/collections/all/products/shoename-color1 example.com/collections/all/products/shoename-color2 ...and so on... all for the same shoe. They have about 20-30 shoes altogether, and some come in 4-5 colors. This has caused some major bloat on their site and I assume some confusion for the search engine. That said, I'm trying to figure out what the best way to tackle this is from an SEO perspective. Here's where I've gotten to so far: Is it better to canonicalize all URLs, referencing back to one "main" one, delete all bloat pages re-link everything to the main one(s), or 301 redirect the bloat URLs back to the "main" one(s)? Or is there another option that I haven't considered? Thanks!
Intermediate & Advanced SEO | | AJTSEO0 -
No Index thousands of thin content pages?
Hello all! I'm working on a site that features a service marketed to community leaders that allows the citizens of that community log 311 type issues such as potholes, broken streetlights, etc. The "marketing" front of the site is 10-12 pages of content to be optimized for the community leader searchers however, as you can imagine there are thousands and thousands of pages of one or two line complaints such as, "There is a pothole on Main St. and 3rd." These complaint pages are not about the service, and I'm thinking not helpful to my end goal of gaining awareness of the service through search for the community leaders. Community leaders are searching for "311 request service", not "potholes on main street". Should all of these "complaint" pages be NOINDEX'd? What if there are a number of quality links pointing to the complaint pages? Do I have to worry about losing Domain Authority if I do NOINDEX them? Thanks for any input. Ken
Intermediate & Advanced SEO | | KenSchaefer0 -
Fetch as Google -- Does not result in pages getting indexed
I run a exotic pet website which currently has several types of species of reptiles. It has done well in SERP for the first couple of types of reptiles, but I am continuing to add new species and for each of these comes the task of getting ranked and I need to figure out the best process. We just released our 4th species, "reticulated pythons", about 2 weeks ago, and I made these pages public and in Webmaster tools did a "Fetch as Google" and index page and child pages for this page: http://www.morphmarket.com/c/reptiles/pythons/reticulated-pythons/index While Google immediately indexed the index page, it did not really index the couple of dozen pages linked from this page despite me checking the option to crawl child pages. I know this by two ways: first, in Google Webmaster Tools, if I look at Search Analytics and Pages filtered by "retic", there are only 2 listed. This at least tells me it's not showing these pages to users. More directly though, if I look at Google search for "site:morphmarket.com/c/reptiles/pythons/reticulated-pythons" there are only 7 pages indexed. More details -- I've tested at least one of these URLs with the robot checker and they are not blocked. The canonical values look right. I have not monkeyed really with Crawl URL Parameters. I do NOT have these pages listed in my sitemap, but in my experience Google didn't care a lot about that -- I previously had about 100 pages there and google didn't index some of them for more than 1 year. Google has indexed "105k" pages from my site so it is very happy to do so, apparently just not the ones I want (this large value is due to permutations of search parameters, something I think I've since improved with canonical, robots, etc). I may have some nofollow links to the same URLs but NOT on this page, so assuming nofollow has only local effects, this shouldn't matter. Any advice on what could be going wrong here. I really want Google to index the top couple of links on this page (home, index, stores, calculator) as well as the couple dozen gene/tag links below.
Intermediate & Advanced SEO | | jplehmann0 -
How to do Country specific indexing ?
We are a business that operate in South East Asian countries and have medical professionals listed in Thailand, Philippines and Indonesia. When I go to Google Philippines and check I can see indexing of pages from all countries and no Philippines pages. Philippines is where we launched recently. How can I tell Google Philippines to give more priority to pages from Philippines and not from other countries Can someone help?
Intermediate & Advanced SEO | | ozil0 -
HTTPS pages - To meta no-index or not to meta no-index?
I am working on a client's site at the moment and I noticed that both HTTP and HTTPS versions of certain pages are indexed by Google and both show in the SERPS when you search for the content of these pages. I just wanted to get various opinions on whether HTTPS pages should have a meta no-index tag through an htaccess rule or whether they should be left as is.
Intermediate & Advanced SEO | | Jamie.Stevens0 -
Huge Google index on E-commerce site
Hi Guys, I got a question which i can't understand. I'm working on a e-commerce site which recently got a CMS update including URL updates.
Intermediate & Advanced SEO | | ssiebn7
We did a lot of 301's on the old url's (around 3000 /4000 i guess) and submitted a new sitemap (around 12.000 urls, of which 10.500 are indexed). The strange thing is.. When i check the indexing status in webmaster tools Google tells me there are over 98.000 url's indexed.
Doing the site:domainx.com Google tells me there are 111.000 url's indexed. Another strange thing which another forum member describes here : Cache date has been reverted And next to that old url's (which have a 301 for about a month now) keep showing up in the index. Does anyone know what i could do to solve the problem?0 -
Indexing/Sitemap - I must be wrong
Hi All, I would guess that a great number of us new to SEO (or not) share some simple beliefs in relation to Google indexing and Sitemaps, and as such get confused by what Web master tools shows us. It would be great if somone with experience/knowledge could clear this up for once and all 🙂 Common beliefs: Google will crawl your site from the top down, following each link and recursively repeating the process until it bottoms out/becomes cyclic. A Sitemap can be provided that outlines the definitive structure of the site, and is especially useful for links that may not be easily discovered via crawling. In Google’s webmaster tools in the sitemap section the number of pages indexed shows the number of pages in your sitemap that Google considers to be worthwhile indexing. If you place a rel="canonical" tag on every page pointing to the definitive version you will avoid duplicate content and aid Google in its indexing endeavour. These preconceptions seem fair, but must be flawed. Our site has 1,417 pages as listed in our Sitemap. Google’s tools tell us there are no issues with this sitemap but a mere 44 are indexed! We submit 2,716 images (because we create all our own images for products) and a disappointing zero are indexed. Under Health->Index status in WM tools, we apparently have 4,169 pages indexed. I tend to assume these are old pages that now yield a 404 if they are visited. It could be that Google’s Indexed quotient of 44 could mean “Pages indexed by virtue of your sitemap, i.e. we didn’t find them by crawling – so thanks for that”, but despite trawling through Google’s help, I don’t really get that feeling. This is basic stuff, but I suspect a great number of us struggle to understand the disparity between our expectations and what WM Tools yields, and we go on to either ignore an important problem, or waste time on non-issues. Can anyone shine a light on this for once and all? If you are interested, our map looks like this : http://www.1010direct.com/Sitemap.xml Many thanks Paul
Intermediate & Advanced SEO | | fretts0 -
Should you stop indexing of short lived pages?
In my site there will be a lot of pages that have a short life span of about a week as they are items on sale, should I nofollow the links meaning the site has a fwe hundred pages or allow indexing and have thousands but then have lots of links to pages that do not exist. I would of course if allowing indexing make sure the page links does not error and sends them to a similarly relevant page but which is best for me with the SEarch Engines? I would like to have the option of loads of links with pages of loads of content but not if it is detrimental Thanks
Intermediate & Advanced SEO | | barney30120