Huge Google index on E-commerce site
-
Hi Guys,
Refering back to my original post I would first like to thank you guys for all the advice.
We implemented canonical url's all over the site and noindexed some url's with robots.txt and the site already went from 100.000+ url's indexed to 87.000 urls indexed in GWT.
My question: Is there way to speed this up?
I do know about the way to remove url's from index (with noindex of robots.txt condition) but this is a very intensive way to do so.I was hoping you guys maybe have a solution for this..
-
Hi,
A few weeks later now and index is now on 63.000 url's so that's a good thing.
Another weird thing is the following.
There's a (old) url still in the index. When i visit it redirects me to the new url, which is good. Cache date is 2 weeks ago but Google still shows the old url.
How is this possible? The 301 redirect is already in place since April 2013.
-
Hi allen Jarosz!
Thanks for your reply
I've actually done all the things you said in the last few weeks. Site is totally indexed but the main problem is that are over 85.000 url's indexed but the site only exists of 13.000 urls.
So the main question is wether i can speed things up in one way or another to get those 70.000 url's deindexed.Are any options besides noindex, robots.txt and removing some url's ? Because now it's just waiting.
It looks like we are going the right way when you check the image.
-
SSiebn,
I have had some success in speeding things up, but only to a point.
Google webmaster tools is a GREAT tool that fortunately for us Google allows us to use, and its free!
I'm sure you probably already use the service, but I have found a few ways to use the tools to improve their scan rate. First block the spiders from crawling any pages you don't want indexed, for instance your backend files, this allows more time to be spent on the pages you want indexed. Second ensure you pages link to each other in the site, this allows pages to be linked by flowing through to each other, (no dead ends). Third use "Fetch as Google" from WMT, you are allowed up to 10 fetches. These fetches can be configured to follow linking pages, once crawled, you may submit the results to the Google index, with up to 500 fetches. It may be beneficial to submit for "Fetch as Google" your main categories. Lastly check your "Crawl Rate" to ensure that you have chosen "<label for="recommendedType">Let Google optimize for my site (recommended)</label>"
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Moving a lot of pdfs to main site. Worth trying to get them indexed?
On my main site we link to pdfs that are located on another one of our domains. The only thing that is on this other domain is the pdfs. It was setup really poorly so I am going to redesign everything and probably move it. Is it worthwhile trying to add these pdfs to our sitemap and to try and get them indexed? They are all connected to a current item, but the content is original.
Intermediate & Advanced SEO | | EcommerceSite0 -
Huge drop in rankings for specialist life insurance site
I work with a site who specialise in life insurance for people with pre-existing medical conditions - http://goo.gl/Drwre6. The site has ranked really well historically, but was hit hard on 16th June when we saw an almost 100% drop in rankings overnight. We picked up on quite a few issues straight away and rectified these. A list of steps we've taken so far are below: removed CSS & JS files from robots.txt changed hosting provider back, as it had recently been moved somewhere new updated copy on main landing pages to remove small amounts that were duplicated requested removal of some suspicious looking backlinks and submitted a disavow found and removed a test site that was live and indexable found an external site that had scraped copy from our site - requested removal (this site is no longer live) cleaned up any 404 and ensured all redirects are working correctly updated the diabetes page to include more valuable info - including linking out to authority sites After taking all these steps, we have still seen no improvement. It could be that Google just hasn't yet re-crawled the site to take the changes into account...? We're aware of one other site in our industry that has noticed a drop in rankings in the last couple of months, but a number of our competitors are still ranking well for our target terms. We wonder if the site was caught up in the Payday Loans update, as the timings almost line up. Other sites with spammy medical content seem to have been hit, so we wonder if the "medical" type content on our site could have been penalised? Incredibly frustrating if so, as it's a valid, genuine service being offered! Really at a bit of a loss as to what to do next, so any help would be hugely appreciated! Katie
Intermediate & Advanced SEO | | Digirank0 -
Google Processing but Not Indexing XML Sitemap
Like it says above, Google is processing but not indexing our latest XML sitemap. I noticed this Monday afternoon - Indexed status was still Pending - and didn't think anything of it. But when it still said Pending on Tuesday, it seemed strange. I deleted and resubmitted our XML sitemap on Tuesday. It now shows that it was processed on Tuesday, but the Indexed status is still Pending. I've never seen this much of a lag, hence the concern. Our site IS indexed in Google - it shows up with a site:xxxx.com search with the same number of pages as it always has. The only thing I can see that triggered this is Sunday the site failed verification via Google, but we quickly fixed that and re-verified via WMT Monday morning. Anyone know what's going on?
Intermediate & Advanced SEO | | Kingof50 -
How to remove wrong crawled domain from Google index
Hello, I'm running a Wordpress multisite. When I create a new site for a client, we do the preparation using the multisite domain address (ex: cameleor.cobea.be). To keep the site protected we use the "multisite privacy" plugin which allows us to restrict the site to admin only. When site is ready we a domain mapping plugin to redirect the client domain to the multisite (ex: cameleor.com). Unfortunately, recently we switched our domain mappin plugin by another one and 2 sites got crawled by Google on their multsite address as well. So now when you type "cameleor" in Google you get the 2 domains in SERPS (see here http://screencast.com/t/0wzdrYSR). It's been 2 weeks or so that we fixed the plugin issue and now cameleor.cobea.be is redirected to the correct address cameleor.com. My question: how can I get rid of those wrong urls ? I can't remove it in Google Webmaster Tools as they belong to another domain (cf. cameleor.cobea.be for which I can't get authenticated) and I wonder if will ever get removed from index as they still redirect to something (no error to the eyes of Google)..? Does anybody has an idea or a solution for me please ? Thank you very much for your help Regards Jean-Louis
Intermediate & Advanced SEO | | JeanlouisSEO0 -
Site Indexed by Google but not Bing or Yahoo
Hi, I have a site that is indexed (and ranking very well) in Google, but when I do a "site:www.domain.com" search in Bing and Yahoo it is not showing up. The team that purchased the domain a while back has no idea if it was indexed by Bing or Yahoo at the time of purchase. Just wondering if there is anything that might be preventing it from being indexed? Also, Im going to submit an index request, are there any other things I can do to get it picked up?
Intermediate & Advanced SEO | | dbfrench0 -
Why is Google indexing either the singular or plural version of a keyword?
Hello Forum, We have just finished completely redoing a website and it seems that for several keywords either the plural or singular version is no longer being displayed in Google search results. For example, we sell yoga products, one of which is a bolster. In the SEO section of Google Analytics, the keyword "bolsters" has held a steady rank while "bolster" lost lots of rank and now no longer shows. Both keywords pointed to the same page and hold nearly equal rank, which has both keywords for "bolster" and "yoga bolster" Any idea what may be going on?
Intermediate & Advanced SEO | | pano0 -
Removing a Page From Google index
We accidentally generated some pages on our site that ended up getting indexed by google. We have corrected the issue on the site and we 404 all of those pages. Should we manually delete the extra pages from Google's index or should we just let Google figure out that they are 404'd? What the best practice here?
Intermediate & Advanced SEO | | dbuckles0