Recovering from index problem (Take two)
-
Hi all. This is my second pass at the problem. Thank you for your responses before, I think I'm narrowing it down!
Below is my original message. Afterwards, I've added some update info.
For a while, we've been working on http://thewilddeckcompany.co.uk/. Everything was going swimmingly, and we had a top 5 ranking for the term 'bird hides' for this page - http://thewilddeckcompany.co.uk/products/bird-hides.
Then disaster struck! The client added a link with a faulty parameter in the Joomla back end that caused a bunch of duplicate content issues. Before this happened, all the site's 19 pages were indexed. Now it's just a handful, including the faulty URL (thewilddeckcompany.co.uk/index.php?id=13)
This shows the issue pretty clearly.
I've removed the link, redirected the bad URL, updated the site map and got some new links pointing at the site to resolve the problem. Yet almost two month later, the bad URL is still showing in the SERPs and the indexing problem is still there.
UPDATE
OK, since then I've blocked the faulty parameter in the robots.txt file. Now that page has disappeared, but the right one - http://thewilddeckcompany.co.uk/products/bird-hides - has not been indexed. It's been like this for several week.
Any ideas would be much appreciated!
-
Thank you all, this is brilliant.
-
Your problem is with the robots.txt file. You are blocking the URL
thewilddeckcompany.co.uk/index.php?id=13
That URL 301 redirects to the correct URL of
http://thewilddeckcompany.co.uk/products/bird-hides
Google cannot "see" the 301 redirect from the old "bad" URLs to the new "good" URL.
You have to let Google crawl the old URLs and see the 301 redirects so that it knows how things need to forward.
I would do this for all the duplicate pages, make sure they 301 to the correct pages and do not put the "bad" pages in robots.txt - otherwise the indexing will not be updated.
Something separate to check. We have seen Google taking a while to acknowledge some of our 301s. Go into your GWT and look at your duplicate title reports. You may see the old and new URLs showing as duplicates, even with the 301s in place. We had to setup a self canonicalizing link on the "good" pages to help get that cleaned up.
-
Blink-SEO
Jonathan is correct to try a Fetch as Google in WMT for the urls you need re indexed. (Note, that is not really the purpose of a Fetch as Google, but sometimes it works.)
I would also resubmit the sitemap now that you have blocked the offending url with robots.txt. It is likely the resubmission will help you the quickest IMO.Best,
Robert
-
It sounds like you just need to wait for Google to recrawl your robots.txt file. I saw this error in the serps:
www.thewilddeckcompany.co.uk/products/timber-water...
A description for this result is not available because of this site's robots.txt – learn more.So it is clear that the robots.txt file has not updated with the changes, after the mistake was made. Try fetching as Googlebot within webmaster tools, but it may take a little time to update. But at least it would seem that the robots.txt error is still a cause of the problem, just need to wait a little longer.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Getting Authority Social Blogs to Index
We have a few authority blogs that I manage to help increase our brand awareness and build power to our website. We have Blogspot, Wordpress, Tumblr & Typepad. Our content get's a summary syndicated to our authority blogs with an attribution link back to the original post. I also manually check them one a month to make sure it looks good and the content syndicated correctly. I even add unique content to these blogs once in awhile. I recently realized that the majority of the pages are not indexing. I added the blogs to our GSC & Bing webmasters and submitted the sitemaps. This was done on December 11th, as of now some pages indexed in Google and Bing says the sitemaps are still pending... Blogspot - 32/100 pages indexed Wordpress - 34/81 pages indexed Tumblr - 4/223 pages indexed Typepad - 3/63 pages indexed Can anyone help me figure out why I can't get Google to index more pages or Bing to process the sitemaps timely?
Intermediate & Advanced SEO | | LindsayE1 -
Travel site schema problems
I'm developing a travel site for my home state (Kansas - http://www.kansasisbeautiful.com though it's still has some development being worked on), but struggling to find scheme to work with for some items. So far my site is laid out by both region (northeast Kansas, western Kansas, etc.) and location types (waterfalls, parks, etc.). I'm currently working on coding in schema markup. I've found schema types for waterfalls, parks and landmarks, but I'm struggling to find anything for scenic drives (or highways, drives, anything related), hiking/biking trails and regions (northeast Kansas, southeast Kansas, etc.) The question I have is: What can I do to still try and put some kind of markup when there's nothing available that fits the item I'm trying to markup?
Intermediate & Advanced SEO | | msphoto0 -
Problem with Google finding our website
We have an issue with Google finding our website: (URL removed) When we google "(keyword removed)" in google.com.au, our website doesn't come up anywhere. This is despite inserting the suitable title tag and onsite copy for SEO. We found this strange, and thought we'd investigate further. We decided to just google the website URL in google.com.au, to see if it was being properly found. Our site appeared at the top but with this description: A description for this result is not available because of this site's robots.txt – learn more. We also can see that the incorrect title tag is appearing. From this, we assumed that there must be an issue with the robot.txt file. We decided to put a new robot.txt file up: (URL removed) This hasn't solved the problem though and we still have the same issue. If someone could get to the bottom of this for us, we would be most appreciative. We are thinking that there may possibly be another robot.txt file that we can't find that is causing issues, or something else we're not sure of! We want to get to the bottom of it so that the site can be appropriately found. Any help here would be most appreciated!
Intermediate & Advanced SEO | | Gavo0 -
Town and County pages taking months to index.
Hi, At http://www.general-hypnotherapy-register.com/regional-hypnotherapy-directory/ we have a load of town and county pages for all of the hypnotherapists on the site a) I have checked all of these links and they are spiderable. b) About a month back I noticed after the site changes, not entirely sure why, but the site was generating rogue pages, eg http://www.general-hypnotherapy-register.com/hypnotherapists/page/5/?town=barnsley instead of http://www.general-hypnotherapy-register.com/hypnotherapists/?town=barnsley We have added meta no index, no follow to these rogue pages around 4 weeks ago..however these pages still have a google cache date of Oct 4th predating these meta changes c) There are examples of the pages we do want, indexed, and ranking too on page 1, site:www.general-hypnotherapy-register.com/hypnotherapists eg http://www.general-hypnotherapy-register.com/hypnotherapists/?town=ockham however these pages are few and far between, these have a recent google cache date of Nov 1 **d) **The xml sitemap has all of the correct URLS, but in webmaster tools, the amount of pages indexed has been stubbornly flat at 2800 out of 4400 for 4 weeks now e) Query Paramaters: for ?town and ?county in webmaster tools, are set to Yes/Specifies Would love any suggestions, Thanks. Mark.
Intermediate & Advanced SEO | | Advantec0 -
Hosting two websites
Hi guys, I work with a company that has a business to consumer (B2C) website and a business to business (B2B) website selling the same product. The content will differ but the structure / templates will be similar across both sites. Any recommendations or suggested reading to ensure both sites appear in Search? Many thanks in advance! Richard
Intermediate & Advanced SEO | | Richard5550 -
Website is not getting indexed in Google! Not sure why?
I just came up with my new blog, its not live yet but the 1<sup>st</sup> landing page is ready, up and running… all is fine but here is the only problem is its not getting indexed in Google and I am not really sure why? .xml sitemap is there Google webmaster and analytics are there Website contain at least that much real social shares that it should get indexed in Google Few Links may be coming from Famous Bloggers and SEOmoz (both sites are very authentic in their respective domains) It’s the 4 day the website is up I don’t think website is not getting indexed in Google just because it contains 1 landing page and a thank you page! Any clue or help will be appreciated. www.setalks.com is the domain
Intermediate & Advanced SEO | | MoosaHemani0 -
Rich snippet on main index page
Hello, I am using a 3rd party company to generate reviews for my website. I want to optimize my site for the index page to see a star rating in the SERP. I am pulling the the count of the number of reviews and the average rating from my review partner and rendering this on the page. It is not visible to a visitor to the site. My page has been marked up correctly as you can see using the rich snippet testing tool http://www.google.com/webmasters/tools/richsnippets?url=http%3A%2F%2Fwww.jsshirts.com.au However the stars are not showing in SERP's. Does anyone have any ideas as to why the stars are not showing. Many thanks, Jason
Intermediate & Advanced SEO | | mullsey0 -
Google indexing flash content
Hi Would googles indexing of flash content count towards page content? for example I have over 7000 flash files, with 1 unique flash file per page followed by a short 2 paragraph snippet, would google count the flash as content towards the overall page? Because at the moment I've x-tagged the roberts with noindex, nofollow and no archive to prevent them from appearing in the search engines. I'm just wondering if the google bot visits and accesses the flash file it'll get the x-tag noindex, nofollow and then stop processing. I think this may be why the panda update also had an effect. thanks
Intermediate & Advanced SEO | | Flapjack0