Certain Product Pages Not Indexing
-
Hey All,
We discovered an issue where new product pages on our site were not getting indexed because a "noindex" tag was inadvertently being added to section when those pages were created.
We removed the noindex tag in late April and some of the pages that had not been previously indexed are now showing up, but others are still not getting indexed and I'd appreciate some help on why this could be.
Here is an example of a page that was not in the index but is now showing after removal of noindex:
http://www.cloud9living.com/san-diego/gaslamp-quarter-food-tour
And here is an example of a page that is still not showing in the index:
http://www.cloud9living.com/atlanta/race-a-ferrari
UPDATE: The above page is now showing after I manually submitted it in WMT. I had previously submitted another page like a month ago and it was still not indexing so I thought the manual submission was a dead end. However, it just so happens that the above URL just had its Page Title and H1 updated to something more specific and less duplicative so I am currently running a test to see if that's the problem with these pages not indexing. Will update this soon.
Any suggestions? Thanks!
-
Significantly changing the Page Title and H1 is working. Second page now indexing after not indexing for some time. Probably shoulda thought of that a long time ago but that noindex tag sidetracked me!
-
It's hard to say, and I admit that I would have also expected them to reindex the pages by now.
A while back I was working for a client who accidentally turned on noindex, nofollow in their WordPress SEO by Yoast plugin site-wide. I didn't catch it for a week, and after I turned it off it took an additional 3 weeks before a single page of the site was reindexed. Granted, this was a low ranked site, and it probably wasn't high on Google's priority, but it did take much longer than I hoped to recover from.
Unfortunately I think you just have to wait it out. Just keep doing what your doing, creating new content, etc. Maybe if you build a new link to the page, Google will recrawl it then?
-
Good point. But why then would they continue to not index a month after manual submission?
-
If the page was set to "noindex" for a long time, Google may have flagged the page as such and chosen to skip over it when it was crawling your site.
-
Also, historically indexation happened very quickly on this site (less than 24 hours) so that's why I think something else is afoot here. And it has been like 6 weeks... which I don't think makes sense for a site with this level of domain authority.
-
Hey Bradley,
Thanks for the response. Yes, I had manually fetched a few of these pages about a month back and that didn't change indexation so I thought it was a dead end. However, one I tried again this morning suddenly indexed and it just so happened to also have had its Page Title and H1 tag changed to be significantly more unique than they were previously so I am wondering if that is problem. I am currently running a test with another page that I manually submitted a month ago but without updating Page Title/H1 and now I have resubmitted with changed info.
We'll see if that does the trick.
Will let you know.
-
You can ask Google to crawl your page: http://support.google.com/webmasters/bin/answer.py?hl=en&answer=1352276
Ask Google to crawl a page or site:
- On the Webmaster Tools Home page, click the site you want.
- On the Dashboard, under Health, click Fetch as Google.
- In the text box, type the path to the page you want to check.
- In the dropdown list, select Web. (You can select another type of page, but currently we only accept submissions for our Web Search index.)
- Click Fetch. Google will fetch the URL you requested. It may take up to 10 or 15 minutes for Fetch status to be updated.
- Once you see a Fetch status of "Successful", click Submit to Index, and then click one of the following:
- To submit the individual URL to Google's index, select URL and click Submit. You can submit up to 500 URLs a week in this way.
- To submit the URL and all pages linked from it, click URL and all linked pages. You can submit up to 10 of these requests a month.
It's probably just Google slowly making their way around to re-crawling these pages. I would fetch the page, and just wait a little while longer.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
SEO for Product Pages Deal that will last One Day Only
For an Ecommerce website I am required to create two pages. 1) One that will be displaying the "Deal of the day", which is basically a summary of the product on sale and another 2) product page where the actual product-deal resides. "Deal of the day" page Fixed url e.g. homepage.com/deal-of-the-day Product description summary Go to product-deal & Buy Now Button Content changes everyday Product Deal Page Similar to other products, sometimes will be a group of products, coupons etc. Product deals will be stored for later re-use Not visible from the main product catalogue These products are most of the time the same products from the catalogue but different copy Recommendations? Thanks!
Intermediate & Advanced SEO | | raulreyes0 -
Unlimited Product Pages
While browsing through my Moz campaign, I noticed that my site is pulling up unlimited numbers of product pages even though no products appear on them. i.e. http://www.interstellarstore.com/star-trek-memorabilia?page=16 http://www.interstellarstore.com/star-trek-memorabilia?page=100 http://www.interstellarstore.com/star-trek-memorabilia?page=200 I have no ideal how to resolve this issue. I can't possible 301 an unlimited number of pages, and I can see this being a big SEO problem. Any thoughts?
Intermediate & Advanced SEO | | moon-boots0 -
Pagination on a product page with reviews spread out on multiple pages
Our current product pages markup only have the canonical URL on the first page (each page loads more user reviews). Since we don't want to increase load times, we don't currently have a canonical view all product page. Do we need to mark up each subsequent page with its own canonical URL? My understanding was that canonical and rel next prev tags are independent of each other. So that if we mark up the middle pages with a paginated URL, e.g: Product page #1http://www.example.co.uk/Product.aspx?p=2692"/>http://www.example.co.uk/Product.aspx?p=2692&pageid=2" />**Product page #2 **http://www.example.co.uk/Product.aspx?p=2692&pageid=2"/>http://www.example.co.uk/Product.aspx?p=2692" />http://www.example.co.uk/Product.aspx?p=2692&pageid=3" />Would mean that each canonical page would suggest to google another piece of unique content, which this obviously isn't. Is the PREV NEXT able to "override" the canonical and explain to Googlebot that its part of a series? Wouldn't the canonical then be redundant?Thanks
Intermediate & Advanced SEO | | Don340 -
Duplicate Content: Is a product feed/page rolled out across subdomains deemed duplicate content?
A company has a TLD (top-level-domain) which every single product: company.com/product/name.html The company also has subdomains (tailored to a range of products) which lists a choosen selection of the products from the TLD - sort of like a feed: subdomain.company.com/product/name.html The content on the TLD & subdomain product page are exactly the same and cannot be changed - CSS and HTML is slightly differant but the content (text and images) is exactly the same! My concern (and rightly so) is that Google will deem this to be duplicate content, therfore I'm going to have to add a rel cannonical tag into the header of all subdomain pages, pointing to the original product page on the TLD. Does this sound like the correct thing to do? Or is there a better solution? Moving on, not only are products fed onto subdomain, there are a handfull of other domains which list the products - again, the content (text and images) is exactly the same: other.com/product/name.html Would I be best placed to add a rel cannonical tag into the header of the product pages on other domains, pointing to the original product page on the actual TLD? Does rel cannonical work across domains? Would the product pages with a rel cannonical tag in the header still rank? Let me know if there is a better solution all-round!
Intermediate & Advanced SEO | | iam-sold0 -
Rel=next/prev for paginated pages then no need for "no index, follow"?
I have a real estate website and use rel=next/prev for paginated real estate result pages. I understand "no index, follow" is not needed for the paginated pages. However, my case is a bit unique: this is real estate site where the listings also show on competitors sites. So, I thought, if I "no index, follow" the paginated pages that would reduce the amount of duplicate content on my site and ultimately support my site ranking well. Again, I understand "no index, follow" is not needed for paginated pages when using rel=next/prev, but since my content will probably be considered fairly duplicate, I question if I should do anyway.
Intermediate & Advanced SEO | | khi50 -
Adding Orphaned Pages to the Google Index
Hey folks, How do you think Google will treat adding 300K orphaned pages to a 4.5 million page site. The URLs would resolve but there would be no on site navigation to those pages, Google would only know about them through sitemap.xmls. These pages are super low competition. The plot thickens, what we are really after is to get 150k real pages back on the site, these pages do have crawlable paths on the site but in order to do that (for technical reasons) we need to push these other 300k orphaned pages live (it's an all or nothing deal) a) Do you think Google will have a problem with this or just decide to not index some or most these pages since they are orphaned. b) If these pages will just fall out of the index or not get included, and have no chance of ever accumulating PR anyway since they are not linked to, would it make sense to just noindex them? c) Should we not submit sitemap.xml files at all, and take our 150k and just ignore these 300k and hope Google ignores them as well since they are orhpaned? d) If Google is OK with this maybe we should submit the sitemap.xmls and keep an eye on the pages, maybe they will rank and bring us a bit of traffic, but we don't want to do that if it could be an issue with Google. Thanks for your opinions and if you have any hard evidence either way especially thanks for that info. 😉
Intermediate & Advanced SEO | | irvingw0 -
Why is a page with a noindex code being indexed?
I was looking through the pages indexed by Google (with site:www.mywebsite.com) and one of the results was a page with "noindex, follow" in the code that seems to be a page generated by blog searches. Any ideas why it seems to be indexed or how to de-index it?
Intermediate & Advanced SEO | | theLotter0 -
Can a XML sitemap index point to other sitemaps indexes?
We have a massive site that is having some issue being fully crawled due to some of our site architecture and linking. Is it possible to have a XML sitemap index point to other sitemap indexes rather than standalone XML sitemaps? Has anyone done this successfully? Based upon the description here: http://sitemaps.org/protocol.php#index it seems like it should be possible. Thanks in advance for your help!
Intermediate & Advanced SEO | | CareerBliss0