Feedback needed on possible solutions to resolve indexing on ecommerce site
-
I’ve included the scenario and two proposed fixes I’m considering. I’d appreciate any feedback on which fixes people feel are better and why, and/or any potential issues that could be caused by these fixes. Thank you!
Scenario of Problem I’m working on an ecommerce website (built on Magneto) that is having a problem getting product pages indexed by Google (and other search engines). Certain pages, like the ones I’ve included below, aren’t being indexed. I believe this is because of the way the site is configured in terms of internal linking. The site structure forces certain pages to be linked very deeply, therefore the only way for Googlebot to get to these pages is through a pagination page (such as www.acme.com/page?p=3). In addition, the link on the pagination page is really deep; generally there are more than 125 links on the page ahead of this link.
One of the Pages that Google isn’t indexing: http://www.getpaper.com/find-paper/engineering-paper/bond-20-lb/430-20-lb-laser-bond-22-x-650-1-roll.html
This page is linked from http://www.getpaper.com/find-paper/engineering-paper/bond-20-lb?p=5, and it is the 147<sup>th</sup> link in the source code.
Potential Fixes Fix One: Add navigation tags to the template so that search engines will spend less time crawling them and will get to the deeper pages, such as the one mentioned above. Note: the navigation tags are for HTML-5; however, the Magento site in which this is built does not use HTML 5.
Fix Two: Revised the Templates and CSS so that the main navigation and the sidebar navigation is on the bottom of the page rather than the top. This would put the links to the product pages in the source code ahead of the navigation links.
-
Thanks Matthew, while I am aware of duplicate content on this site, I wasn't aware it it specific to some of the pages that aren't being indexed. I will do more research on this!
-
Hey,
It looks like you might have a duplicate content problem contributing here. For instance, you linked to: http://www.getpaper.com/find-paper/engineering-paper/bond-20-lb/430-20-lb-laser-bond-22-x-650-1-roll.html
And there is this duplicated page, that doesn't have the category directory structure for the URL.
http://www.getpaper.com/430-20-lb-laser-bond-22-x-650-1-roll.htmlThat duplicated page is indexed by Google. It also looks like the duplicated page is what is listed in your XML sitemap, not the page you have linked to from the paginated pages.
In spot checking some of the other product pages, it looks like there is a similar issue going on. I'd recommend altering your XML sitemap to reference the URL you want indexed. Or, since it looks like Google has already indexed the pages on your XML sitemap (some of them, at least), you may want to use the URLs that have been indexed (the ones without the category structure) instead of the URLs with the category structure.
In terms of your possible fixes, I think fix one makes more sense. The more direct links you can add to deeper pages of your site, the better. On fix two, moving the sidebar and header to the bottom of the code and controlling the design with CSS can present some problems in various browsers...in my experience, it usually is more pain than gain.
I hope that helps. Thanks!
Matthew
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Sub domain? Micro site? What's the best solution?
My client currently has two websites to promote their art galleries in different parts of the country. They have bought a new domain (let's call it buyart.com) which they would eventually like to use as an e-commerce platform. They are wondering whether they keep their existing two gallery websites (non e-commerce) separate as they always have been, or somehow combine these into the new domain and have one overarching brand (buyart.com). I've read a bit on subdomains and microsites but am unsure at this stage what the best option would be, and what the pros and cons are. My feeling is to bring it all together under buyart.com so everything is in one place and creates a better user journey for anyone who would like to visit. Thoughts?
Technical SEO | | WhitewallGlasgow0 -
Why are these blackhat sites so successful?
Here's an interesting conundrum. Here are three sites with their respective ranking for "dental implants [city]:" http://dentalimplantsvaughan.ca - 9 (on google.ca) http://dentalimplantsinhonoluluhi.com - 2 (on google.com) http://dentalimplantssurreybc.ca - 7 (on google.ca) These markets are not particularly competitive, however, all of these sites suffer from: Duplicate content, both internally and across sites (all of this company's implant sites have the same exact content, minus the bio pages and the local modifier). Average speed score. No structured data No links And these sites are ranking relatively quickly. The Vaughan site went live 3 months ago. But, what's boggling my mind is that they rank on the first page at all. It seems they're doing the exact opposite of what you're supposed to do, yet they rank relatively well.
Technical SEO | | nowmedia10 -
Pages Indexed Not Changing
I have several sites that I do SEO for that are having a common problem. I have submitted xml sitemaps to Google for each site, and as new pages are added to the site, they are added to the xml sitemap. To make sure new pages are being indexed, I check the number of pages that have been indexed vs. the number of pages submitted by the xml sitemap every week. For weeks now, the number of pages submitted has increased, but the number of pages actually indexed has not changed. I have done searches on Google for the new pages and they are always added to the index, but the number of indexed pages is still not changing. My initial thought was as new pages are added to the index, old ones are being dropped. But I can't find evidence of that, or understand why that would be the case. Any ideas on why this is happening? Or am I worrying about something that I shouldn't even be concerned with since new pages are being indexed?
Technical SEO | | ang1 -
Partner Sites
Hi All, Within our company we have a media group that publishes magazines and videos, the sites have footers that link to our shopping site, one of them has 118,459 links to one URL, domain authority 23, and the other 17,726 to seven URLs, domain authority 52, (there are some articles which link organically). My question is are these links because they're from identifiable companies with the same ownership worth keeping or are they detrimental? The site being linked to has a DA of 39 Cheers Stew
Technical SEO | | StewMcG0 -
Dev Site Was Indexed By Google
Two of our dev sites(subdomains) were indexed by Google. They have since been made private once we found the problem. Should we take another step to remove the subdomain through robots.txt or just let it ride out? From what I understand, to remove the subdomain from Google we would verify the subdomain on GWT, then give the subdomain it's own robots.txt and disallow everything. Any advice is welcome, I just wanted to discuss this before making a decision.
Technical SEO | | ntsupply0 -
Homepage de-indexed, rest of site all there...
This is a random issue that I've been trying to get to the bottom of over the last few months. First I thought it might be that I have a spammy host, so I changed it. My site loads a little faster but the homepage is still totally non-visible. Other pages and posts index no problem.. It's really quite frustrating. http://bit.ly/1hA8DqV Any suggestions welcome. Standard WP, running Wordpress SEO by Joost and a few other basic plugins...
Technical SEO | | duncm0 -
Need Urgent Help
I have found one mistake that my place page address is little different than address on all local directories like on place page address is: 10010 S Tryon St #122 Charlotte, NC 28273 and on directories : 10010 South Tryon St 122 Charlotte, NC 28273 so on place page it is just "S" instead of South and "#" is before 122 but on all directories # is missing So what do you suggest ? Should i change address and re verify place page ? Re verify will put down place page value ???
Technical SEO | | mnkpso0 -
Site maintenance and crawling
Hey all, Rarely, but sometimes we require to take down our site for server maintenance, upgrades or various other system/network reasons. More often than not these downtimes are avoidable and we can redirect or eliminate the client side downtime. We have a 'down for maintenance - be back soon' page that is client facing. ANd outages are often no more than an hour tops. My question is, if the site is crawled by Bing/Google at the time of site being down, what is the best way of ensuring the indexed links are not refreshed with this maintenance content? (ie: this is what the pages look like now, so this is what the SE will index). I was thinking that add a no crawl to the robots.txt for the period of downtime and remove it once back up, but will this potentially affect results as well?
Technical SEO | | Daylan1