Curious, anyone ever had over half of their indexed links drop on an e-commerce site?
-
In a year went from around 300k indexed pages to around >100k according to GWT. Could this be duplicate content issue, lost links, spam, aged links or all of the above? either way an audit is in order. Thanks!
Chris
-
awesome and thanks! I love nashville. went to school there:)
-
By phone it is 615-678-5464, by email it is lesley@dh42.com
-
what's the best way to reach you L?
thx,
C
-
Sure. The platform I use is Prestashop. It lets you put a short description in about the manufacturer or the brand in a centralized area in the shop. I just create a new tab on the page and draw that content in programatically. So you might type up a 300 word bio about the manufacturer or use what is on their Wikipedia page, and then have that load on all of the pages for their products. You can put it in a text box so it is not obliviously seen as well.
I always generally try to put another tab as well. It is kind of a pain, but I try to type about 5 -10 different things up like "Our Return Policy" or "Why buy from us" or "Our price guarantee" Something like those and have the page choose one randomly at the render time. That way the content is always changing as well. Similar to this, http://screencast.com/t/schHrJjk It is just content to water down the feed content and make it possibly rank.
-
ok. any chance you can extend a dummies guide for that lol? i kinda follow for the most part. thanks, very very helpful L.
C
-
thank you!
C
-
There is another way too. One thing I have used to rank sites with content issues like this is to create a couple of tabs on the product pages and programatically fill them out. Say an "About {$manufacturer_name}" and a "Our Return Policy".
What you are trying to do is water down the content that is creating the duplicate. This will often work and bring the pages back into the index and ranking again.
-
Christian,
Here are your choices:
1. Rewrite the content so it is unique to your site.
OR, if that is not scalable because you have so many pages then:
2. Noindex most of those pages and allow indexation of only the ones that you have time/budget to rewrite.
Yes duplicate content is pretty rampant in eCommerce, which is precisely why Google has to handle it by choosing a canonical version and not ranking most of the others. They're not going to "ban" or "penalize" you, but ultimately the result is the same: No rankings = No Traffic.
-
well it looks like dupe content is a big issue which i am sure is pretty common in the e-commerce environment. I'm a bit fresh in the seo e-commerce as my background is more with services. I assume a stop over at Google Webmaster forum will provide some insight? thanks Lesley.
Christian
-
It could be due to any of those reasons, including others like content quality. Do you have unique product descriptions for all 300k+ pages?
-
I have seen it happen several times. Are you using a feed for your product description data? It could be an issue where a competitor has started to out rank you with the same description data and you have been dropped from the index.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Differences in site search revenue in GA
I just put in a piece of software to replace a really bad built in site search engine on my 3dcart website. Now I am trying to measure the change, but I am having some issues. When I check the ecom data in the conversions section of GA with the built in segment Performed Site Search, I get promising results. Approximately 5% revenue increase over LY. But if we jump to behavior, site search, usage, and then check the visits with site search, I get a decrease by 4%. And the actual revenue is off, by like double (150k compared to 80k) Anyone have any idea why I am getting these results? The site search function is set up. Tracking is enabled, query parameter is keyword and search url is /search.asp?keyword=
Reporting & Analytics | | ShockoeCommerce0 -
Lost rankings after disavowing links
About two months ago, I received an unnatural inbound links message from Google. Then I disavowed 58 (the worst ones) and now I can see that right after the date I submitted my disavow file I'm losing rankings. What would you suggest? I don't really want to revoke my disavow file because it has totally bad links. I have this idea to build 58 links from high quality sites (instead of the 58 I disavowed). Do you think it'll work faster (if at all) or I just need to remove my disavow file?
Reporting & Analytics | | VinceWicks0 -
Linking Multiple Niche Site In Same Google Analytics Account
Hi, I am providing SEO for Local business. Is it advisable to separate out the Google Analytics into different Google account or is it ok to remain it this way? Some of the client might be in the same niche, and might be competing with the same keywords as well. What I was worried is, Google might see these sites as same owner and only rank for 1 of the site. I was thinking to get the owners to register for their own Google Analytics and share the access to me.
Reporting & Analytics | | JonathanSoh0 -
Disavow links.??.Any One
Hi There, Just checking Google webmasters for one website. I have download a list links that was show in webmasters under "Links to Your Site". And found that there are few not found urls that or expired domain. May be that belongs to very old links building on bookmarking websites. I am thinking to submit all these links list in disavow links. Your thoughts.?
Reporting & Analytics | | lucidsoftech0 -
Solving link and duplicate content errors created by Wordpress blog and tags?
SEOmoz tells me my site's blog (a Wordpress site) has 2 big problems: a few pages with too many links and duplicate content. The problem is that these pages seem legit the way they are, but obviously I need to fix the problem, sooooo... Duplicate content error: error is a result of being able to search the blog by tags. Each blog post has mutliple tags, so the url.com/blog/tag pages occasionally show the same articles. Anyone know of a way to not get penalized for this? Should I exclude these pages from being crawled/sitemapped? Too many links error: SEOmoz tells me my main blog page has too many links (both url.com/blog/ and url.com/blog-2/) - these pages have excerpts of 6 most recent blog posts. I feel like this should not be an error... anyone know of a solution that will keep the site from being penalized by these pages? Thanks!
Reporting & Analytics | | RUNNERagency0 -
Internal Links not Showing Up
Hi, I have a wordpress blog at www.learnbonds.com. I am looking through open site explorer and and showing only 9 internal links on my report. I have hundreds of internal links on the site so I am wondering why they are not showing up? Any assistance would be appreciated. Thanks Dave
Reporting & Analytics | | fxtrader19790 -
Unique root linking domains - clarification
Hi guys, In SEOMoz Search Ranking Factors, one of the the top ranking factors is number of unique root domains linking to the page: http://www.seomoz.org/article/search-ranking-factors#metrics-5 My question is: do these unique root domains need to be unique root domains liking to my domain also? E.g. www.mydomain.com/landingpage1/ already got a link from www.externaldomain.com If www.externaldomain.com has another link pointing to www.mydomain.com/landingpage2/ will this link be counted? If yes will the value be diluted as www.externaldomain.com has already linked to www.mydomain.com/ Many thanks. David
Reporting & Analytics | | sssrpm0 -
Something strange going on with new client's site...
Please forgive my stupidity if there is something obvious here which I have missed (I keep assuming that must be the case), but any advice on this would be much appreciated. We've just acquired a new client. Despite having a site for plenty of time now they did not previously have analytics with their last company (I know, a crime!). They've been with us for about a month now and we've managed to get them some great rankings already. To be fair, the rankings weren't bad before us either. Anyway. They have multiple position one rankings for well searched terms both locally and nationally. One would assume therefore that a lot of their traffic would come from Google right? Not according to their analytics. In fact, very little of it does... instead, 70% of their average 3,000 visits per month comes from just one referring site. A framed version of their site which is through reachlocal, which itself doesn't rank for any of their terms. I don't get it... The URL of the site is: www.namgrass.co.uk (ignore there being a .com too, that's a portal as they cover other countries). The referring site causing me all this confusion is: http://namgrass.rtrk.co.uk/ (see source code at the bottom for the reachlocal thing). Now I know reach local certainly isn't sending them all that traffic, so why does GA say it is... and what is this reachlocal thing anyway?? I mean, I know what reachlocal is, but what gives here with regards to it? Any ideas, please??
Reporting & Analytics | | SteveOllington0