Curious, anyone ever had over half of their indexed links drop on an e-commerce site?
-
In a year went from around 300k indexed pages to around >100k according to GWT. Could this be duplicate content issue, lost links, spam, aged links or all of the above? either way an audit is in order. Thanks!
Chris
-
awesome and thanks! I love nashville. went to school there:)
-
By phone it is 615-678-5464, by email it is lesley@dh42.com
-
what's the best way to reach you L?
thx,
C
-
Sure. The platform I use is Prestashop. It lets you put a short description in about the manufacturer or the brand in a centralized area in the shop. I just create a new tab on the page and draw that content in programatically. So you might type up a 300 word bio about the manufacturer or use what is on their Wikipedia page, and then have that load on all of the pages for their products. You can put it in a text box so it is not obliviously seen as well.
I always generally try to put another tab as well. It is kind of a pain, but I try to type about 5 -10 different things up like "Our Return Policy" or "Why buy from us" or "Our price guarantee" Something like those and have the page choose one randomly at the render time. That way the content is always changing as well. Similar to this, http://screencast.com/t/schHrJjk It is just content to water down the feed content and make it possibly rank.
-
ok. any chance you can extend a dummies guide for that lol? i kinda follow for the most part. thanks, very very helpful L.
C
-
thank you!
C
-
There is another way too. One thing I have used to rank sites with content issues like this is to create a couple of tabs on the product pages and programatically fill them out. Say an "About {$manufacturer_name}" and a "Our Return Policy".
What you are trying to do is water down the content that is creating the duplicate. This will often work and bring the pages back into the index and ranking again.
-
Christian,
Here are your choices:
1. Rewrite the content so it is unique to your site.
OR, if that is not scalable because you have so many pages then:
2. Noindex most of those pages and allow indexation of only the ones that you have time/budget to rewrite.
Yes duplicate content is pretty rampant in eCommerce, which is precisely why Google has to handle it by choosing a canonical version and not ranking most of the others. They're not going to "ban" or "penalize" you, but ultimately the result is the same: No rankings = No Traffic.
-
well it looks like dupe content is a big issue which i am sure is pretty common in the e-commerce environment. I'm a bit fresh in the seo e-commerce as my background is more with services. I assume a stop over at Google Webmaster forum will provide some insight? thanks Lesley.
Christian
-
It could be due to any of those reasons, including others like content quality. Do you have unique product descriptions for all 300k+ pages?
-
I have seen it happen several times. Are you using a feed for your product description data? It could be an issue where a competitor has started to out rank you with the same description data and you have been dropped from the index.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Get anchor text, nofollow info etc from a list of links
Hi everybody. I'm currently doing a backlink audit for a client and I've hit a small problem. I'm combining data from Ahrefs, OSE, Webmaster Tools and Link Detox. I've got around 27k links in total now, but the issue is that WMT does not provide data on target page, anchor text and nofollow/dofollow. This means I have around 1k links with only partial information. Does anyone know of a way that I can get this data automatically? Thanks!
Reporting & Analytics | | Blink-SEO1 -
Large event site - how should I structure my URLs?
Hi guys, I'm working on a new website which is consolidating a number of existing event sites into one. The existing sites use a variety of URL structures: www.eventsite1.com/events/event-name www.eventsite2.com/festival-program/event-name www.eventsite3.com/event-name This inconsistency has led to issues with tracking category usage properly in analytics - for instance, with eventsite3.com, events fall within categories (www.eventsite3.com/category-name) but as soon as you drill into an event detail page (www.eventsite3.com/event-name) from the category page, the category is lost to analytics. This is compounded when one event lives within multiple categories, as I can't figure out which category is the most effective for a particular event. I've seen other event sites establish a canonical URL for a primary category, display it in the URL (i.e. www.eventsite4.com/primary-category/event-name) yet still let that event get hit via the secondary categories (www.eventsite4.com/secondary-category/event-name). This way, the categories get passed to analytics without any duplicate content issues (i.e. via the setting of canonicals) Basically, I want to make sure that whatever instruction I give to the devs for the new site re: URL structure is correct from an SEO perspective and analytics perspective. Do I even need to worry about having the category in the URL? Can someone please help me with this? Hope this makes sense Cheers
Reporting & Analytics | | cos20300 -
Impressions in GWT have dropped to nothing, but my page is still ranking normally
Hello Everyone, I'm seeing a strange issue. On the 22nd of this month Webmasters tools started showing 6 impressions per day down from hundreds or thousands. I thought I was hit with a huge penalty for my keywords but they are still ranking where they have for the past month or two on Google. In analytics my organic traffic is stable. It just seems to be GWT showing the massive drop. My domain is: http://Patchofland.com Any Thoughts? Thanks in advance!
Reporting & Analytics | | PatchofLand0 -
How is it possible that this site has a higher page authority than my site?
Judging by open site explorer, I'm crushing my competitor in every imaginable way. And yet, somehow they have a higher page authority than me and, consequently, are ranking higher than me. How is this possible? My site is on the left: 40atcpP.png
Reporting & Analytics | | ScottMcPherson0 -
SEOing a Site Before it is ready?
Hi Guys I have a site currently being built that is 45 days away. I am wondering whether it is possible to have a dummy website (with same domain name) in play and starting the SEO effort so that these 45 days are not wasted from an SEO point of view. Is this something you have seen done effectively, or is it a no no. Kind Regards, Justin
Reporting & Analytics | | jchapman50 -
Measuring events to external sites
Im having problem measuring click on ads by using events in GA or Jetpack. For example when I checked out yesterday this is what I read: 1. In GA events it says 12 clicks 2. In Jetpack it says 9 clicks But when I look at Referrals to the actual site directly it says 18 clicks Which one is the rights one? I need this because I use this to invoice clients end of month! and it cant be any "maybe".something. cheers, R
Reporting & Analytics | | rrrobertsson0 -
Why do I have few different index URL addresses?
Yes I know, sorry guys but I also have a problem with duplicate pages. It shows that almost every page of my site has a duplicate content issue and looking at my folders in the server, I don't see all these pages... This is a static Website with no shopping cart or anything fancy. The first on the list is my [index] page and this is giving me a hint about some sort of bad settings on my end with the SEOMOZ crawler??? Please advice and thank you! index-variations.jpg
Reporting & Analytics | | cssyes0 -
Google is listing my site using IP also, is it normal?
https://www.google.com/search?sourceid=chrome&ie=UTF-8&q=site%3A50.97.XXX.XXX About 7,050 results (0.24 seconds) when we do list by domain we get : About 10,400,000 results (0.29 seconds) is it ok? would google smart enough to count IP address not as duplicate content?
Reporting & Analytics | | tpt.com0