Curious, anyone ever had over half of their indexed links drop on an e-commerce site?
-
In a year went from around 300k indexed pages to around >100k according to GWT. Could this be duplicate content issue, lost links, spam, aged links or all of the above? either way an audit is in order. Thanks!
Chris
-
awesome and thanks! I love nashville. went to school there:)
-
By phone it is 615-678-5464, by email it is lesley@dh42.com
-
what's the best way to reach you L?
thx,
C
-
Sure. The platform I use is Prestashop. It lets you put a short description in about the manufacturer or the brand in a centralized area in the shop. I just create a new tab on the page and draw that content in programatically. So you might type up a 300 word bio about the manufacturer or use what is on their Wikipedia page, and then have that load on all of the pages for their products. You can put it in a text box so it is not obliviously seen as well.
I always generally try to put another tab as well. It is kind of a pain, but I try to type about 5 -10 different things up like "Our Return Policy" or "Why buy from us" or "Our price guarantee" Something like those and have the page choose one randomly at the render time. That way the content is always changing as well. Similar to this, http://screencast.com/t/schHrJjk It is just content to water down the feed content and make it possibly rank.
-
ok. any chance you can extend a dummies guide for that lol? i kinda follow for the most part. thanks, very very helpful L.
C
-
thank you!
C
-
There is another way too. One thing I have used to rank sites with content issues like this is to create a couple of tabs on the product pages and programatically fill them out. Say an "About {$manufacturer_name}" and a "Our Return Policy".
What you are trying to do is water down the content that is creating the duplicate. This will often work and bring the pages back into the index and ranking again.
-
Christian,
Here are your choices:
1. Rewrite the content so it is unique to your site.
OR, if that is not scalable because you have so many pages then:
2. Noindex most of those pages and allow indexation of only the ones that you have time/budget to rewrite.
Yes duplicate content is pretty rampant in eCommerce, which is precisely why Google has to handle it by choosing a canonical version and not ranking most of the others. They're not going to "ban" or "penalize" you, but ultimately the result is the same: No rankings = No Traffic.
-
well it looks like dupe content is a big issue which i am sure is pretty common in the e-commerce environment. I'm a bit fresh in the seo e-commerce as my background is more with services. I assume a stop over at Google Webmaster forum will provide some insight? thanks Lesley.
Christian
-
It could be due to any of those reasons, including others like content quality. Do you have unique product descriptions for all 300k+ pages?
-
I have seen it happen several times. Are you using a feed for your product description data? It could be an issue where a competitor has started to out rank you with the same description data and you have been dropped from the index.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Do lots of event tracking via tag manager will it slow down my site?
Hi All, For my Ecommerce site I have done lots of tracking total I do have total 45 event tracking but many times one event, track many pages. So if visitors click on url or button then do my site speed affect because of these trackings? Thanks!
Reporting & Analytics | | pragnesh96390 -
Redirecting all URLs appended with index.htm or index.html
It has come to my attention with one of my clients (WordPress website) that for some time they have within their Landing Page report (of GA - Google Analytics) URLs that should all be pointing to the one page, example: domain.com/about-us, also has a listing in GA as domain.com/about-us/index.htm Is this some kind of indication of a subdirectory issue? Has anyone had experience with this in such wordpress plugins as Yoast SEO, or other SEO plugin? My thoughts here are to simply redirect any of these non-existent files with a redirect in .htaccess - but what I'm using isn't working. I will insert the redirect here - - and any help would be greatly appreciated. RewriteEngine onRewriteCond %{THE_REQUEST} ^./index.html?
Reporting & Analytics | | cceebar
RewriteRule ^(.)index.html?$ http://www.dupontservicecenter.com/$1 [R=301,L] and this rewrite doesn't work: RewriteEngine on
RewriteRule ^(.+).htm$ http://dupontservicecenter.com/$1.php [R,NC] _Cindy0 -
Universal Google analytics e-commerce tracking code?
Hi Mozzers, I have never setup a universal google analytics e-commerce tracking code before and my concern is what exactly do I need to add on the cart thank you page? Can someone send me a sample of code I need to be adding? What kind of customization does it require? The more details you can provide the better! Thanks!
Reporting & Analytics | | Ideas-Money-Art0 -
Site relaunch and impact on SEO
I have some tough decisions to make about a web site I run. The site has seen around for 20 years (September 1995, to be precise, is the date listed against the domain). Over the years, the effort I've expanded on the site has come and gone, but I am about to throw a lot of time and effort back into it. The majority of the content on the site is pretty dated, isn't tremendously useful to the audience (since it's pretty old) and the site design and URL architecture isn't particularly SEO-friendly. In addition, I have a database of thousands vendors (for the specific industry this site serves). I don't know if it's a factor any more but 100% of the links there have been populated by the vendors themselves specifically requesting inclusion (through a form we expose on the site). When the request is approved, the vendor link shows up on the appropriate pages for location (state) and segment of the industry. Though the links are all "opt-in" from vendors (we've never one added or imported any ourselves), I am sure this all looks like a terrible link farm to Google! And some vendors have asked us to remove their link for that reason 🙂 One final (very important) point. We have a relationship with a nationwide brand and have four very specific pages related to that brand on our site. Those pages are essential - they are by far the most visited pages and drive virtually all our revenue. The pages were put together with SEO in mind and the look and feel is very different to the rest of the site. The result is, effectively, a site-within-a-site. I need to carefully protect the performance of these pages. To put some rough numbers on this, the site had 475,000 page views over the last year, with about 320,000 of those being to these four pages (by the way, for the rest of the content "something happened" around May 20th of last year - traffic almost doubled overnight - even though there were no changes to our site). We have a Facebook presence and have put a little effort into that recently (increasing fans from about 10,000 last August to nearly 24,000 today, with a net gain of about 2,500 per month currently). I don't have any sense of whether that is a meaningful resource in the big picture. So, that's the background. I want to totally revamp the broader site - much improved design, intentional SEO decisions, far better, current and active content, active social media presence and so on. I am also moving from one CMS to another (the target CMS / Blog platform being WordPress). Part of me wants to do the following: Come up with a better plan for SEO and basically just throw out the old stuff and start again, with the exception of the four vendor pages I mentioned Implement redirection of the old URLs to new content (301s) Just stop exposing the vendor pages (on the basis that many of the links are old/broken and I'm really not getting any benefit from them) Leave the four important pages exactly as they are (URL and content-wise) I am happy to rebuild the content afresh because I have a new plan around that for which I have some confidence. But I have some important questions. If I go with the approach above, is there any value from the old content / URLs that is worth retaining? How sure can I be there is no indirect negative effect on the four important pages? I really need to protect those pages Is throwing away the vendor links simply all good - or could there be some hidden negative I need to know about (given many of the links are broken and go to crappy/small web sites, I'm hoping this is just a simple decision to make) And one more uber-question. I want to take a performance baseline so that I can see where I started as I start making changes and measure performance over time. Beyond the obvious metrics like number of visitors, time per page, page views per visit, etc what metrics would be important to collect from the outset? I am just at the start of this project and it is very important to me. Given the longevity of the site, I don't know if there is much worth retaining for that reason, even if the content changes radically. At a high level I'm trying to decide what questions I need to answer before I set off on this path. Any suggestions would be very much appreciated. Thanks.
Reporting & Analytics | | MarkWill0 -
No Index Meta
Good Morning, So the company who redesigned our website forgot to take off any of the no-index stuff that was put onto the site once it went live. I removed everything in the robots.txt and the privacy settings in wordpress but I am still seeing Any suggestions on changing this or if its even necessary to change would be great! Thank you
Reporting & Analytics | | HashtagHustler0 -
Is Google Webmaster Tools Index Status Completely Correct?
So I was thrilled when Webmaster tools released the little graph telling you how many pages are indexed. However, I noticed with one of my sites, that it was actually going down over the past year. I've only been doing the SEO on the site for a few months, so it wasn't anything I did. The chart is attached to this post. Now here's the funny part. I haven't really noticed anything out of the ordinary for keyword ranking dropping off. I also tested my most recent page that I put up. 3 days after I posted it, I could find it in a Google search. Shouldn't this mean that's it's indexed? I can also find any other page I've posted in the last few months. Another oddity is that I submitted a sitemap a while ago when the site was only 22 pages. The sitemap index count says 20 of those pages are indexed. The chart only says that there are 3 indexed pages right now. However, I can clearly find dozens of pages in Google searches. Is there something I'm missing? Is my chart for this website broken? Should I report this to Google? Has anyone had a similar issue? decreaseIndex.png
Reporting & Analytics | | mytouchoftech0 -
Increased Bounce Rate & Dollar Index?
We use Google Analytics on our ecommerce site and we recently made several changes to an important page. Due to logistical reasons, we couldn't perform a Google web optimizer test but tracked the page's numbers in analytics from before/after the changes were made. After a week, we noticed that the bounce rate on the page went up by about 10% but the dollar index also doubled. We're trying to figure out how this could happen, since it seems kind of odd. Any feedback would be appreciated.
Reporting & Analytics | | airnwater0 -
Spider 404 errors linked to purchased domain
Hi, My client purchased a domain which based on the seller "promising lots of traffic". Subsequent investigation showed it was a scam and that the seller had been creative in Photoshop with some GA reports. Nevertheless, my client had redirected the acquired domain to their primary domain (via the domain registrar). From the period on which the acquired domain was redirected to the point when we removed the redirect, the web log files had a high volume of spider/bot 404 errors relating to an online pharmaacy - viagra, pills etc. The account does not seem to have been hacked. No additional files are present and the rest of the logs seem normal. As soon as the redirect was removed the spider 404 errors stopped. Aside from the advice about acquiring domains promising traffic which I've already discussed with my client, does anybody have any ideas about how a redirect could cause the 404 errors? Thanks
Reporting & Analytics | | bjalc20110