Curious, anyone ever had over half of their indexed links drop on an e-commerce site?
-
In a year went from around 300k indexed pages to around >100k according to GWT. Could this be duplicate content issue, lost links, spam, aged links or all of the above? either way an audit is in order. Thanks!
Chris
-
awesome and thanks! I love nashville. went to school there:)
-
By phone it is 615-678-5464, by email it is lesley@dh42.com
-
what's the best way to reach you L?
thx,
C
-
Sure. The platform I use is Prestashop. It lets you put a short description in about the manufacturer or the brand in a centralized area in the shop. I just create a new tab on the page and draw that content in programatically. So you might type up a 300 word bio about the manufacturer or use what is on their Wikipedia page, and then have that load on all of the pages for their products. You can put it in a text box so it is not obliviously seen as well.
I always generally try to put another tab as well. It is kind of a pain, but I try to type about 5 -10 different things up like "Our Return Policy" or "Why buy from us" or "Our price guarantee" Something like those and have the page choose one randomly at the render time. That way the content is always changing as well. Similar to this, http://screencast.com/t/schHrJjk It is just content to water down the feed content and make it possibly rank.
-
ok. any chance you can extend a dummies guide for that lol? i kinda follow for the most part. thanks, very very helpful L.
C
-
thank you!
C
-
There is another way too. One thing I have used to rank sites with content issues like this is to create a couple of tabs on the product pages and programatically fill them out. Say an "About {$manufacturer_name}" and a "Our Return Policy".
What you are trying to do is water down the content that is creating the duplicate. This will often work and bring the pages back into the index and ranking again.
-
Christian,
Here are your choices:
1. Rewrite the content so it is unique to your site.
OR, if that is not scalable because you have so many pages then:
2. Noindex most of those pages and allow indexation of only the ones that you have time/budget to rewrite.
Yes duplicate content is pretty rampant in eCommerce, which is precisely why Google has to handle it by choosing a canonical version and not ranking most of the others. They're not going to "ban" or "penalize" you, but ultimately the result is the same: No rankings = No Traffic.
-
well it looks like dupe content is a big issue which i am sure is pretty common in the e-commerce environment. I'm a bit fresh in the seo e-commerce as my background is more with services. I assume a stop over at Google Webmaster forum will provide some insight? thanks Lesley.
Christian
-
It could be due to any of those reasons, including others like content quality. Do you have unique product descriptions for all 300k+ pages?
-
I have seen it happen several times. Are you using a feed for your product description data? It could be an issue where a competitor has started to out rank you with the same description data and you have been dropped from the index.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Multiple GA codes, one site.
Hi all, Is anyone running two GA codes on one website successfully? My organisation own a number of websites so we used to have one global GA code on all our sites to track global stats, and then we would also have site unique GA on each property to just track that one property. This worked fine, but of late we seem to be getting no data from the globally based code. Obviously, with the site-specific codes we can enter the name for that domain in GA but for the overall code, it is called 'all.com' I'm wondering if Google has now tied the GA domain to the code or if we are doing something wrong. All the codes are the same as they always were but have stopped working. As a stop gap, we have swapped to using Piwik as the all.com code. However, we are then comparing the stats in two different analytics programs so will get a different result. Also, it would be nice to be able to add the all.com to tools such as this to generate weekly reports. Anyone else having GA woe like this? Thanks. Carl
Reporting & Analytics | | WonkyDog0 -
Are these Search Console crawl errors a major concern to new client site?
We recently (4/1) went live with a new site for a client of ours. The client site was originally Point2 before they made the switch to a template site with Real Estate Webmasters. Now when I look into the Search Console I am getting the following Crawl Errors: 111 Server Errors (photos) 104 Soft 404s (blogs, archives, tags) 6,229 Not Found (listings) I have a few questions. The server errors I know not a lot about so I generally ignore. My main concerns are the 404s and not found. The 404s are mostly tags and blog archives which I wonder if I should leave alone or do 301s for each to /blog. For not found, these are all the previous listings from the IDX. My assumption is these will naturally fall away after some time, as the new ones have already indexed. But I wonder what I should be doing here and which will be affecting me. When we launched the new site there was a large spike in clicks ( 250% increase) which has now tapered off to an average of ~85 clicks versus ~160 at time of launch. Not sure if the Crawl Errors have any effect, I'm guessing not so much right now. I'd appreciate your insights Mozzers!
Reporting & Analytics | | localwork0 -
Google Analytics Set-Up for site with both http & https pages
We have a client that migrated to https last September. The site uses canonicals pointing to the https version. The client IT team is reluctant to put 301 redirects from the non-secure to the secure and we are not sure why they object. We ran a screaming frog report and it is showing both URLs for the same page (http and https). The non-secure version has a canonical pointing to the secure version. For every secure page there is a non-secure version in ScreamingFrog so Google must be ignoring the canonical and still indexing the page however, when we run a site: we see that most URLs are the secure version. At that time we did not change the Google Analytics setup option to use: "https" instead of "http" BUT GA appears to be recording data correctly. Yesterday we set up a new profile and selected "https" but our question is: Does the GAnalytics http/https version make a difference if so, what difference is it?
Reporting & Analytics | | RosemaryB1 -
Any issues with Google impressions dropping in Webmaster Tools?
I'm seeing a drop in impressions across all my websites that are hosted at a certain location. Just wanted to make sure that it is not some reporting issue that others are seeing.
Reporting & Analytics | | tdawson090 -
404 errors more than 1.8 lacs, Duplicate Content, Duplicate title, missing meta description increasing as site is based on regular ticket selling (CRM), kindly help
Sites error increasing i.e. 404 errors more than 1.8 lacs, Duplicate Content, Duplicate title, missing meta description increasing day by day as site is based on regular ticket selling (CRM), We have checked with webmasters for 404's, but it is not easy to delete 1.8 lac entries. How to resolve this issue for future. kindly help and suggest the solution.
Reporting & Analytics | | 1akal0 -
Local site rankings have dropped off first page but Universal went up.
My site was performing first page locally on 6 of 20 keywords, and universally on 3 of 20 keywords. We started a link building campaign and optimization about 3 weeks ago. When I looked at the rankings today I was happy to see that 16 of 20 keywords were in the top 20 rankings universally, but not happy to see that only 1 of the 25 words were ranking locally now. I lost my local ranking on 5 very important keywords. I realize that you can not rank first page for both local and organic but its as if I traded my first page local ranking for a universal ranking that appears lower on the page. Maybe someone could point me in the right direction.
Reporting & Analytics | | whmgatx0 -
Has anyone experienced Google Analytics track the page visit to a "thank you" page, but not the goal conversion?
Has anyone experienced where Google Analytics would track the page visit to a "thank you" page, but not the goal conversion that should result? The goal had worked for a long time as it is as just a goal url with head match. No funnel. Not case sensitive. For about four days now, no conversions have been recorded, but Google Analytics shows hundreds of people visited the page that should trigger the goal. Additionally, we have received the hundreds of leads. A Screaming Frog search shows the code is embedded throughout the site. For the interested, the GA code looks like (and the 8 Xs are the correct number on the site): Am I missing something?
Reporting & Analytics | | 352inc0 -
Tracking visits who entered the site via subdomain in GA
Hello, Does anybody know how I can segment visits who entered my site via subdomain? For example: I want to know people who entered via subdomain.example.com/***** and not include people who entered via www.example.com Thanks in advance!
Reporting & Analytics | | A_Q0