Can internal duplicate content cause issues?
-
Hello all mozzers - has anyone used nitrosell? we use them only because their inventory connects to ours epos point but because they have thousands of 301s on our domain we are getting duplicate content because different sizes of products (we sell womenswear) are creating seperate URLS so we are duplicating both content and URLS - im curious as to whether anyone has experienced simillar problems that have affected their SERPS?
Best wishes,
Chris
-
Yes! In the past I have always tried to work around such issues. But in the last few months I have made good experiences in contacting product developers. And in most cases, they react and improve their software. Have a nice day. Sunny regards from Germany, Sebastian
-
Hi Sebastian yes we are very aware of this issue, we need to flag this back up with the ecommerce engine as a matter of urgence! many thanks for your assistance.
-
You have to ask yourself, if there are enough searches for "XY size z"?
If not -> I would collect the link power in one page per product.
But as you say, it is bad, if the listed size is out of stock. But is it not possible, that the visitor can chose his size in a pull down menu?
If you can not change the "size" on the product page, you do allready have a problem.
Imagine me, searching for "product XY" and landing on "XS" instead of my size (L). I am lost
-
thank you for your response Sebastian, the issue is the duplicate URLs are the same product but in different sizes, so i dont know if it would be wise to add a canonical to these pages as google might index a product which size is out of stock. I am not sure why nitrosell is built like this!
-
Internal duplicate content can hurt your rankings. You waste link power on the copies, that should promote the original.
If your software generates variations, i would insert a rel=canical pointing from all those variations to the orginial.
http://support.google.com/webmasters/bin/answer.py?hl=en&answer=139394
-
No,I haven't used it.
Nitrosell: markets its world-class eCommerce solutions primarily through a worldwide network of reseller partners.....
How about asking the support what they can do for you in that case, maybe you have miss-configured the app etc.? OR the product has bugs?
Maybe you have url parameters (/womenshat?xy paramater12-12433) that shows up as duplicate content OR you have different pictures on the same URL +parameter with no content etc..
In general it is very hard to give you advise,with the given info.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate content: Form labels and field content
I have a site that has 500 pages, each with unique content, the only content that could be deemed the same is the 'Make Contact' form, which has the same labels and placeholder text on each page. Is this likely to cause any duplicate content penalties?
On-Page Optimization | | deployseo0 -
How to solve duplicate content issue???
I have 5 websites with different domain names, every website have same content, same pages, same website design. Kindly let me know how to solve this issue.
On-Page Optimization | | ross254sidney0 -
Static content VS Dynamic changing content
We have collected a lot of reviews and we want to use them on our Categories pages. We are going to be updating the top 6 reviews per categories every 4 days. There will be another page to see all of the reviews. Is there any advantage to have the reviews static for 1 or 2 weeks vs. having unique new ones pulled from the data base every time the page is refreshed? We know there is an advantage if we keep them on the page forever with long tail; however, we have created a new page with all of the reviews they can go to.
On-Page Optimization | | DoRM0 -
Dealing with thin content/95% duplicate content - canonical vs 301 vs noindex
My client's got 14 physical locations around the country but has a webpage for each "service area" they operate in. They have a Croydon location. But a separate page for London, Croydon, Essex, Luton, Stevenage and many other places (areas near Croydon) that the Croydon location serves. Each of these pages is a near duplicate of the Croydon page with the word Croydon swapped for the area. I'm told this was a SEO tactic circa 2001. Obviously this is an issue. So the question - should I 301 redirect each of the links to the Croydon page? Or (what I believe to be the best answer) set a rel=canonical tag on the duplicate pages). Creating "real and meaningful content" on each page isn't quite an option, sorry!
On-Page Optimization | | JamesFx0 -
Duplicate content in the title
Good morning, I am developing an application that searches offers in the press. The problem I have is the follow one:
On-Page Optimization | | ofuente
When I find an offer that I have already post, I cant use the same URL because it generates duplicate content , as the URL is generated from the title. If I find two offers in different stores (for example Thomson TV) I am studying two options. The first would be to add a number at the end of the URL
http://www.offertazo.com/televisor-thomson
http://www.offertazo.com/televisor-thomson1
http://www.offertazo.com/televisor-thomson2 Another option I propose would be to add semantic data to provide value (such as the date). For example:
http://www.offertazo.com/01-12-12/televisor-thomson I appreciate your help.0 -
How to fix duplicate issue among multiple root domains
Hello, I’m doing SEO for one E-commerce website which name is Lily Ann Cabinets & I’ve around 300 different root domains which having same linking structures, same design & even same product database for all 300 websites, but currently I’m focusing only on Lily Ann Cabinets website & trying to get ranking on some targeted keywords, but website is not performing well in Google.com For Example: http://www.lilyanncabinets.com/ (Main Websites)
On-Page Optimization | | CommercePundit
http://www.orlandocabinets.com/
http://www.chicagocabinets.org/
http://www.miamicabinets.org/
http://www.newyorkcabinets.org/
http://www.renocabinets.org/ So please can anyone tell that Will it create duplicate issue in search engines or may be due to this reason website doesn’t have good ranking in search engines, then how can I fix this issue? Do I have to make different structures for Lily Ann Cabinets?0 -
Duplicate Content - Potential Issue.
Hello, here we go again, If I write an article somewhere, lets say Squidoo for instance, then post it to my blog on my website will google see this as duplicate content and probably credit Squidoo for it or is there soemthing I can do to prevent this, maybe a linkk back to Squidoo from my website or a dontfollow on my website? Im not sure so any help here would be great, Also If I use other peoples material in my blog and link back to them, obviously I dont want the credit for the original material I am simply collating some of this on my blog for others to have a specific library if you like. Is this going to damage my websites reputation? Thanks again peeps. Craig Fenton IT
On-Page Optimization | | craigyboy0 -
Meta Data definition for multiple pages. Potential duplicate content risk?
Hi all, One of our clients needs to redefine their meta title and description tags. They publish very similar information almost every day, so the structure they propose is the following: Structure 1: Type of Analysis + periodicity + data + brand name Examples 1: Monthly Market Analysis, 1/5/2012 - Brand Name Weekly Technical Analysis, 7/5/2012 - Brand Name Structure 2: Company Name + investment recommendation + periodicity Example 2: Iberdrola + investment recommendation (this text doesn't vary) + 2T12 (wich means 2012, 2nd trimestrer) Regarding meta description they want to follow a similar approach, replicating every time the same info with a slight variation for each publication. I'm afraid this may cause a duplicate content problem because of the resemblance of every "Market Analysis" done or every "Investment recommendation" done in the future. My initial suggestion for them is to define specific and unique meta data for each page, but this is not possible for them given the time it takes to do it for every page. Finally, I ask them to specify the data in each meta title of content published, in order to add something different each time and avoid duplicate content penalty. Will this be enough to avoid duplicate content issues? Thanks in advance for your help folks! Alex
On-Page Optimization | | elisainteractive0