Mass Duplicate Content
-
Hi guys
Now that the full crawl is complete I've found the following:
http://www.trespass.co.uk/mens-onslow-02022
http://www.trespass.co.uk/mens-moora-01816
http://www.trespass.co.uk/site/writeReview?ProductID=1816
http://www.trespass.co.uk/site/writeReview?ProductID=2022
The first 2 duplicate content is easily fixed by writing better product descriptions for each product (a lot of hours needed) but still an easy fix.
The last 2 are review pages for each product which are all the same except for the main h1 text. My thinking is to add no index and no follow to all of these review pages? The site will be changing to magento very soon and theres still a lot of work to do.
If anyone has any other suggestions or can spot any other issues, its appreciated.
Kind regards
Robert
-
Thanks
Thats what I had in mind.
-
Hi Raphael
If you look in the code youll see this is actually the
tag and not image. Not sure why it does that. Ive had similar issue before with wordpress themes appearing to show headre tags as images when you right click.
-
For the review submission page, I'd include some extra meta tag elements like:
This to tell Google not to index this page, not follow links on this page and to force using your meta description in the search engine result instead of a Open Directory Network description.
-
On each review page there are images of "Review [Product name]". What I can suggest you is to exchange this images to text or just add Alt tag, that's how it won't be the same.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
To avoid the duplicate content issue I have created new urls for that specific site I am posting to and redirecting that url to the original on my site. Is this the right way to do it?
I am trying to avoid the duplicate content issue by creating new urls and redirecting them to the original url. Is this the proper way of going about it?
On-Page Optimization | | yagobi210 -
Duplicate content with tagging and categories
Hello, Moz is showing that a site has duplicate content - which appears to be because of tags and categories. It is a relatively new site, with only a few blog publications so far. This means that the same articles are displayed under a number of different tags and categories... Is this something I should worry about, or just wait until I have more content? The 'tag' and 'category' pages are not really pages I would expect or aim for anyone to find in google results anyway. Would be glad to here any advice / opinions on this Thanks!
On-Page Optimization | | wearehappymedia1 -
I am trying to better understand solving the duplicate content issues highlighted in your recent crawl report of our site - www.thehomesites.com.
Below are some of the urls highlighted as having duplicate content -
On-Page Optimization | | urahul
http://www.thehomesites.com/zip_details/76105
http://www.thehomesites.com/zip_details/44135
http://www.thehomesites.com/zip_details/75227
http://www.thehomesites.com/zip_details/94501 These are neighborhood reports generated for 4 different zip codes. We use a standard template to create these reports. What are some of the steps we can take to avoid these pages being categorized as duplicate content?0 -
Impact of rogue keyword in content
I have a page that is optimised - title, URL, content etc for the chosen keywords. However, within the content are some batches of bullet point text that has repeated text throughout. So for example I have 5 instances of my chosen keyword within the content and 24 instances of the two word text within the bullet points. Does this kind of scenario have any impact on ranking?
On-Page Optimization | | MickEdwards0 -
Solve duplicate content issues by using robots.txt
Hi, I have a primary website and beside that I also have some secondary websites with have same contents with primary website. This lead to duplicate content errors. Because of having many URL duplicate contents, so I want to use the robots.txt file to prevent google index the secondary websites to fix the duplicate content issue. Is it ok? Thank for any help!
On-Page Optimization | | JohnHuynh0 -
How to check duplicate content with other website?
Hello, I guest that my website may be duplicate contents with other websites. Is this a important factor on SEO? and how to check and fix them? Thanks,
On-Page Optimization | | JohnHuynh1 -
Duplicate Mega tags
we have a e-commerce site, we have products that are the exact same but different sizes each has a page, we use the same mega tag would it be better to use no mega tag
On-Page Optimization | | DFC0 -
Avoiding "Duplicate Page Title" and "Duplicate Page Content" - Best Practices?
We have a website with a searchable database of recipes. You can search the database using an online form with dropdown options for: Course (starter, main, salad, etc)
On-Page Optimization | | smaavie
Cooking Method (fry, bake, boil, steam, etc)
Preparation Time (Under 30 min, 30min to 1 hour, Over 1 hour) Here are some examples of how URLs may look when searching for a recipe: find-a-recipe.php?course=starter
find-a-recipe.php?course=main&preperation-time=30min+to+1+hour
find-a-recipe.php?cooking-method=fry&preperation-time=over+1+hour There is also pagination of search results, so the URL could also have the variable "start", e.g. find-a-recipe.php?course=salad&start=30 There can be any combination of these variables, meaning there are hundreds of possible search results URL variations. This all works well on the site, however it gives multiple "Duplicate Page Title" and "Duplicate Page Content" errors when crawled by SEOmoz. I've seached online and found several possible solutions for this, such as: Setting canonical tag Adding these URL variables to Google Webmasters to tell Google to ignore them Change the Title tag in the head dynamically based on what URL variables are present However I am not sure which of these would be best. As far as I can tell the canonical tag should be used when you have the same page available at two seperate URLs, but this isn't the case here as the search results are always different. Adding these URL variables to Google webmasters won't fix the problem in other search engines, and will presumably continue to get these errors in our SEOmoz crawl reports. Changing the title tag each time can lead to very long title tags, and it doesn't address the problem of duplicate page content. I had hoped there would be a standard solution for problems like this, as I imagine others will have come across this before, but I cannot find the ideal solution. Any help would be much appreciated. Kind Regards5