Duplicat Content?start=1
-
I am currently trying to figure out how to fix this basically our subcategory pages are being flagged for duplicate content. Basically we have items listed on these pages and usually we show 12 items and users can click to go onto the next page of items. Here is an example of what I am seeing.
I'm just not too sure on how I would go about fixing this... Thanks for any Help!
-
Place that code in between the head tags for all of the specific categories to implement a Canonical link ^.^ Hope it helps!
-
Good point, but I've seen products within a category still be indexed using this technique, thats without any other links to them on the site.
-
I don't think there is an ideal solution to pagination problems, but here's a few things to get you started.
Your exact question
- http://www.seomoz.org/blog/how-to-deal-with-pagination-duplicate-content-issues
A farewell to pagination - http://www.seomoz.org/blog/whiteboard-friday-a-farewell-to-pagination
Best practises - http://www.seomoz.org/blog/pagination-best-practices-for-seo-user-experience
And then you might want to start thinking more about the tech behind it and what your users can handle. Infinite scrolling, tabbed products, things I'm too lazy to think of
-
But isn't any good for getting the other items indexed properly, no?
-
Put canonical tags on the duplicate pages pointing back to the original category page.
This tells Google that this is the original "Canon" version of the page.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Why would a developer build all page content in php?
Picked up a new client. Site is built on Wordpress. Previous developer built nearly all page content in their custom theme's PHP files. In other words, the theme's "page.php" file contains virtually all the HTML for each of the site's pages. Each individual page's back-end page editor appears blank, except for some of the page text. No markup, no widgets, no custom fields. And no dedicated, page-specific php files either. Pages are differentiated within page.php using: elseif (is_page("27") Has anyone ever come across this approach before? Why might someone do this?
Web Design | | mphdavidson0 -
Title tag on Google starts with company name then :
Can someone help me and tell me why Google picks up and shows the title tag as for example: SEOmoz**: SEO Software. Simplified.** Then if you click through and look at the cache version of the page it shows the title tags as just SEO Software. Simplified. So without the SEOmoz: at the start. http://webcache.googleusercontent.com/search?q=cache%3Awww.seomoz.org%2F&aq=f&oq=cache%3Awww.seomoz.org%2F&aqs=chrome.0.57j58.3052&sourceid=chrome&ie=UTF-8 Its probably something really easy and I'm going to kick myself when someone tells me but I can't figure out why?
Web Design | | i3MEDIA1 -
Subdomain or Start over?
I have a site that is currently set up geo-targeted in sub-domains. So each geo is princeton.site.com. Each site has the same code, design, but the events listed are different. There are about 30 subdomains and they are looking to expand nationally. After the Panda update, I am seeing that it looks like they should stay as sub-domains rather than redirecting everything or setting up new domains. Any thoughts on if the sub domains should stay and expand exponentially? SEO strategy, etc.? I'm also worried about the possibility of duplicate content. Thanks!
Web Design | | PPI0 -
Sites went from page 1 to page 40 + in results
Hello all We are looking for any insight we can get as to why all (except 1) of our sites were effected very badly in the rankings by Google since the Panda updates. Several of our sites londonescape.com dublinescape.com and prague, paris, florence, delhi, dubai and a few others (all escape.com urls) have had major drop in their rankings. LondonEscape.net (now.com (changed after rank drop) ), was ranked between 4th & 6th but is now down around 400th and DelhiEscape.net and MunichEscape.com were both number 1 for several years for our main key words We also had two Stay sites number 1 , AmsterdamStay and NewYorkstay both .com ranked number 1 for years , NewYork has dropped to 10th place so far the Amsterdam site has not been effected. We are not really sure what we did wrong. MunichEscape and DelhiEcape should never have been page 1 sites ) just 5 pages and a click thru to main site WorldEscape) but we never did anything to make them number 1. London, NewYork and Amsterdam sites have had regular new content added, all is checked to make sure its original. **Since the rankings drop ** LondonEscape.com site We have redirected the.net to the .com url Added a mountain of new articles and content Redesigned the site / script Got a fair few links removed from sites, any with multiple links to us. A few I have not managed yet to get taken down. So far no result in increased rankings. We contacted Google but they informed us we have NOT had a manual ban imposed on us, we received NO mails from Google informing us we had done anything wrong. We were hoping it would be a 6 month ban but we are way past that now. Anyone any ideas ?
Web Design | | WorldEscape0 -
Using content from other sites without duplicate content penalties?
Hi there, I am setting up a website, where i believe it would substantially benefit users experience if i setup a database of information on artists. I am torn because to feasibly do this correctly, i would have content that is built from multiple sources, but has no real unique content. It would have parts from Wikipedia, parts from other websites etc. All would be sourced of-course. My concern is that if i do this, am i risking in devaluing my website because of this. Is there a way i can handle this without taking a hit?
Web Design | | BorisD0 -
Penalized by duplicate content?
Hello, I am in a very weird position. I am managing a website(EMD) which a part of it dynamically creates pages. The former webmaster who create this system though that this would help with SEO but I dought! The thing is that now the site has about 1500 pages which must look duplicate but are they really duplicate? Each page has a unique URL but the content is pretty much the same: one image and a different title with 5-8 words. There is more: All these pages are not accessible by the users but only for the crawlers!!! This URL machine is a part of a php - made photo gallery which i never understood the sense of it! The site overall is not performing very well in SERP, especially after Penguin, but judging by the link profile, the Domain authority, construction (ok besides that crazy photo gallery) and content, it never reached the position it should have in the past. The majority of these mysterious pages - and mostly their images - are cached by Google and some of them are in top places to some SERP - the ones that match the small title on page - but the numbers are poor, 10 - 15 clicks per month. Are these pages considered as duplicated, although they are cached, and is it safe for the site just to remove 1500 at once? The seomoz tools have pointed some of them as dups but the majority not! Can these pages impact the image of the whole site in search engines?( drop in Google and has disappeared from Yahoo and Bing!) Do I also have to tell Google about the removal? I have not seen anything like it before so any comment would be helpful! Thank you!
Web Design | | Tz_Seo0 -
What reason would scrapers, and syndication sites outrank all of our content?
Typing in any of our titles for content, scrapers and content syndication sites all outrank us by quite a bit. What is the main reason for this usually? I started noticing this happening quite a bit this year, and think maybe it has to do with panda. Has anyone figured out the reasoning?
Web Design | | upbuiltgames0 -
Duplicate content.
Hi there....we're dealing with a duplicate content mess. We're a franchisor(www.kitchensolvers.com), and each of our franchises have their own landing page. The trouble is, the way the landing pages are set up, it's causing all the links available on the national level of the website to be re-indexed each time for every franchise! We don't have an in-house developer and was wondering if anyone else has had similar issues and point me in the right direction.
Web Design | | tafkat0