Duplicate Content Issue with
-
Hello fellow Moz'rs!
I'll get straight to the point here -
The issue, which is shown in the attached image, is that for every URL ending in /blog/category/name, it has a duplicate page of /blog/category/name/?p=contactus. Also, its worth nothing that the ?p=contact us are not in the SERPs but were crawled by SEOMoz and they are live and duplicate.
We are using Pinnacle cart. Is there a way to just stop the crawlers from ?p=contactus or?
Thank you all and happy rankings,
James
-
I have used disallow parameters for all kinds of things; you can set it up and test it in Webmaster tools under "crawler access" before you implement on your site to confirm it's done properly. There have been a few times I've had to tweak it to get exactly what I wanted.
I'd test with the "/" in front of it as one option - again, just in testing, to see if there's any difference in results.
Since Google already got a hold of it, it'll take a lot of time to see results but don't be discouraged since they have to re-crawl and figure it out.
Good luck
-
I have done some deeper digging and since Google has already uncovered the *?p=contactus URLs, it seems like canonical would be the optimal route. However, there are about 300 pages of this and would drive the IT department insane.
I have also looked into using Google Webmaster Tool's parameter setting. I would love to hear if this has worked for someone, and would the parameter I set to ignore be *?p=contactus
Thanks so much again!
-
I like using robots.txt for stuff. - mostly because our homegrown cms limits our abilities.s however, if these pages have already been indexed then disallow limits the outcome. Ideally, disallow or no index or any of those are done in advance so Google doesn't get their hands on it; doing it after the fact can take some time for Google to figure it out and put the pieces together. Can your site manage a canonical for this?
-
I believe you can just add this to your robots.txt file and you will be alright.
Disallow: ?p=contactus
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Crawl Diagnostics: Duplicate Content Issues
The Moz crawl diagnostic is showing that I have some duplicate content issues on my site. For the most part, these are variations of the same product that are listed individually (i.e size/color). What would be the best way to deal with this? Choose one variation of the product and add a canonical tag? Thanks
Technical SEO | | inhouseseo0 -
Tags, Categories, & Duplicate Content
Looking for some advice on a duplicate content issue that we're having that definitely isn't unique to us. See, we are allowing all our tag and category pages, as well as our blog pagination be indexed and followed, but Moz is detecting that all as duplicate content, which is obvious since it is the same content that is on our blog posts. We've decided in the past to keep these pages the way they are as it hasn't seemed to hurt us specifically and we hoped it would help our overall ranking. We haven't seen positive or negative signals either way, just the warnings from Moz. We are wondering if we should noindex these pages and if that could cause a positive change, but we're worried it might cause a big negative change as well. Have you confronted this issue? What did you decide and what were the results? Thanks in advance!
Technical SEO | | bradhodson0 -
How do I get rid of duplicate content
I have a site that is new but I managed to get it to page one. Now when I scan it on SEO Moz I see that I have duplicate content. Ex: www.mysite.com, www.mysite.com/index and www.mysite.com/ How do I fix this without jeopardizing my SERPS ranking? Any tips?
Technical SEO | | bronxpad0 -
Duplicate Page Content
I've got several pages of similar products that google has listed as duplicate content. I have them all set up with rel="prev" and rel="next tags telling google that they are part of a group but they've still got them listed as duplicates. Is there something else I should do for these pages or is that just a short falling of googles webmaster tools? One of the pages: http://www.jaaronwoodcountertops.com/wood-countertop-gallery/walnut-countertop-9.html
Technical SEO | | JAARON0 -
Pages with different content and meta description marked as duplicate content
I am running into an issue where I have pages with completely different body and meta description but they are still being marked as having the same content (Duplicate Page Content error). What am I missing here? Examples: http://www.wallstreetoasis.com/forums/what-to-expect-in-the-summer-internship
Technical SEO | | WallStreetOasis.com
and
http://www.wallstreetoasis.com/blog/something-ventured http://www.wallstreetoasis.com/forums/im-in-the-long-run
and
http://www.wallstreetoasis.com/image/jhjpeg0 -
Duplicate Content on Multinational Sites?
Hi SEOmozers Tried finding a solution to this all morning but can't, so just going to spell it out and hope someone can help me! Pretty simple, my client has one site www.domain.com. UK-hosted and targeting the UK market. They want to launch www.domain.us, US-hosted and targeting the US market. They don't want to set up a simple redirect because a) the .com is UK-hosted b) there's a number of regional spelling changes that need to be made However, most of the content on domain.com applies to the US market and they want to copy it onto the new website. Are there ways to get around any duplicate content issues that will arise here? Or is the only answer to simply create completely unique content for the new site? Any help much appreciated! Thanks
Technical SEO | | Coolpink0 -
Are RSS Feeds deemed duplicate content?
If a website content management system includes built-in feeds of different categories that the client can choose from, does that endanger them of having duplicate content if their categories are the same as another client's feed? These feeds appear on templated home page designs by default. Just trying to figure out how big of an issue these feeds are in terms of duplicate content across clients' sites. Should I be concerned? Obviously, there's other content on the home page besides the feed and have not really seen negative effects, but could it be impacting results?
Technical SEO | | KyleNeuberger0 -
Question about duplicate content within my site
Hi. New here to SEOmoz and also somewhat new to SEO in general. A friend has asked me to help do some onsite SEO for their company's website. The company uses Drupal Content Management System. They have a couple product pages that contain a tabbed section for features, accessories, etc. When they built their tabs, they used a Drupal module called Quicktabs, by which each individual tab is created as a separate page and then pulled into the tabs from those pages. So, in essence, you now have instances of repeated content. 1) the page used to create the tab, and 2) the tab that displays on the product page. My question is, how should I handle the pages that were used to create the tabs? Should I make them NOINDEX? Thank you for your advice in advance.
Technical SEO | | aprilm-1890400