Database driven content producing false duplicate content errors
-
How do I stop the Moz crawler from creating false duplicate content errors. I have yet to submit my website to google crawler because I am waiting to fix all my site optimization issues.
Example: contactus.aspx?propid=200, contactus.aspx?propid=201.... these are the same pages but with some old url parameters stuck on them. How do I get Moz and Google not to consider these duplicates. I have looked at http://moz.com/learn/seo/duplicate-content with respect to
Rel="canonical"
and I think I am just confused.
Nick
-
All of you guys rock! I have never been involved in a community that has had the right answers every time... I used the on all my static pages such as directions, policies, contact, etc... and it removed all the parameters thereby eliminating them from standing out in the MOZ crawl.. I feel like and idiot not knowing about this HTML tag and its importance. My moz crawl now looks so so much better.
When I mean old url parameters, I just meant a few seconds old, meaning the user is on property.aspx?property=1 then when they moved to a static page such as contact, directions, policy we now have another page called contact.aspx?property=1 which if I have 150 properties times 10 static pages I basically just created 150 duplicate content errors just for the contact page alone. Because contact.aspx?property=1 or contact.aspx?property=150 and in between are all the same page... I am sure this has killed my SEO. SO THAT PROBLEM IS NOW FIXED!!
NOW to revisit what zenstorageunits says about URL rewriting which has many different ways to do it using .net, but Miketek I would not have to create subdirectories because it is done in the code... they are more like virtual directories...
zenstorageunits or anyone else for that matter, Is it worth it for me to hire somebody to create a URL rewrite app that can change the following;
http:/www.destinationbigbear.com/property_detail.aspx?propid=202 to
http://www.destinationbigbear.com/big-bear-cabin-rentals/a-true-cabin/details
and
http:/www.destinationbigbear.com/property_photos.aspx?propid=202 to
http://www.destinationbigbear.com/big-bear-cabin-rentals/a-true-cabin/photos
See everyone of my 150 cabins has these pages; info, photos, calendar, video, reviews, rates...and they all have unique cabin names... so it is basically 150 cabins x 6 pages = 900 unique pages with unique content but really only 6 pages dynamically being changed by 150 cabins.
I have been able to dynamically change all the page titles for everyone of these 900 database driven pages such as
Big-Bear-Cabin | A True Cabin Photos or Big-Bear-Cabin | A True Cabin Calendar and so on.
-
Hi Nick,
I think you've gotten some good tips here - I'd agree with Prestashop that the preferred solution would be to find where these parameters are being included in links to this page and remove them.
Failing that, zenstorageunits's advice to use rel="canonical" would be my recommendation - or a 301 redirect from the URLs that include parameters back to the core URL would work.
I wouldn't convert these parameters to subdirectories unless they are integral to the way your site works and pull up unique content - you called them "old parameters" so it sounds like they're not supposed to be there, so probably not a case where you'd want to convert these parameters to subdirectories.
Failing the above, you could utilize the Google Webmaster Tools "URL Parameters" interface to tell Googlebot to ignore these parameters.
Overall, your best course of action is to find and remove the links that include the parameters.
I'd also add that the Moz crawl report is highly sensitive to "duplicate content," and I often find it flags up issues has high/medium priority that are not actually going to have a significant impact on the site. You have to take the crawl report with a grain of salt - while duplicate content can be a serious issue for some sites (ecommerce retailers for example with duplication issues across a wide catalog of products), in most cases it has minimal impact and isn't something I'd hold up your site launch for.
Best of Luck,
Mike -
I agree zenstorageunits about using rel=canonical but one thing I would like to point out is that Moz does not create false errors. It is a simple crawler, not like google. Google will actually try to follow links that people have used before and that show up in your analytics files. moz uses no logic like that, it just jumps from page to page. If it is picking up a page with a query string like that then it is a link on your site. I would find the links and take them off.
-
You have a few options you could do. One thing I would look into is maybe doing some url rewriting to change
contactus.aspx?propid=200
to
contactus/propid/200
look at http://msdn.microsoft.com/en-us/library/ms972974.aspx on how to do that for IIS.
A better option I think if you need to keep the parameters the way they are is to use the rel canocial tag look at moz article
http://moz.com/blog/rel-confused-answers-to-your-rel-canonical-questions
but basicly you would need to add something like this to your contact.aspx page(replace example.com with your website url)
This suggest to the website crawler, like google or moz crawler, that those pages should be assoicated with the contact.aspx page.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
URL Mixed Cases and Duplicate Content
Hi There, I have a question for you. I am working on a website where by typing any letter of the URL in lower or upper case, it will give a 200 code. Examples www.examples.com/page1/product www.examples.com/paGe1/Product www.examples.com/PagE1/prOdUcT www.examples.com/pAge1/proODUCt and so on… Although I cannot find evidence of backlinks pointing to my page with mixed cases, shall I redirect or rel=canonical all the possible combination of the cases to a lower version of them in order to prevent duplicate content? And if so, do you have any advice on how to complete such a massive job? Thanks a lot
Technical SEO | | Midleton0 -
Container Page/Content Page Duplicate Content
My client has a container page on their website, they are using SiteFinity, so it is called a "group page", in which individual pages appear and can be scrolled through. When link are followed, they first lead to the group page URL, in which the first content page is shown. However, when navigating through the content pages, the URL changes. When navigating BACK to the first content page, the URL is that for the content page, but it appears to indexers as a duplicate of the group page, that is, the URL that appeared when first linking to the group page. The client updates this on the regular, so I need to find a solution that will allow them to add more pages, the new one always becoming the top page, without requiring extra coding. For instance, I had considered integrating REL=NEXT and REL=PREV, but they aren't going to keep that up to date.
Technical SEO | | SpokeHQ1 -
SEOMOZ and non-duplicate duplicate content
Hi all, Looking through the lovely SEOMOZ report, by far its biggest complaint is that of perceived duplicate content. Its hard to avoid given the nature of eCommerce sites that oestensibly list products in a consistent framework. Most advice about duplicate content is about canonicalisation, but thats not really relevant when you have two different products being perceived as the same. Thing is, I might have ignored it but google ignores about 40% of our site map for I suspect the same reason. Basically I dont want us to appear "Spammy". Actually we do go to a lot of time to photograph and put a little flavour text for each product (in progress). I guess my question is, that given over 700 products, why 300ish of them would be considered duplicates and the remaning not? Here is a URL and one of its "duplicates" according to the SEOMOZ report: http://www.1010direct.com/DGV-DD1165-970-53/details.aspx
Technical SEO | | fretts
http://www.1010direct.com/TDV-019-GOLD-50/details.aspx Thanks for any help people0 -
Duplicate Page Content Report
In Crawl Diagnostics Summary, I have 2000 duplicate page content. When I click the link, my Wordpress return "page not found" and I see it's not indexed by Google, and I could not find the issue in Google Webmaster. So where does this link come from?
Technical SEO | | smallwebsite0 -
404's and duplicate content.
I have real estate based websites that add new pages when new listings are added to the market and then deletes pages when the property is sold. My concern is that there are a significant amount of 404's created and the listing pages that are added are going to be the same as others in my market who use the same IDX provider. I can go with a different IDX provider that uses IFrame which doesn't create new pages but I used a IFrame before and my time on site was 3min w/ 2.5 pgs per visit and now it's 7.5 pg/visit with 6+min on the site. The new pages create new content daily so is fresh content and better on site metrics (with the 404's) better or less 404's, no dup content and shorter onsite metrics better? Any thoughts on this issue? Any advice would be appreciated
Technical SEO | | AnthonyLasVegas0 -
How can i see the pages that cause duplicate content?
SEOmoz PRO is giving me back duplicate content errors. However, i don't see how i can get a list of pages that are duplicate to the one shown. If i don't know which pages/urls cause the issue i can't really fix it. The only way would be placing canonical tags but that's not always the best solution. Is there a way to see the actual duplicate pages?
Technical SEO | | 5MMedia0 -
How can i resolve Duplicate Page Content?
Hello, I have created one campaign over SEOmoz tools for my website AutoDreams.it i have found 159 duplicate page content. My problem is that this web site is about car adsso it is easy to create pages with duplicate content and also Car ads are placed byregistered users. How can i resolve this problem? Regards Francesco
Technical SEO | | francesco870 -
Duplicate Page Content
Hi within my campaigns i get an error "crawl errors found" that says duplicate page content found, it finds the same content on the home pages below. Are these seen as two different pages? And how can i correct these errors as they are just one page? http://poolstar.net/ http://poolstar.net/Home_Page.php
Technical SEO | | RouteAccounts0