Database driven content producing false duplicate content errors
-
How do I stop the Moz crawler from creating false duplicate content errors. I have yet to submit my website to google crawler because I am waiting to fix all my site optimization issues.
Example: contactus.aspx?propid=200, contactus.aspx?propid=201.... these are the same pages but with some old url parameters stuck on them. How do I get Moz and Google not to consider these duplicates. I have looked at http://moz.com/learn/seo/duplicate-content with respect to
Rel="canonical"
and I think I am just confused.
Nick
-
All of you guys rock! I have never been involved in a community that has had the right answers every time... I used the on all my static pages such as directions, policies, contact, etc... and it removed all the parameters thereby eliminating them from standing out in the MOZ crawl.. I feel like and idiot not knowing about this HTML tag and its importance. My moz crawl now looks so so much better.
When I mean old url parameters, I just meant a few seconds old, meaning the user is on property.aspx?property=1 then when they moved to a static page such as contact, directions, policy we now have another page called contact.aspx?property=1 which if I have 150 properties times 10 static pages I basically just created 150 duplicate content errors just for the contact page alone. Because contact.aspx?property=1 or contact.aspx?property=150 and in between are all the same page... I am sure this has killed my SEO. SO THAT PROBLEM IS NOW FIXED!!
NOW to revisit what zenstorageunits says about URL rewriting which has many different ways to do it using .net, but Miketek I would not have to create subdirectories because it is done in the code... they are more like virtual directories...
zenstorageunits or anyone else for that matter, Is it worth it for me to hire somebody to create a URL rewrite app that can change the following;
http:/www.destinationbigbear.com/property_detail.aspx?propid=202 to
http://www.destinationbigbear.com/big-bear-cabin-rentals/a-true-cabin/details
and
http:/www.destinationbigbear.com/property_photos.aspx?propid=202 to
http://www.destinationbigbear.com/big-bear-cabin-rentals/a-true-cabin/photos
See everyone of my 150 cabins has these pages; info, photos, calendar, video, reviews, rates...and they all have unique cabin names... so it is basically 150 cabins x 6 pages = 900 unique pages with unique content but really only 6 pages dynamically being changed by 150 cabins.
I have been able to dynamically change all the page titles for everyone of these 900 database driven pages such as
Big-Bear-Cabin | A True Cabin Photos or Big-Bear-Cabin | A True Cabin Calendar and so on.
-
Hi Nick,
I think you've gotten some good tips here - I'd agree with Prestashop that the preferred solution would be to find where these parameters are being included in links to this page and remove them.
Failing that, zenstorageunits's advice to use rel="canonical" would be my recommendation - or a 301 redirect from the URLs that include parameters back to the core URL would work.
I wouldn't convert these parameters to subdirectories unless they are integral to the way your site works and pull up unique content - you called them "old parameters" so it sounds like they're not supposed to be there, so probably not a case where you'd want to convert these parameters to subdirectories.
Failing the above, you could utilize the Google Webmaster Tools "URL Parameters" interface to tell Googlebot to ignore these parameters.
Overall, your best course of action is to find and remove the links that include the parameters.
I'd also add that the Moz crawl report is highly sensitive to "duplicate content," and I often find it flags up issues has high/medium priority that are not actually going to have a significant impact on the site. You have to take the crawl report with a grain of salt - while duplicate content can be a serious issue for some sites (ecommerce retailers for example with duplication issues across a wide catalog of products), in most cases it has minimal impact and isn't something I'd hold up your site launch for.
Best of Luck,
Mike -
I agree zenstorageunits about using rel=canonical but one thing I would like to point out is that Moz does not create false errors. It is a simple crawler, not like google. Google will actually try to follow links that people have used before and that show up in your analytics files. moz uses no logic like that, it just jumps from page to page. If it is picking up a page with a query string like that then it is a link on your site. I would find the links and take them off.
-
You have a few options you could do. One thing I would look into is maybe doing some url rewriting to change
contactus.aspx?propid=200
to
contactus/propid/200
look at http://msdn.microsoft.com/en-us/library/ms972974.aspx on how to do that for IIS.
A better option I think if you need to keep the parameters the way they are is to use the rel canocial tag look at moz article
http://moz.com/blog/rel-confused-answers-to-your-rel-canonical-questions
but basicly you would need to add something like this to your contact.aspx page(replace example.com with your website url)
This suggest to the website crawler, like google or moz crawler, that those pages should be assoicated with the contact.aspx page.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How to handle one section of duplicate content
Hi guys, i'm wondering if I can get some best practice advice in preparation for launching our new e-commerce website. For the new website we are creating location pages with a description and things to do which will lead the user to hotels in the location. For each hotel page which relates to the location we will have the same 'Things to do' content. This is what the content will look like on each page: Location page Location title (1-3 words) Location description (150-200 words) Things to do (200-250 words) Reasons to visit location (15 words) Hotel page Hotel name and address (10 words) Short description (25 words) Reasons to book hotel (15 words) Hotel description (100-200 words) Friendly message why to visit (15 words) Hotel reviews feed from trust pilot Types of break and information (100-200 words) Things to do (200-250 words) My question is how much will we penalised for having the same 'Things to do' content on say up to 10 hotels + 1 location page? In an ideal world we want to develop a piece of code which tells search engines that the original content lies on the location page but this will not be possible before we go live. I'm unsure whether we should just go and take the potential loss in traffic or remove the 'Things to do' section on hotel pages until we develop the piece of code?
Technical SEO | | CHGLTD1 -
Removed .html - Now Get Duplicate Content
Hi there, I run a wordpress website and have removed the .html from my links. Moz has done a crawl and now a bunch of duplicated are coming up. Is there anything I need to do in perhaps my htaccess to help it along? Google appears to still be indexing the .html versions of my links
Technical SEO | | MrPenguin0 -
Self inflicted duplicate content penalty?
Wondering if I could pick the brains of fellow mozer's. Been working with a client for about 3 months now to get their site up in the engine. In the three months the DA has gone from about 11 to 34 and PA is 40 (up from about 15) so that's all good. However, we seem not to be moving up the ranking much. The average DA of competitors in the niche in the top ten is 25. We have 9.2 times the average no of backlinks too. During a call to the client today they told me that they noticed a major drop in their rankings a few months back. Didn't say this when we started the project. I just searched for the first paragraph on their homepage and it returns 16,000 hits in google, The second returns 9600 and the third 1,400. Searching for the first paragraph of their 'about us' page gives me 13,000 results!! Clearly something is not right here. Looking into this, I seems that someone has use their content, word for word, as the descriptions on thousands of blogs, social sites. I am thinking that this, tied in with the slow movement in the listings, has caused a duplicate content penalty in the search engines. The client haven't copied anyone's content as it is very specific for their site but it seems all over the web. I have advised them to change their site content asap and hope we get a Panda refresh in to view the new unique content. Once the penalty is off i expect the site to shoot up the rankings. From an seo company point of view, should I have seen this before? Maybe. If they had said they suffered a major drop in rankings a few months back - when they dropped their seo agency, I would have looked into it, but one doesn't naturally assume that a client's copy will be posted all over the web, it is not something I would have searched for without reason to search Any thoughts on this, either saying yes or no to my theory would be most welcome please. Thanks Carl
Technical SEO | | GrumpyCarl0 -
Duplicate Content on Product Pages
Hello I'm currently working on two sites and I had some general question's about duplicate content. For the first one each page is a different location, but the wording is identical on each; ie it says Instant Remote Support for Critical Issues, Same Day Onsite Support with a 3-4 hour response time, etc. Would I get penalized for this? Another question i have is, we offer Antivirus support for providers ie Norton, AVG,Bit Defender etc. I was wondering if we will get penalized for having the same first paragraph with only changing the name of the virus provider on each page? My last question is we provide services for multiple city's and towns in various states. Will I get penalized for having the same content on each page, such as towns and producuts and services we provide? Thanks.
Technical SEO | | ilyaelbert0 -
How critical is Duplicate content warnings?
Hi, So I have created my first campaign here and I have to say the tools, user interface and the on-page optimization, everything is useful and I am happy with SEOMOZ. However, the crawl report returned thousands of errors and most of them are duplicate content warnings. As we use Drupal as our CMS, the duplicate content is caused by Drupal's pagination problems. Let's say there is a page called "/top5list" , the crawler decided /top5list?page=1" to be duplicate of "/top5list". There is no real solution for pagination problems in Drupal (as far as I know). I don't have any warnings in Google's webmaster tools regarding this and my sitemap I submitted to Google doesn't include those problematic deep pages. (that are detected as duplicate content by SEOMOZ crawler) So my question is, should I be worried about the thousands of error messages in crawler diagnostics? any ideas appreciated
Technical SEO | | Gamer070 -
Duplicate Content Errors
Ok, old fat client developer new at SEO so I apologize if this is obvious. I have 4 errors in one of my campaigns. two are duplicate content and two are duplicate title. Here is the duplicate title error Rare Currency And Old Paper Money Values and Information.
Technical SEO | | Banknotes
http://www.antiquebanknotes.com/ Rare Currency And Old Paper Money Values and Information.
http://www.antiquebanknotes.com/Default.aspx So, my question is... What do I need to do to make this right? They are the same page. in my page load for default.aspx I have this: this.Title = "Rare Currency And Old Paper Money Values and Information."; And it occurs only once...0 -
Duplicate Page Content
Hi within my campaigns i get an error "crawl errors found" that says duplicate page content found, it finds the same content on the home pages below. Are these seen as two different pages? And how can i correct these errors as they are just one page? http://poolstar.net/ http://poolstar.net/Home_Page.php
Technical SEO | | RouteAccounts0 -
Duplicate content connundrum
Hey Mozzers- I have a tricky situation with one of my clients. They're a reputable organization and have been mentioned in several major news articles. They want to create a Press page on their site with links to each article, but they want viewers to remain within the site and not be redirected to the press sites themselves. The other issue is some of the articles have been removed from the original press sites where they were first posted. I want to avoid duplicate content issues, but I don't see how to repost the articles within the client's site. I figure I have 3 options: 1. create PDFs (w/SEO-friendly URLs) with the articles embedded in them that open in a new window. 2. Post an image with screenshot of article on a unique URL w/brief content. 3. Copy and paste the article to a unique URL. If anyone has experience with this issue or any suggestions, I would greatly appreciate it. Jaime Brown
Technical SEO | | JamesBSEO0