Avoiding "Duplicate Page Title" and "Duplicate Page Content" - Best Practices?
-
We have a website with a searchable database of recipes. You can search the database using an online form with dropdown options for:
Course (starter, main, salad, etc)
Cooking Method (fry, bake, boil, steam, etc)
Preparation Time (Under 30 min, 30min to 1 hour, Over 1 hour)Here are some examples of how URLs may look when searching for a recipe:
find-a-recipe.php?course=starter
find-a-recipe.php?course=main&preperation-time=30min+to+1+hour
find-a-recipe.php?cooking-method=fry&preperation-time=over+1+hourThere is also pagination of search results, so the URL could also have the variable "start", e.g.
find-a-recipe.php?course=salad&start=30
There can be any combination of these variables, meaning there are hundreds of possible search results URL variations.
This all works well on the site, however it gives multiple "Duplicate Page Title" and "Duplicate Page Content" errors when crawled by SEOmoz.
I've seached online and found several possible solutions for this, such as:
- Setting canonical tag
- Adding these URL variables to Google Webmasters to tell Google to ignore them
- Change the Title tag in the head dynamically based on what URL variables are present
However I am not sure which of these would be best.
As far as I can tell the canonical tag should be used when you have the same page available at two seperate URLs, but this isn't the case here as the search results are always different.
Adding these URL variables to Google webmasters won't fix the problem in other search engines, and will presumably continue to get these errors in our SEOmoz crawl reports.
Changing the title tag each time can lead to very long title tags, and it doesn't address the problem of duplicate page content.
I had hoped there would be a standard solution for problems like this, as I imagine others will have come across this before, but I cannot find the ideal solution.
Any help would be much appreciated.
Kind Regards
-
I agree that you shouldn't try to get search pages indexed.
Google loves clean URL's and "?course=starter" is not really a clean URL.One of the reasons why Google doesn't like this is because it's not good for the user experience.
Nobody is gonna remember a url with ? and = in it.My recommendation would definitely be to change the structure of the website that every recipe has it's own page and that these recipes are sorted into categories. Then you can still use the search function so that people will be able to search and they will find their recipes. Trying to get these pages higher in Google may help in some ways, but i think it will serve you better to focus on getting pages with a clean URL structure higher in Google.This should also improve your CTR (Click Through Rate) because people trust websites more if they understand the URL.
About the duplicate content issues. Search pages are not really meant to be indexed by search engines in my experience. I always no-index the search pages because of the possible duplicate content issues and they are custom pages which appear bases on what a user does. Not something i would want to get found on in the search engines.
I hope I helped, if anything is unclear or you need more advice or have a different question than please let me know.
-
I read all the replies and question is still unanswered.
I have the exact same problem with WordPress when using nextpage tag to split a single post into multiple pages.
Wordpress SEO plugin from Yoast doesn't handle this. If I create a blog one post with 3 more pages, url structure looks like the following
With Wordpress SEO Plugin, I can set Title and Description for just ONCE ( not for all 4 pages). This results in Duplicate Title Tags.
Possible Solution 1: Re-write Title using wp_title if page is >= 2. Appending page number to the Title could fix this.
Possible Solution 2: Add www.example.com/post-link/ as canonical link in Advanced Tab of Wordpress SEO
-
smaavie, are you going pick one of these fine answers to mark this Q&A as "Answered"?
-
For few of the page title has warning for my blog campgain has duplicate title !
We used Photoshop tutorials, when we create new article on different new techniques ! what i can't change them ! Because traffic and social votes will be affected.
WordPress slug Changing will become 404 error to the medium and high PA . Before using All in seo Plugin, I don't know that we should use short length url ! so i used the same title in the url ? i don't want to change it ?
any solution ?
-
As this question is rather old but still is marked as "unanswered" I take the liberty to post an answer this late which I hope not only you will benefit from but all other webmasters/SEOs with similar issues.
First of all: Duplicate pages and therefrom duplicate title-tags etc. are of course meant to be taken serious but there are no easy fixes in my opinion and especially not if your design and database set-up is causing large amounts of duplicated content.
But is it a big problem Google-wise? I have my doubts based on conflicting signs and indications given by Google Webmaster Tools and the SERPs in general.
An example I just dug up for you: One of the big players in the field of recipes (allrecipes.com) has +5.000 search results indexed by Google ALL with duplicate title-tags which would bring up all the red flags in the Dashboard.
But based on Googles trends for searches, allrecipes.com are still outperforming their closest competitors. Their search result-urls are unique but all have the same title-tag...so +5.000 duplicate title-tags is probably not really a problem in their perspective.
What to do then?
All though your website seems to have been designed with quite a few potential problems built into its core I would personally hesitate spending a lot of resources on fixing it, especially if your traffic from Google is not taking a bashing.
Eventually your website will be in need for a design re-do and perhaps a change of content management system/database system.
Plan ahead and make sure that you will be able to control this issue in your next version.
It could f. i. be by having all search results appear more like individual pages with individual urls. With a little bit of effort you could make each search result unique with unique title tag and url and thereby bring more traffic to your site.
Best of luck with your efforts
Jens Peter
-
Every site you monitor should have a keyword distribution sheet in excel.
Each line will have the url, meta description, title tag, and h1 tag showing (LEN) character count for meta details. With a further column showing the keywords targeted for each page.
With this you have a way to monitor each page in a more direct visual way and avoid duplication, especially titles and meta descriptions.
-
The best way to fix the issue is address it from the server level - so page level creation and urls.
Link canonical is good, but is really a tier three level fix.
Starting at the root is best. You will want to ensure you have:
1. A logical taxonomy, which is a breakdown of the core topic into sub-categories for classification purposes
2. A logical way to tag categories and entities with meaningful tags, or search based on title, content (tags or keys work well - a programmer should be able to help with this)
3. Rewriting the urls as was mentioned so that any urls exported are always exact urls and not using variables or queries
4. 301 redirect appropriate core query urls to the new urls, and implement internal links to the new urls to reinforce that content and show search engines that it is priority.
5. Continue to run reports regularly and monitor the amount of duplicate content. -
Solution:
-
Download "All in one SEO" plugin
-
Go to plugin settings. Check these settings:
Use no inxex for -categories
-tags
-search (!)
-Achieves
That will prevent duplicate content issues if you use Wordpress.
-
-
Setting canonical tags would be the way I would go, but make sure you have got good seo on the rest of the site for the recipies etc.
-
Take Lonnie's advice. Install Yoast. Use the rel=next tags that the software inserts for you automatically. Yoast will fix it all.
Another WP plugin is called htaccess control. And it is also used for this same purpose... It's a little simpler than the Yoast plugin, and if you already have an SEO plugin you like-- or worst yet-- it is built into your theme, like Thesis...
Just go with htaccess control. It is simple to use and your problems will be solved in minutes.
-
- Setting canonical tag, you should be already doing this rather it's a problem here or not as outlined in seomoz tools you may be missing out on link juice.
I agree with this guy, but I would like to add, why do you want google to crawl your searchable index? Aren't all of your recipes found on your site already by picking categories from a menu?
-
The best way to handle this is via the URL Parameters Setting in Google Webmaster or a robots.txt file.
Google added this functionality to handle the exact issues your'e describing, so there's no need to drastically change functionalities which would likely require editing core files in your CMS.
If you click on URL Parameters under Site Configuration in Google Webmaster you will find a list of queries and for each one there available options that instruct google as to how to handle these pages.
To do this:
1. Click Edit for the Paramater you'd like to configure (i.e. course, cooking, etc).
2. In the Dropdown Menu, select Yes. Changes, reorders, or narrow page contents.
3. Choose the option that best describes how the parameter affects the page content.
4. Choose how GoogleBot should crawl these pages.
- I usually choose "Let GoogleBot Decide" as it's Google your trying to please ;). I've designed and optimized several eCommerce store with multiple parameters and this option handles the crawling and indexing of these pages correctly 99% of the time. If you still experience Duplicate Content issues after editing these settings, simply choose the Ignore option.
Dynamic websites are very common these days and this tool is designed by Google specifically to handle parameters in the best possible way and allow Google to understand the URL structure of your site.. The "Don't have dynamic URLs solution" isn't a solution at all, as many modern functionalities rely on dynamic URLs, such as layered navigation in Magento or other eCommerce platforms. How do you suggest filtering products by price, size, color, etc without creating dynamic URLs? These functionalities IMPROVE user experience and navigation. The text in the address bar isn't always the important factor when a user is navigating a site.
Don't overthink it.
Take advantage of the functionality and only de-index pages that are causing duplicate content problems. If you notice specific dynamic URLs are appearing in SERPs too often then create a 301 redirect from that dynamic URL to a landing page with more user friendly URL.
Hope this helps.
Anthony
-
I agree it is best to get the individual pages indexed. Dont have dynamic pages. Instead come up with categories that make sense, have them indexed.
-
This is something that I've been working on lately. I've been really successfully avoiding duplicate content by using canonical linking, however this has not solve the duplicate Titles nor the duplicate meta descriptions. If you are using a normal web site (static) to post your content as single pages manually, then your only concern would fall into the search pages.
I've switched 100% to Wordpress Blog platforms because of two reasons.
- Google loves them better
- Easier to control content
I've been very successful avoiding duplicate content except for three areas but I do have the solution to repair them as well and I'm currently taking on this task.
The 3 areas of concern are:
- Duplicate Titles
- Duplicate meta descriptions
- Scrapers snatching my unique content and making them their own.
The 3 solutions are: (wordpress platform)
- Duplicate Titles are furnished by pagination next/previous or page #'s at the bottom of each page.
Although wordpress hasn't included this function within the core of its platform yet, Wordpress SEO by Yoast (plugin) automatically add's the new suggested syntax by Google.
Enter rel="next" and rel="prev"
Now, as it goes with these things, Google has just posted the solution. They've asked to add
rel="next"
andrel="prev"
to paginated archives, so that they can distinguish them as a series and, quote: Send users to the most relevant page/URL—typically the first page of the series.The above syntax will solve our pagination duplicate titles and search paginations. The plugin also adds tag terms at the end of the title for each page. This makes the Title unique.
-
Now the above also tells Google that page #1 is the canonical Title and meta description for all paginations, therefore your meta description is now accurate and safe. The plugin also has an advanced feature which allows you to provide a different description per page other than what the page actually states. Making this slight change makes all the difference.
-
The next problem is robbery or copy infringing my content. My unique content has been scraped and posted without my permission, however now... we can use another rel= syntax to point the article back to the original owner.
rel=”author” and rel=”me” in WP and other platforms
You can allow people to use your content, however the rel="me" tells search engines who the unique content really belongs to. and the rel="author" points to me as well.
This attribute allows you to tell Google who you are as an author and what articles you write. Google has indicated that they believe the authority of an author may even be weighted more heavily than traditional on page metrics, like page or domain authority. As Matt Cutts stated at SMX West, “The concept is that if an author is trustworthy, why does it matter what site the article appears on?”. Author authority also has implications for the impending Panda 2.2 update, which will affect the sites that steal content from other sites to post on their own. If Google sees the same article on 10 different sites, and 1 of those sites clearly identifies an author, marked up with the "rel=author" attribute, which site do you think Google is going to rank?
This is the extent of my research on the above and so far its working well. I hope the above helps for you too.
Cheers!
-
Personally I believe its best practice to have user friendly urls, rather than search generated ones. Google favours this and so do the users. It may be a lot more work to implement, but in my experience (having a site with a lot of categories and posts) it was well worth it.
-
Thanks Keri,
Our current experience is that search results from our site are showing up in Google results, sometimes quite high.
So, I'm reluctant to change anything too drastically - "if it ain't broke, don't fix it". But ... maybe we could get slightly higher rankings if we made some minor alterations?
Is there any 'best practice' guidance I could look at to learn more about this specific issue?
Thanks for your help.
David
-
I think Baptiste is referring to Google's preference for not including search results in their search results, as the URL in the example appeared to be a search result.
-
Hello Baptiste,
I'm keen to know more about why you believe we would get penalised for this. What, specifically, should we seek to avoid in order to avoid the penalty?
Thanks for your help
David
-
I noticed this question is still listed as unanswered. Did you come up with a solution you can share with us, and any information about how well it worked? Or are you still looking for advice? Would be great if you could pop back in with an update. Thanks!
-
I manage websites specialising in holiday rentals, so the search pages are very powerful however I only use these for customer experience. For my seo I create pages based on the areas, types of properties, specific searches i.e. villas in florida etc..
I think when building websites you must always have two outlooks; users & seo
-
Why do you think this? Is it Part of googles terms of service?
-
Hope you don't create link from visitor's researches, like find-a-recipe.php?course=salad**&q=tomatoes** as you would get penalised !
-
Sure, I understand where you're coming from. I still think there's no easy solution to this, but maybe someone else will have some interesting suggestions.
What I was suggesting in my first reply above is pretty much in line with what Baptiste is saying below. Google used to be very tough on people trying to index search results pages and that's why personally I would try going a bit different way.
Cheers!
-
I would suggest making indexable pages for courses, the rest of the parameters are rather user orientated and - I think - not usefull for SEO. This means separating the search script with browse pages.
This means making find-a-recipe.php, which looks like the search engine, forbidden to robots. Instead, you should have a category browser, using only the course (I suppose no recipe have multiple courses ?). You would have url like :
/recipes/ => all recipes, paginated
/recipes/start/ => all starter recipes, paginated
/recipes/starter/fry/ => fried starter recipes, but you should check the search volume of those expression, like "fried starter recipes". If you have a very small volume of recipes, wait until every subpages of /recipes/starter/ have at least 5 recipes.
The goal here is to make your recipe pages easy to index, with a strong focus on the course type. Although the course may not be the best root category for recipe, this should be a good way to may your site seo friendly.
-
Thanks Nemek,
I appreciate your answer.
However, as the site owner my instinct is to seek to get as many pages as possible indexed, so I'd like to get further advice about this before I take action.
The search results pages on our site often mirror what people are specifically searching for in Google, so we'd love our results pages to be highly ranked so as to help these people find what they want, quickly.
Does anyone else have an opinion on the best way forward for us?
Thanks in advance.
-
Technically, the search engines don't want to crawl other "search results". Personally I would try to get individual pages and category pages indexed, while avoid trying to index and "canonicalize" search result pages.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate Content
When I say duplicate content, I don't mean that the content on a clients site is displaying on another site on the web or taken from a site on the web. A client has a few product pages and each product page has content on the bottom of the page (4-5 paragraphs) describing the product. Now, this content is also displaying on other pages, but re-worded so it's not 100% duplicate. Some pages show a duplicate content % ranging from 12% to 35% and maybe 40%. Just curious if I should suggest having each product page less than 10% duplicated. Thanks for your help.
On-Page Optimization | | Kdruckenbrod0 -
Page Title versus H1 title
What's the difference between the Page Title and the H1 title? It seems like both summarize the page. Is it a wasted opportunity to make them the same? Should they be similar but slightly different?
On-Page Optimization | | amybethmegjo1 -
Duplicate content: Form labels and field content
I have a site that has 500 pages, each with unique content, the only content that could be deemed the same is the 'Make Contact' form, which has the same labels and placeholder text on each page. Is this likely to cause any duplicate content penalties?
On-Page Optimization | | deployseo0 -
URL best practices
Hi, I have a problem here, I used http://www.vietnamvisacorp.com/faqs.html instead of http://www.vietnamvisacorp.com/faqs. Hence, http://www.vietnamvisacorp.com/faqs will be caused 404 page. My question is should I change from faqs.html to faqs (no .html)? Thanks in advance any advice?
On-Page Optimization | | JohnHuynh0 -
Spanish version of site - best practice?
I need to create a Spanish version of an existing site. My idea was to have the Spanish content switch out the English content if the query string had something like ?l=es. It would also drop a cookie so that all other pages would switch out content as well. I do want the Spanish content to be indexed and rank in the search engines, though. I would include all of the Spanish versions (with the ?l=es) in the site map and link to them on every page with a link to the Spanish version. Does anyone have any experience with this? Is this a bad idea? Thanks! Tom
On-Page Optimization | | TomBristol0 -
How to avoid duplicates when URL and content changes during the course of a day?
I'm currently facing the following challenge: Newspaper industry: the content and title of some (featured) articles change a couple of times during a normal day. The CMS is setup so each article can be found by only using it's specific id (eg. domain.tld/123). A normal article looks like this: domain.tld/some-path/sub-path/i-am-the-topic,123 Now the article gets changed and with it the topic. It looks like this now: domain.tld/some-path/sub-path/i-am-the-new-topic,123 I can not tell the writers that they can not change the article as they wish any more. I could implement canonicals pointing to the short url (domain.tld/123). I could try to change the URL's to something like domain.tld/some-path/sub-path/123. Then we would lose keywords in URL (which afaik is not that important as a ranking factor; rather as a CTR factor). If anyone has experiences sharing them would be greatly appreciated. Thanks, Jan
On-Page Optimization | | jmueller0 -
I am optimizing title tags and was wondering if it makes a difference if I use "commas" in between keywords that are synonyms or should I use "and" instead?
For example: "pants, trousers at pants.com" or "pants and trousers at pants.com".
On-Page Optimization | | EcomLkwd0 -
Checking Duplicate Content
Hi there, We are migrating to a new website, which we are writing lots of new content for the new website. The new website is hosted on a development site which is password protected and so on so that it cannot be indexed. What i would like to know is, how do i check for duplicate content issues out there on the world wide web with the dev site being password protected? Hope this makes sense. Kind Regards,
On-Page Optimization | | Paul780