Is there an easier way from the server to prevent duplicate page content?
-
I know that using either 301 or 302 will fix the problem of duplicate page content. My question would be; is there an easier way of preventing duplicate page content when it's an issue with the URL. For example:
URL: http://example.com
My guess would be like it says here, that it's a setting issue with the server.
If anyone has some pointers on how to prevent this from occurring, it would be greatly appreciated.
-
I have seen tons of duplicate content errors in the SEO Moz REport. The pages that I have are same but the sidebar ads and others are dynamic based on the store they are coming from. So we send store name as query string.http://www.appymall.com/apps/numberland-learn-numbers-with-montessori%20&store=Appy-back-2-school if you look at teh source code, we defined the canonicalURL. The system is still calling these duplicates. Can you help address this issue? What we are doing wroong?I did checked the on-page keyword tool and it has green check after
Canonical URL Tag Usage
-
Thanks. Did the server-level change, works great, the pages are having no problems resolving canonically, and the changes have been accounted for in Google and Bing's webmaster data since the 24th. Only, one other thing also happened at that same time: my site lurched downward another notch.
This is what usually happens when I do something that's been recommended by SEOs.
-
I've never done one of these yet so I will Google how to do it. I'm waiting to find out the type of server it is.
-
Yes, a 301-redirect is almost always a server-level directive. It's not a tag or HTML element. You can create them with code (in the header of the page), but that's typically harder and only for special cases.
-
Okay, so the code variant will rely on the type of server?
-
If that's the case Dr. Pete, that saves me from having to add the tag to 51 pages. I already have one on the homepage. Thank you.
-
As long as the tactic you use returns a proper 301, there's really no way that's better than any other. Ryan's approach works perfectly well for Apache-hosted sites.
-
In most cases, I don't find sitewide canonical tags to really be necessary, but if they're done right, they can't hurt. The trick is that people often screw them up (and bad canonicals can be really bad). I do like one on the home-page, because it sweeps up all the weird variants that are so common for home pages.
-
Robert check this article out re: frontpage and htaccess
Frontpage is an html editor that helps you build a site. Apache is a server that site can run on. It sounds like you have both.
You'll want to edit the .htaccess file in the root folder of your website, wherever the file for your homepage sits.
-
Make sure you have a space after the second quotation:
-
Thank you for expounding on this issue I thought it fit.
-
Dr. Peter, thank you for clarifying this. I do see the R=301 now but I didn't see it before.
That's what I figured. Is their a preferred 301 code to use?
Yes, I will be sure to use it internally as well. I can see where that would be a mess. Thank you again for sharing your expertise.
-
I'm still getting a bad canonical problem, even with every page having a rel="canonical". It even shows up in SEOmoz's stats, with it indexing 300+ pages when there's only 180-odd. Trouble is, the .htaccess file says "FrontPage" not Apache. Would your .htaccess thingy for Apache work there? And is it the .htaccess that's in the url's folder with the rest of the site's regular files, or one that's in a prev. folder?
-
Hi Dr Pete, Would correcting the current issue with a 301 and adding the rel=canonical tags to each page be the best option? My thought being any future duplicate content issues that may occur (not caused from this issue) would be avoided.
-
Sorry I just read this again, the 301 will fix the URL issue site wide.
-
Expounding is what I do Other people use different words for it...
-
Just to clarify, the rewrite that Ryan is proposing IS a 301-redirect (see the "R=301") - it's just one way to implement it. Done right, it can be used sitewide.
It's perfectly viable to also use canonicals (and I definitely think they're great to have on the home-page, for example), but I think the 301 is more standard practice here. It's best for search crawlers AND visitors to see your canonical URL (www vs. non-www, whichever you choose). That leads people to link to the "proper" version, bookmark it, promote it on social, etc.
Make sure, too, to use the canonical version internally. It's amazing how often people 301-redirect to "www." but then link to the non-www version internally, or vise-versa. Consistent signals are important.
-
Thank you again SEOKeith, I understand what has to be done. I just wanted to make sure I was clear on what needed to be done. Yes, the rel canonical tag will reflect whatever the page is I'm adding it. Since I didn't get the errors for it I never added it to my other sites; so now I have to it for all of them. Fun...
-
I recommend you update each page, note the rel canonical tag will be different for each page. And 50 pages should take you less than 15 mins
-
SEOKeith, the problem is sitewide, all 52 pages. I was hoping to solve the problem in the server and avoid coding each page. But from what I'm gathering is, even if I use the 301 redirect I should still add the rel="canonical" on each page to avoid scraping. This tells the SE that this page is the only page to index and crawl.
Lol, sorry I didn't recognize the acronym. Yes, I have a site that is through Wordpress and one that is through Joomla. The one that I'm having issues with is not through a CMS though.
-
Brian, it's the same thing just a different method both 301.
No it would not cover the issues site wide only for the home page.
CMS = Content Management System (an example would be Wordpress or Drupal).
You should still do the rel="canonical" site wide (on each page).
All make sense ?
-
Thank you SEOKeith, what would be the difference between using a 301 in the .htaccess verses the code Ryan suggested <ifmodule mod_rewrite.c="">?</ifmodule>
Also if I use the 301 redirect in the .htaccess would it cover this issue site wide?
Okay so the space needs to be there.
No, I don't use a CMS?
-
Brian, if you 301 the example.com to www.example.com that will get rid of the duplicate URL issue server side. (this will resolve your current duplicate content issue).
Additionally I recommend you add the rel=canonical it will prevent other potential duplicate content issues that may arise and is considered good practice to implement.
The tag looks correct, note the space after the domain in quotes:
Are you using a CMS ?
-
Thank you SEOKeith! I definitely want to make sure I don't use any bad practices to fix this issue and thank you for clarifying that about the code.
So if I apply the code and fix the issue from the server by making the URL www.example.com
I would then add the rel=canonical tag to prevent scraping.
Would this be the correct URL to put in the tag?
-
The rewrite rule above is not bad practice, it will fix the issue with your URL's
However it is good practice to use the rel=canonical tag on your site additionally to prevent any other duplicate content issues.
In short the rel=canonical tag tells Google which URL you wish to use, preventing Google from thinking you have duplicate content if multiple URL's exist for the same page.
-
Thank you Keri, that's what I'm thinking but I want to make sure. Thank you for messaging Dr. Pete, I hope maybe he can expound on this.
-
Generally, if you can fix it with code, that tends to be a bit better than the canonical tag, from my understanding. I've emailed Dr. Pete and asked him to contribute to this thread as well, as he's an expert on canonical tags.
-
Thanks a lot guys this is some great information. Let me get this straight.
Is solving this issue with the code below a bad practice?
<ifmodule mod_rewrite.c="">RewriteEngine on</ifmodule>
RewriteCond %{HTTP_HOST} !^www. [NC]
RewriteRule ^ http://www.%{HTTP_HOST}%{REQUEST_URI} [L,R=301]If it's not a bad practice and I implement the code to stop the issue, you are saying I should still use a rel=canonical tag to prevent scraping?
-
You should set up the correct Canonicalization rewrites at the server level with IIS or .htaccess. (Not sure which one you have). If you know what type of sever you are on, then you can find all the correct rewrites. (www, non www, lowercase, trailing slash / , etc.)
For example, here is a great post if you have IIS. http://www.seomoz.org/blog/what-every-seo-should-know-about-iis
And you should also use rel=canonical tags.
-
I always use rel="canonical"
-
Rel Canonical is considered a best practice in SEO, so you should just always include it in your pages, even if they're the only copy of the content you know of. It will help prevent any scrapers from stealing your content down the road.
And re: you're sorta right. Technically speaking, what we're doing with that htaccess code is 301 redirecting every URL, either to the www or non-www version. So say you go with my method anyone going to http://example.com just gets 301'd over to http://www.example.com
-
Thank you Ryan, that is exactly what I expected the problem to be but really couldn't figure out how to address it or solve it. You explained it very well and I appreciate the suggested code to use as well. I should be able to figure it out from here.
Thank you again!
-
Thank you Brent and kjay. Take a look at Ryan's answer, I think that is what I was shooting for. If I can eliminate the problem of an ambiguous URL at the server level then I will not need rel="canonical or 301/302. What do you guys think?
-
Personally I would do the following:
- Set rel="canonical" as Brent says below
- 301 redirect the preferred URL, so if you are using www.example.com redirect example.com, that way if anyone points links at example.com "most" of the juice will pass over (this will probably fix the issue you have posted about)
- Set the referred URL in Google web master tools
If you are using CMS like Wordpress rel="canonical" will probably already be taken care of for your website, you can check this by viewing the source or using SEO Moz's on-page keyword optimization tool.
-
Actually in cases like your example above its more an issue of an ambiguous URL rather than actual duplicate content.
The thing to do in the example above is to choose which version of your site you want (with www or without) to always use, and then set your server accordingly. In Apache this means using your .htaccess file.
If you decide to always display www (my preferred way) then this should be in your .htaccess:
<ifmodule mod_rewrite.c="">RewriteEngine on</ifmodule>
RewriteCond %{HTTP_HOST} !^www. [NC]
RewriteRule ^ http://www.%{HTTP_HOST}%{REQUEST_URI} [L,R=301]If you want your URLS to not use www:
<ifmodule mod_rewrite.c="">RewriteEngine on</ifmodule>
RewriteCond %{HTTP_HOST} ^www.(.+)$ [NC]
RewriteRule ^ http://%1%{REQUEST_URI} [L,R=301] -
You should definitely setup your site Canonicalization, and you should also utilize rel=canonical tags to help distinguish which page is the actual page.
For example, if you want to identify that www.example.com is the correct url, then you would use the following:
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate Content Issues with Pagination
Hi Moz Community, We're an eCommerce site so we have a lot of pagination issues but we were able to fix them using the rel=next and rel=prev tags. However, our pages have an option to view 60 items or 180 items at a time. This is now causing duplicate content problems when for example page 2 of the 180 item view is the same as page 4 of the 60 item view. (URL examples below) Wondering if we should just add a canonical tag going to the the main view all page to every page in the paginated series to get ride of this issue. https://www.example.com/gifts/for-the-couple?view=all&n=180&p=2 https://www.example.com/gifts/for-the-couple?view=all&n=60&p=4 Thoughts, ideas or suggestions are welcome. Thanks
Technical SEO | | znotes0 -
Who gets punished for duplicate content?
What happens if two domains have duplicate content? Do both domains get punished for it, or just one? If so, which one?
Technical SEO | | Tobii-Dynavox0 -
Duplicate Tag Content Mystery
Hello Moz Communtiy! i am also having error of Duplicate Tag Content Mystery like: http://www.earnmoneywithgoogleadsense.com/tag/blog-post/ http://www.earnmoneywithgoogleadsense.com/tag/effective-blog-post/ Pages are same. I have 100+ Error on website so how can i remove this error? DO you have any tutorial based on this? Can i change canonical url at once or i need to set it one by one? If you have any video basis on it, i will recommend.
Technical SEO | | navneetkumar7860 -
Partially duplicated content on separate pages
TL;DR: I am writing copy for some web pages. I am duplicating some bits of copy exactly on separate web pages. And in other cases I am using the same bits of copy with slight alterations. Is this bad for SEO? Details: We sell about 10 different courses. Each has a separate page. I'm currently writing copy for those pages. Some of the details identical for each course. So I can duplicate the content and it will be 100% applicable. For example, when we talk about where we can run courses (we go to a company and run it on their premises) – that's applicable to every course. Other bits are applicable with minor alterations. So where we talk about how we'll tailor the course, I will say for example: "We will the tailor the course to the {technical documents|customer letters|reports} your company writes." Or where we have testimonials, the headline reads "Improving {customer writing|reports|technical documents} in every sector and industry". There is original content on each page. The duplicate stuff may seem spammy, but the alternative is me finding alternative re-wordings for exactly the same information. This is tedious and time-consuming and bizarre given that the user won't notice any difference. Do I need to go ahead and re-write these bits ten slightly different ways anyway?
Technical SEO | | JacobFunnell0 -
Duplicated content on subcategory pages: how do I fix it?
Hello Everybody,
Technical SEO | | uMoR
I manage an e-commerce website and we have a duplicated content issue for subcategory. The scenario is like this: /category1/subcategory1
/category2/subcategory1
/category3/subcategory1 A single subcategory can fit multiple categories, so we have 3 different URL for the same subcategory with the same content (except of the navigation link). Which are the best practice to avoid this issue? Thank you!0 -
Duplicate Content - That Old Chestnut!!!
Hi Guys, Hope all is well, I have a question if I may? I have several articles which we have written and I want to try and find out the best way to post these but I have a number of concerns. I am hoping to use the content in an attempt to increase the Kudos of our site by providing quality content and hopefully receiving decent back links. 1. In terms of duplicate content should I only post it on one place or should I post the article in several places? Also where would you say the top 5 or 10 places would be? These are articles on XML, Social Media & Back Links. 2. Can I post the article on another blog or article directory and post it on my websites blog or is this a bad idea? A million thanks for any guidance. Kind Regards, C
Technical SEO | | fenwaymedia0 -
Thin/Duplicate Content
Hi Guys, So here's the deal, my team and I just acquired a new site using some questionable tactics. Only about 5% of the entire site is actually written by humans the rest of the 40k + (and is increasing by 1-2k auto gen pages a day)pages are all autogen + thin content. I'm trying to convince the powers that be that we cannot continue to do this. Now i'm aware of the issue but my question is what is the best way to deal with this. Should I noindex these pages at the directory level? Should I 301 them to the most relevant section where actual valuable content exists. So far it doesn't seem like Google has caught on to this yet and I want to fix the issue while not raising any more red flags in the process. Thanks!
Technical SEO | | DPASeo0 -
Duplicate Content Resolution Suggestion?
SEOmoz tools is saying there is duplicate content for: www.mydomain.com www.mydomain.com/index.html What would be the best way to resolve this "error"?
Technical SEO | | PlasticCards0