How to setup a redirect from one subfolder to another to avoid duplicate content.
-
Hello All,
I have a WordPress site that Moz says has duplicate content.
http://deltaforcepi.com/latest-news/page/3
http://deltaforcepi.com/category/latest-news/page/3So I set up an addition to the .htaccess file . . .
redirect code to move from one folder to another
RewriteRule ^category/latest-news/(.*)$ /latest-news/$1 [R=301,NC,L]
What did I do wrong? I am not proficient in .htaccess files.
-
Thank you, I did not think it would have been an issue either but the customer did not like seeing that on the report and wanted it fixed. I will look into how to setup a robots.txt file to take care of this.
Michael
-
Google has always said and very recently repeated that internal duplicate content is not an issue, Google will simply decide on what content is best to return results too.
If you are concerned you have a few options instead of what you are doing.
Use the meta noindex so that Google does not index the data. If you cant do that because of wordpress then this can be set externally Using the X-Robots-Tag HTTP header.
https://developers.google.com/webmasters/control-crawl-index/docs/robots_meta_tag
Hope that helps. Personally I would keep the page up because Google is used to dealing with wordpress and would punish thousands of sites if this was really an issue.
-
Nope, that didn't do it. I see that WordPress creates duplicate content from having an archive of posts made on the website. Maybe If I can have an robot.txt file that does not crawl that directory???
-
I believe it should be-
RewriteRule ^category/latest-news/(.*)$ http://yourdomain.com/latest-news/$1 [R=301,NC,L]
Try that and see if it doesn't fix it for you. (Replace 'yourdomain.com' with your real domain of course.)
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Is there a way to redirect URLs with a hash-bang (#!) format?
Hi Moz, I'm trying to redirect www.site.com/locations/#!city to www.site.com/locations/city. This seems difficult because anything after the hash character in the URL does not make it to the server thus cannot be parsed for rewriting. Is there an SEO friendly way to implement these redirects? Thanks for reading!
Web Design | | DA20130 -
How to handle International Duplicated Content?
Hi, We have multiple international E-Commerce websites. Usually our content is translated and doesn't interfere with each other, but how do search engines react to duplicate content on different TLDs? We have copied our Dutch (NL) store for Belgium (BE) and i'm wondering if we could be inflicting damage onto ourselves... Should I use: for every page? are there other options so we can be sure that our websites aren't conflicting? Are they conflicting at all? Alex
Web Design | | WebmasterAlex0 -
Increasing content, adding rich snippets... and losing tremendous amounts of organic traffic. Help!
I know dramatic losses in organic traffic is a common occurrence, but having looked through the archives I'm not sure that there's a recent case that replicates my situation. I've been working to increase the content on my company's website and to advise it on online marketing practices. To that end, in the past four months, I've created about 20% more pages — most of which are very high quality blog posts; adopted some rich snippets (though not all that I would like to see at this point); improved and increased internal links within the site; removed some "suspicious" pages as id'd by Moz that had a lot of links on it (although the content was actually genuine navigation); and I've also begun to guest blog. All of the blog content I've written has been connected to my G+ account, including most of the guest blogging. And... our organic traffic is preciptiously declining. Across the board. I'm befuddled. I can see no warnings (redirects &c) that would explain this. We haven't changed the site structure much — I think the most invasive thing we did was optimize our title tags! So no URL changes, nothing. Obviously, we're all questioning all the work I've done. It just seems like we've sunk SO much energy into "doing the right thing" to no effect (this site was slammed before for its shady backlink buying — though not from any direct penalty, just as a result of the Penguin update). We noticed traffic taking a particular plunge at the beginning of June. Can anyone offer insights? Very much appreciated.
Web Design | | Novos_Jay0 -
Duplicate page title caused by Shopify CMS
Hi, We have an ecommerce site set up at devlinsonline.com.au using Shopify and the MOZ crawl is returning a huge number (hundreds!) of Duplicate Page Title errors. The issue seems to be the way that Shopify uses tagging to sort products. So, using the 'Riedel' collection as an example, the urls devlinsonline.com.au/collections/riedel-glasses/ devlinsonline.com.au/collections/riedel-glasses/decanters devlinsonline.com.au/collections/riedel-glasses/vinum all have the exact same page title. We are also having the same issue with the blog and other sections of our site. Is this something that is actually a serious issue or, perhaps, is Google's algorithm intelligent enough to recognise that this is part of Shopify's layout so it will not negatively affect our rankings and can, essentially, be ignored? Thanks.
Web Design | | SimonDevlin0 -
Will a .com and .co.uk site (with exact same content) hurt seo
hello, i am sure this question has been asked before, but while i tried to search i could not find the right answer. my question is i have a .com and .co.uk site. both sites have exact same product, exact same product descriptions, and everything is the same. the reason for 2 sites is that .com site shows all the details for US customers and in $, and .co.uk site shows all the details to UK customers and with Pound signs. the only difference in the 2 sites might be the privacy policy (different for US and UK) and different membership groups the site belongs to (US site belong to a list of US trade groups, UK belongs to a list of UK trade groups). my question is other than the minor difference above, all the content of the site is exactly the same, so will this hurt seo for either one or both the site. Our US site much more popular and indexed already in google for 4 years, while our UK site was just started 1 month ago. (also both the sites are hosted by same hosting company, with one site as main domain and the other site as domain addon (i thought i include this information also, if it makes sense to readers)) i would appreciate a reply to the question above thanks
Web Design | | kannu10 -
Using content from other sites without duplicate content penalties?
Hi there, I am setting up a website, where i believe it would substantially benefit users experience if i setup a database of information on artists. I am torn because to feasibly do this correctly, i would have content that is built from multiple sources, but has no real unique content. It would have parts from Wikipedia, parts from other websites etc. All would be sourced of-course. My concern is that if i do this, am i risking in devaluing my website because of this. Is there a way i can handle this without taking a hit?
Web Design | | BorisD0 -
Redirect based on location best practice clarification?
Hi, i have a question that i have seen some other have also had. The question is what is the best practice to serve the location specific page to the user (based on their location)? This post (http://www.seomoz.org/q/redirecting-users-based-on-location) suggests against automatically redirecting the user based on IP address. I guess the primary concern is that Google bot will also be redirected in this case... I see a number of well known sites use automatic redirect based on location. Take Urbanspoon for example (http://www.urbanspoon.com/), they use a 302 redirect to redirect to location specific page. Do they not redirect Google bot? Is there any way to test this? Can creating a rule to exclude crawlers from redirect cause SEO problems? How? Another example that i am somewhat confused as to how it works effectively is groupon.com.au It selects my closest city (i assume using IP), however the URL stays as the root URL. For example, i typed in http://www.groupon.com.au/ and it stays as http://www.groupon.com.au/ with the city chosen as "Melbourne". The canonical url for this page is the root URL (ie http://www.groupon.com.au/). If you then select "change city" and click the same city (ie Melbourne), it redirects to http://www.groupon.com.au/deals/melbourne. Canonical URL of this page is http://www.groupon.com.au/deals/melbourne. How is this not duplicate content? Can you please advise on the best way to redirect (ideally automatically), to provide the best user experience, while still having Google bot able to crawl the site effectively? Thanks
Web Design | | blackrails0 -
Display a message after 301 redirect.
I know this one is a little more technical than pure SEO, but it somewhat ties in. I'm redirecting 2 sites to one. The companies have combined, and I've imported all their content that has been created, and the url mapping is set up correctly. So it currently is set up like this: A.com --> C.com
Web Design | | WilliamBay
A.com/blogpost-1 --> C.com/blogpost-1 B.com --> C.com B.com/blogpost-20 --> C.com/blogpost-20 What i would like is some method a redirect when landing at C.com or C.com/blogpost-XX. And in turn display a message in a DIV so I can style it. And yes, I have nothing better to do on a Friday night 🙂0