How to setup a redirect from one subfolder to another to avoid duplicate content.
-
Hello All,
I have a WordPress site that Moz says has duplicate content.
http://deltaforcepi.com/latest-news/page/3
http://deltaforcepi.com/category/latest-news/page/3So I set up an addition to the .htaccess file . . .
redirect code to move from one folder to another
RewriteRule ^category/latest-news/(.*)$ /latest-news/$1 [R=301,NC,L]
What did I do wrong? I am not proficient in .htaccess files.
-
Thank you, I did not think it would have been an issue either but the customer did not like seeing that on the report and wanted it fixed. I will look into how to setup a robots.txt file to take care of this.
Michael
-
Google has always said and very recently repeated that internal duplicate content is not an issue, Google will simply decide on what content is best to return results too.
If you are concerned you have a few options instead of what you are doing.
Use the meta noindex so that Google does not index the data. If you cant do that because of wordpress then this can be set externally Using the X-Robots-Tag HTTP header.
https://developers.google.com/webmasters/control-crawl-index/docs/robots_meta_tag
Hope that helps. Personally I would keep the page up because Google is used to dealing with wordpress and would punish thousands of sites if this was really an issue.
-
Nope, that didn't do it. I see that WordPress creates duplicate content from having an archive of posts made on the website. Maybe If I can have an robot.txt file that does not crawl that directory???
-
I believe it should be-
RewriteRule ^category/latest-news/(.*)$ http://yourdomain.com/latest-news/$1 [R=301,NC,L]
Try that and see if it doesn't fix it for you. (Replace 'yourdomain.com' with your real domain of course.)
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
In Wordpress getting marked as duplicate content for tags
Moz is marking 11 high priority items for duplicate content. Just switched to wordpress and publishing articles for the site but only have a few. The problem is on the tag pages. Since there aren't very many articles so when you go to the tag pages it lists one or two articles and hence there are pages with duplicate content. Most of the articles have the same tags / categories. Perhaps I'm using too many tags and categories? I'm using about 7 tags and around 2 categories for each post / event. I've read the solution is using canonical tags but a little confused on which page I should use for the tag and then I believe I need to point the duplicate pages to the correct page. For example, I have two events that are for dances and both have the same tags. So when you visit, site.com/tags/dance or site.com/events both pages have the same articles listed. Which page do I select as having the original content? Does it matter? Does that make sense? Someone was also saying I could use the Yoast plugin to fix, but not really seeing anything in the Yoast tools. I also see 301 redirects mentioned as a solution but the tag pages will be changing as we add new articles and they have a purpose so not really seeing that as a solution.
Web Design | | limited70 -
One Page Guide vs. Multiple Individual Pages
Howdy, Mozzers! I am having a battle with my inner-self regarding how to structure a resources section for our website. We're building out several pieces of content that are meant to be educational for our clients and I'm having trouble deciding how to layout the content structure. We could either layout all eight short sections on a single page, or create individual pages for each section. The goal is obviously to attract new potential clients by targeting these terms that they may be searching for in an information gathering stage. Here's my dilemma...
Web Design | | jpretz
With the single page guide, it would be nice because it will have a lot of content (and of course, keywords) to be picked up by the SERPS but I worry that it is going to be a bit crammed (because of eight sections) for the user. The individual pages would be much better organized and you can target more specific keywords, but I worry that it may get flagged for light content as some pages may have as little as a 150 word description. I have always been mindful of writing copy for searchers over spiders, but now I'm at a more technical crossroads as far as potentially getting dinged for not having robust content on each page. Here's where you come in...
What do you think is the better of the two options? I like the idea of having the multiple pages because of the ability to hone-in on a keyword and the clean, organized feel, but I worry about the lack of content (and possibly losing out on long-tail opportunities). I'd love to hear your thoughts. Please and thank you. Ready annnnnnnnnnnnd GO!0 -
Question re. crawlable textual content
I have a client who is struggling to fit crawlable textual content on their pages. I'm wondering if we can add a "Learn More..." feature that works as a mouse over pop up. When a page visitor runs their curser over the link or button, a window bubble pops up and textual content about the page will show. Not knowing much about code, can text in this format be crawlable by search engines and count as unique and relevant content? Thanks, Dino
Web Design | | Dino640 -
Page Content
What is the minimum amount of content a page should have to be seo friendly? What is the maximum amount of content a page should have to be seo friendly?
Web Design | | bronxpad0 -
Do I need to redirect soft 404s that I got from Google Webmaster Tools?
Hi guys, I got almost 1000+ soft 404s from GWT. All of the soft 404s produce 200 HTTP status code but the URLs are something like the following: http://www.example.com/search/house-for-rent (query used: house for rent) http://www.example.com/search/-----------rent (query used:-------rent) There are no listings that match these queries and there is an advanced search that is visible in these pages. Here are my questions: 1. Do I need to redirect each page to its appropriate landing page? 2. Do I need to add user sitemap or a list of URLs where they can search for other properties? Any suggestions would help. 🙂
Web Design | | esiow20130 -
How will engines deal with duplicate head elements e.g. title or canonicals?
Obviously duplicate content is never a good thing...on separate URL's. Question is, how will the engines deal with duplicate meta tags on the same page. Example Head Tag: <title>Example Title - #1</title> <title>Example Title - #2</title> My assumption is that Google (and others) will take the first instance of the tag, such that "Example Title - #1" and canonical = "http://www.example.com" would be considered for ranking purposes while the others are disregarded. My assumption is based on how SE's deal with duplicate links on a page. Is this a correct assumption? We're building a CMS-like service that will allow our SEO team to change head tag content on the fly. The easiest solution, from a dev perspective, is to simply place new/updated content above the preexisting elements. I'm trying to validate/invalidate the approach. Thanks in advance.
Web Design | | PCampolo0 -
Getting tons of duplicate content and title errors on my asp.net shopping cart, is there way to resolve this?
The problem I am having is that the web crawlers are seeing all my category pages as the same page thus creating duplicate content and duplicate title errors. At this time I have 270 of these critical errors to deal with. Here is an example: http://www.baysidejewelry.com/category/1-necklaces.aspx http://www.baysidejewelry.com/category/1-necklaces.aspx?pageindex=1 http://www.baysidejewelry.com/category/1-necklaces.aspx?pageindex=2 http://www.baysidejewelry.com/category/1-necklaces.aspx?pageindex=3 http://www.baysidejewelry.com/category/1-necklaces.aspx?pageindex=4 All of these pages are see as the same exact page by the crawlers. Because these pages are generated by a SQL database I don't have a way I know of to fix it.
Web Design | | bsj20020 -
Duplicate content.
Hi there....we're dealing with a duplicate content mess. We're a franchisor(www.kitchensolvers.com), and each of our franchises have their own landing page. The trouble is, the way the landing pages are set up, it's causing all the links available on the national level of the website to be re-indexed each time for every franchise! We don't have an in-house developer and was wondering if anyone else has had similar issues and point me in the right direction.
Web Design | | tafkat0