Solution for duplicate content not working
-
I'm getting a duplicate content error for:
http://www.website.com/default.htm
I searched for the Q&A for the solution and found:
Access the.htaccess file and add this line:
redirect 301 /default.htm http://www.website.com
I added the redirect to my .htaccess and then got the following error from Google when trying to access the http://www.website.com/default.htm page:
"This webpage has a redirect loop
The webpage at http://www.webpage.com/ has resulted in too many redirects. Clearing your cookies for this site or allowing third-party cookies may fix the problem. If not, it is possibly a server configuration issue and not a problem with your computer.""Error 310 (net::ERR_TOO_MANY_REDIRECTS): There were too many redirects."
How can I correct this?
Thanks
-
Hi Joseph
The info you sent me looked fine. Have you seen if other users on other browsers or operating systems are having this issue? Also have you cleared your cookies as advised in the warning message?
If you upload a htaccess file with just one redirect do you still get the same issue? If so it would suggest that it may be a server issue. As such it may be worth talking to the relevant people within the organisation that deal with them or as may be the case if there is no one then it may be worth contacting the web host.
-
it wont harm to try this and try to add it before all redirects
<code>Options +FollowSymLinks RewriteEngine On RewriteBase / RewriteCond %{THE_REQUEST} ^/default.htm HTTP/ RewriteRule ^/(default.htm)$ http://www.otherdomain.com [R=301,L]</code>
-
PM sent...
-
Hi Lewis,
I tried recreating the file as you said, but no luck yet... I sent you a pm with my .htaccess info, maybe you can see something I don't.
-
Do you mind posting URL or PM me
-
Ok, tried it and got a 200 OK
-
try to check your http header please.
add your http://www.website.com
and see what it will serve you. a 200 OK or a 301 to default.htm
-
Is there anything else in the .htaccess file that may be causing the loop?
If the file looks correct I would try copying and pasting the text into a new htaccess file and deleting the original and seeing if the problem remains.
I have fixed a random issue with htaccess before by effectively recreating the file.
-
Hi Wissam,
I'm not sure, there is no redirect set up for /default.htm in my .htaccess
I tried adding the redirectmatch 301 and got back the same error message as before...
-
Hi joseph,
first investigate who is redirecting root to /default... is it htaccess or script?
try to use
redirectmatch 301 /default.htm http://www.website.com
please follow up
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate content and canonicalization confusion
Hello, http://bit.ly/1b48Lmp and http://bit.ly/1BuJkUR pages have same content and their canonical refers to the page itself. Yet, they rank in search engines. Is it because they have been targeted to different geographical locations? If so, still the content is same. Please help me clear this confusion. Regards
Technical SEO | | IM_Learner0 -
Duplicate Content Question (E-Commerce Site)
Hi All, I have a page that ranks well for the keyword “refurbished Xbox 360”. The ranking page is an eCommerce product details page for a particular XBOX 360 system that we do not currently have in stock (currently, we do not remove a product details page from the website, even if it sells out – as we bring similar items into inventory, e.g. more XBOX 360s, new additional pages are created for them). Long story short, given this way of doing things, we have now accumulated 79 “refurbished XBOX 360” product details pages across the website that currently, or at some point in time, reflected some version of a refurbished XBOX 360 in our inventory. From an SEO standpoint, it’s clear that we have a serious duplicate content problem with all of these nearly identical XBOX 360 product pages. Management is beginning to question why our latest, in-stock, XBOX 360 product pages aren't ranking and why this stale, out-of-stock, XBOX 360 product page still is. We are in obvious need of a better process for retiring old, irrelevant (product) content and eliminating duplicate content, but the question remains, how exactly is Google choosing to rank this one versus the others since they are primarily duplicate pages? Has Google simply determined this one to be the original? What would be the best practice approach to solving a problem like this from an SEO standpoint – 301 redirect all out of stock pages to in stock pages, remove the irrelevant page? Any thoughts or recommendations would be greatly appreciated. Justin
Technical SEO | | JustinGeeks0 -
Duplicate content issues, I am running into challenges and am looking for suggestions for solutions. Please help.
So I have a number of pages on my real estate site that display the same listings, even when parsed down by specific features and don't want these to come across as duplicate content pages. Here are a few examples: http://luxuryhomehunt.com/homes-for-sale/lake-mary/hanover-woods.html?feature=waterfront http://luxuryhomehunt.com/homes-for-sale/lake-mary/hanover-woods.html This happens to be a waterfront community so all the homes are located along the waterfront. I can use a canonical tag, but I not every community is like this and I want the parsed down feature pages to get index. Here is another example that is a little different: http://luxuryhomehunt.com/homes-for-sale/winter-park/bear-gully-bay.html http://luxuryhomehunt.com/homes-for-sale/winter-park/bear-gully-bay.html?feature=without-pool http://luxuryhomehunt.com/homes-for-sale/winter-park/bear-gully-bay.html?feature=4-bedrooms http://luxuryhomehunt.com/homes-for-sale/winter-park/bear-gully-bay.html?feature=waterfront So all the listings in this community happen to have 4 bedrooms, no pool, and are waterfront. Meaning that they display for each of the parsed down categories. I can possible set something that if the listings = same then use canonical of main page url, but in the next case its not so simple. So in this next neighborhood there are 48 total listings as seen at: http://luxuryhomehunt.com/homes-for-sale/windermere/isleworth.html and being that it is a higher end neighborhood, 47 of the 48 listings are considered "traditional listings" and while it is not exactly all of them it is 99%. Any recommendations is appreciated greatly.
Technical SEO | | Jdubin0 -
Pages with different content and meta description marked as duplicate content
I am running into an issue where I have pages with completely different body and meta description but they are still being marked as having the same content (Duplicate Page Content error). What am I missing here? Examples: http://www.wallstreetoasis.com/forums/what-to-expect-in-the-summer-internship
Technical SEO | | WallStreetOasis.com
and
http://www.wallstreetoasis.com/blog/something-ventured http://www.wallstreetoasis.com/forums/im-in-the-long-run
and
http://www.wallstreetoasis.com/image/jhjpeg0 -
Similar Content vs Duplicate Content
We have articles written for how to setup pop3 and imap. The topics are technically different but the settings within those are very similar and thus the inital content was similar. SEOMoz reports these pages as duplicate content. It's not optimal for our users to have them merged into one page. What is the best way to handle similar content, while not getting tagged for duplicate content?
Technical SEO | | Izoox0 -
Mod Rewrite question to prevent duplicate content
Hi, I'm having problems with a mod rewrite issue and duplicate content On my website I have Website.com Website.com/directory Website.com/directory/Sub_directory_more_stuff_here Both #1 and #2 are the same page (I can't change this). #3 is different pages. How can I use mod rewrite to to make #2 redirect to #1 so I don't have duplicate content WHILE #3 still works?
Technical SEO | | kat20 -
Snippets on every page considered duplicate content?
If I create a page that pulls a 10 snippets of information from various external site, would that content be considered duplicate content? If I link to the source, would it be recommended to use a "nofollow" tag?
Technical SEO | | nicole.healthline0 -
The Bible and Duplicate Content
We have our complete set of scriptures online, including the Bible at http://lds.org/scriptures. Users can browse to any of the volumes of scriptures. We've improved the user experience by allowing users to link to specific verses in context which will scroll to and highlight the linked verse. However, this creates a significant amount of duplicate content. For example, these links: http://lds.org/scriptures/nt/james/1.5 http://lds.org/scriptures/nt/james/1.5-10 http://lds.org/scriptures/nt/james/1 All of those will link to the same chapter in the book of James, yet the first two will highlight the verse 5 and verses 5-10 respectively. This is a good user experience because in other sections of our site and on blogs throughout the world webmasters link to specific verses so the reader can see the verse in context of the rest of the chapter. Another bible site has separate html pages for each verse individually and tends to outrank us because of this (and possibly some other reasons) for long tail chapter/verse queries. However, our tests indicated that the current version is preferred by users. We have a sitemap ready to publish which includes a URL for every chapter/verse. We hope this will improve indexing of some of the more popular verses. However, Googlebot is going to see some duplicate content as it crawls that sitemap! So the question is: is the sitemap a good idea realizing that we can't revert back to including each chapter/verse on its own unique page? We are also going to recommend that we create unique titles for each of the verses and pass a portion of the text from the verse into the meta description. Will this perhaps be enough to satisfy Googlebot that the pages are in fact unique? They certainly are from a user perspective. Thanks all for taking the time!
Technical SEO | | LDS-SEO0