Duplicate Page Content
-
Hi within my campaigns i get an error "crawl errors found" that says duplicate page content found, it finds the same content on the home pages below. Are these seen as two different pages? And how can i correct these errors as they are just one page?
-
Sorry for the delay in reply but thanks a lot for the info. This site is hosted on godaddy can all the above be done via their website management tools?
-
Hey Kevin
This is a common problem and is easily solved with a bit of URL rewriting.
Seeing that you are using PHP pages I will assume you are using a linux / apache web server so you need to create a .htaccess file. This is a configuration file that is processed by the web server and that you can use to tap into additional functionality from the web server.
RewriteEngine On
Options +FollowSymLinksRewriteCond %{THE_REQUEST} ^./Home_Page.php
RewriteRule ^(.)index.php$ http://www.poolstar.net/$1 [R=301,L]It looks like you are also returning the same content at the www. and non www versions of the site as well so you want to pick one and stick with it. A quick look at open site explorer shows more links to the www. version so I would go with that but you are only looking at 2 to www and 1 to the non www version so if you have some other strong reason to use the domain without the www then don't worry about this too much (but tweak the below).
RewriteCond %{HTTP_HOST} ^poolstar.net RewriteRule (.*) http://www.poolstar.net/$1 [R=301,L]
The full thing should look like the following *
RewriteEngine On
RewriteBase /
RewriteCond %{HTTP_HOST} ^poolstar.net [NC]
RewriteRule ^(.*)$ http://www.poolstar.net/$1 [L,R=301]
RewriteCond %{THE_REQUEST} ^[A-Z]{3,9}\ /index.php\ HTTP/
RewriteRule ^index.php$ http://www.poolstar.net/ [R=301,L]
***** it should end here but the code highlighter is playing up *****
- Some servers are a little different and you may or may not need the follow symlinks or rewrite base lines. Try the above, if that does not work, give me a shout, happy to help.
Hope it helps
Marcus -
From now on link to your homepage as http://poolstar.net/ and set a 301 redirect in place from http://poolstar.net/Home_page.php to http://poolstar.net/
Not that it is any of my business, but I would really recommend to have someone redesign your website. It looks rather 1998 at the moment which is most likely costing you new customers and possibly (after Google Panda) is making it harder for you to rank high in the search engines as well.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate Page Content and Titles from Weebly Blog
Anyone familiar with Weebly that can offer some suggestions? I ran a crawl diagnostics on my site and have some high priority issues that appear to stem from Weebly Blog posts. There are several of them and it appears that the post is being counted as "page content" on the main blog feed and then again when it is tagged to a category. I hope this makes sense, I am new to SEO and this is really confusing. Thanks!
Technical SEO | | CRMI0 -
Duplicate pages on wordpress
I am doing SEO on a site which is running on WP. And it has all pages and categories duplicates on domain.com/site/ However, as it got crawled I saw that all domain.com/ pages have rel=canonical with main page tag (does it mean something?). Thing is I will fix permalinks structure and I think WP automatically redirects if it is changed from /?page_id= to /%category%/%postname%/ or /%postname%/ Isn't there something I miss? Second problems is a forum. After a crawl it found over 5k errors and over 5k warnings. Those are: Duplicate page content; Duplicate page title; Overly-Dynamic URLs; Missing Meta descr; Title Element too long. All those come from domain.com/forum/ (fortunately, there are no domain.com/site/forum duplicates). What could be an easy solution to this?
Technical SEO | | OVJ0 -
Results Pages Duplication - What to do?
Hi all, I run a large, well established hotel site which fills a specific niche. Last February we went through a redesign which implemented pagination and lots of PHP / SQL wizzardy. This has left us, however, with a bit of a duplication problem which I'll try my best to explain! Imagine Hotel 1 has a pool, as well as a hot tub. This means that Hotel 1 will be in the search results of both 'Hotels with Pools' and 'Hotels with Hot Tubs', with exactly the same copy, affiliate link and thumbnail picture in the search results. Now imagine this issue occurring hundreds of times across the site and you have our problem, especially since this is a Panda-hit site. We've tried to keep any duplicate content away from our landing pages with some success but it's just all those pesky PHP paginated pages which doing us in (e.g. Hotels/Page-2/?classifications[]263=73491&classifcations[]742=24742 and so on) I'm thinking that we should either a) completely noindex all of the PHP search results or b) move us over to a Javascript platform. Which would you guys recommend? Or is there another solution which I'm overlooking? Any help most appreciated!
Technical SEO | | dooberry0 -
How do I get rid of duplicate content
I have a site that is new but I managed to get it to page one. Now when I scan it on SEO Moz I see that I have duplicate content. Ex: www.mysite.com, www.mysite.com/index and www.mysite.com/ How do I fix this without jeopardizing my SERPS ranking? Any tips?
Technical SEO | | bronxpad0 -
Google inconsistent in display of meta content vs page content?
Our e-comm site includes more than 250 brand pages - lrg image, some fluffy text, maybe a video, links to categories for that brand, etc. In many cases, Google publishes our page title and description in their search results. However, in some cases, Google instead publishes our H1 and the aforementioned fluffy page content. We want our page content to read well, be descriptive of the brand and appropriate for the audience. We want our meta titles and descriptions brief and likely to attract CTR from qualified shoppers. I'm finding this difficult to manage when Google pulls from two different areas inconsistently. So my question... Is there a way to ensure Google only utilizes our title/desc for our listings?
Technical SEO | | websurfer0 -
Mitigating duplicate page content on dynamic sites such as social networks and blogs.
Hello, I recently did an SEOMoz crawl for a client site. As it typical, the most common errors were duplicate page title and duplicate content. The client site is a custom social network for researchers. Most of the pages that showing as duplicate are simple variations of each user's profile such as comment sections, friends pages, and events. So my question is how can we limit duplicate content errors for a complex site like this. I already know about the rel canonical tag, and rel next tag, but I'm not sure if either of these will do the job. Also, I don't want to lose potential links/link juice for good pages. Are there ways of using the "noindex" tag in batches? For instance: noindex all urls containing this character? Or do most CMS allow this to be done systematically? Anyone with experience doing SEO for a custom Social Network or Forum, please advise. Thanks!!!
Technical SEO | | BPIAnalytics0 -
Duplicate content?
I have a question regarding a warning that I got on one of my websites, it says Duplicate content. I'm canonical url:s and is also using blocking Google out from pages that you are warning me about. The pages are not indexed by Google, why do I get the warnings? Thanks for great seotools! 3M5AY.png
Technical SEO | | bnbjbbkb0 -
Solution for duplicate content not working
I'm getting a duplicate content error for: http://www.website.com http://www.website.com/default.htm I searched for the Q&A for the solution and found: Access the.htaccess file and add this line: redirect 301 /default.htm http://www.website.com I added the redirect to my .htaccess and then got the following error from Google when trying to access the http://www.website.com/default.htm page: "This webpage has a redirect loop
Technical SEO | | Joeuspe
The webpage at http://www.webpage.com/ has resulted in too many redirects. Clearing your cookies for this site or allowing third-party cookies may fix the problem. If not, it is possibly a server configuration issue and not a problem with your computer." "Error 310 (net::ERR_TOO_MANY_REDIRECTS): There were too many redirects." How can I correct this? Thanks0