Duplicate content with same URL?
-
SEOmoz is saying that I have duplicate content on:
The only difference I see in the URL is that the "content.asp" is capitalized in the second URL.
Should I be worried about this or is this an issue with the SEOmoz crawl?
Thanks for any help.
Mike
-
I am not using a rewrite rule yet -- I was asking if there is one that would resolve this issue.
-
Are you specifying the URL rewrite rule at the page level, or in your .htaccess? I had a similar issue once on a WordPress Multisite install that was rewriting
example.com/site2 -> site2.com
And:
example.com/site3 -> site3.comThe issue wasn't "real" in that the users' browsers were moving to the preferred URLs specified in the HTTP headers, but our crawl tests were a nightmare of non-existent files much like yours. Rel="canonical" will help in that case to avoid penalties, but won't do any favors for page rank or indexation. I believe our developers created some additional page-level rewrites to deal with the phantom pages created in the crawl, but alas, I'm not sure what the details were.
You might post in a new thread or reach out to Chris Abernethy directly, he's far savvier with PHP than I am.
-
I have a similar problem, and I couldn't see a solution on the site that your link refers to. Maybe you can help?
In both SEOmoz reports and GWT I get duplicate meta descriptions and/or duplicate title tags on pages that do not physically (or logically) exist. I'm not talking about dynamically generated URLs. What I see is for a given page, several other appended pages that have no relationship to the first, like this:
/realpage1.php/anotherrealpage1.html
/realpage1.php/adifferentrealpage2.html
/realpage1.php/anotherrealpage3.php
/realpage1.php/directory/realpage4.htmlPerhaps related to this issue, I discovered that if a trailing slash is entered after any URL typed into the browser (other than the home page), our custom 404 page appears, but with no CSS styling or active javascript.
I have been wondering if a rewrite rule that eliminates trailing slashes would work, but then it would never display a sub-directory's default index page, right?
I've searched all over for some help with this, to no avail. Any help will be much appreciated.
-
Modern search engines won't penalize you for this, but you may lose link juice if your content has multiple URLs and each is receiving links. Best practice is to set up a few simple PHP mod_rewrite rules in your .htaccess for basic URL display issues (enforce trailing backslash, redirect to/away from www, etc.), as well as to declare your preferred URL in the HTML of each page using this handy .
Here's a great tutorial how to force lower-case URLs written by a fellow Mozzer (props, Chris! It's how I learned...), and here's 10 other useful mod_rewrites to add to your repertoire.
-
You sir are a gentleman and a scholar.
Thanks for your help Matt.
-
Use canonicalization to resolve this common duplicate content issue.
You need to place the canonical tag pointing to your preferred URL.
See this SeoMoz guide on how to do it -
http://www.seomoz.org/learn-seo/duplicate-content
See
Rel="canonical"
this actually uses the example of capitalization and one page appearing as three to search engines...
Hope this helps!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Same URL, different Drupal content types
Hi all, I am working in Drupal which isn't always SEO-friendly. I want to convert some of our articles that are currently in an old article type to our new shiny longform template without losing SEO value. The process we use right now is to: change the URL of the old article in the CMS from /article-title to /article-title-old and then make the longform template /article-title in the CMS. Then hit publish. That way we can avoid having to mess with redirects. My concerns are that this will be seen as a bait and switch by Google. They are, after all, two separate pages — node-1 and node-2 on the back end — that are being smushed into the same skin aka same URL. I don't know if updating to the new template wipes out some of the info Google may have deemed important. I guess you could argue it's a redesign by CMS but I'm still not sure. Thoughts?
Technical SEO | | webbedfeet0 -
Database driven content producing false duplicate content errors
How do I stop the Moz crawler from creating false duplicate content errors. I have yet to submit my website to google crawler because I am waiting to fix all my site optimization issues. Example: contactus.aspx?propid=200, contactus.aspx?propid=201.... these are the same pages but with some old url parameters stuck on them. How do I get Moz and Google not to consider these duplicates. I have looked at http://moz.com/learn/seo/duplicate-content with respect to Rel="canonical" and I think I am just confused. Nick
Technical SEO | | nickcargill0 -
Duplicate Content Problems
Hi I am new to the seomoz community I have been browsing for a while now. I put my new website into the seomoz dashboard and out of 250 crawls I have 120 errors! So the main problem is duplicate content. We are a website that finds free content sources for popular songs/artists. While seo is not our main focus for driving traffic I wanted to spend a little time to make sure our site is up to standards. With that said you can see when two songs by an artist are loaded. http://viromusic.com/song/125642 & http://viromusic.com/song/5433265 seomoz is saying that it is duplicate content even though they are two completely different songs. I am not exactly sure what to do about this situation. We will be adding more content to our site such as a blog, artist biographies and commenting maybe this will help? Although if someone was playing multiple bob marley songs the biography that is loaded will also be the same for both songs. Also when a playlist is loaded http://viromusic.com/playlist/sldvjg on the larger playlists im getting an error for to many links on the page. (some of the playlists have over 100 songs) any suggestions? Thanks in advance and any tips or suggestions for my new site would be greatly appreciated!
Technical SEO | | mikecrib10 -
Duplicate Content
SEOmoz is reporting duplicate content for 2000 of my pages. For example, these are reported as duplicate content: http://curatorseye.com/Name=“Holster-Atlas”---Used-by-British-Officers-in-the-Revolution&Item=4158
Technical SEO | | jplill
http://curatorseye.com/Name=âHolster-Atlasâ---Used-by-British-Officers-in-the-Revolution&Item=4158 The actual link on the site is http://www.curatorseye.com/Name=“Holster-Atlas”---Used-by-British-Officers-in-the-Revolution&Item=4158 Any insight on how to fix this? I'm not sure where the second version of the URL is coming from. Thanks,
Janet0 -
Duplicate Content Vs No Content
Hello! A question that has been throw around a lot at our company has been "Is duplicate content better than no content?". We operate a range of online flash game sites, most of which pull their games from a feed, which includes the game description. We have unique content written on the home page of the website, but aside from that, the game descriptions are the only text content on the website. We have been hit by both Panda and Penguin, and are in the process of trying to recover from both. In this effort we are trying to decide whether to remove or keep the game descriptions. I figured the best way to settle the issue would be to ask here. I understand the best solution would be to replace the descriptions with unique content, however, that is a massive task when you've got thousands of games. So if you have to choose between duplicate or no content, which is better for SEO? Thanks!
Technical SEO | | Ryan_Phillips0 -
How to prevent duplicate content in archives?
My news site has a number of excerpts in the form of archives based on categories that is causing duplicate content problems. Here's an example with the nutrition archive. The articles here are already posts, so it creates the duplicate content. Should I nofollow/noindex this category page along with the rest and 2011,2012 archives etc (see archives here)? Thanks so much for any input!
Technical SEO | | naturalsociety0 -
Mod Rewrite question to prevent duplicate content
Hi, I'm having problems with a mod rewrite issue and duplicate content On my website I have Website.com Website.com/directory Website.com/directory/Sub_directory_more_stuff_here Both #1 and #2 are the same page (I can't change this). #3 is different pages. How can I use mod rewrite to to make #2 redirect to #1 so I don't have duplicate content WHILE #3 still works?
Technical SEO | | kat20 -
Crawl Errors and Duplicate Content
SEOmoz's crawl tool is telling me that I have duplicate content at "www.mydomain.com/pricing" and at "www.mydomain.com/pricing.aspx". Do you think this is just a glitch in the crawl tool (because obviously these two URL's are the same page rather than two separate ones) or do you think this is actually an error I need to worry about? Is so, how do I fix it?
Technical SEO | | MyNet0