Duplicate content issue in SEOmoz campaign.
-
Hi,
We are running a campaign for a website in SEOmoz.
We get a dup content issue warning:
http://www.oursite.com and http://www.oursite.com/ are being seen as 2 different urls.
Only difference among 2 urls is the trailing slash at the end of the second url.
Why is this happening? I was aware of www vs non www but never heard of an issue related to the slash.
Thanks for your help!
-
Nice to know this. I have had the same issue arise out of nowehre about week maybe two back. It took awhile but I found the same htaccess quick fix but it seems as if it was an seomoz alogorithim change up. Robot Roger works his butt off in the basment or so seomoz tells me he must have found some free time to switch up the campaign errors alogorithim.
-
#removes trailing slash
RewriteCond %{REQUEST_FILENAME} !-f
RewriteCond %{REQUEST_FILENAME} !-d
RewriteCond %{HTTP_HOST} !^.localhost$ [NC]
RewriteRule ^(.+)/$ http://%{HTTP_HOST}$1 [R=301,L]Above code segment removes trailing slashes, second code block is doing redirection from non-www to www.
In order for above code to work you will need to add two lines above it, telling Apache to enable rewrite module.
Options +FollowSymlinks
RewriteEngine onSo with that combined here is how your .htaccess file should look like
Options +FollowSymlinks
RewriteEngine on#removes trailing slash
RewriteCond %{REQUEST_FILENAME} !-f
RewriteCond %{REQUEST_FILENAME} !-d
RewriteCond %{HTTP_HOST} !^.localhost$ [NC]
RewriteRule ^(.+)/$ http://%{HTTP_HOST}$1 [R=301,L]If you are using non-Apache server such as IIS let me know and I send you configuration for it.
Kind regards
Bojan
-
So, this code:
#removes trailing slash
RewriteCond %{REQUEST_FILENAME} !-f
RewriteCond %{REQUEST_FILENAME} !-d
RewriteCond %{HTTP_HOST} !^.localhost$ [NC]
RewriteRule ^(.+)/$ http://%{HTTP_HOST}$1 [R=301,L]and this code:
REDIRECT to canonical url
RewriteCond %{HTTP_HOST} ^mysite.com [NC]
RewriteRule ^(.*)$ http://www.mysite.com/$1 [R=301,L]should be added to our .htaccess ?
Or is it just one of the two code blocks?
Thanks for your time!
-
From search engine perspective http://www.oursite.com and http://www.oursite.com/ are not the same thing.
This problem however is easily fixed, If you are using apache it's a matter of simple config file modification.
You can refer to http://www.seomoz.org/q/how-can-i-prevent-duplicate-content-between-www-page-com-and-www-page-com for more information.
Kind regards
Bojan
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
CTA first content next or Content first CTA next
We are a casino affiliations company, our website has a lot of the same casino offers. So is it beneficial to put the content over the casino offers, then do a CSS flex, reverse wrap, so the HTML has the page content first, but the visual of the page displays the casinos first and the content after? or just the usual i.e image the HTML as content first, and CSS makes offers come first?
On-Page Optimization | | JoelssonMedia0 -
Can I robots.txt an entire site to get rid of Duplicate content?
I am in the process of implementing Zendesk and will have two separate Zendesk sites with the same content to serve two separate user groups (for the same product-- B2B and B2C). Zendesk does not allow me the option to changed canonicals (nor meta tags). If I robots.txt one of the Zendesk sites, will that cover me for duplicate content with Google? Is that a good option? Is there a better option. I will also have to change some of the canonicals on my site (mysite.com) to use the zendesk canonicals (zendesk.mysite.com) to avoid duplicate content. Will I lose ranking by changing the established page canonicals on my site go to the new subdomain (only option offered through Zendesk)? Thank you.
On-Page Optimization | | RoxBrock0 -
Checking for content originality in a site
two part question on original content How would you go about checking if a site holds original content accept the long search quary within Google? ans also if I find many sites carrying my content and I am the original source should I replace the content? thanks
On-Page Optimization | | ciznerguy0 -
SEOMOZ returns duplicate title and description
Hi, When SEOMOZ returns the crawl report results and it shows duplicate descriptions, is it referring to the meta description or is it referring to the actual product description? I have up to 30 products for many different sports teams. Since I have 30 products for as many teams, most of my products are the same and he only thing I did when I populated the site was to change the team name. I left the product description the same otherwise. Thanks, Don
On-Page Optimization | | ge01734000 -
How do I avoid duplicate content and page title errors when using a single CMS for a website
I am currently hosting a client site on a CMS with both a Canadian and USA version of the website. We have the .com as the primary domain and the .ca is re-directed from the registrar to the Canadian home page. The problem I am having is that my campaign produces errors for duplicate page content and duplicate page titles. Is there a way to setup the two versions on the CMS so that these errors do not get produced? My concern is getting penalized from search engines. Appreciate any help. Mark Palmer
On-Page Optimization | | kpreneur0 -
User experience regarding dulpicate content and managing this content with google.
Hi long title i know! We are moving on to magento and have chosen to allocate a specific colour to each category using corresponding tabbed navigation for user experience.All products within each of the coloured tabs then inherit the repective colour, giving the products a category identiy within the store. This layout has had a positive feedback from our "testers" As a lot of our products are seasonal and can be represented in different categories there is a significant amount of duplicate content. ATM i see our options as being: Alter the site structure so that the category is not shown in the url, therefore eliminating our duplicate products. The downside of this is that the colour co-ordination of the categories would not work at product level as its the category path that assigns the colour. create canonical links for every duplicate, can this be damaging? keep the duplicates and do nothing let google decide the most important version of a product. any guidance would be appreciated!
On-Page Optimization | | LadyApollo0 -
Duplicate Page Content Issue
For one of our campaigns, we have 164 errors for Duplicate Page Content. We have a website where much of the same content lives in two different places on their website. The information needs to be accessible from both areas. What is the best way to tackle this problem? Is there anything that can be done so these pages are not competing against one another? If the only solution is to edit the content on one of the pages, how much of the content has to be different? Is there a certain percentage to go by? Here is an example of what I am referring to: 1.) http://www.valleyorthopedicassociates.com/services/foot-center/preventing-sprains-and-strains 2.) http://www.valleyorthopedicassociates.com/patient-resources/service/foot-and-ankle-center/preventing-sprains-and-strains
On-Page Optimization | | cmaseattle1 -
Filtered Navigation, Duplicate content issue on an Ecommerce Website
I have navigation that allows for multiple levels of filtering. What is the best way to prevent the search engine from seeing this duplicate content? Is it a big deal nowadays? I've read many articles and I'm not entirely clear on the solution. For example. You have a page that lists 12 products out of 100: companyname.com/productcategory/page1.htm And then you filter these products: companyname.com/productcategory/filters/page1.htm The filtered page may or may not contain items from the original page, but does contain items that are in the unfiltered navigation pages. How do you help the search engine determine where it should crawl and index the page that contains these products? I can't use rel=canonical, because the exact set of products on the filtered page may not be on any other unfiltered pages. What about robots.txt to block all the filtered pages? Will that also stop pagerank from flowing? What about the meta noindex tag on the filitered pages? I have also considered removing filters entirely, but I'm not sure if sacrificing usability is worth it in order to remove duplicate content. I've read a bunch of blogs and articles, seen the whiteboard special on faceted navigation, but I'm still not clear on how to deal with this issue.
On-Page Optimization | | 13375auc30