Another deduplication question.
-
Where an existing website has duplicate content issues - specifically the www. and non-www. type; what is the most effective way to inform the searchers and spiders that there is only one page?
I have a site where the ecommerce software (Shopfitter 4) allows a fair bit of meta data to be inserted into each product page but I am uncertain, after a couple of attempts to deduplicate some pages, which is the most effective way to ensure that the www related duplication is eliminated sitewide - there is such a solution.
I have to own up to having looked at
,htaccess
301 redirects
webmaster tools
and become increasingly bamboozled by the conflicting advice as to which is the most effective way or combination to get rid of this problem. too olod to learn new tricks I reckon
Your help and clarification would be appreciated as this may help head off more fruitless work.
-
no. the rewrite rule will apply to all URLs
-
Quick tip:
Usually you can just contact your Hosting company and ask them to do the 301 redirect for you if you feel uneasy tampering with code on the server.
/ G
-
BTW, my answer is for a Apache server.... This means don't use it if its Microsoft...
-
Hi again!
Here we go:
Just input following to .htaccess file:
RewriteEngine On
RewriteCond %{HTTP_HOST} ^example.com
RewriteRule (.*) http://www.example.com/$1 [R=301,L](replace example withyour site )
This should do the trick for the whole site.
/ Gustav
-
Blimey gustav - that was quick. In the htaccess do you need to specify each separtae page url or is there a way of setting it site wide?
Many thanks for taking the time to answer.
Ray
-
Hi there!
Use An 301 redirect you can do this in hte .htaccess file.
Submit xml sitemap to Google webmaster tools with the correct adress(with www)
You will soon be rid of the duplicated pages if you do this.
Best
/ Gustav
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Http to https Canonical Question
Hello Fellow Moz Friends I have recently went from http to https for the website. Do I keep my canonicals at http or make all https? Will this affect ranking signals? Anything I should be looking out for? Thank you.
Intermediate & Advanced SEO | | Carwrapsolutions0 -
Question about Syntax in Robots.txt
So if I want to block any URL from being indexed that contains a particular parameter what is the best way to put this in the robots.txt file? Currently I have-
Intermediate & Advanced SEO | | DRSearchEngOpt
Disallow: /attachment_id Where "attachment_id" is the parameter. Problem is I still see these URL's indexed and this has been in the robots now for over a month. I am wondering if I should just do Disallow: attachment_id or Disallow: attachment_id= but figured I would ask you guys first. Thanks!0 -
Parked Domain question
Hi, If a domain has been parked for more than 12 years, and has never been used for a project so far, does this has an impact on SEO or its like having a fresh new domain? Sebi
Intermediate & Advanced SEO | | TheHecksler0 -
Sites in multiple countries using same content question
Hey Moz, I am looking to target international audiences. But I may have duplicate content. For example, I have article 123 on each domain listed below. Will each content rank separately (in US and UK and Canada) because of the domain? The idea is to rank well in several different countries. But should I never have an article duplicated? Should we start from ground up creating articles per country? Some articles may apply to both! I guess this whole duplicate content thing is quite confusing to me. I understand that I can submit to GWT and do geographic location and add rel=alternate tag but will that allow all of them to rank separately? www.example.com www.example.co.uk www.example.ca Please help and thanks so much! Cole
Intermediate & Advanced SEO | | ColeLusby0 -
Quick Question: Is it Bad for SEO to paste from Word to your CMS?
Hey just a quick question I'm having trouble finding a definitive answer to: Is the markup that is transferred from Word docs bad for SEO? We are managing to paste it and it looks fine, but the developers are worried that the extra code will be bad for SEO. Does anyone have solution besides pasting into Text Editor and formatting in the CMS? Is this necessary or can we just leave the extra code? Thank you!
Intermediate & Advanced SEO | | keL.A.xT.o0 -
Advanced SEO question.
Hi, I manage and do the SEO for this site: www.aerlawgroup.com. If you Google "Los Angeles Criminal Defense Attorney", you can see I rank well (1st page). I have managed to achieve similar rankings for interior pages within the site: www.aerlawgroup.com/domestic-violence.html (Google: "Los Angeles Domestic Violence Attorney".) Here is my question. No matter how hard I try, I cannot get to the first page on Google for the search term: "Los Angeles DUI Lawyer", for the following interior page: www.aerlawgroup.com/dui.html. Is there anyway that I can pass the authority/ranking (not sure what to call it) that I have for www.aerlawgroup.com to www.aerlawgroup.com/dui.html so that internal page ranks higher for "Los Angeles DUI Lawyer"? I apologize if my question doesn't make sense. In a nutshell, I'm trying to understand if there is anyway to use the ranking I have for www.aerlawgroup.com to help me rank higher for Los Angeles DUI lawyer for the dui interior page. If not, are there any other suggestions anyone has to achieve a higher ranking? Thanks!
Intermediate & Advanced SEO | | mrodriguez14400 -
High level rel=canonical conceptual question
Hi community. Your advice and perspective is greatly appreciated. We are doing a site replatform and I fear that serious SEO fundamentals were overlooked and I am not getting straight answers to a simple question: How are we communicating to search engines the single URL we want indexed? Backstory: Current site has major duplicate content issues. Rel-canonical is not used. There are currently 2 versions of every category and product detail page. Both are indexed in certain instances. A 60 page audit has recommends rel=canonical at least 10 times for the similar situations an ecommerce site has with dupe urls/content. New site: We are rolling out 2 URLS AGAIN!!! URL A is an internal URL generated by the systerm. We have developed this fancy dynamic sitemap generator which looks/maps to URL A and creates a SEO optimized URL that I call URL B. URL B is then inserted into the site map and the sitemap is communicated externally to google. URL B does an internal 301 redirect back to URL A...so in an essence, the URL a customer sees is not the same as what we want google to see. I still think there is potential for duplicate indexing. What do you think? Is rel=canonical the answer? In my research on this site, past projects and google I think the correct solution is this on each customer facing category and pdp: The head section (With the optimized Meta Title and Meta Description) needs to have the rel-canonical pointing to URL B
Intermediate & Advanced SEO | | mm916157
example of the meta area of URL A: What do you think? I am open to all ideas and I can provide more details if needed.0 -
Questions about Vittana.org's blogging contest and having bloggers use specific anchor text.
Hi All, Kenji Crosland here. I just joined vittana.org (yesterday!) to do some of the blogger outreach and content creation/link building. Although most of the links we've gotten in the past are branded links, we've decided to actively pursue anchor text links with specific keywords. If you check, you'll see that vittana has a relatively high domain authority. At the beginning of next week we'll be conducting a blogging contest with A-list celebrity tech bloggers. I don't think we'll have time to contact influencers in other areas for this contest unfortunately. When these A-list bloggers write their posts, we want them to have a link to this page: http://www.vittana.org/students To me, this seems a great opportunity to win on certain keywords we've discovered that should be easy to win and yet have a high volume of monthly searches. These are 5 word plus keywords that have over 300,000 searches per month. The students page, however, isn't optimized for those keywords. In the long run we want to win for the more difficult keyword "literacy". The word "literacy" is what we think will be a part of our new tagline: "Literacy is not enough". Because of time constraints, we won't be able to create landing pages to win for those "low hanging fruit" keywords in time for the blog contest. My question is: to what extent should we optimize the http://www.vittana.org/students page for the five word plus low hanging fruit keywords that we've discovered. I imagine if the content isn't relevant our clickthrough rates will suffer even if we do win for it (Altering our meta description is a possibility here) . Should we just try for the difficult keyword from the get go and come up with other ways to win for the low hanging fruit keywords? I'd love to hear your thoughts on this.
Intermediate & Advanced SEO | | vittana_seo0