Another deduplication question.
-
Where an existing website has duplicate content issues - specifically the www. and non-www. type; what is the most effective way to inform the searchers and spiders that there is only one page?
I have a site where the ecommerce software (Shopfitter 4) allows a fair bit of meta data to be inserted into each product page but I am uncertain, after a couple of attempts to deduplicate some pages, which is the most effective way to ensure that the www related duplication is eliminated sitewide - there is such a solution.
I have to own up to having looked at
,htaccess
301 redirects
webmaster tools
and become increasingly bamboozled by the conflicting advice as to which is the most effective way or combination to get rid of this problem. too olod to learn new tricks I reckon
Your help and clarification would be appreciated as this may help head off more fruitless work.
-
no. the rewrite rule will apply to all URLs
-
Quick tip:
Usually you can just contact your Hosting company and ask them to do the 301 redirect for you if you feel uneasy tampering with code on the server.
/ G
-
BTW, my answer is for a Apache server.... This means don't use it if its Microsoft...
-
Hi again!
Here we go:
Just input following to .htaccess file:
RewriteEngine On
RewriteCond %{HTTP_HOST} ^example.com
RewriteRule (.*) http://www.example.com/$1 [R=301,L](replace example withyour site )
This should do the trick for the whole site.
/ Gustav
-
Blimey gustav - that was quick. In the htaccess do you need to specify each separtae page url or is there a way of setting it site wide?
Many thanks for taking the time to answer.
Ray
-
Hi there!
Use An 301 redirect you can do this in hte .htaccess file.
Submit xml sitemap to Google webmaster tools with the correct adress(with www)
You will soon be rid of the duplicated pages if you do this.
Best
/ Gustav
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Hypothetical SEO Question
I am running a website for a law firm. It has been running for many, many years and has plenty of backlinks and authority. I then create a standalone website for a specific type of case that the law firm is handling. On that website, I have a page that copies some of the attorney bio text from the main website. How much of a negative impact will this standalone website have on the main website as far as duplicate content issues are concerned? Please explain your answer in detail. Thank you in advance.
Intermediate & Advanced SEO | | goldbergweismancairo0 -
Question regarding geo-targeting in Google Webmaster Tools.
I understand that it's possible to target both domains/subdomains and subfolders to different geographical regions in GWT. However, I was wondering about the effect of targeting the domain to a single country, say the UK. Then targeting subfolders to other regions (say the US and France). e.g. www.domain.com -> UK
Intermediate & Advanced SEO | | TranslateMediaLtd
www.domain.com/us -> US
www.domain.com/fr -> France etc Would it be better to leave the main domain without a geographical target but set geo-targeting for the subfolders? Or would it be best to set geo-targeting for both the domain and subfolders.0 -
Question about New Client with Manual Actions / Partial Matches in GWT
We just signed on a new client and are gaining access to their Analytics, GWT, etc... In GWT, we quickly went to "Manual Actions" as the client stated they've been slipping in rankings the past couple months from 1 to 4 to 8 and have been staying at around 7/8 for 15 of their main keywords. Without getting into the specifics of their keyword rankings, I'm curious to know when they may have received the Partial Matches Manual Action from Google. I checked Messages and saw nothing about the Manual Actions update. Can anyone lend some advice as we are most likely going to have to put together a Disavow text file and begin sending requests to take down links. Thank you in advance. Hope this was clear enough, but let me know if you need more info. Patrick uOGsyKh.jpg
Intermediate & Advanced SEO | | WhiteboardCreations0 -
2 links from the same external page question
Hi, I have always thought if 2 links on a single page, both going to the same url wouldnt pass PR from both. I watched a Matt Cutts vid and he was saying in the original algo it was built in that both links would pass PR. So for example if I guest posted say 1000 words and this article had 2 links pointing to the same url would they both work? Cheers
Intermediate & Advanced SEO | | Bondara0 -
Removing Content 301 vs 410 question
Hello, I was hoping to get the SEOmoz community’s advice on how to remove content most effectively from a large website. I just read a very thought-provoking thread in which Dr. Pete and Kerry22 answered a question about how to cut content in order to recover from Panda. (http://www.seomoz.org/q/panda-recovery-what-is-the-best-way-to-shrink-your-index-and-make-google-aware). Kerry22 mentioned a process in which 410s would be totally visible to googlebot so that it would easily recognize the removal of content. The conversation implied that it is not just important to remove the content, but also to give google the ability to recrawl that content to indeed confirm the content was removed (as opposed to just recrawling the site and not finding the content anywhere). This really made lots of sense to me and also struck a personal chord… Our website was hit by a later Panda refresh back in March 2012, and ever since then we have been aggressive about cutting content and doing what we can to improve user experience. When we cut pages, though, we used a different approach, doing all of the below steps:
Intermediate & Advanced SEO | | Eric_R
1. We cut the pages
2. We set up permanent 301 redirects for all of them immediately.
3. And at the same time, we would always remove from our site all links pointing to these pages (to make sure users didn’t stumble upon the removed pages. When we cut the content pages, we would either delete them or unpublish them, causing them to 404 or 401, but this is probably a moot point since we gave them 301 redirects every time anyway. We thought we could signal to Google that we removed the content while avoiding generating lots of errors that way… I see that this is basically the exact opposite of Dr. Pete's advice and opposite what Kerry22 used in order to get a recovery, and meanwhile here we are still trying to help our site recover. We've been feeling that our site should no longer be under the shadow of Panda. So here is what I'm wondering, and I'd be very appreciative of advice or answers for the following questions: 1. Is it possible that Google still thinks we have this content on our site, and we continue to suffer from Panda because of this?
Could there be a residual taint caused by the way we removed it, or is it all water under the bridge at this point because Google would have figured out we removed it (albeit not in a preferred way)? 2. If there’s a possibility our former cutting process has caused lasting issues and affected how Google sees us, what can we do now (if anything) to correct the damage we did? Thank you in advance for your help,
Eric1 -
2013 Panda Update Question
Hi everyone, I'm new here 🙂 So far I've had wonderful success seo wise and none of the updates (Penguin nor Panda) affected any sites, until this one. For example, one site has 7 keywords I'm optimizing for. Out of those 7, all but 2 (and variations of the 2 - one word vs long-tail) completely tanked. These keywords were all on page 2/3. One of the two survivors never budged from page 2 (it's a brand keyword so I was sooo happy to finally get it to page 2) Now when I check rankings, the other terms show up in the 200-400 spots, but NOT for the URL I was optimizing for (category page) but instead for random products in the category. The only thing I've done differently with the 2 keywords that are still doing well, was focus - we did more link-building for those, but not an extreme amount. Never over-optimize. My question is, how did 2 survive and 5 are still floating up and down. Last night I saw one go up 122 spots, now today down 14. I'm really struggling with this. Thank you
Intermediate & Advanced SEO | | Freelancer130 -
Duplicate Content Question
Brief question - SEOMOZ is teling me that i have duplicate content on the following two pages http://www.passportsandvisas.com/visas/ and http://www.passportsandvisas.com/visas/index.asp The default page for the /visas/ directory is index.asp - so it effectively the same page - but apparently SEOMOZ and more importantly Google, etc treat these as two different pages. I read about 301 redirects etc, but in this case there aren't two physical HTML pages - so how do I fix this?
Intermediate & Advanced SEO | | santiago230 -
Few questions regarding wordpress and indexing/no follow.
I'm using Yoast's Wordpress SEO plugin on my wordpress site which allows you to quickly set up nofollow / no index on specific taxonomies. I wanted to see what you guys thought was the best practice in setting up my various taxonomies. Would you noidex, but follow all of these, none of these, or just some of these: Categories, tags, media, author archives ( (My blog is mainly a single author blog (me) but my wife does sometimes write posts. So I didn't know how this effected everything. Also I could simply make the blog a single user blog and just have her posts be guest posts, but I'd rather leave her as a user.), and date archives. The example I read on line only no-index's the date archives. Just curious what you guys thought. Thanks.
Intermediate & Advanced SEO | | NoahsDad0