How do I best handle Duplicate Content on an IIS site using 301 redirects?
-
The crawl report for a site indicates the existence of both www and non-www content, which I am aware is duplicate. However, only the www pages are indexed**, which is throwing me off. There are not any 'no-index' tags on the non-www pages and nothing in robots.txt and I can't find a sitemap. I believe a 301 redirect from the non-www pages is what is in order. Is this accurate?
I believe the site is built using asp.net on IIS as the pages end in .asp. (not very familiar to me) There are multiple versions of the homepage, including 'index.html' and 'default.asp.' Meta refresh tags are being used to point to 'default.asp'.
What has been done:
1. I set the preferred domain to 'www' in Google's Webmaster Tools, as most links already point to www.
2. The Wordpress blog which sits in a /blog subdirectory has been set with rel="canonical" to point to the www version.
What I have asked the programmer to do:
1. Add 301 redirects from the non-www pages to the www pages.
2. Set all versions of the homepage to redirect to www.site.org using 301 redirects as opposed to meta refresh tags.
Have all bases been covered correctly?
One more concern: I notice the canonical tags in the source code of the blog use a trailing slash - will this create a problem of inconsistency? (And why is rel="canonical" the standard for Wordpress SEO plugins while 301 redirects are preferred for SEO?)
Thanks a million!
**To clarify regarding the indexation of non-www pages: A search for 'site:site.org -inurl:www' returns only 7 pages without www which are all blog pages without content (Code 200, not 404 - maybe deleted or moved - which is perhaps another 301 redirect issue).
-
I realized the question was a bit wordy and disorganized so I reworded it and posted it here. http://moz.com/community/q/what-s-my-best-strategy-for-duplicate-content-if-only-www-pages-are-indexed
Totally answered!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Redirect closed shop to main shop, or keep the domain and content alive and use it for link building?
Hello, We used to have two shops selling our products, a small shop with a small selection of only our best quality products (domain smallshop.com), and a big shop with everything (bigshop.com). It used to make sense (without going into full detail), but it's not relevant anymore, and so we decided to stop maintaining the small shop, because it was time consuming and not worth it. There is some really good links pointing to smallshop.com, and the content is original (the product descriptions are different between both shops). So far, we just switch the "add to cart" button on the small shop into a link to the same product on the big shop, and did links from the small shop to the big shop also on categories pages. So the question is: in your opinion, is it better to do that, keep the small shop and content alive and build links to our big shop, or do 301 redirections and shut down completely the small shop ? Thanks for your opinion!
Intermediate & Advanced SEO | | Colage0 -
Can I use duplicate content in different US cities without hurting SEO?
So, I have major concerns with this plan. My company has hundreds of facilities located all over the country. Each facility has it's own website. We have a third party company working to build a content strategy for us. What they came up with is to create a bank of content specific to each service line. If/when any facility offers that service, they then upload the content for that service line to that facility website. So in theory, you might have 10-12 websites all in different cities, with the same content for a service. They claim "Google is smart, it knows its content all from the same company, and because it's in different local markets, it will still rank." My contention is that duplicate content is duplicate content, and unless it is "localize" it, Google is going to prioritize one page of it and the rest will get very little exposure in the rankings no matter where you are. I could be wrong, but I want to be sure we aren't shooting ourselves in the foot with this strategy, because it is a major major undertaking and too important to go off in the wrong direction. SEO Experts, your help is genuinely appreciated!
Intermediate & Advanced SEO | | MJTrevens1 -
Pages that did NOT 301 redirect to the new site
Hi, Is there a tool out there that can tell me what pages did NOT 301 redirect to the new sites? I need something rather than going into google.com and typing in site:oldsite.com to see if it's still indexed and if it's not 301 redirecting.. I'm not sure if screaming frog can do that. Thanks.
Intermediate & Advanced SEO | | ggpaul5620 -
Handling duplicate content, whilst making both rank well
Hey MOZperts, I run a marketplace called Zibbet.com and we have 1000s of individual stores within our marketplace. We are about to launch a new initiative giving all sellers their own stand-alone websites. URL structure:
Intermediate & Advanced SEO | | relientmark
Marketplace URL: http://www.zibbet.com/pillowlink
Stand-alone site URL: http://pillowlink.zibbet.com (doesn't work yet) Essentially, their stand-alone website is a duplicate of their marketplace store. Same items (item title, description), same seller bios, same shop introduction content etc but it just has a different layout. You can scroll down and see a preview of the different pages (if that helps you visualize what we're doing), here. My Questions: My desire is for both the sellers marketplace store and their stand-alone website to have good rankings in the SERPS. Is this possible? Do we need to add any tags (e.g. "rel=canonical") to one of these so that we're not penalized for duplicate content? If so, which one? Can we just change the meta data structure of the stand-alone websites to skirt around the duplicate content issue? Keen to hear your thoughts and if you have any suggestions for how we can handle this best. Thanks in advance!0 -
Should I redirect images when I migrate my site
We are about to migrate a large website with a fair few images (20,000). At the moment we include images in the sitemap.xml so they are indexed by Google and drive traffic (not sure how I can find out how much though). Current image slugs are like:
Intermediate & Advanced SEO | | ArchMedia
http://website.com/assets/images/a2/65680/thumbnails/638x425-crop.jpg?1402460458 Like on the old site, images on the new website will also have unreadable cache slugs, like:
http://website.com/site_media/media/cache/ce/7a/ce7aeffb1e5bdfc8d4288885c52de8e3.jpg All content pages on the new site will have the same slugs as on the old site. Should I go through the trouble of redirecting all these images?0 -
It appears that Googlebot Mobile will look for mobile redirects from the desktop site, but still use the SEO from the desktop site.
Is the above statement correct? I've read that its better to have different SEO titles & descriptions for mobile sites as users search differently on mobile devices. I've also read it's good to link build, keep text content on mobile sites etc to get the mobile site to rank. If I choose to not have titles & descriptions on my mobile site will Google just rank our desktop version & then redirect a user on a mobile device to our mobile site or should I be adding in titles & descriptions into the mobile site? Thanks so much for any help!
Intermediate & Advanced SEO | | DCochrane0 -
BEING PROACTIVE ABOUT CONTENT DUPLICATION...
So we all know that duplicate content is bad for SEO. I was just thinking... Whenever I post new content to a blog, website page etc...there should be something I should be able to do to tell Google (in fact all search engines) that I just created and posted this content to the web... that I am the original source .... so if anyone else copies it they get penalised and not me... Would appreciate your answers... 🙂 regards,
Intermediate & Advanced SEO | | TopGearMedia0 -
Duplicate page Content
There has been over 300 pages on our clients site with duplicate page content. Before we embark on a programming solution to this with canonical tags, our developers are requesting the list of originating sites/links/sources for these odd URLs. How can we find a list of the originating URLs? If you we can provide a list of originating sources, that would be helpful. For example, our the following pages are showing (as a sample) as duplicate content: www.crittenton.com/Video/View.aspx?id=87&VideoID=11 www.crittenton.com/Video/View.aspx?id=87&VideoID=12 www.crittenton.com/Video/View.aspx?id=87&VideoID=15 www.crittenton.com/Video/View.aspx?id=87&VideoID=2 "How did you get all those duplicate urls? I have tried to google the "contact us", "news", "video" pages. I didn't get all those duplicate pages. The page id=87 on the most of the duplicate pages are not supposed to be there. I was wondering how the visitors got to all those duplicate pages. Please advise." Note, the CMS does not create this type of hybrid URLs. We are as curious as you as to where/why/how these are being created. Thanks.
Intermediate & Advanced SEO | | dlemieux0