Duplicate Content?
-
My site has been archiving our newsletters since 2001. It's been helpful because our site visitors can search a database for ideas from those newsletters. (There are hundreds of pages with similar titles: archive1-Jan2000, archive2-feb2000, archive3-mar2000, etc.)
But, I see they are being marked as "similar content." Even though the actual page content is not the same. Could this adversely affect SEO? And if so, how can I correct it?
Would a separate folder of archived pages with a "nofollow robot" solve this issue? And would my site visitors still be able to search within the site with a nofollow robot?
-
Cool. No worries
StackOverFlow has always been awesome in helping me with my IIS rules and such.
If you Google: site:stackoverflow.com apache redirect
You will see MANY examples of how to set up 301 redirects, including redirecting from non-www to www pages, etc.
Hope this helps.
Mike
-
Yes, on Google webmaster...sorry. And it's apache.
thank u!
-
Google Analytics or Google Webmaster Tools? You will need to do that in Webmaster Tools.
That is a bummer they are having issues with your 301 redirects. If you know whether you are using Apache, IIS, etc. for your backend, you could post the code you are using in a new question and hopefully someone in the SEOMoz community can help; otherwise, there are Apache and IIS forums where you can post and get some great results and/or examples to base your redirects off of too.
Good luck Sarah! I hope you get your site in shape and back on page 1!!!
Mike
-
HI Mike,
Thank you. To change all the titles is a huge task, there are hundreds and hundreds of pages. I think I'll put them in a folder and mark the page link to that folder with a nofollow. As to the canoncalization of the two names, I have marked one of them as the top one in Google Analytics. But I have a much greater problem than that. I have several domain names that are on the same server and that all point to the one domain (same files and folders). I have been attempting to get my server techs to do a 301 redirect so that only http://www.sundayschoolnetwork.com displays in a browser. However, every time they attempt to do it, part or all of my site stops working correctly.
-
You can go back and fix all of your old title tags, making them unique, like Newsletter Archive | Month Year | Sunday School Network, which will get rid of your errors and provide a better user experience. This approach will allow you to target specific keywords on each page for ranking in Google. When you have the same title across multiple pages, the assumption is that the content is either the same or very similar.
I noticed you have a canonical issue, where you can access your site via http://sundayschoolnetwork.com as well as http://www.sundayschoolnetwork.com
The issue with this, that you have 44 relatively important links from external websites pointing to the non-www version (http://sundayschoolnetwork.com)... which means you are splitting up your potential power between two sites instead of one. There are many ways you can fix this.
As for why you are not ranking as well, it could be the market became more competitive for the keywords you were originally using. It could be that your site content does not reflect the keywords you are targeting. It could be lots of things.
Like I said in my previous post, the nofollow tells crawlers not to follow the internal and external links on those pages; however, they will still get indexed. This means that you will still have duplicate titles appearing in results. The way to remove them from the results would be to use the noindex directive - which will eventually remove them from the index and you will not have competing title tags.
If you fix your title tags, you do not need to worry about the nofollow or noindex directives.
That is about all I can help with, without knowing any additional information.
The only other thing I can suggest is to read the SEOMoz Beginners Guide to SEO - which will help a TON!
I hope that helps.
Mike
-
thank u. I'm gonna do that!
-
Hi Mike,
That was fast. I copied some of the report from Seomoz "Crawled Diagnostics." Some do have the same titles, which was an edition after many years. The early newsletters I didn't even title, so they have a "default title" of the url.
I happened on SEOmoz, because I am trying to figure out why after so many years of having been on the first or second page of Google search results, we are lucky to show up on page 10 or deeper, if at all.
So I'm trying out SEOmoz to see if this will help us get back on top!
|
The Sunday School Teacher's Network Newsletter - Great ideas for children's ministry!
http://sundayschoolnetwork.com/archive13_Apr10.html 1 18 1 The Sunday School Teacher's Network Newsletter - Great ideas for children's ministry!
http://sundayschoolnetwork.com/archive13_Apr11.html 1 18 1 The Sunday School Teacher's Network Newsletter - Great ideas for children's ministry!
http://sundayschoolnetwork.com/archive13_Apr12.html 1 18 1 http://sundayschoolnetwork.com/archive13_Feb06.html
http://sundayschoolnetwork.com/archive13_Feb06.html 1 18 1 http://sundayschoolnetwork.com/archive13_Feb07.html
http://sundayschoolnetwork.com/archive13_Feb07.html 1 18 1 The Sunday School Teacher's Network Newsletter - Great ideas for children's ministry!
http://sundayschoolnetwork.com/archive14_Apr08.html 1 18 1 The Sunday School Teacher's Network Newsletter - Great ideas for children's ministry!
http://sundayschoolnetwork.com/archive14_Apr09.html 1 18 1 The Sunday School Teacher's Network Newsletter - Great ideas for children's ministry!
http://sundayschoolnetwork.com/archive14_Apr11.html 1 18 1 The Sunday School Teacher's Network Newsletter - Great ideas for children's ministry!
http://sundayschoolnetwork.com/archive14_Apr12.html 1 18 1 http://sundayschoolnetwork.com/archive14_Feb06.html
-
Hi Sarah,
If the titles are different and the page content is different, I do not understand why you should be getting any errors.
What tool are you using that is giving you the "similar content" message?
Your site visitors will still be able to search your site with nofollow in place, because nofollow is simply a directive telling search engines to not follow the internal and external links on your page.
The noindex directive tells Google to not index the content on the selected pages.
If you can provide me with the name of the tool you are receiving the "similar content" message from and/or provide me with your website address I could take a look into things further.
... long story short, if your titles are unique and your content is unique, you should not have to worry about duplicate content.
Hope this helps,
Mike
-
The best way to go is to put all your newsletters in on folder and and disallow the folder in your robot.txt.
rel nofollow & robot.txt are only read by google bot, your visitors won't be affected and will be able to navigate & search the archives without problem.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate Content due to CMS
The biggest offender of our website's duplicate content is an event calendar generated by our CMS. It creates a page for every day of every year, up to the year 2100. I am considering some solutions: 1. Include code that stops search engines from indexing any of the calendar pages 2. Keep the calendar but re-route any search engines to a more popular workshops page that contains better info. (The workshop page isn't duplicate content with the calendar page). Are these solutions possible? If so, how do the above affect SEO? Are there other solutions I should consider?
Technical SEO | | ycheung0 -
Content relaunch without content duplication
We write great Content for blog and websites (or at least we try), especially blogs. Sometimes few of them may NOT get good responses/reach. It could be the content which is not interesting, or the title, or bad timing or even the language used. My question for the discussion is, what will you do if you find the content worth audience's attention missed it during its original launch. Is that fine to make the text and context better and relaunch it ? For example: 1. Rechristening the blog - Change Title to make it attractive
Technical SEO | | macronimous
2. Add images
3. Check spelling
4. Do necessary rewrite, spell check
5. Change the timeline by adding more recent statistics, references to recent writeups (external and internal blogs for example), change anything that seems outdated Also, change title and set rel=cannoical / 301 permanent URLs. Will the above make the blog new? Any ideas and tips to do? Basically we like to refurbish (:-)) content that didn't succeed in the past and relaunch it to try again. If we do so will there be any issues with Google bots? (I hope redirection would solve this, But still I want to make sure) Thanks,0 -
Javascript tabbed navigation and duplicate content
I'm working on a site that has four primary navigation links and under each is a tabbed navigation system for second tier items. The primary link page loads content for all tabs which are javascript controlled. Users will click the primary navigation item "Our Difference" (http://www.holidaytreefarm.com/content.cfm/Our-Difference) and have several options with each tabs content in separate sections. Each second tier tab is also available via sitemap/direct link (ie http://www.holidaytreefarm.com/content.cfm/Our-Difference/Tree-Logistics) without the js navigation so the content on this page is specific to the tab, not all tabs. In this scenario, will there be duplicate content issues? And, what is the best way to remedy this? Thanks for your help!
Technical SEO | | Total-Design-Shop0 -
How different does content need to be to avoid a duplicate content penalty?
I'm implementing landing pages that are optimized for specific keywords. Some of them are substantially the same as another page (perhaps 10-15 words different). Are the landing pages likely to be identified by search engines as duplicate content? How different do two pages need to be to avoid the duplicate penalty?
Technical SEO | | WayneBlankenbeckler0 -
Magento Multistore and Duplicate Content
Hey all, I am currently optimizing a Magento Multistore running with two store views (one per language). Now when I switch from one language to another the urls shows: mydomain.de/.../examplepage.html?___store=german&___from_store=english The same page can also be reached by just entering mydomain.de/.../examplepage.html The question is: Does Google consider this as Duplicate Content or it it nothing to worry about? Or should I just do a dynamic 301 redirect from the 1st version to the 2nd? I read about some hacks posted in diferent magento forums but as I am working for a customer I want to avoid hacks. Also setting "Add Store Code to Urls" didn't help.
Technical SEO | | dominator0 -
301 duplicate content dynamic url
I have a number of pages that appear as duplicate titles in google webmaster. They all have to do with a brand name query. I want to 301 these pages since I'm going to relaunch my new website on wordpress and don't want to have 404s on these pages. a simple 301 redirect doesn't work since they are dynamic urls. here is an example: /kidsfashionnetherlands/mimpi.html?q=brand%3Amim+pi%3A&page=2&sort=relevance /kidsfashionnetherlands/mimpi.html?q=mim+pi&page=3&sort=relevance /kidsfashionnetherlands/mimpi.html?q=mim+pi&page=5&sort=relevance should all be 301 to the original page that I want to remain indexed: /kidsfashionnetherlands/mimpi.html I have a lot of these but for different queries. Should I do a 301 on each of them to avoid having 404s when I change my site to wordpress? Thanks
Technical SEO | | dashinfashion0 -
Duplicate Page Content
I've got several pages of similar products that google has listed as duplicate content. I have them all set up with rel="prev" and rel="next tags telling google that they are part of a group but they've still got them listed as duplicates. Is there something else I should do for these pages or is that just a short falling of googles webmaster tools? One of the pages: http://www.jaaronwoodcountertops.com/wood-countertop-gallery/walnut-countertop-9.html
Technical SEO | | JAARON0 -
The Bible and Duplicate Content
We have our complete set of scriptures online, including the Bible at http://lds.org/scriptures. Users can browse to any of the volumes of scriptures. We've improved the user experience by allowing users to link to specific verses in context which will scroll to and highlight the linked verse. However, this creates a significant amount of duplicate content. For example, these links: http://lds.org/scriptures/nt/james/1.5 http://lds.org/scriptures/nt/james/1.5-10 http://lds.org/scriptures/nt/james/1 All of those will link to the same chapter in the book of James, yet the first two will highlight the verse 5 and verses 5-10 respectively. This is a good user experience because in other sections of our site and on blogs throughout the world webmasters link to specific verses so the reader can see the verse in context of the rest of the chapter. Another bible site has separate html pages for each verse individually and tends to outrank us because of this (and possibly some other reasons) for long tail chapter/verse queries. However, our tests indicated that the current version is preferred by users. We have a sitemap ready to publish which includes a URL for every chapter/verse. We hope this will improve indexing of some of the more popular verses. However, Googlebot is going to see some duplicate content as it crawls that sitemap! So the question is: is the sitemap a good idea realizing that we can't revert back to including each chapter/verse on its own unique page? We are also going to recommend that we create unique titles for each of the verses and pass a portion of the text from the verse into the meta description. Will this perhaps be enough to satisfy Googlebot that the pages are in fact unique? They certainly are from a user perspective. Thanks all for taking the time!
Technical SEO | | LDS-SEO0