How to handle duplicate content with Bible verses
-
Have a friend that does a site with bible verses and different peoples thoughts or feelings on them. Since I'm an SEO he came to me with questions and duplicate content red flag popped up in my head. My clients all generate their own content so not familiar with this world. Since Bible verses appear all over the place, is there a way to address this from an SEO standpoint to avoid duplicate content issues? Thanks in advance.
-
Thanks so much everyone. Not only was it all very helpful, it was very fast. Thanks and have a great day.
-
Hi,
When quoting sources, we use the blockquote tag.
Information here: http://www.w3schools.com/tags/tag_blockquote.asp
It tells Google that 'yes, I know this isn't original, it comes from here'.
Best of luck,
Amelia
-
Your friend has to give Google a reason to send searchers to his website instead of the many well-known authority sites that have the same verses. Is he providing a lot of unique commentary, or are the pages mostly bible verses that can be found on thousands of other sites? If it's the former, then he should be okay. If it's the latter, he may want to focus on adding more unique content.
-
Hi Jeremy,
Welcome to the Moz community!
That's an interesting position to be in. On one hand, Bible verses do not change and you may want to include the verse on your page. However, that copy is most certainly around the web and can easily be identified as duplicate content.
One strategy could be:
-
Make each verse its own page, if the verse is length then try to break up the verse between multiple pages
-
On each page, expand upon the verse.
-
What information from each verse is important to the reader, explain it in plain language
-
Include an interpretation of the verse and key components of the verse
-
Explain how the verse is applicable through real world examples and experience
You can probably see the theme above - expand upon the non-original content with original content through interpretation and experience. You definitely want to outweigh the duplicate content for each verse.
If you're only trying to create an archive of the verse, then you may want to create a special area of your site that includes the original verse and have that noindexed from the search engines, to prevent any duplicate content penalties.
-
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Site architecture, inner link strategy and duplicate or thin content HELP :)
Ok, can I just say I love that Moz exists! I am still very new to this whole website stuff. I've had a site for about 2 years that I have re-designed several times. It has been published this entire time as I made changes but I am now ready to create amazing content for my niche. Trouble is my target audience is in a very focused niche and my site is really only about 1 topic - life insurance for military families. I'm a military spouse who happens to be an experience life insurance agent offering plans to active duty service members, their spouses as well as veterans and retirees. So really I have 3 niches within a niche. I'm REALLY struggling on how to set up my site architecture. My site is basically fresh so it's a good time to get it hammered down as best as possible with my limited knowledge. Might I also add this is a very competitive space. My competitors are big, established brands who offer life insurance along with unaffiliated, informational sites like military.com or the va benefits site. The people in my niche rarely actually search for life insurance because they think they are all set by the military. When they do search it's very short which is common as this niche lives in a world of acronyms. I'm going to have to get real creative to see if there are any long tail keywords I can use as supporting posts but I think my best route is to attempt to rank for the short one to three keyword phrases this niche looks for while searching. Given my expertise on the subject I am able to write long 1000-5000 content on the matter that will also point out some considerations my competitors dont really cover. My challenge is I cant see how this can be broken into sub topics without having thin supporting content. It's my understanding that I should create these in order to inner link and have a shot at ranking. In thinking about my topic I feel like the supporting posts can only be so long. Furthermore, my three niches within my small overall niche search for short but different keywords. Seems I am struggling to put it all into words. Let me stop here with a question - is it bad to have one category in a website? If not I feel like this would solve my dilemma in making a good site map and content plan. it is possible to split my main topic into 3 categories. I heard somewhere you shouldn't inner link posts from different categories. Problem is if I dont it's not ideal for the user experience as the topics really arent that different. Example a military member might be researching his/her own life insurance and be curious about his spouses coverage. In order to satisfy this user's experience and increase the time on my site I should link to where they can find more dept on their spouses coverage which would be in a different category. Is this still acceptable since it's really not a different subject?
Intermediate & Advanced SEO | | insuretheheroes.com0 -
Duplicating content from manufacturer for client site and using canonical reference.
We manage content for many clients in the same industry, and many of them wish to keep their customers on their individualized websites (understandably). In order to do this, we have duplicated content in part from the manufacturers' pages for several "models" on the client's sites. We have put in a Canonical reference at the start of the content directing back to the manufacturer's page where we duplicated some of the content. We have only done a handful of pages while we figure out the canonical reference potential issue. So, my questions are: Is this necessary? Does this hurt, help or not do anything SEO-wise for our ranking of the site? Thanks!
Intermediate & Advanced SEO | | moz1admin1 -
Duplicate content - how to diagnose duplicate content from another domain before publishing pages?
Hi, 🙂 My company is having new distributor contract, and we are starting to sell products on our own webshop. Bio-technology is an industry in question and over 1.000 products. Writing product description from scratch would take many hours. The plan is to re-write it. With permission from our contractors we will import their 'product description' on our webshop. But, I am concerned being penalies from Google for duplicate content. If we re-write it we should be fine i guess. But, how can we be sure? Is there any good tool for comparing only text (because i don't want to publish the pages to compare URLs)? What else should we be aware off beside checking 'product description' for duplicate content? Duplicate content is big issue for all of us, i hope this answers will be helpful for many of us. Keep it hard work and thank you very much for your answers, Cheers, Dusan
Intermediate & Advanced SEO | | Chemometec0 -
Best to Fix Duplicate Content Issues on Blog If URLs are Set to "No-Index"
Greetings Moz Community: I purchased a SEMrush subscription recently and used it to run a site audit. The audit detected 168 duplicate content issues mostly relating to blog posts tags. I suspect these issues may be due to canonical tags not being set up correctly. My developer claims that since these blog URLs are set to "no-index" these issues do not need to be corrected. My instinct would be to avoid any risk with potential duplicate content. To set up canonicalization correctly. In addition, even if these pages are set to "no-index" they are passing page rank. Further more I don't know why a reputable company like SEMrush would consider these errors if in fact they are not errors. So my question is, do we need to do anything with the error pages if they are already set to "no-index"? Incidentally the site URL is www.nyc-officespace-leader.com. I am attaching a copy of the SEMrush audit. Thanks, Alan BarjWaO SqVXYMy
Intermediate & Advanced SEO | | Kingalan10 -
About robots.txt for resolve Duplicate content
I have a trouble with Duplicate content and title, i try to many way to resolve them but because of the web code so i am still in problem. I decide to use robots.txt to block contents that are duplicate. The first Question: How do i use command in robots.txt to block all of URL like this: http://vietnamfoodtour.com/foodcourses/Cooking-School/
Intermediate & Advanced SEO | | magician
http://vietnamfoodtour.com/foodcourses/Cooking-Class/ ....... User-agent: * Disallow: /foodcourses ( Is that right? ) And the parameter URL: h
ttp://vietnamfoodtour.com/?mod=vietnamfood&page=2
http://vietnamfoodtour.com/?mod=vietnamfood&page=3
http://vietnamfoodtour.com/?mod=vietnamfood&page=4 User-agent: * Disallow: /?mod=vietnamfood ( Is that right? i have folder contain module, could i use: disallow:/module/*) The 2nd question is: Which is the priority " robots.txt" or " meta robot"? If i use robots.txt to block URL, but in that URL my meta robot is "index, follow"0 -
Duplicate content throughout multiple URLs dilemma
We have a website with lots of categories and there are problems that some subcategories have identical content on them. So, is it enough to just add different text on those problematic subcategories or we need to use "canonical" tag to main category. Same dilemma is with our search system and duplicate content. For example, "/category/sports" URL would have similar to identical content with "/search/sports" and "/search/sports-fitness/" URLs. Ranking factors is important for all different categories and subcategories. Ranking factors is also important for search individual keywords. So, the question is, how to make them somehow unique/different to rank on all those pages well? Would love to hear advices how it can be solved using different methods and how it would affect our rankings. When we actually need to use "canonical" tag and when 301 redirect is better. Thanks!
Intermediate & Advanced SEO | | versliukai0 -
How best to handle (legitimate) duplicate content?
Hi everyone, appreciate any thoughts on this. (bit long, sorry) Am working on 3 sites selling the same thing...main difference between each site is physical location/target market area (think North, South, West as an example) Now, say these 3 sites all sell Blue Widgets, and thus all on-page optimisation has been done for this keyword. These 3 sites are now effectively duplicates of each other - well the Blue Widgets page is at least, and whist there are no 'errors' in Webmaster Tools am pretty sure they ought to be ranking better than they are (good PA, DA, mR etc) Sites share the same template/look and feel too AND are accessed via same IP - just for good measure 🙂 So - to questions/thoughts. 1 - Is it enough to try and get creative with on-page changes to try and 'de-dupe' them? Kinda tricky with Blue Widgets example - how many ways can you say that? I could focus on geographical element a bit more, but would like to rank well for Blue Widgets generally. 2 - I could, i guess, no-index, no-follow, blue widgets page on 2 of the sites, seems a bit drastic though. (or robots.txt them) 3 - I could even link (via internal navigation) sites 2 and 3 to site 1 Blue Widgets page and thus make 2 blue widget pages redundant? 4 - Is there anything HTML coding wise i could do to pull in Site 1 content to sites 2 and 3, without cloaking or anything nasty like that? I think 1- is first thing to do. Anything else? Many thanks.
Intermediate & Advanced SEO | | Capote0 -
Having a hard time with duplicate page content
I'm having a hard time redirecting website.com/ to website.com The crawl report shows both versions as duplicate content. Here is my htaccess: RewriteEngine On
Intermediate & Advanced SEO | | cgman
RewriteBase /
#Rewrite bare to www
RewriteCond %{HTTP_HOST} ^mywebsite.com
RewriteRule ^(([^/]+/)*)index.php$ http://www.mywebsite.com/$1 [R=301,L] RewriteCond %{REQUEST_FILENAME} !-d
RewriteCond %{REQUEST_FILENAME}.php -f
RewriteRule ^(.*)$ $1.php [NC,L]
RewriteCond %{HTTP_HOST} !^.localhost$ [NC]
RewriteRule ^(.+)/$ http://%{HTTP_HOST}$1 [R=301,L] I added the last 2 lines after seeing a Q&A here, but I don't think it has helped.0