Marking our content as original, where the rel=author tag might not be applied
-
Hello,
Can anyone tell, if it is possible to protect text –type content without the rel=author tag?
We host a business listing site, where, apart from the general contact information, we have also started to write original 800+ character-long unique and original contents for the suppliers, where we expect visits, so rankings should be increased.
My issue is that this is a very competitive business, and content crawling is really an everyday practice. Of course, I would like to keep my original content or at least mark it as mine for Google.
The easiest way would be the author tag, but the problem is, that I do not want our names and our photos to be assigned to these contents, because from one hand, we are not acknowledged content providers on our own (no bio and whatsoever), and on the other hand, we provide contents for every sort of businesses, so just having additional links to our other contents, might not help readers to get what they want. I also really do not think that a photo of me could help increase the CTR from the SERP:)
What we currently do, is that we submit every major fresh content through url submission in WMT, hoping that first indexing might help. We have only a handful of them within a day, so not more than 10.
Yes, I could perhaps use absolute links, but this one is not a feasible scenario in all cases, and about DMCA, as our programmer says, what you can see on the internet, that you can basically own.
So finally, I do not mind our contents being stolen, as I can’t possibly prevent this. I want however our original content to be recognized as ours by Google, even after the stealing is done.
(Best would be an ’author tag for business’, so connected to our business Google+ page, but I am not aware, this function can be used this way.)
Thank you in advance for all of you, sharing your thoughts with me on the topic.
-
Hi Mat,
You have provided some great clarification, thank you.
Now, I only have one thing to be sorted out:
Should I add the DMCA protection badges to all pages with unique content, as soon as they are created?
My dilemma is, that let's say I find my content somewhere else. I will submit a take down request through DMCA. How it is going to be proven, whose site owns the original content?
I am definetely not an expert in this, but it can easily happen, that my page is not the older one, and the page where my content is placed to, simply just changed its previous content, but the page itself is older than my relevant page.
Thank you
-
Hi András ,
I think that you are getting confused to what rel=author actually does. It can help as part of the picture that shows google who the originator of content is, but it doesn't assert it in the way you seem to be suggesting. I'll come back to that, but let me address another point first:
as our programmer says, what you can see on the internet, that you can basically own.
This is plainly wrong. I would agree that whatever you see on the internet can just be stolen. However that is not the same as owning it, something that international law backs up.
If you have valuable content that is likely to get stolen then you need to do 2 things:
1. Ensure that search engines find your copy first and see you as the originator
2. Police it
#1 You seem to be doing. Manual submission via webmaster tools sounds painful to me, but will do that. Tweet it, link it, ping it etc. Do what you can to establish "this was here" early and to get Google to index it.
Part of that same picture is to be seen as trustworthy. Get those high authority citations, ensure you content is always unique etc.
However, #2 is about you taking responsibility for your content. It's yours, you own it, there are no internet police so it is up to you. Try a service like copyscape, or just use google alerts to let you know when people steal stuff. When they do hit them with a take down notice, send the same to their hosts, domain registrar etc - then follow it up with a DMCA request.
This will stop a lot of it. It will also make it a pain in the bum for some of the others (if it is more hassle to steal from you than someone else then they will steal from someone else!). It also starts undermining the trust in their sites. If google have frequent DMCA requests about particular domains it helps build that picture. If you see them stealing other people content let the other victims know as well and encourage them to do the same.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Reusing content on different ccTLDs
We have a client with many international locations, each of which has their own ccTLD domain and website. Eg company-name.com, company-name.com.au, company-name.co.uk, company-name.fr, etc. Each domain/website only targets their own country, and the SEO aim is for each site to only rank well within their own country. We work for an individual country's operations, and the international head office wants to re-use our content on other countries' websites. While there would likely be some optimsation of the content for each region, there may be cases where it is re-used identically. We are concerned that this will cause duplicate content issues. I've read that the separate ccTLDs should indicate to search engines that content is aimed at the different locations - is this sufficient or should we be doing anything extra to avoid duplicate content penalties? Or should we argue that they simply must not do this at all and develop unique content for each? Thanks Julian
Content Development | | Bc.agency0 -
Duplicate Blog Content
Hey Mozzers! I have a client who is a dentist, with multiple offices, and wants to use an identical blog post (including images, alt text, text, tags, everything pretty much) from one of his office's website on his other office's website to save time and mimic the success of the original content for his other office. Everything I've researched says this is a HUGE no-no, but I'd love to hear if anyone else has tried to do something like this and if they were successful in doing so (implementing rel=cannonical or 301?). Also, if he is the owner of both sites and they both receive low traffic will Google even notice? My biggest worry is that if I did post the content on his other site, identically, that it would dilute the visibility of the original post, which has and is continuing to surpass our organic search goals... The main goal though, is to drive traffic to BOTH sites via organic search using the same content. Would love to hear everyone's opinions if this is possible or unrealistic... Thanks! -D
Content Development | | Derrald0 -
Similar Content
I'm in the process of launching two new websites (redesign / rebrand) - one website represents the manufacturer while a second website represents the retail side of the manufacturer (same company essentially but two different brands). The sites have co-existed historically without worry of canibalizing the other's traffic, but I want to make sure in this redesign that we're all set. I'm curious if anyone has any recommendations on how to handle two sites with different branding but VERY similar content (product descriptions are essentially the same). I was thinking it could be smart to just no follow the content on the manufacturers site since we're just trying to drive traffic to the consumer-facing retail side mostly anyway but would love to hear from the experts! Thank you.
Content Development | | TheBatesMillStore0 -
What if your content is getting social shares but no links?
Suppose you have a weekly blog article and sometimes your articles earn social shares (e.g. 23 +1's on Google Plus on one article but normally 3-5 social shares). One out of 10 earns an organic link from a random blog. Would you continue publishing these blog posts?
Content Development | | ProjectLabs0 -
Modifying Content to Avoid Duplicate Content Issues
We are planning to leverage specific posts from a US-based blog for our own Canadian blog (with permission, of course) but are aware that this can cause duplicate content issues. We're willing to re-write as much or as little as we must from the initial blog posts to avoid duplicate content issues but I have no idea just how much we will need to re-write. Is there some guideline for this (e.g., 25% of content must be re-written)? I've been unable to find anything. Thank you in advance!
Content Development | | QueenSt0 -
Ecommerce site content upgrade timescale.
I have been upgrading my sites content and structure and I have been wondering how long I should wait for a traffic increase before I should think it has been a failure and try a new plan of attack?
Content Development | | mark_baird0 -
On page content and PDF - Dup?
Hi We are writing a useful article which we want to put on our site, but we also want to add it as a pdf which people can download - will this be classed as dup copy?
Content Development | | jj34340 -
Duplicate content because of tag
Dear Expert, I use several tags in my single post (www.rumahapp.com), for example when i post something about new game release i would use these tags: Publisher name, game rating and game genre. The problem is seomoz crawl my page and said that i have duplicate content between http://www.rumahapp.com/tag/com2us/ and http://www.rumahapp.com/tag/homerun-battle-2 how do i solve this? is this mean that i shouldn't use tag in my post? thank you
Content Development | | Gundud0