Marking our content as original, where the rel=author tag might not be applied
-
Hello,
Can anyone tell, if it is possible to protect text –type content without the rel=author tag?
We host a business listing site, where, apart from the general contact information, we have also started to write original 800+ character-long unique and original contents for the suppliers, where we expect visits, so rankings should be increased.
My issue is that this is a very competitive business, and content crawling is really an everyday practice. Of course, I would like to keep my original content or at least mark it as mine for Google.
The easiest way would be the author tag, but the problem is, that I do not want our names and our photos to be assigned to these contents, because from one hand, we are not acknowledged content providers on our own (no bio and whatsoever), and on the other hand, we provide contents for every sort of businesses, so just having additional links to our other contents, might not help readers to get what they want. I also really do not think that a photo of me could help increase the CTR from the SERP:)
What we currently do, is that we submit every major fresh content through url submission in WMT, hoping that first indexing might help. We have only a handful of them within a day, so not more than 10.
Yes, I could perhaps use absolute links, but this one is not a feasible scenario in all cases, and about DMCA, as our programmer says, what you can see on the internet, that you can basically own.
So finally, I do not mind our contents being stolen, as I can’t possibly prevent this. I want however our original content to be recognized as ours by Google, even after the stealing is done.
(Best would be an ’author tag for business’, so connected to our business Google+ page, but I am not aware, this function can be used this way.)
Thank you in advance for all of you, sharing your thoughts with me on the topic.
-
Hi Mat,
You have provided some great clarification, thank you.
Now, I only have one thing to be sorted out:
Should I add the DMCA protection badges to all pages with unique content, as soon as they are created?
My dilemma is, that let's say I find my content somewhere else. I will submit a take down request through DMCA. How it is going to be proven, whose site owns the original content?
I am definetely not an expert in this, but it can easily happen, that my page is not the older one, and the page where my content is placed to, simply just changed its previous content, but the page itself is older than my relevant page.
Thank you
-
Hi András ,
I think that you are getting confused to what rel=author actually does. It can help as part of the picture that shows google who the originator of content is, but it doesn't assert it in the way you seem to be suggesting. I'll come back to that, but let me address another point first:
as our programmer says, what you can see on the internet, that you can basically own.
This is plainly wrong. I would agree that whatever you see on the internet can just be stolen. However that is not the same as owning it, something that international law backs up.
If you have valuable content that is likely to get stolen then you need to do 2 things:
1. Ensure that search engines find your copy first and see you as the originator
2. Police it
#1 You seem to be doing. Manual submission via webmaster tools sounds painful to me, but will do that. Tweet it, link it, ping it etc. Do what you can to establish "this was here" early and to get Google to index it.
Part of that same picture is to be seen as trustworthy. Get those high authority citations, ensure you content is always unique etc.
However, #2 is about you taking responsibility for your content. It's yours, you own it, there are no internet police so it is up to you. Try a service like copyscape, or just use google alerts to let you know when people steal stuff. When they do hit them with a take down notice, send the same to their hosts, domain registrar etc - then follow it up with a DMCA request.
This will stop a lot of it. It will also make it a pain in the bum for some of the others (if it is more hassle to steal from you than someone else then they will steal from someone else!). It also starts undermining the trust in their sites. If google have frequent DMCA requests about particular domains it helps build that picture. If you see them stealing other people content let the other victims know as well and encourage them to do the same.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Reusing content on different ccTLDs
We have a client with many international locations, each of which has their own ccTLD domain and website. Eg company-name.com, company-name.com.au, company-name.co.uk, company-name.fr, etc. Each domain/website only targets their own country, and the SEO aim is for each site to only rank well within their own country. We work for an individual country's operations, and the international head office wants to re-use our content on other countries' websites. While there would likely be some optimsation of the content for each region, there may be cases where it is re-used identically. We are concerned that this will cause duplicate content issues. I've read that the separate ccTLDs should indicate to search engines that content is aimed at the different locations - is this sufficient or should we be doing anything extra to avoid duplicate content penalties? Or should we argue that they simply must not do this at all and develop unique content for each? Thanks Julian
Content Development | | Bc.agency0 -
Is there a good tool for finding Duplicate Content?
Hi! Are there any recommendation voor Duplicate Content finder tools? Is it offered on MOZ (not the one that belongs to 'seo issues') or a different site?
Content Development | | Marketing-SurpriseFactory1 -
Outsource Content Writing Services - Who Is The Best?
Hello fellow Mozers! We are currently in the process of hiring an additional content developer but he doesn’t start for a couple weeks. We have some work that is piling up and I am considering outsourcing some content development until the new hire gets here. I have looked into Scripted, TextMaster, CrowdContent and a few others. I stay away from sites like UpWork and Fiverr because I abuse them in the past and I don’t get quality work. Is there any company that you have used in the past that you really trust and who are familiar with SEO? Thank you in advance for all of your recommendations
Content Development | | InfinityDigital1 -
Free Duplicate Content Checker Tools ?
Hi Moz, I am really looking for free tools which can carry my content duplication issue, as i visited http://moz.com/community/q/are-there-tools-to-discover-duplicate-content-issues-with-the-other-websites suggested copyscape which is paid. I want FREE to handle my duplication issue.' Thanks in Advance. Best,
Content Development | | Futura
Teginder1 -
How to make new content Indexed faster by google
I would like to know what can I do. Normally it takes google around 3 days to index my content. I got a site map, swiched the crawling rate to the fastest in my webmaster tools. I also tried crawling my homepage as google bot and sending it to the index with all linked pages but even if I do so my content takes around 3 days if not more to get indexed. I publish around 20 posts a week. My SEOmoz page authority is 48. Some sites of my competition seem to be getting their content indexed in the same day. What else can be done?
Content Development | | sebastiankoch0 -
I have created 2 blogs for a client as they have 2 domains (1 for their core business, and 1 for a product). I want to use the same content on both blogs. What is the best way to set this up so there are no ranking or duplicate content issues?
We are pushing SEO for only one of the domains, therefore I would like one to be dominant. We will be sending the blog post via email to their database, therefore each blog needs to have the same content. Thank you!
Content Development | | MarketingResults0 -
How to best take advantage of content being used on another site?
We've never syndicated content or done "article marketing". Another site contacted us and requested to use the content on several of our webpages. The other site is a fairly prestigious nonprofit in our industry. We don't mind them using our content, but we want to get the most benefit out of it. There are two ways the occur to me: Have them create pages with the exact same text as on our pages, but put in the header of those pages Just have them create pages with the text from our pages with embedded links back to our other pages. Each page they create will say "Content courtesy of XXX" Does anyone have opinions on which way is best, or another approach?
Content Development | | DanCrean0 -
WHAT IF YOU ARE NOT IN THE BUSINESS OF PRODUCING CONTENT
A lot of SEO is focusing on content these days and having unique blogs and articles. I understand how this would be great for a search engine, but doesn't that leave out businesses that are not there to make content but only want to advertise their services?
Content Development | | musillawfirm0