If Google Authorship is used for every page of your website, will it be penalized?
-
Hey all,
I've noticed a lot of companies will implement Google Authorship on all pages of their website, ie landing pages, home pages, sub pages. I'm wondering if this will be penalized as it isn't a typical authored piece of content, like blogs, articles, press releases etc.
I'm curious as I'm going to setup Google Authorship and I don't want it to be setup incorrectly for the future. Is it okay to tie each page (home page, sub pages) and not just actual authored content (blogs, articles, press releases) or will it get penalized if that occurs?
Thanks and much appreciated!
-
I actually don't think it is alright to use Authorship or Publisher on every page and this is not what Google intends, check out their blog on this:
http://googlewebmastercentral.blogspot.co.uk/2013/08/relauthor-frequently-asked-advanced.html
Specifically they say "Authorship annotation is useful to searchers because it signals that a page conveys a real person’s perspective or analysis on a topic. Since property listings and product pages are less perspective/analysis oriented, we discourage using authorship in these cases. However, an article about products that provides helpful commentary, such as, “Camera X vs. Camera Y: Faceoff in the Arizona Desert” could have authorship."
So while at this time using Authorship on non-article, product pages is 'unlikely' to get you Google slapped, you are going against their direct advice - which often then gets put into the algorithm when they notice something being abused.
You are right many sites are using this on every page, and it will as of now give you an advantage, even in not in higher rankings then in a more visible results listing which may have a improved CTR. However, as I said once Google see this is being abused they will attempt to stop the practice and make sure it is used for rich content pages only.
Publisher is different in that they want it to be from ideally the homepage to the Business G+ page, they are both different things and Google treats them as separate.
Hope this helps - basically, if what you are doing on your site doesn't benefit your site user, then you are right to question it.
-
Yes, you are right on both counts. I think there will come a time when Google will display the brand icon in place of an image for pages that are marked up with rel=publisher. I can see that pulling through for the sites I manage when I plug them into the Rich Snippet testing tool. Google, however, is not yet displaying those images.
Good luck! It sounds like you've got a good idea of what pages should use what type of authorship.
Dana
-
Hey Dana,
Thanks for the response. So basically what saying is that if it isn't in the immediate authored content area (such as href="https://plus.google.com/104609087715575652977" rel="publisher"/> in the section) then it should be rel=publisher. However, rel=author should be on authored content like the blogs and articles and press releases. This would be shown as "Authored by [Individual Name]" linking the name to the personal Google+ profile, Right?
Also, in doing rel=author /rel=publisher, only "rel=author" translates into the Google+ head shot profile picture in search results, but rel="publisher" will list the company Google+ profile information?
Thanks again for all the responses!
-
There are times when rel=publisher is more approrpiate than rel=author, a product page on an e-commerce site for example. Will a site be penalized for establishing authorship on every page? Absolutely not. In fact, I think that is what Google is intending for people to do.
The problem right now is there is such mass confusion over rel=author and rel=publisher andhow to use them properly, that right now, you see lots of sites that should be using rel=publisher using rel=author instead. Because Google has done such a poor job of articulating how and where to implement these things, I can't imagine them penalizing sites for using one when they should be using the other. Although, I suppose strange things have happened.
I do think that the intention with authorship and also structured data markup is that Webmaster implement all the appropriate tags and markup on every page of their site.
Hope that's helpful!
Dana
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate content for product pages
Say you have two separate pages, each featuring a different product. They have so many common features, that their content is virtually duplicated when you get to the bullets to break it all down. To avoid a penalty, is it advised to paraphrase? It seems to me it would benefit the user to see it all laid out the same, apples to apples. Thanks. I've considered combining the products on one page, but will be examining the data to see if there's a lost benefit to not having separate pages. Ditto for just not indexing the one that I suspect may not have much traction (requesting data to see).
White Hat / Black Hat SEO | | SSFCU0 -
It appears that no matter what I do, my website is not picking up traffic. What can I do?
I have tried everything, followed everything by the book...yet nothing is happening. I I yet have to try PPC, but I am sure that by now the website is healthy since I have spent from January to the current date fixing every sort of warning and errors. While I have worked on link building strategies(only submitting links to directories for the moment) However the website is dead. What should I do? Is this due to a Google penalty?
White Hat / Black Hat SEO | | ts24group0 -
Penalized by Penguin 2.0
I believe our site has been penalizes by Penguin 2.0. Our impressions in Google Webmaster are down and our traffic in Google Analytics also took a hit. Both of these occurences took place right when Penguin 2.0 was unleashed. What are the steps I need to take to regain my ranking? Is disallowing all the links I think maybe spammy the first thing to do?
White Hat / Black Hat SEO | | joebuilder0 -
Domain Structure For A Network of Websites
To achieve this we need to set up a new architecture of domains and sub-websites to effectively build this network. We want to make sure we follow the right protocols for setting up the domain structures to achieve good SEO for the primary domain and local websites. Today we have our core website at www.doctorsvisioncenter.com which will ultimately will become dvceyecarenetwork.com. That website will serve as the core web presence that can be custom branded for hundreds. For example, today you can go to www.doctorsvisioncenter.com/pinehurst. Note when you start there, you can click around and it is still branded for Pinehurst or spectrum eye care. So the burning question(s). - if I am an independent doc at www.newyorkeye.com, I could do domain forwarding but Google does not index forwarded domains so that is out. I could do a 301 permanent redirect to my page www.doctorsvisioncenter.com/newyorkeye. I could then put a rule in the HT Access file that says if newyorkeye.com redirect to www.doctorsvisioncenter/newyorkeye and then have the domain show up as www.newyorkeye.com. Another way to do that is we point the newyorkeye DNS to doctorsvisioncenter.com rather than a 301 redirect with the same basic rule in the HT Access file. That means that, theoretically, every sub page would show up, for example, as www.newyorkeye.com/contact-lens-center which is actually www.doctorsvisioncenter.com/contact-lens-center. It also means, theoretically, that it will be seen as an individual domain but pointing to all the same content under that individual domain just like potentially hundreds of others. The goal is we build once, manage once and benefit many. If we do something like the above which will mean that each domain will essentially be a separate domain, but, will google see it that way or as duplicative content? While it is easy to answer "yes" it would be duplicative, it is not necessarily the case if the content is on separate domains. Is this a good way to proceed, or does anyone have another recommendation for us?
White Hat / Black Hat SEO | | JessTopps0 -
Google Sitemaps & punishment for bad URLS?
Hoping y'all have some input here. This is along story, but I'll boil it down: Site X bought the url of Site Y. 301 redirects were added to direct traffic (and help transfer linkjuice) from urls in Site X to relevant urls in Site Y, but 2 days before a "change of address" notice was submitted in Google Webmaster Tools, an auto-generating sitemap somehow applied urls from Site Y to the sitemap of Site X, so essentially the sitemap contained urls that were not the url of Site X. Is there any documentation out there that Google would punish Site X for having essentially unrelated urls in its sitemap by downgrading organic search rankings because it may view that mistake as black hat (or otherwise evil) tactics? I suspect this because the site continues to rank well organically in Yahoo & Bing, yet is nonexistent on Google suddenly. Thoughts?
White Hat / Black Hat SEO | | RUNNERagency0 -
Pages Getting Deindexed
My Question Is I have 16 pages on my site that were all indexed until yesterday now there are only 3 indexed. I tried resubmitting my site map, and when i did it was the same result as before 3 pages indexed and 13 pages deindexed. I was wondering if someone could explain to me why this is happening and what I can do to fix it? Keep in mind my site is almost three months old, and this has happened before but, it fixed itself over time thanks.
White Hat / Black Hat SEO | | ilyaelbert0 -
Has anyone seen this kind of google cache spam before?
Has anyone seen this kind of 'hack'? When looking at a site recently I found the Google cache version (from 28 Oct) strewn with mentions of all sorts of dodgy looking pharma products but the site itself looked fine. The site itself is www.istc.org.uk Looking in the source of the pages you can see the home pages contains: Browsing as googlebot showed me an empty page (though msnbot etc. returned a 'normal' non-pharma page). As a mildly amusing aside - when I tried to tell the istc about this, the person answering the phone clearly didn't believe me and couldn't get me off the line fast enough! Needless to say they haven't fixed it a week after being told.
White Hat / Black Hat SEO | | JaspalX0 -
Using Canonical Tags to Boost Lead Form Ranking
I presently have a number of whitepapers that bring traffic to our site. If a visitor elects to download the whitepaper they are taken to a lead form with an abstract of the whitepaper. The abstract is present because the visitor may or may not have come to the lead form directly. I imagine this would be a "no no," but how do you feel about placing a canoncial tag on a whitepaper that points to the lead form w/ abstract? The obvious idea being to take the umph of a whitepaper to direction search visitors directly to the lead form.
White Hat / Black Hat SEO | | shoffy0