If Google Authorship is used for every page of your website, will it be penalized?
-
Hey all,
I've noticed a lot of companies will implement Google Authorship on all pages of their website, ie landing pages, home pages, sub pages. I'm wondering if this will be penalized as it isn't a typical authored piece of content, like blogs, articles, press releases etc.
I'm curious as I'm going to setup Google Authorship and I don't want it to be setup incorrectly for the future. Is it okay to tie each page (home page, sub pages) and not just actual authored content (blogs, articles, press releases) or will it get penalized if that occurs?
Thanks and much appreciated!
-
I actually don't think it is alright to use Authorship or Publisher on every page and this is not what Google intends, check out their blog on this:
http://googlewebmastercentral.blogspot.co.uk/2013/08/relauthor-frequently-asked-advanced.html
Specifically they say "Authorship annotation is useful to searchers because it signals that a page conveys a real person’s perspective or analysis on a topic. Since property listings and product pages are less perspective/analysis oriented, we discourage using authorship in these cases. However, an article about products that provides helpful commentary, such as, “Camera X vs. Camera Y: Faceoff in the Arizona Desert” could have authorship."
So while at this time using Authorship on non-article, product pages is 'unlikely' to get you Google slapped, you are going against their direct advice - which often then gets put into the algorithm when they notice something being abused.
You are right many sites are using this on every page, and it will as of now give you an advantage, even in not in higher rankings then in a more visible results listing which may have a improved CTR. However, as I said once Google see this is being abused they will attempt to stop the practice and make sure it is used for rich content pages only.
Publisher is different in that they want it to be from ideally the homepage to the Business G+ page, they are both different things and Google treats them as separate.
Hope this helps - basically, if what you are doing on your site doesn't benefit your site user, then you are right to question it.
-
Yes, you are right on both counts. I think there will come a time when Google will display the brand icon in place of an image for pages that are marked up with rel=publisher. I can see that pulling through for the sites I manage when I plug them into the Rich Snippet testing tool. Google, however, is not yet displaying those images.
Good luck! It sounds like you've got a good idea of what pages should use what type of authorship.
Dana
-
Hey Dana,
Thanks for the response. So basically what saying is that if it isn't in the immediate authored content area (such as href="https://plus.google.com/104609087715575652977" rel="publisher"/> in the section) then it should be rel=publisher. However, rel=author should be on authored content like the blogs and articles and press releases. This would be shown as "Authored by [Individual Name]" linking the name to the personal Google+ profile, Right?
Also, in doing rel=author /rel=publisher, only "rel=author" translates into the Google+ head shot profile picture in search results, but rel="publisher" will list the company Google+ profile information?
Thanks again for all the responses!
-
There are times when rel=publisher is more approrpiate than rel=author, a product page on an e-commerce site for example. Will a site be penalized for establishing authorship on every page? Absolutely not. In fact, I think that is what Google is intending for people to do.
The problem right now is there is such mass confusion over rel=author and rel=publisher andhow to use them properly, that right now, you see lots of sites that should be using rel=publisher using rel=author instead. Because Google has done such a poor job of articulating how and where to implement these things, I can't imagine them penalizing sites for using one when they should be using the other. Although, I suppose strange things have happened.
I do think that the intention with authorship and also structured data markup is that Webmaster implement all the appropriate tags and markup on every page of their site.
Hope that's helpful!
Dana
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How to fix site breadcrumbs on mobile google search
For past one month, I have been doing some research on how to fix this issue on my website but all my efforts didn't work out I really need help on this issue because I'm worried about this I was hoping that Google will cache or understand the structure of my site and correct the error the breadcrumb is working correctly on desktop but not shown on mobile. For Example take a look at : https://www.xclusivepop.com/omah-lay-bad-influence/
White Hat / Black Hat SEO | | Ericrodrigo0 -
Google spider
If someone provide 1 or more cent discount to our customers who put up a link on their site, and wanted to actually show the referral discount in their shopping cart for that customer, can Google see that and realize they are providing a discount for a link? Can Google see what's displayed in our their web application - like in the upload, shopping cart and complete transaction pages?
White Hat / Black Hat SEO | | K_Monestel0 -
Advice needed! How to clear a website of a Wordpress Spam Link Injection Google penalty?
Hi Guys, I am currently working on website that has been penalised by Google for a spam link injection. The website was hacked and 17,000 hidden links were injected. All the links have been removed and the site has subsequently been redesigned and re-built. That was the easy part 🙂 The problems comes when I look on Webmaster. Google is showing 1000's of internal spam links to the homepage and other pages within the site. These pages do not actually exist as they were cleared along with all the other spam links. I do believe though this is causing problems with the websites rankings. Certain pages are not ranking on Google and the homepage keyword rankings are fluctuating massively. I have reviewed the website's external links and these are all fine. Does anyone have any experience of this and can provide any recommendations / advice for clearing the site from Google penalty? Thanks, Duncan
White Hat / Black Hat SEO | | CayenneRed890 -
How to re-rank an established website with new content
I can't help but feel this is a somewhat untapped resource with a distinct lack of information.
White Hat / Black Hat SEO | | ChimplyWebGroup
There is a massive amount of information around on how to rank a new website, or techniques in order to increase SEO effectiveness, but to rank a whole new set of pages or indeed to 're-build' a site that may have suffered an algorithmic penalty is a harder nut to crack in terms of information and resources. To start I'll provide my situation; SuperTED is an entertainment directory SEO project.
It seems likely we may have suffered an algorithmic penalty at some point around Penguin 2.0 (May 22nd) as traffic dropped steadily since then, but wasn't too aggressive really. Then to coincide with the newest Panda 27 (According to Moz) in late September this year we decided it was time to re-assess tactics to keep in line with Google's guidelines over the two years. We've slowly built a natural link-profile over this time but it's likely thin content was also an issue. So beginning of September up to end of October we took these steps; Contacted webmasters (and unfortunately there was some 'paid' link-building before I arrived) to remove links 'Disavowed' the rest of the unnatural links that we couldn't have removed manually. Worked on pagespeed as per Google guidelines until we received high-scores in the majority of 'speed testing' tools (e.g WebPageTest) Redesigned the entire site with speed, simplicity and accessibility in mind. Htaccessed 'fancy' URLs to remove file extensions and simplify the link structure. Completely removed two or three pages that were quite clearly just trying to 'trick' Google. Think a large page of links that simply said 'Entertainers in London', 'Entertainers in Scotland', etc. 404'ed, asked for URL removal via WMT, thinking of 410'ing? Added new content and pages that seem to follow Google's guidelines as far as I can tell, e.g;
Main Category Page Sub-category Pages Started to build new links to our now 'content-driven' pages naturally by asking our members to link to us via their personal profiles. We offered a reward system internally for this so we've seen a fairly good turnout. Many other 'possible' ranking factors; such as adding Schema data, optimising for mobile devices as best we can, added a blog and began to blog original content, utilise and expand our social media reach, custom 404 pages, removed duplicate content, utilised Moz and much more. It's been a fairly exhaustive process but we were happy to do so to be within Google guidelines. Unfortunately, some of those link-wheel pages mentioned previously were the only pages driving organic traffic, so once we were rid of these traffic has dropped to not even 10% of what it was previously. Equally with the changes (htaccess) to the link structure and the creation of brand new pages, we've lost many of the pages that previously held Page Authority.
We've 301'ed those pages that have been 'replaced' with much better content and a different URL structure - http://www.superted.com/profiles.php/bands-musicians/wedding-bands to simply http://www.superted.com/profiles.php/wedding-bands, for example. Therefore, with the loss of the 'spammy' pages and the creation of brand new 'content-driven' pages, we've probably lost up to 75% of the old website, including those that were driving any traffic at all (even with potential thin-content algorithmic penalties). Because of the loss of entire pages, the changes of URLs and the rest discussed above, it's likely the site looks very new and probably very updated in a short period of time. What I need to work out is a campaign to drive traffic to the 'new' site.
We're naturally building links through our own customerbase, so they will likely be seen as quality, natural link-building.
Perhaps the sudden occurrence of a large amount of 404's and 'lost' pages are affecting us?
Perhaps we're yet to really be indexed properly, but it has been almost a month since most of the changes are made and we'd often be re-indexed 3 or 4 times a week previous to the changes.
Our events page is the only one without the new design left to update, could this be affecting us? It potentially may look like two sites in one.
Perhaps we need to wait until the next Google 'link' update to feel the benefits of our link audit.
Perhaps simply getting rid of many of the 'spammy' links has done us no favours - I should point out we've never been issued with a manual penalty. Was I perhaps too hasty in following the rules? Would appreciate some professional opinion or from anyone who may have experience with a similar process before. It does seem fairly odd that following guidelines and general white-hat SEO advice could cripple a domain, especially one with age (10 years+ the domain has been established) and relatively good domain authority within the industry. Many, many thanks in advance. Ryan.0 -
Why have bots (including googlebot) categorized my website as adult?
How do bots decide whether a website is adult? For example, I have a gifting portal, but strangely here, it is categorized as 'Adult'. Also, my google adsense application to run ads on my site got rejected - I have a feeling this is because googlebot categorized my site as adult. And there are good chances that other bots also consider it an adult website, rather than a gifting website. Can anyone please go through the site and tell me why this is happening? Thanks in advance.
White Hat / Black Hat SEO | | rahulkan0 -
Is there a problem with google?
I have one or two competitors (in the UK) in my field who buy expired 1 - 8 year old domains on random subjects (SEO, travel, health you name it) and they are in the printing business and they stick 1 - 2 articles (unrelated to what was on there before) on these and that's it. I think they stick with PA and DA above 30 and most have 10 – 100 links so well used expired domains, hosted in the USA and most have different Ip’s although they now have that many (over 70% of their backlink profile) that some have the same ip. On further investigation none of the blogs have any contact details but it does look like they have been a little smart here and added content to the about us (similar to I use to run xxx but now do xxx) also they have one or two tabs with content on (article length) that is on the same subject they use to do and the titles are all the same content. So basically they are finding expired 1 – 10 year old domains that have only been expired (from what I can see) 6 months max and putting 1 – 2 articles on the home page in relation with print (maybe adding a third on the subject the blog use to cover), add 1 – 3 articles via tabs at the top on subjects the sites use to cover, registering the details via xbybssgcf@whoisprivacyprotect.com and that’s it. They have been ranking via this method for the last couple of years (through all the Google updates). Does Google not have any way to combat link networks other than the stupid stuff such as public link networks, it just seems that if you know what you are doing you get away, if your big enough you get away with it but the middle of the ground (mum and pop sites) get F*** over with spam pointing to there site that no spammer would dream of doing anyway?
White Hat / Black Hat SEO | | BobAnderson0 -
SEOLutions - Paint it White... Has any one used?
Has anyone used the tiered link building service offered by seolutions (http://seolutions.biz/store/seo-solutions/premium-solutions-paint-it-white.html)? If so, can you provide any insight into how effective it was in the long and short term? Thanks!
White Hat / Black Hat SEO | | PeterAlexLeigh0 -
Should we add our site to Google Webmaster Tools
Hello, Should we add our site nlpca(dot)com to google webmaster tools? Everything's very white hat but we do have a section on each of our 4 sites for "Our other Sites" that link to the others. It's been there for many years. We're looking for clues as to why we've dropped in rank Thanks!
White Hat / Black Hat SEO | | BobGW0