"Authorship is not working for this webpage" Can a company G+ page be both Publisher AND Author?
-
When using the Google Structured Data testing tool I get a message saying.......
**Authorship Testing Result - **Authorship is not working for this webpage.
Here are the results of the data for the page http://www.webjobz.com/jobs/
Authorship Email Verification
Please enter a Google+ profile to see if the author has successfully verified an email address on the domain www.webjobz.com to establish authorship for this webpage. Learn more
<form id="email-verification-form" action="http://www.google.com/webmasters/tools/richsnippets" method="GET" data-ved="0CBMQrh8">Verify Authorship</form>
Email verification has not established authorship for this webpage.Email address on the webjobz.com domain has been verified on this profile: YesPublic contributor-to link from Google+ profile to webjobz.com: YesAutomatically detected author name on webpage: Not Found.Publisher
| Publisher markup is verified for this page. |
| Linked Google+ page: | https://plus.google.com/106894524985345373271 |Question - Can this company Google plus account "Webjobz" be both the publisher AND the author?
Can I use https://plus.google.com/106894524985345373271 as the author of this and all other pages on our site?
-
Hi Matt - Publisher is for brands, while Authorship is for people. You should put the rel=publisher code on your website's most important page (usually the home page) only, and establish a two-way link between that page and your brand's business page on Google+. You should verify Authorship for each one of your bloggers, as well as authors of other content on your website where it is appropriate to have an author byline. You cannot use your business page on Google+ to establish Authorship. Each author must link their personal Google+ profile to your website (under the contributor section on said profile.) It looks like you have a multi-author site, so you can either link your website content back to each author's Google+ profile using the email method or rel=author markup. I hope that helps!
-
I found this ...is this true? ..................................
Matthew BarbySEO and Social Media Strategist at Wow Internet
- Oct 03, 2012
Hi Guys,
I just wanted to add to what I have previously said as when I have dug a bit deeper into this, I have found some info that could go against what I have previously said.
You can use the rel=publisher tag alongside the rel=author tag on a webpage. With this in mind it would be a good idea to have the rel=publisher present on every page in your website to enable you to be eligible for Google's Direct Connect feature.
I found this out when looking through a Google forum and one of the Google Trend Analysts confirmed this by saying:
Hi guys
_just a few short comments: it's fine to have both a link rel=publisher and author-markup on the same page. The rel=publisher confirms that your website is the publisher of that Google+ Page; the authorship markup confirms that you (your personal profile) is the author of the content on that page. This markup can be used independently, since the meanings are slightly different. The issue with the Rich Snippets testing tool flagging this as an error is a bug on our side and should be resolved soon (sorry about the confusion caused by that!). _ Cheers John
Just thought I would share this with you all as it might help.
Matt.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Do you know if there is a tool that check all the scripts that are running on the page, and can diagonse scripts that can harm our seo?
Hi, Do you know if there is a tool that check all the scripts that are running on the page, and can diagnose scripts that can harm our seo? Thanks Roy
Intermediate & Advanced SEO | | kadut0 -
Huge spike in "access denied" in search console
Hey Guys, We have seen a huge spike in "Access Denied" status in the google search console for our website and I have no idea why that would be the case. Is there anyone that can shed some light on what is going on or who can point me in the direction of an SEO specialist that we can pay to fix the issue?? Thanks denied.png
Intermediate & Advanced SEO | | fbchris0 -
Syntax: 'canonical' vs "canonical" (Apostrophes or Quotes) does it matter?
I have been working on a site and through all the tools (Screaming Frog & Moz Bar) I've used it recognizes the canonical, but does Google? This is the only site I've worked on that has apostrophes. rel='canonical' href='https://www.example.com'/> It's apostrophes vs quotes. Could this error in syntax be causing the canonical not to be recognized? rel="canonical"href="https://www.example.com"/>
Intermediate & Advanced SEO | | ccox10 -
Dilemma about "images" folder in robots.txt
Hi, Hope you're doing well. I am sure, you guys must be aware that Google has updated their webmaster technical guidelines saying that users should allow access to their css files and java-scripts file if it's possible. Used to be that Google would render the web pages only text based. Now it claims that it can read the css and java-scripts. According to their own terms, not allowing access to the css files can result in sub-optimal rankings. "Disallowing crawling of Javascript or CSS files in your site’s robots.txt directly harms how well our algorithms render and index your content and can result in suboptimal rankings."http://googlewebmastercentral.blogspot.com/2014/10/updating-our-technical-webmaster.htmlWe have allowed access to our CSS files. and Google bot, is seeing our webapges more like a normal user would do. (tested it in GWT)Anyhow, this is my dilemma. I am sure lot of other users might be facing the same situation. Like any other e commerce companies/websites.. we have lot of images. Used to be that our css files were inside our images folder, so I have allowed access to that. Here's the robots.txt --> http://www.modbargains.com/robots.txtRight now we are blocking images folder, as it is very huge, very heavy, and some of the images are very high res. The reason we are blocking that is because we feel that Google bot might spend almost all of its time trying to crawl that "images" folder only, that it might not have enough time to crawl other important pages. Not to mention, a very heavy server load on Google's and ours. we do have good high quality original pictures. We feel that we are losing potential rankings since we are blocking images. I was thinking to allow ONLY google-image bot, access to it. But I still feel that google might spend lot of time doing that. **I was wondering if Google makes a decision saying, hey let me spend 10 minutes for google image bot, and let me spend 20 minutes for google-mobile bot etc.. or something like that.. , or does it have separate "time spending" allocations for all of it's bot types. I want to unblock the images folder, for now only the google image bot, but at the same time, I fear that it might drastically hamper indexing of our important pages, as I mentioned before, because of having tons & tons of images, and Google spending enough time already just to crawl that folder.**Any advice? recommendations? suggestions? technical guidance? Plan of action? Pretty sure I answered my own question, but I need a confirmation from an Expert, if I am right, saying that allow only Google image access to my images folder. Sincerely,Shaleen Shah
Intermediate & Advanced SEO | | Modbargains1 -
Why does old "Free" site ranks better than new "Optimized" site?
My client has a "free" site he set-up years ago - www.montclairbariatricsurgery.com (We'll call this the old site) that consistently outranks his current "optimized" (new) website - http://www.njbariatricsurgery.com/ The client doesn't want to get rid of his old site, which is now a competitor, because it ranks so much better. But he's invested so much in the new site with no results. A bit of background: We recently discovered the content on the new site was a direct copy of content on the old site. We had all copy on new site rewritten. This was back in April. The domain of the new site was changed on July 8th from www.Bariatrx.com to what you see now - www.njbariatricsurgery.com. Any insight you can provide would be greatly appreciated!!!
Intermediate & Advanced SEO | | WhatUpHud0 -
Date of page first indexed or age of a page?
Hi does anyone know any ways, tools to find when a page was first indexed/cached by Google? I remember a while back, around 2009 i had a firefox plugin which could check this, and gave you a exact date. Maybe this has changed since. I don't remember the plugin. Or any recommendations on finding the age of a page (not domain) for a website? This is for competitor research not my own website. Cheers, Paul
Intermediate & Advanced SEO | | MBASydney0 -
How to redirect an url in .htaccess when "redirect 301" doesnt work
I have an odd page url, generated by a link from an external website, it has: %5Cu0026size=27.4KB%5Cu0026p=dell%20printers%20uk%5Cu0026oid=333302b6be58eaa914fbc7de45b23926%5Cu0026ni=21%5Cu0026no=24%5Cu0026tab=organic%5Cu0026sigi=11p3eqh65%5Cu0026tt=Dell%205210n%20A4%20Mono%20Laser%20Printer%20from%20Printer%20Experts%5Cu0026u=fb ,after a .jpg image url, and I can't get it redirect using the redirect 301 in .htaccess to the properly image url as I use to do with the rest of not found urls eg: /15985.jpg%5Cu0026size=27.4KB%5Cu0026p=dell%20printers%20uk%5Cu0026oid=333302b6be58eaa914fbc7de45b23926%5Cu0026ni=21%5Cu0026no=24%5Cu0026tab=organic%5Cu0026sigi=11p3eqh65%5Cu0026tt=Dell%205210n%20A4%20Mono%20Laser%20Printer%20from%20Printer%20Experts%5Cu0026u=fb to just: /15985.jpg
Intermediate & Advanced SEO | | Status0 -
Can a home page penalty cause a drop in rankings for all pages?
All my main keywords have dropped out of the SERPS. Could it be that the home page (the strongest) page has been devalued and therefore 'link juice' that used to spread throughout the site is no longer doing so. Would this cause all other pages to drop? I just can't understand how all my pages have lost rankings. The site is still indexed so there's no problem there.
Intermediate & Advanced SEO | | SamCUK0