How do you achieve Google Authorship verification on a site with no clearly defined authors?
-
Google Authorship seems to be the current buzz topic in SEO. It seems perfect for people who write lots of articles of blog posts, but what about sites where the main focus isn't articles e.g. e-commerce sites?
Can the website as a whole get verified?
-
Hi Paul
Thanks for your response. That makes a lot of sense, thanks for the clarification!
-
In this case, you'd verify the site with rel=publisher connected to the business's Google + page instead of rel=author.
Unfortunately, rel=publisher won't give you any advantage of a rich snippet in search or the other benefits of rel=author at this point, but most expect that those linkages will be implemented by Google soon. So worth doing now to be right on top of it when does get further implemented by Google
There's no point trying to "fake " a rel=author connection - it won' help your site anyway.
That help?
Paul.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google Mobile site crawl returns poorer results on 100% responsive site
Has anyone experienced an issue where Google Mobile site crawl returns poorer results than their Desktop site crawl on a 100% responsive website that passes all Google Mobile tests?
Intermediate & Advanced SEO | | MFCommunications0 -
How long will old pages stay in Google's cache index. We have a new site that is two months old but we are seeing old pages even though we used 301 redirects.
Two months ago we launched a new website (same domain) and implemented 301 re-directs for all of the pages. Two months later we are still seeing old pages in Google's cache index. So how long should I tell the client this should take for them all to be removed in search?
Intermediate & Advanced SEO | | Liamis0 -
Completely redesigned webmaster - set up new site in Google Webmaster Tools, or keep existing??
Hi - our company just completely redesigned our website and went from a static HTML site to a PHP based site, so every single URL has changed (around 1500 pages). I put the same verification code into the new site and re-verified but now Google is listing tons and tons of 404's. Some of them are really old pages that haven't existing in a long time, it would literally be impossible to create all the redirects for the 404s it's pulling. Question - when completely changing a site like this, should I have created a whole new Search Console? Or did I do the right thing by using the existing one?
Intermediate & Advanced SEO | | Jenny10 -
Understanding the levels in my site
How can I figure out which pages are on the same level on my site ? I created an automatic sitemap with a software online but it doesn't tell me abc page is on the 1 st level, xyz page is on the second level etc... and I have a hard time figuring out if my main menu is on the same level as my drop down menu as it is visible on the same page. Is there anyway to figure what which pages are on the same level ?
Intermediate & Advanced SEO | | seoanalytics0 -
Site rankings dropped from ~15 to 500+ but Google says we were not penalized
2 months ago my site was ranking about 15 for my main KWs in the UK (my main market). On July 28 we dropped suddenly to ~500 or not in the top 500 at all in the UK for the 3 main keywords. However, these same terms are still ranking in other markets and we are doing okay in the UK for other terms targeted to the same pages. I cleaned up my links/content and sent a reconsideration request to Google. I then looked closer at my links and realized Google may be counting my affiliate links as if they were mine, so thought maybe that was the problem. I sent Google another reconsideration request and they wrote back the letter pasted below Any ideas about what happened or how I can get my rankings back? This definitely doesn't seem like it was just a simple algorithm change. There were no major changes done to our site, and we are still ranking in other markets. Dear site owner or webmaster
Intermediate & Advanced SEO | | theLotter
We received a request from a site owner to reconsider your site for compliance with Google's Webmaster Guidelines.
We reviewed your site and found no manual actions by the webspam team that might affect your site's ranking in Google. There's no need to file a reconsideration request for your site, because any ranking issues you may be experiencing are not related to a manual action taken by the webspam team.
Of course, there may be other issues with your site that affect your site's ranking. Google's computers determine the order of our search results using a series of formulas known as algorithms. We make hundreds of changes to our search algorithms each year, and we employ more than 200 different signals when ranking pages. As our algorithms change and as the web (including your site) changes, some fluctuation in ranking can happen as we make updates to present the best results to our users.
If you've experienced a change in ranking which you suspect may be more than a simple algorithm change, there are other things you may want to investigate as possible causes, such as a major change to your site's content, content management system, or server architecture. For example, a site may not rank well if your server stops serving pages to Googlebot, or if you've changed the URLs for a large portion of your site's pages. This article has a list of other potential reasons your site may not be doing well in search.
If you're still unable to resolve your issue, please see our Webmaster Help Forum for support.
Sincerely,
Google Search Quality Team0 -
Is it safe to not have a sitemap if Google is already crawling my site every 5-10 min?
I work on a large news site that is constantly being crawled by Google. Googlebot is hitting the homepage every 5-10 minutes. We are in the process of moving to a new CMS which has left our sitemap nonfunctional. Since we are getting crawled so often, I've met resistance from an overwhelmed development team that does not see creating sitemaps as a priority. My question is, are they right? What are some reasons that I can give to support my claim that creating an xml sitemap will improve crawl efficiency and indexing if we are already having new stories appear in Google SERPs within 10-15 minutes of publication? Is there a way to quantify what the difference would be if we added a sitemap?
Intermediate & Advanced SEO | | BostonWright0 -
Do I have to tell WBT site moved to a subdirectory on another internal site?
I am moving content from one site to another and redirecting the DNS from www.oldsite.com to www.newsite.com/old-site. I have put the 301 in place but I wanted to make sure I have to also tell Webmaster Tools to change the old site to the new domain? We still want the old domain name to answer and redirect to www.newsite.com/old-site. Thanks
Intermediate & Advanced SEO | | GeorgeLaRochelle0 -
Random Google?
In 2008 we performed an experiment which showed some seemingly random behaviour by Google (indexation, caching, pagerank distributiuon). Today I put the results together and analysed the data we had and got some strange results which hint at a possibility that Google purposely throws in a normal behaviour deviation here and there. Do you think Google randomises its algorithm to prevent reverse engineering and enable chance discoveries or is it all a big load balancing act which produces quasi-random behaviour?
Intermediate & Advanced SEO | | Dan-Petrovic0