Google authorship and multiple sites with multiple authors
-
Hi guys :).
I am asking your help - basically I would like to know what would be the best way to set all of this up.
Basically I have two main (e-commerce) sites, and a few other big web properties.
What I would like to know is if it is ok to link the main sites to my real G+ account, and use alias G+ accounts for other web properties, or is that a kind of spamming?
The thing is that I use a G+ account for those e-commerce sites, and would not necessarily want the other web properties to be linked to the same G+ account, as they are not really related.
I do hope I was clear. Any insight would be appreciated. Thanks.
-
To clarify:
Assume the OP's parent company is "Acme Widgets" and has a Google + Busines page for "Acme Widgets". He has established rel="publisher" between this business profile, and the "Acme Widgets" main company website.
Now 6 months later Acme Widgets opens up a new website -- "The Widget Training Site" -- which is a separate site, but falls under the "Acme Widget" umbrella.
Can the OP also add "The Widget Training Site" to the "Acme Widgets" Google + Page? I've tried to so something similar, and I can only link one website to one google + page.
Or is there a way to add multiple sites to one google + business page.
Thanks for your insight!
-
Than you Russ for the references, and thank you very much Kyle, that does help a lot!
-
Hi Igor,
I think you may be mixing up rel="publisher" and rel="author" mark up here.
By implementing the Google Plus rel="publisher" link on your sites you are establishing your brand to Google. You should use this link to link to your business' G+ page which should be unique to each business (including the two ecommerce sites). (For reference: http://support.google.com/webmasters/bin/answer.py?hl=en&answer=1708844)
Additionally, you can set up a rel="author" link between the sites which you contribute to and own to establish your personal G+ account as an author. (For reference: http://support.google.com/webmasters/bin/answer.py?hl=en&answer=2539557)
Thanks,
K
-
I'm not sure personally, but you should reach out to either Mark Traphagen or AJ Kohn who are Google+ / Authorship experts.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate content - multiple sites hosted on same server with same IP address
We have three sites hosted on the same server with the same IP address. For SEO (to avoid duplicate content) reasons we need to redirect the IP address to the site - but there are three different sites. If we use the "rel canonical" code on the websites, these codes will be duplicates too, as the websites are mirrored versions of the sites with IP address, e.g. www.domainname.com/product-page and 23.34.45.99/product-page. What's the best ways to solve these duplicate content issues in this case? Many thanks!
White Hat / Black Hat SEO | | Jade0 -
How would you optimize a new site?
Hi guys, im here to ask based on your personal opinion. We know in order to rank in SEO for a site is to make authority contents that interest people. But what would you do to increase your ranking of your site or maybe a blog post? leaving your link on blogs comment seem dangerous, nowadays. Is social media the only way to go? Trying to get people to write about you? what else can be done?
White Hat / Black Hat SEO | | andzon0 -
The wrath of Google's Hummingbird, a big problem, but no quick solution?
One of our websites has been wrongfully tagged for penalty and has literally disappeared from Google. After lot's of research, it seems the reason was due to a ton of spammy backlinks and irrelevant anchor text. I have disavowed the links, but the results are still not rebounding back. Any idea how long the wrath of Google gods will last?
White Hat / Black Hat SEO | | Mouneeb0 -
Do legitimately earned links from unrelated sites help or hurt?
We have a few charity events coming up that have offered to link back to our homepage. While we do genuinely like the charities we are going to sponsor, I'm not sure how those links will look seo-wise. For example, one is for the local high school basketball team and another is for a Pediatric Care Mud Run. To a human, these links make perfect sense, but to a robot, I'm not sure if it differentiates these links from spam/some negative link. Granted, I understand that a small percentage of links probably won't do anything either way, but I'd like to ignore that for the purposes of my question. All things being equal, do links such as these help or hurt? Thanks for your time and insight, Ruben
White Hat / Black Hat SEO | | KempRugeLawGroup0 -
Failed microsites that negatively affect main site: should I just redirect them all?
While they are great domain names, I suspect my 7 microsites are considered spammy and resulted in a filter on my main e-commerce site for the important keywords we now have a filter blocking from showing up in search. Should I consider it a sunk cost and redirect them all to my main e-commerce site, or is there any reason why that would make things worse? I've fixed just about everything I can thinking of in response to Panda and Penguin, before which we were on the first page for everything. That includes adding hundreds of pages of unique and relevant content, in the form of buyers guides and on e-commerce category pages -- resolving issues of thin content. Then I hid URL parameters in Ajax, sped up the site significantly, started generating new links... nothing... I have tons of new keywords for other categories, but I still clearly have that filter on those few important head keywords. The anchor text on the microsites leading to the main site are typically not exact match, so I don't think that's the issue. It has to be that the sites themselves are considered spammy. My bosses are not going to like the idea because they paid for those awesome domains, but would the best idea be to redirect them to the e-commerce site?
White Hat / Black Hat SEO | | ElBo9130 -
Massive drop in Google traffic after upping pagecount 8-fold.
I run a book recommendation site -- Flashlight Worthy. It's a collection of original, topical book lists: "The Best Books for Healthy (Vegetarian) Babies" or "Keystone Mysteries: The Best Mystery Books Set in Pennsylvania" or "5 Books That Helped Me Discover and Love My Italian Heritage". It's been online for 4+ years. Historically, it's been made up of: a single home page ~50 "category" pages, and ~425 "book list" pages. (That 50 number and 425 number both started out much smaller and grew over time but has been around 425 for the last year or so as I've focused my time elsewhere.) On Friday, June 15 we made a pretty big change to the site -- we added a page for every Author who has a book that appears on a list. This took the number of pages in our sitemap from ~500 to 4,149 overnight. If an Author has more than one book on the site, the page shows every book they have on the site, such as this page: http://www.flashlightworthybooks.com/books-by/Roald-Dahl/2805 ..but the vast majority of these author pages have just one book listed, such as this page: http://www.flashlightworthybooks.com/books-by/Barbara-Kilarski/2116 Obviously we did this as an SEO play -- we figured that our content was getting ~1,000 search entries a day for such a wide variety of queries that we may as well create pages that would make natural landing pages for a broader array of queries. And it was working... 5 days after we launched the pages, they had ~100 new searches coming in from Google. (Ok, it peaked at 100 and dropped down to a steady 60 or so day within a few days, but still. And then it trailed off for the last week, dropping lower and lower every day as if they realized it was repurposed content from elsewhere on our site...) Here's the problem: For the last several years the site received ~30,000 search entries a month... a little more than 1,000 a day on weekdays, a little lighter on weekends. This ebbed and flowed a bit as Google made tweaked things (Panda for example), as we garnered fresh inbound links, as the GoodReads behemoth stole some traffic... but by and large, traffic was VERY stable. And then, on Saturday, exactly 3 weeks after we added all these pages, the bottom fell out of our search traffic. Instead of ~1,000 entries a day, we've had ~300 on Saturday and Sunday and it looks like we'll have a similar amount today. And I know this isn't just some Analytics reporting problem as Chartbeat is showing the same drop. As search is ~80% of my traffic I'm VERY eager to solve this problem... So: 1. Do you think the drop is related to my upping my pagecount 8-fold overnight? 2. Do you think I'd climb right back into Google's good graces if I removed all the pages at once? Or just all the pages that only list one author (which would be the vasy majority). 3. Have you ever heard of a situation like this? Where Google "punishes" a site for creating new pages out of existing content? Really, it's useful content -- and these pages are better "answers" for a lot of queries. When someone searches for "Norah Ephron books" it's better they land on a page of ours that pulls together the 4 books we have than taking them to a page that happens to have just one book on it among 5 or 6 others by other authors. What else? Thanks so much, help is very appreciated. Peter
White Hat / Black Hat SEO | | petestein1
Flashlight Worthy Book Recommendations
Recommending books so good, they'll keep you up past your bedtime. 😉0 -
Google-backed sites' link profiles
Curious what you SEO people think of the link profiles of these (high-ranking) Google-backed UK sites: http://www.opensiteexplorer.org/domains?site=www.startupdonut.co.uk http://www.opensiteexplorer.org/domains?site=www.lawdonut.co.uk http://www.opensiteexplorer.org/domains?site=www.marketingdonut.co.uk http://www.opensiteexplorer.org/domains?site=www.itdonut.co.uk http://www.opensiteexplorer.org/domains?site=www.taxdonut.co.uk Each site has between 40k and 50k inlinks counted in OSE. However, there are relatively few linking root domains in each case: 273 for marketingdonut 216 for startupdonut 90 for lawdonut 53 for itdonut 16 for taxdonut Is there something wrong with the OSE data here? Does this imply that the average root domain linking to the taxdonut site does so with 2857 links? The sites have no significant social media stats. The sites are heavily inter-linked. Also linked from the operating business, BHP Information Solutions (tagline "Gain access to SMEs"). Is this what Google would think of as a "natural" link profile? Interestingly, they've managed to secure links on quite a few UK local authority resources pages - generally being the only commercial website on those pages.
White Hat / Black Hat SEO | | seqal0 -
Multiple H1 tags are OK according to developer. I have my doubts. Please advise...
Hi, My very well known and widely respected developer is using multiple H1 tags I see - they like using them in their code and they argue multiple H1s conform with HTML5 standards. They are resisting a recode to one H1 tag per page. However, I know this is clearly an issue in Bing, so I don't want to risk it with Google. Any thoughts on whether it's best to avoid multiple H1 tags in Google (any evidence and reasoning would be great - I can then put that to my developer...) Many thanks for your help, Luke
White Hat / Black Hat SEO | | McTaggart0