Best Practice for setting up expert author contributing to Multiple Sites?
-
If a single author contributes to multiple sites, should each site have its own author page (tying to the same single gg+ account)? Ex. One author > one gg+ account > multiple author pages (one per site)
Or, should all sites publishing his content link to a single author page/bio on a single, main site? Ex. One author > one gg+ account > a single author page on one site (all other sites link to this author page)
In this event, where would the 'contributor to' link point for the additional sites he is contributing to, the homepage?
Thanks!
-
Thanks for clarifying. Each site needs it's own author page. I am comfortable with creating reciprocity with GG+. Thanks!
-
Federico is exactly correct. G+ and the Website/Blog need to be tied together through the reciprocal links. G+ needs the website's link in the "Contributor" field in the Links section of the profile. This link should be the www.domainname.com/author/yourname link to be more specific. Then, on the website the G+ profile URL needs to be "rel=author". If using WordPress, AuthorSure is a great plugin we like using, but there are others to test around with.
Patrick McCoy
-
If an author contributes on more than one site, then on G+ you should link to each author page on each site (within the same G+ profile). Each site should have the author page linking to his/her g+ profile. The authorship verification needs a "reciprocal" link to verify "ownership".
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Indexed Pages Different when I perform a "site:Google.com" site search - why?
My client has an ecommerce website with approx. 300,000 URLs (a lot of these are parameters blocked by the spiders thru meta robots tag). There are 9,000 "true" URLs being submitted to Google Search Console, Google says they are indexing 8,000 of them. Here's the weird part - When I do a "site:website" function search in Google, it says Google is indexing 2.2 million pages on the URL, but I am unable to view past page 14 of the SERPs. It just stops showing results and I don't even get a "the next results are duplicate results" message." What is happening? Why does Google say they are indexing 2.2 million URLs, but then won't show me more than 140 pages they are indexing? Thank you so much for your help, I tried looking for the answer and I know this is the best place to ask!
Intermediate & Advanced SEO | | accpar0 -
Best practice for disallowing URLS with Robots.txt
Hi Everybody, We are currently trying to tidy up the crawling errors which are appearing when we crawl the site. On first viewing, we were very worried to say the least:17000+. But after looking closer at the report, we found the majority of these errors were being caused by bad URLs featuring: Currency - For example: "directory/currency/switch/currency/GBP/uenc/aHR0cDovL2NlbnR1cnlzYWZldHkuY29tL3dvcmt3ZWFyP3ByaWNlPTUwLSZzdGFuZGFyZHM9NzEx/" Color - For example: ?color=91 Price - For example: "?price=650-700" Order - For example: ?dir=desc&order=most_popular Page - For example: "?p=1&standards=704" Login - For example: "customer/account/login/referer/aHR0cDovL2NlbnR1cnlzYWZldHkuY29tL2NhdGFsb2cvcHJvZHVjdC92aWV3L2lkLzQ1ODczLyNyZXZpZXctZm9ybQ,,/" My question now is as a novice of working with Robots.txt, what would be the best practice for disallowing URLs featuring these from being crawled? Any advice would be appreciated!
Intermediate & Advanced SEO | | centurysafety0 -
Old site penalised, we moved: Shall we cut loose from the old site. It's curently 301 to new site.
Hi, We had a site with many bad links pointing to it (.co.uk). It was knocked from the SERPS. We tried to manually ask webmasters to remove links.Then submitted a Disavow and a recon request. We have since moved the site to a new URL (.com) about a year ago. As the company needed it's customer to find them still. We 301 redirected the .co.uk to the .com There are still lots of bad links pointing to the .co.uk. The questions are: #1 Do we stop the 301 redirect from .co.uk to .com now? The .co.uk is not showing in the rankings. We could have a basic holding page on the .co.uk with 'we have moved' (No link). Or just switch it off. #2 If we keep the .co.uk 301 to the .com, shall we upload disavow to .com webmasters tools or .co.uk webmasters tools. I ask this because someone else had uploaded the .co.uk's disavow list of spam links to the .com webmasters tools. Is this bad? Thanks in advance for any advise or insight!
Intermediate & Advanced SEO | | SolveWebMedia0 -
Merging Two Unrelated Sites into a Third Site
We have a new client interested in possibly merging 2 sites into one under the brand of a new parent company. Here's a breakdown of the scenario..... BrandA.com sells a variety of B2B widget-services via their online store. BrandB.com sells a variety of B2B thing-a-majig products and services (some of them large in size) not sold through an online store. These are sold more consultatively via a sales team. The new parent company, BrandA-B.com is considering combining the two sites under the new brand parent company domain. The Widget-services and Thing-A-Majigs have very little similarity or purchase crossover; so just because you're interested in one doesn't make you a good candidate for the other. We feel pretty confident that we can round-up all the necessary pages and inbound links to do proper transitioning to a new, separate third domain though we're not in agreement that this is the best course of action. Currently the individual brand sites are fairly well known in their industry and each ranks fairly well for a variety of important terms though there is room for improvement and each site has good links with the exception of the new site which has considerably fewer. BrandA.com DA = 73 - 19 years old
Intermediate & Advanced SEO | | OPM
BrandB.com DA = 55 - 18 years old
BrandA-B.com DA = 40 - 1 year old Our SEO team members have opinions on what the potential outcome(s) of this would be but are wondering what the community here thinks. Will the combining of the sites cause a dilution of the topics of the two sites and hurt rankings? Will the combining of the domain authority help one set part of the business but hurt the other? What do you think? What would you do?0 -
Guest Blog post best practice considering time/energy
Good morning Moz community 🙂 What do you guys think would be the best practice as a starting blogger offering guest articles to other 3rd party blogs when it comes to building up my own website's SEO points (assuming I have a link in the guest article to my website). 1. If I have the opportunity to post the guest article on two+ different blogs, should I go for it? -OR- 2. Only post the article on one specific blog and write a different one for the others? In a world with unlimited resources, the latter option would prevail, but considering that it takes time to write, what would you recommend if I am trying to build my websites SERPs? Carlos
Intermediate & Advanced SEO | | 90miLLA0 -
What's the best internal linking strategy for articles and on-site resources?
We recently added an education center to our site with articles and information about our products and industry. What is the best way to link to and from that content? There are two options I'm considering: Link to articles from category and subcategory pages under a section called "related articles" and link back to these category and subcategory pages from the articles: category page <<--------->> education center article education center article <<---------->> subcategory page Only link from the articles to the category and subcategory pages: education center article ---------->> category page education center article ---------->> subcategory page Would #1 dilute the SEO value of the category and subcategory pages? I want to offer shoppers links to more information if they need it, but this may also take them away from the products. Has anyone tested this? Thanks!
Intermediate & Advanced SEO | | pbhatt0 -
Best Format for URLs on large Ecommerce Site?
I saw this article, http://www.distilled.net/blog/seo/common-ecommerce-technical-seo-problems/, and noticed that Geoff mentioned that product URLs format should be in one of the following ways: Product Page: site.com/product-name Product Page: site.com/category/sub-category/product-name However, for SEO, is there a preferred way? I understand that the top one may be better to prevent duplicate page issues, but I would imagine that the bottom would be better for conversion (maybe the user backtracks to site.com/category/sub-category/ to see other products that he may be interested in). Also, I'd imagine that the top URL would not be a great way to distribute link juice since everything would be attached to the root, right?
Intermediate & Advanced SEO | | eTundra0 -
Rebuilding a site with pre-existing high authority subdomains
I'm rebuilding a real estate website with 4 subdomains that have Page Authorities between 45 and 50. Since it's a real estate website it has 20,000+ pages of unique (listing) content PER sub-domain. The subdomains are structured like: washington.xyzrealty.com and california.xyzrealty.com. The root domain has a ~50 Page Authority. The site is about 7 years old. My preference is to focus all of my efforts on the primary domain going forward, but I don't want to waste the power of the subdomains. I'm considering: 1. Putting blogs or community/city pages on the subdomains 2. 301 redirecting all of the existing pages to matching pages on the new root domain. 3. Any other ideas??
Intermediate & Advanced SEO | | jonathanwashburn0