Google rel hell
-
So apologies in advance for this question, but:
Can someone explain whether as a site we should be using the "rel author" tag or the "rel publisher" tag?
1. We don't really need to distinguish between the people who write our content.
2. We definitely do need to establish ownership of our content, as unfortunately it has been widely copied. We are spending quite a bit of time filing DMCA notices.
3. Do we need to apply either tag to every page? Or does "del publisher" just need to be applied to the homepage to cover the rest of the site?
4. What looks better in the search results? - a person's face or a company logo?
If prefer a face, but understand we need to promote our brand.
Thanks
P
-
Hi Guys
Thanks, very helpful.
We are an ecommerce site and an information site. So I guess we will go with publisher on the homepage and author for content pages.
Though we have tried author on product pages and G displays the author photo in the results. Not sure what effect that has, but I like to see a face related to the product. (We sell information products - legal documents). It also helps our product stand-out in the results.
Thanks again.
Patrick
-
There seems to be some confusion here.
First of all, rel="publisher" is a way to connect your site to your page on Google+. This can help your page be recognized as the "official" page for your brand and make it eligible for Google Direct Connect. The rel="publisher" tag goes ONLY on your homepage. rel="publisher" will NOT get your company logo shown in the search results (at least not yet).
rel="author" is a way to link the content on your site to specific authors on Google+. This has the benefit of building up the author rank for those authors, as well as displaying author snippets in the search results. This only works with photos of people, not cartoons or logos. The author tag should go on every page where you have unique content.
This link shows you the different ways you can verify ownership:
http://support.google.com/webmasters/bin/answer.py?hl=en&answer=1408986
Hope that helps!
-
1. What type of site are you promoting? An eCommerce site? A Brand? A consultant? An informational site.
For me, the ultimate decision would come down to what I want the "face of the company" or website to be.
For example, I may have two or three websites and am trying to build "Me" as a consultant or an expert in a particular subject matter. In that case I would definitely use "author." However, I own an eCommerce site where I want to build a brand...the people who write the content may come and go and I don't want to the article ownership (in Google's eyes) to leave with that person. I would use "publisher" in that case.
I have seen sites that successfully mix it up too. They would use "publisher" on the catalog pages and use "author" on blog type pages.
3. I would apply this to every page with unique content.
4. To me, same as question 1. If I were a consumer, I would want to see the website brand for products or catalog data but for opinions, industry expertise, consulting, etc., I would want to see a face with the content.
My 2 cents
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
The particular page cannot be indexed by Google
Hello, Smart People!
On-Page Optimization | | Viktoriia1805
We need help solving the problem with Google indexing.
All pages of our website are crawled and indexed. All pages, including those mentioned, meet Google requirements and can be indexed. However, only this page is still not indexed.
Robots.txt is not blocking it.
We do not have a tag "nofollow"
We have it in the sitemap file.
We have internal links for this page from indexed pages.
We requested indexing many times, and it is still grey.
The page was established one year ago.
We are open to any suggestions or guidance you may have. What else can we do to expedite the indexing process?1 -
Tracking down rel="canonical" on Wordpress site
A rel="canonical" is being added to every page and post on my Wordpress site - not tag results, not category results. Not a major problem, right? Except that I don't know where it's coming from. I've tried tracking it down - change the theme back to a default one, turn off all the plugins - it's still there. Is it coming from .htaccess perhaps? The only issue it is causing is that it has causes me to have to turn off the canonical option in Platinum SEO as that was resulting in two identical rel=canon on each page. It doesn't seem to be causing problem but I'd like to get a better understanding of what it going on.
On-Page Optimization | | robandsarahgillespie0 -
Google changing my Title
I noticed today that Google is showing a different Title in searches for the homepage. It is showing a title which I believe was active as of late last year or early January. The new title is in the title tag, in Google cache and in MOZ crawls as well. Not sure why it would still be showing a different title in the search and wanted to see if anyone can explain the reason. Thanks
On-Page Optimization | | campaigneast0 -
Is there a limit to the number of duplicate pages pointing to a rel='canonical ' primary?
We have a situation on twiends where a number of our 'dead' user pages have generated links for us over the years. Our options are to 404 them, 301 them to the home page, or just serve back the home page with a canonical tag. We've been 404'ing them for years, but i understand that we lose all the link juice from doing this. Correct me if I'm wrong? Our next plan would be to 301 them to the home page. Probably the best solution but our concern is if a user page is only temporarily down (under review, etc) it could be permanently removed from the index, or at least cached for a very long time. A final plan is to just serve back the home page on the old URL, with a canonical tag pointing to the home page URL. This is quick, retains most of the link juice, and allows the URL to become active again in future. The problem is that there could be 100,000's of these. Q1) Is it a problem to have 100,000 URLs pointing to a primary with a rel=canonical tag? (Problem for Google?) Q2) How long does it take a canonical duplicate page to become unique in the index again if the tag is removed? Will google recrawl it and add it back into the index? Do we need to use WMT to speed this process up? Thanks
On-Page Optimization | | dsumter0 -
Google Page Rank has no any rankings as of now. what to do?
my domain and page authority is working well right now but my Google Page Rank has no any rankings as we speak. What to do now? can some of you give me advice on this? Thank you very much in advance.
On-Page Optimization | | Panoramictrip0 -
Make Google reindex the website after on-page optimization
Hello Moz Community, I just finished the on-page optimization for a new project and I would like to know how can I make Google to reindex the new link structure, titles and meta tags? Thank you!
On-Page Optimization | | CosminC0 -
Does Google respect User-agent rules in robots.txt?
We want to use an inline linking tool (LinkSmart) to cross link between a few key content types on our online news site. LinkSmart uses a bot to establish the linking. The issue: There are millions of pages on our site that we don't want LinkSmart to spider and process for cross linking. LinkSmart suggested setting a noindex tag on the pages we don't want them to process, and that we target the rule to their specific user agent. I have concerns. We don't want to inadvertently block search engine access to those millions of pages. I've seen googlebot ignore nofollow rules set at the page level. Does it ever arbitrarily obey rules that it's been directed to ignore? Can you quantify the level of risk in setting user-agent-specific nofollow tags on pages we want search engines to crawl, but that we want LinkSmart to ignore?
On-Page Optimization | | lzhao0 -
Should I let Google index tags?
Should I let Google index tags? Positive? Negative Right now Google index every page, including tags... looks like I am risking to get duplicate content errors? If thats true should I just block /tag in robots.txt Also is it better to have as many pages indexed by google or it's should be as lees as possible and specific to the content as much as possible. Cheers
On-Page Optimization | | DiamondJewelryEmpire0