Authorship Photo Not showing. Done all checks still photo not coming
-
Can someone suggest - authorship photo not showing - have asked this earlier too but did not get much response on it
Site URL
http://www.mycarhelpline.com/index.php?option=com_easyblog&view=entry&id=93&Itemid=91
http://www.mycarhelpline.com/index.php?option=com_latestnews&view=detail&n_id=479&Itemid=10
Google +
https://plus.google.com/109551624336693902828/posts
Have done checks :-
-
?rel=author at end of profile url on site - Yes
-
Profile discovery option on in Google+ - yes
-
Contributor link in Google+ - yes
-
Email validation done - yes
-
Photo fitted in size - yes
-
Rich snippet showing authorship established with photo - yes
still the photo not coming in for last 6 months now. Any suggestion pls
Even on searching name 'Gagan Modi' - the photo do show in Search result of google plus profile. But rich snippet as author photo do not show in for the site.
-
-
Great, thanks!
-
Ahaa, Sure.. Thanks ...
Seems have not recvd notification for your reply in that thread - hence could not revert on that. Reverting on that thread now !!
-
Hi Brian,
Sure. Thanks. However, We tried this earlier. Will do it again. Infact there are other sections like News & Reviews too - which has only single link to Google+ Profile - but unfortunately doesnt work to show author photo.
-
Hi Gagan - Did you see the response I left you on your original question at http://moz.com/community/q/authorship-photo-not-showing-in-for-last-6-months-now? I included a couple screencaps of Authorship showing for select searches.
-
One thing I noticed is that your using the
rel="author"
tag twice on the blogs you provided, one to link to your internal profile and one for your Google+ profile. I would try removing the tag from the internal link and leave it for the Google+ link and see if that makes a difference. -
Hi Angelos,
Thanks for writing in,
Unfortunately - still it has not worked up !! The Photo still not coming
Clueless now !!
-
Hey Gagan
just wanted to check out if everything worked out for you.
-
Ok, changed photo as advised
Keepin' fingers crossed - will let know
-
Yes, its showing in Rich Snippets and for over 3 months
The site gets crawled daily.
-
Hey Gagan,
Ok try to use a photo with a brighter background. I quarranty it will work. Give it a day or two too.
Don't try any tools or anything, its that simple and 100% sure.
-
Hi Gagan,
Try using the Rich Snippets testing tool http://www.google.com/webmasters/tools/richsnippets let me know if you can see the photo there when you type in the sample URL. If not then there is an issue with your code setup as this pulls fresh on each request.
If however you do see it there then it's possible that it's an issue on either your local cache or how your site is caching on Google. Are you sure the latest edits are live and crawlable?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
I show different versions of the same page to the crawlers and users, but do not want to do anymore
Hello, While Google could not read JavaScript, I created two versions of the same page, one of them is for human and another is for Google. Now I do not want to serve different content to the search engine. But, I am worry if I will lose my traffic value. What is the best way to succeed it without loss? Can you help me?
White Hat / Black Hat SEO | | kipra0 -
How do you check if a website has a link network (From the same C Class)
Hello Mozzers, I'm conducting a link audit and I see a red flag for one of my guest blogs i did in 2012. let's say the IP of the website was 62.658.62.9 Little did I know that the blogging website is a link network with the same content on each IP via it's specific C class: 62.658.62.9 62.658.62.10 62.658.62.11 ETC... How does one find a website to blog on and check to see if they have a blog network or better yet, see if there is a similar distinction of duplicate sites based on its C-class?
White Hat / Black Hat SEO | | Shawn1240 -
Does showing the date published for an article in the SERPS help or hurt click-through rate?
Does showing the date published for an article in the SERPS help or hurt click-through rate?
White Hat / Black Hat SEO | | WebServiceConsulting.com1 -
Its posible to use Google Authorship in an online shop?
Today I installed Google Authorship in my Wordpress Blog and I would like to know if its posible to implement it in my Opencart online shop. I am not interested in rich snippets because I have 9k of products and the 90% of them dont have sells nor reviews
White Hat / Black Hat SEO | | mozismoz0 -
Is Yahoo! Directory still a beneficial SEO tactic
For obvious reasons, we have submitted our clients to high authority directories such as Yahoo! Directory and Business.com. However, with all of the algorithm updates lately, we've tried to cut back on the paid directories that we submit our clients to. Having said that, my question is, is Yahoo! Directory still a beneficial SEO tactic? Or are paid directories, with the exception of BBB.com, a bad SEO tactic?
White Hat / Black Hat SEO | | MountainMedia0 -
Why does expired domains still work for SEO?
Hi everyone I’ve been doing an experiment during more than 1 year to try to see if its possible to buy expired domains. I know its considered black hat, but like I said, I wanted to experiment, that is what SEO is about. What I did was to buy domains that just expired, immediately added content on a WP setup, filled it with relevant content to the expired domain and then started building links to other relevant sites from these domains.( Here is a pretty good post on how to do, and I did it in a similar way. http://searchenginewatch.com/article/2297718/How-to-Build-Links-Using-Expired-Domains ) This is nothing new and SEO:s has been doing it for along time. There is a lot of rumors around the SEO world that the domains becomes worthless after they expire. But after trying it out during more than 1 year and with about 50 different expired domains I can conclude that it DOES work, 100% of the time. Some of the domains are of course better than others, but I cannot see any signs of the expired domains or the sites i link to has been punished by Google. The sites im liking to ranks great ONLY with those links 🙂 So to the question: WHY does Google allow this? They should be able to see that a domain has been expired right? And if its expired, why dont they just “delete” all the links to that domain after the expiry date? Google is well aware of this problem so what is stopping them? Is there any one here that know how this works technically?
White Hat / Black Hat SEO | | Sir0 -
Is it still valuable to place content in subdirectories to represent hierarchy or is it better to have every URL off the root?
Is it still valuable to place content in subdirectories to represent hierarchy on the site or is it better to have every URL off the root? I have seen websites structured both ways. It seems having everything off the root would dilute the value associated with pages closest to the homepage. Also, from a user perspective, I see the value in a visual hierarchy in the URL.
White Hat / Black Hat SEO | | belcaro19860 -
How to Not Scrap Content, but still Being a Hub
Hello Seomoz members. I'm relatively new to SEO, so please forgive me if my questions are a little basic. One of the sites I manage is GoldSilver.com. We sell gold and silver coins and bars, but we also have a very important news aspect to our site. For about 2-3 years now we have been a major hub as a gold and silver news aggregator. At 1.5 years ago (before we knew much about SEO), we switched from linking to the original news site to scraping their content and putting it on our site. The chief reason for this was users would click outbound to read an article, see an ad for a competitor, then buy elsewhere. We were trying to avoid this (a relatively stupid decision with hindsight). We have realized that the Search Engines are penalizing us, which I don't blame them for, for having this scraped content on our site. So I'm trying to figure out how to move forward from here. We would like to remain a hub for news related to Gold and Silver and not be penalized by SEs, but we also need to sell bullion and would like to avoid loosing clients to competitors through ads on the news articles. One of the solutions we are thinking about is perhaps using an iFrame to display the original url, but within our experience. An example is how trap.it does this (see attached picture). This way we can still control the experience some what, but are still remaining a hub. Thoughts? Thank you, nick 3dLVv
White Hat / Black Hat SEO | | nwright0