Authorship Photo Not showing. Done all checks still photo not coming
-
Can someone suggest - authorship photo not showing - have asked this earlier too but did not get much response on it
Site URL
http://www.mycarhelpline.com/index.php?option=com_easyblog&view=entry&id=93&Itemid=91
http://www.mycarhelpline.com/index.php?option=com_latestnews&view=detail&n_id=479&Itemid=10
Google +
https://plus.google.com/109551624336693902828/posts
Have done checks :-
-
?rel=author at end of profile url on site - Yes
-
Profile discovery option on in Google+ - yes
-
Contributor link in Google+ - yes
-
Email validation done - yes
-
Photo fitted in size - yes
-
Rich snippet showing authorship established with photo - yes
still the photo not coming in for last 6 months now. Any suggestion pls
Even on searching name 'Gagan Modi' - the photo do show in Search result of google plus profile. But rich snippet as author photo do not show in for the site.
-
-
Great, thanks!
-
Ahaa, Sure.. Thanks ...
Seems have not recvd notification for your reply in that thread - hence could not revert on that. Reverting on that thread now !!
-
Hi Brian,
Sure. Thanks. However, We tried this earlier. Will do it again. Infact there are other sections like News & Reviews too - which has only single link to Google+ Profile - but unfortunately doesnt work to show author photo.
-
Hi Gagan - Did you see the response I left you on your original question at http://moz.com/community/q/authorship-photo-not-showing-in-for-last-6-months-now? I included a couple screencaps of Authorship showing for select searches.
-
One thing I noticed is that your using the
rel="author"
tag twice on the blogs you provided, one to link to your internal profile and one for your Google+ profile. I would try removing the tag from the internal link and leave it for the Google+ link and see if that makes a difference. -
Hi Angelos,
Thanks for writing in,
Unfortunately - still it has not worked up !! The Photo still not coming
Clueless now !!
-
Hey Gagan
just wanted to check out if everything worked out for you.
-
Ok, changed photo as advised
Keepin' fingers crossed - will let know
-
Yes, its showing in Rich Snippets and for over 3 months
The site gets crawled daily.
-
Hey Gagan,
Ok try to use a photo with a brighter background. I quarranty it will work. Give it a day or two too.
Don't try any tools or anything, its that simple and 100% sure.
-
Hi Gagan,
Try using the Rich Snippets testing tool http://www.google.com/webmasters/tools/richsnippets let me know if you can see the photo there when you type in the sample URL. If not then there is an issue with your code setup as this pulls fresh on each request.
If however you do see it there then it's possible that it's an issue on either your local cache or how your site is caching on Google. Are you sure the latest edits are live and crawlable?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Competitor Bad Practice SEO Still Ranking Well But Why ?
Moz Friends, A very close competitor have always been challenging for similar competitive keywords. We seem to have the advantage for alot of long tail keywords but on one of the higher traffic relevant keywords they seem to do well. I really struggle to understand why, particularly with the back links they use Just my thoughts and notes on the two: Our Page Better written text content (Maybe slightly written to for experienced target audience but we are working on simplifying things) Good Clear site URL structure and navigation for usability Fresh content updates Mobile optimized Reasonable page speeds Good on-page optimization Good back links from industry influences Competitor Page Negatives Site structure and URL's are inconsistent and messy Lower quality content site wide They use tried and tested on page optimization methods like Keyword spamming, Bold Keywords,Underlining Keywords (Sarcasm) Terrible back links, all directories and free article submission sites (Seriously take a look) Less focused on page optimization Not mobile optimized Most of the rest of the sites carry on the same sort of differences, Engine: www.google.co.uk Keyword: Sound level meters **Our Page: **www.cirrusresearch.co.uk/products/sound-level-meters/ **Competitor Page: **www.pulsarinstruments.com/product-information/Sound-Level-Meter.html Any feedback would be greatly appreciated please, i am really struggling to get my head around this Thanks James
White Hat / Black Hat SEO | | Antony_Towle1 -
Whay are low-quality exact match domains still ranking well for our biggest term?
There are a number of low-quality “exact-match” domains that are ranking well for the term “locum tenens”. I don’t want to specifically mention any sites, but there are some with poor content and very few quality backlinks that are on page one. The only reason I can see for them ranking so well is the fact that “locum” and/or “tenens” are in the URL. It’s very frustrating because we have worked hard to do all the right things (regular blogging, high-quality content, quality backlinks, etc.) to build our domain authority and page authority so they are better than these sites, yet they still out-rank us. Our site is www.bartonassociates.com. Could it have something to do with the term “locum tenens”, which is a latin phrase? Is it possible that because it is a latin term that it somehow slipped through the cracks and avoided the update that was supposed to eliminate this? If so, what can we do to get some justice?
White Hat / Black Hat SEO | | ba_seomoz0 -
Why does expired domains still work for SEO?
Hi everyone I’ve been doing an experiment during more than 1 year to try to see if its possible to buy expired domains. I know its considered black hat, but like I said, I wanted to experiment, that is what SEO is about. What I did was to buy domains that just expired, immediately added content on a WP setup, filled it with relevant content to the expired domain and then started building links to other relevant sites from these domains.( Here is a pretty good post on how to do, and I did it in a similar way. http://searchenginewatch.com/article/2297718/How-to-Build-Links-Using-Expired-Domains ) This is nothing new and SEO:s has been doing it for along time. There is a lot of rumors around the SEO world that the domains becomes worthless after they expire. But after trying it out during more than 1 year and with about 50 different expired domains I can conclude that it DOES work, 100% of the time. Some of the domains are of course better than others, but I cannot see any signs of the expired domains or the sites i link to has been punished by Google. The sites im liking to ranks great ONLY with those links 🙂 So to the question: WHY does Google allow this? They should be able to see that a domain has been expired right? And if its expired, why dont they just “delete” all the links to that domain after the expiry date? Google is well aware of this problem so what is stopping them? Is there any one here that know how this works technically?
White Hat / Black Hat SEO | | Sir0 -
Website not listing in google - screaming frog shows 500 error? What could the issue be?
Hey, http://www.interconnect.org.uk/ - the site seems to load fine, but for some reason the site is not getting indexed. I tried running the site on screaming frog, and it gives a 500 error code, which suggests it can't access the site? I'm guessing this is the same problem google is having, do you have any ideas as to why this may be and how I can rectify this? Thanks, Andrew
White Hat / Black Hat SEO | | Heehaw0 -
Bad links showing up in opensiteexplorer
Hello Everybody,I've been working as an inhouse SEO for nearly a year and a half now and i've gotten some pretty great results. Two years ago our site was on the second page for the most important keywords in our niche and with a lot of work we've managed to get top 5 rankings for most keywords and even the number 1 spot for the most important keywords. I've been using opensite explorer to track backlinks and today i noticed that a lot of links we're discovered in the last week from websites that i did not recognize. Most url's won't even load properly because each "blogpost" has over a thousand comments. It took me a couple of tries to even find one that loaded properly and find the link to our website, and it was really there. There haven't been any drops in our rankings but i'm worried about a possible spam penalty. I know that i can use the disavow tool to at least disavow the links from these domains, but is that really the only thing i can do? Furthermore these are just the links that opensiteexplorer picked up, who knows how many more are out there.For any of you questioning wether or not i did this to myself, I'm no saint, but I'm definitely not stupid enough to buy these kinds of links. any help would be highly appreciated
White Hat / Black Hat SEO | | Laurensvda0 -
Is it still valuable to place content in subdirectories to represent hierarchy or is it better to have every URL off the root?
Is it still valuable to place content in subdirectories to represent hierarchy on the site or is it better to have every URL off the root? I have seen websites structured both ways. It seems having everything off the root would dilute the value associated with pages closest to the homepage. Also, from a user perspective, I see the value in a visual hierarchy in the URL.
White Hat / Black Hat SEO | | belcaro19860 -
How to Not Scrap Content, but still Being a Hub
Hello Seomoz members. I'm relatively new to SEO, so please forgive me if my questions are a little basic. One of the sites I manage is GoldSilver.com. We sell gold and silver coins and bars, but we also have a very important news aspect to our site. For about 2-3 years now we have been a major hub as a gold and silver news aggregator. At 1.5 years ago (before we knew much about SEO), we switched from linking to the original news site to scraping their content and putting it on our site. The chief reason for this was users would click outbound to read an article, see an ad for a competitor, then buy elsewhere. We were trying to avoid this (a relatively stupid decision with hindsight). We have realized that the Search Engines are penalizing us, which I don't blame them for, for having this scraped content on our site. So I'm trying to figure out how to move forward from here. We would like to remain a hub for news related to Gold and Silver and not be penalized by SEs, but we also need to sell bullion and would like to avoid loosing clients to competitors through ads on the news articles. One of the solutions we are thinking about is perhaps using an iFrame to display the original url, but within our experience. An example is how trap.it does this (see attached picture). This way we can still control the experience some what, but are still remaining a hub. Thoughts? Thank you, nick 3dLVv
White Hat / Black Hat SEO | | nwright0 -
Showing pre-loaded content cloaking?
Hi everyone, another quick question. We have a number of different resources available for our users that load dynamically as the user scrolls down the page (like Facebook's Timeline) with the aim of improving page load time. Would it be considered cloaking if we had Google bot index a version of the page with all available content that would load for the user if he/she scrolled down to the bottom?
White Hat / Black Hat SEO | | CuriosityMedia0