Google+ Verification - Site Speed Optimization
-
So the Google+ Badge verifies our site for Google direct connect. However, the javascript code for the badge itself causes the page to load 3 to 4 seconds longer, which is a big deal.
Any ideas for a work around?
-
Hi! You can find info about adding it to your page asynchronously here: https://developers.google.com/+/plugins/badge/.
-
Hey Slava, more specifically, I'm looking for details on how to load the google+ badge asynchronously. I'm not highly technical, so hopefully that made sense.
-
You have to love Google
-
Try to load it from the cloud, if you can. This should speed up your website.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Technical Site Migration
Hello All- I have been in the SEO industry for about 4 years and feel fairly confident in my technical SEO; however, I am being asked to conduct a migration that is both a platform change (step 1) then a consolidation change (think of combining websites when a company is acquired) all within a 4-5 month time span. I feel like as I begin to create this multi-step checklist I would love to hear of others that have gone through something similar or have resources you could point me to form more of a process/procedure standpoint. Again - I feel confident in the technical output but I have a fairly junior team and want to execute this smoothly. Feel free to add recommendations or ask for clarification if my discussion question doesn't make sense.
Intermediate & Advanced SEO | | Instructure0 -
Site-wide links with optimized anchor words?
Hi Moz community, I work at a web design company. I found my competitors have a lot of site-wide backlinks from their clients with optimized anchor text "affordable web design by XXX". Some of the clients' website are not even relevant to web design or design industry. I am sure those are dofollow links. Although I heard a lot of sayings that site-wide backlinks look unnatural and spammy, why the top ranking guys are still using this way to acquire backlinks? Does Google really actually say no to this? Thanks for any help and explanation. Best, Raymond
Intermediate & Advanced SEO | | Raymondlee0 -
Can Google penalize your site without sending you a Manual Spam Action?
I had a massive drop in traffic in Mid 2013, and a slow reduction since then. It has sort of leveled off now, but it's not exactly climbing I've never received a manual spam action. The answer to my question seems pretty obvious, now that I write it out... but have you heard of anyone getting penalized, without specifically receiving a warning? Thanks!
Intermediate & Advanced SEO | | DavidC.0 -
Why is my site not getting crawled by google?
Hi Moz Community, I have an escort directory website that is built out of ajax. We basically followed all the recommendations like implementing the escaped fragment code so Google would be able to see the content. Problem is whenever I submit my sitemap on Google webmastertool it always 700 had been submitted and only 12 static pages had been indexed. I did the site query and only a number of pages where indexed. Does it have anything to do with my site being on HTTPS and not on HTTP? My site is under HTTPS and all my content is ajax based. Thanks
Intermediate & Advanced SEO | | en-gageinc0 -
Google Penalty or Not?
One of my sites I work with got this message: http://www.mysite: Unnatural inbound linksJune 27, 2013 Google has detected a pattern of artificial or unnatural links pointing to your site. Buying links or participating in link schemes in order to manipulate PageRank are violations of Google's Webmaster Guidelines. As a result, Google has applied a manual spam action to mysite.com/. There may be other actions on your site or parts of your site. But, when I got to manual actions it says: Manual Actions No manual webspam actions found. -- So which is it??? I have been doing link removal, but now I am confused if I need to do a reconsideration request or not.
Intermediate & Advanced SEO | | netviper0 -
After Receiving a "Googlebot can't access your site" would this stop your site from being crawled?
Hi Everyone,
Intermediate & Advanced SEO | | AMA-DataSet
A few weeks ago now I received a "Googlebot can't access your site..... connection failure rate is 7.8%" message from the webmaster tools, I have since fixed the majority of these issues but iv noticed that all page except the main home page now have a page rank of N/A while the home page has a page rank of 5 still. Has this connectivity issues reduced the page ranks to N/A? or is it something else I'm missing? Thanks in advance.0 -
Google bot vs google mobile bot
Hi everyone 🙂 I seriously hope you can come up with an idea to a solution for the problem below, cause I am kinda stuck 😕 Situation: A client of mine has a webshop located on a hosted server. The shop is made in a closed CMS, meaning that I have very limited options for changing the code. Limited access to pagehead and can within the CMS only use JavaScript and HTML. The only place I have access to a server-side language is in the root where a Defualt.asp file redirects the visitor to a specific folder where the webshop is located. The webshop have 2 "languages"/store views. One for normal browsers and google-bot and one for mobile browsers and google-mobile-bot.In the default.asp (asp classic). I do a test for user agent and redirect the user to one domain or the mobile, sub-domain. All good right? unfortunately not. Now we arrive at the core of the problem. Since the mobile shop was added on a later date, Google already had most of the pages from the shop in it's index. and apparently uses them as entrance pages to crawl the site with the mobile bot. Hence it never sees the default.asp (or outright ignores it).. and this causes as you might have guessed a huge pile of "Dub-content" Normally you would just place some user-agent detection in the page head and either throw Google a 301 or a rel-canon. But since I only have access to JavaScript and html in the page head, this cannot be done. I'm kinda running out of options quickly, so if anyone has an idea as to how the BEEP! I get Google to index the right domains for the right devices, please feel free to comment. 🙂 Any and all ideas are more then welcome.
Intermediate & Advanced SEO | | ReneReinholdt0 -
Google omitting some entries
Hi, I used this tool to test some domains. The tool can be found at http://www.virante.com/seo-tools/duplicate-content I have no questions about the other checks but with the similarity check. My question is how do i get Google not to omit some entries very similar to the top 1000 pages on my site? Will appreciate your answers, thanks. Suleman
Intermediate & Advanced SEO | | esuleman0