Fetch data for users with ajax but show it without ajax for Google
-
Hi,
We have a thematic footer which shows similar pages links relevant to the search criteria made on a page.
We want to fetch those footer similar links through ajax when users search on site but the links will be shown without using ajax when Google fetches those pages. We want to do this to improve our page load time.
The links content & count will be exactly same in both cases whether Google fetches the search pages or user fetches those pages. Will this be treated as negative by Google, Can this have any negative affect on our rankings or traffic.
Regards,
-
I'm with Alan on the server side, 1 second is not really good for just doing a request for some links.
-
1 second is a lot for a few links even a lot of links. maybe your server technology has problems.
But still you have the problem of load time no matter who you are downloading the links for, the search engine or the use, they still have to be downloaded.
-
Hi Martijn,
Thanks for a quick reply.
This will reduce around 1 second in page load time.
-
if you are going to load data for google on page load, then you will still have the load times. so loading links again using ajax is not solving anything.
-
Cloaking is very dangerous, and the most common reason for google to use his axe.
If you code has anything similar to if googlebot then, you are at risk.
But in this case you do have a solution which theoretically should have no negative effect. Google has been sponsoring that technique of serving a static content on first load and update it with ajax.
But let me stress what it means, serve static content on first load, and update with ajax. Which means no cloaking, don't serve a different content (neither different code with same looking content) to visitors than google bot.
Additionally, it is very important to please visitors and serve content fast to them, but at the same time it's important to serve content fast to googlebot, since speed is a ranking factor.
-
How many seconds would the impact be for Google if you would still load it via AJAX? It feels a bit like you're trying to fix something that ain't broken.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Can I safely asume that links between subsites on a subdirectories based multisite will be treated as internal links within a single site by Google?
I am building a multisite network based in subdirectories (of the mainsite.com/site1 kind) where the main site is like a company site, and subsites are focused on brands or projects of that company. There will be links back and forth from the main site and the subsites, as if subsites were just categories or pages within the main site (they are hosted in subfolders of the main domain, after all). Now, Google's John Mueller has said: <<as far="" as="" their="" url="" structure="" is concerned,="" subdirectories="" are="" no="" different="" from="" pages="" and="" subpages="" on="" your="" main="" site.="" google="" will="" do="" its="" best="" to="" identify="" where="" sites="" separate="" using="" but="" the="" is="" same="" for="" a="" single="" site,="" you="" should="" assume="" that="" seo="" purposes,="" network="" be="" treated="" one="">></as> This sounds fine to me, except for the part "Google will do its best to identify where sites are separate", because then, if Google establishes that my multisite structure is actually a collection of different sites, links between subsites and mainsite would be considered backlinks between my own sites, which could be therefore considered a link wheel, that is, a kind of linking structure Google doesn't like. How can I make sure that Google understand my multisite as a unique site? P.S. - The reason I chose this multisite structure, instead of hosting brands in categories of the main site, is that if I use the subdirectories based multisite feature I will be able to map a TLD domain to any of my brands (subsites) whenever I'd choose to give that brand a more distinct profile, as if it really was a different website.
Web Design | | PabloCulebras0 -
Multiple sites using same text - how to avoid Google duplicate content penalty?
Hi Mozers, my client located in Colorado is opening a similar (but not identical) clinic in California. Will Google penalize the new California site if we use text from our website that features his Colorado office? He runs the clinic in CO and will be a partner of the clinic in CA, so the CA clinic has his "permission" to use his original text. Eventually he hopes to go national, with multiple sites utilizing essentially the same text. Will Google penalize the new CA site for plagiarism and/or duplicate content? Or is there a way to tell Google, "hey Google, this new clinic is not ripping off my text"?
Web Design | | CalamityJane770 -
Is there an issue if we show our old mobile site to Google & new site to users
Hi, We have our existing mobile site that contains interlinking in footer & content and new mobile site that does not have interlinking. We will show existing mobile site to google crawler & new mobile site to users. Will this be taken as black hat by Google. The mobile site & desktop site will have same url across devices & browsers. Regards
Web Design | | vivekrathore0 -
Has anyone added Structured Data Markup Server Side?
I want to add some structured data to our companies website via microdata through schema.org. I have been asked to gather all of the requirements so that it can be done server side and automated when things change. I honestly don't know where to begin as there are many areas where it can be added. Has anyone done this server side before?
Web Design | | Sika220 -
Does Google penalize duplicate website design?
Hello, We are very close to launching five new websites, all in the same business sector. Because we would like to keep our brand intact, we are looking to use the same design on all five websites. My question is, will Google penalize the sites if they have the same design? Thank you! Best regards,
Web Design | | Tiberiu
Tiberiu0 -
How to verify http://bizdetox.com for google webmaster tools
Hey guys i tried to to make a Preferred Domain choice in webmaster tools, but it is not allowing me to save my choice bec its asking me to verify that i own http://bizdetox.com How do i go about doing that and what are the steps I have already verified www.bizdetox.com
Web Design | | BizDetox0 -
Google Penalizing Websites that Have Contact Forms at Top of Website Page?
Has anyone else heard of Google penalizing websites for having their contact forms located at the top of the website? For example http://www.austintenantadvisors.com/ Look forward to hearing other thoughts on this.
Web Design | | webestate1 -
The primary search keywords for our news release network have dropped like a rock in Google... we are not sure why.
Hi, On April 11th, a month after the farmer update was released for U.S. users of Google, the primary keywords for ALL our sites significantly dropped in Google. I have some ideas why, but I wanted to get some second opinions also. First off, I did some research if Google did anything on the 11th of April... they did. They implemented the farmer update internationally, but that does not explain why our ranks did not drop in March for U.S. Google users... unless they rolled out their update based on what site the domain is registered in... in our case, Canada. The primary news release site is www.hotelnewsresource.com, but we have many running on the same server. EG. www.restaurantnewsresource.com, www.travelindustrywire.com and many more. We were number 1 or had top ranks for terms like ¨Hotel News¨, ¨Hotel Industry¨, ¨Hotel Financing¨, ¨Hotel Jobs¨, ¨Hotels for Sale¨, etc... and now, for most of these we have dropped in a big way. It seems that Google has issued a penalty for every internal page we link to. Couple obvious issues with the current template we use... too many links, and we intend to change that asap, but it has never been a problem before. The domain hotelnewsresource.com is 10 years old and still holds a page rank of 6. Secondly, the way our news system works, it´s possible to access an article from any domain in the network. E.G. I can read an article that was assigned to www.hotelnewsresource.com on www.restaurantnewsresource.com... we don´t post links to the irrelevant domain, but it does sometimes get indexed. So, we are going to implement the Google source meta tag option. The bottom line is that I think we put too much faith in the maturity of the domain... thinking that may protect us... not the case and it´s now a big mess. Any insight you can offer would be greatly appreciated. Do you think it was farmer or possibly something else? Thanks, Jarrett
Web Design | | jarrett.mackay0