Idle Connection Timeout for Sever Load Balancer
-
We are using Amazon Web Server for www.mastersindia.co. Please help me to know what is idle timeout for server load balancer for AWS.
-
Is time out a response code you are getting when querying your own website in some way? Usually it means you are crawling a site too fast and it's refusing to respond (or it can't respond in time as it has too many requests)
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Angular JS - Page Load
Website build in process in Angular JS. We are looking at prerendering the pages so its all good. However, because there are going to be few server requests, how would the page load be like for search engines? Also, on the client side (browser) would there be any impact if we prerender the pages? Cheers!
Intermediate & Advanced SEO | | Malika10 -
Alt tag for src='blank.gif' on lazy load images
I didn't find an answer on a search on this, so maybe someone here has faced this before. I am loading 20 images that are in the viewport and a bit below. The next 80 images I want to 'lazy-load'. They therefore are seen by the bot as a blank.gif file. However, I would like to get some credit for them by giving a description in the alt tag. Is that a no-no? If not, do they all have to be the same alt description since the src name is the same? I don't want to mess things up with Google by being too aggressive, but at the same time those are valid images once they are lazy loaded, so would like to get some credit for them. Thanks! Ted
Intermediate & Advanced SEO | | friendoffood0 -
How would Google reach internal pages on Zales with Lazy Load?
Hi, I encountered the following page on Zales:
Intermediate & Advanced SEO | | BeytzNet
http://engagementring.theprestigediamondcollection.com/NewEngagementRing/NewEring.aspx As you scroll down more items pop up (the well known Pinterest style).
Would Google bot be able to enter the product pages? I don't assume the bot "scrolls"... Thanks0 -
1 Ecommerce site for several product segments or 1 Ecommerce site for each product segment ?
I am currently struggling with the decision whether to create individual ecommerce sites for each of 3 consumer product segments or rather to integrate them all under one umbrella domain. Obviously integration under 1 domain makes link building easier, but I am not sure how far google will favor in rankings websites focussed on one topic=product segment. Product segments are medium competitive.Product segments are not directly related but there may be some overlap in customer demographics- Any thoughts ?
Intermediate & Advanced SEO | | lcourse1 -
Varying Internal Link Anchor Text with Each New Page Load
I'm asking for people's opinions on varying internal anchor text. Before you jump in and say, "Oh yes, varying your anchor text is always a good idea", let me explain. I'm not talking about varying anchor text on different links scattered throughout a site. We all know that is a wise thing to do for a variety of reasons that have been covered in many places. What I'm talking about is including semi-useful links below the fold and then varying the anchor text with each page load. Each time Googlebot crawls a page, it sees different anchor text for each link. That way, Googlebot is seeing, for example, 'san diego bars', 'taverns in san diego', 'san diego clubs', and 'pubs in san diego' all pointing to a San Diego bar/tavern/club/pub page. I'm wondering if there is value in this approach. Will it help a site rank well for multiple search queries? Could it potentially be better than static anchor text as it may help Google better understand the targeted page? Is it a good way to protect a large site with a huge number of internal links from Penguin? To summarize, we're talking about the impact of varying the anchor text on a single page with each page load as opposed to varying the anchor text on different pages. Thoughts?
Intermediate & Advanced SEO | | RyanOD0 -
Indexing a several millions pages new website
Hello everyone, I am currently working for a huge classified website who will be released in France in September 2013. The website will have up to 10 millions pages. I know the indexing of a website of such size should be done step by step and not in only one time to avoid a long sandbox risk and to have more control about it. Do you guys have any recommandations or good practices for such a task ? Maybe some personal experience you might have had ? The website will cover about 300 jobs : In all region (= 300 * 22 pages) In all departments (= 300 * 101 pages) In all cities (= 300 * 37 000 pages) Do you think it would be wiser to index couple of jobs by couple of jobs (for instance 10 jobs every week) or to index with levels of pages (for exemple, 1st step with jobs in region, 2nd step with jobs in departements, etc.) ? More generally speaking, how would you do in order to avoid penalties from Google and to index the whole site as fast as possible ? One more specification : we'll rely on a (big ?) press followup and on a linking job that still has to be determined yet. Thanks for your help ! Best Regards, Raphael
Intermediate & Advanced SEO | | Pureshore0 -
My site links have gone from a mega site links to several small links under my SERP results in Google. Any ideas why?
A site I have currently had the mega site links on the SERP results. Recently they have updated the mega links to the smaller 4 inline links under my SERP result. Any idea what happened or how do I correct this?
Intermediate & Advanced SEO | | POSSIBLE0 -
Load balancing - duplicate content?
Our site switches between www1 and www2 depending on the server load, so (the way I understand it at least) we have two versions of the site. My question is whether the search engines will consider this as duplicate content, and if so, what sort of impact can this have on our SEO efforts? I don't think we've been penalised, (we're still ranking) but our rankings probably aren't as strong as they should be. The SERPs show a mixture of www1 and www2 content when I do a branded search. Also, when I try to use any SEO tools that involve a site crawl I usually encounter problems. Any help is much appreciated!
Intermediate & Advanced SEO | | ChrisHillfd0