Server response time: restructure the site or create the new one? SEO opinions needed.
-
Hi everyone,
The internal structure of our existing site increase server response time (6 sec) which is way below Google 0.2sec standards and also make prospects leave the site before it's loaded.
Now we have two options (same price):
- restructure the site's modules, panels etc
- create new site (recommended by developers)
Both options will extend the same design and functionality.
I just wanted to know which option SEO community will recommend?
-
Yes, correct - multiple CCS files & javascript will not affect server response time - I think ryan was referring to page load speed.
-
Hello.
Before starting from scratch, try to optimize Drupal. You have some simple things to do which speed Drupal amazingly:
- Go to Administer » Site configuration » Performance page, enable the option ""Aggregate and compress CSS files." and "Aggregate Javascript Files".
- On the same page, activate the cache: "Cache page for anonymous users" and "Cache blocks".
Try if it helps while you find the source of the problem.
-
There is one huge thing that is being missed here by both of you. The Google Insight grades on server response time. Server response time has no bearing on if a site loads 1 css file or 30 css files. It has not bearing on how many js files are loaded and if the parsing of them is deferred or not. If you follow all of the suggestions that pingdom gives you to the T, it will not affect your server response time one bit.
The only way to affect your server response time is going to be to reduce the processing time of your site. Not the loading time in the browser. To reduce your server response time you are going to have to explore server caching, mysql optimization, and things such as that.
This might help to read as well.
https://support.google.com/analytics/answer/2383341?hl=en
http://www.blogaid.net/server-response-time-vs-page-load-time-for-google-crawl
-
We'll make it as fast as possible! Thanks John. Just need to figure out if we should restructure the existing site, or make it from scratch.
-
Ryan
Yes. I do not worry about the speed variations - there are too many variables on each test. ie Which server did the test use?
My view on page speed is forget "time" and "time ranges" on various tools. If you have identified page speed as issue which you have focus on what you know you can and should fix. Don't just fix the minimum - on page speed fix the maximum. I believe page speed is a key factor on ranking.
-
John, thanks for the tool. Site has multiple CSS for the same types of content, too much of different modules, panels and blocks for the simple site. Btw, in Google page speed test it shows different time speed in the range between 2.1 sec and 6,5 sec. Have you ever seen this dependency?
-
Lesley is correct it is important to understand why the issues before you move forward.
I am not sure if you are familiar with tools.pingdom.com - but free test your site on tools.pingdom. Then review the performance tab - and see what your loading problems are. Also .2 of a second is best in class - if you can get below 2 seconds I would be happy with that. Not suggesting you do not go for .2 - just that it is onerous and likely not time efficient.
The positive is I have seen several times dropping a site from 6 seconds to 2 seconds gets me an uplift in rankings without doing anything else!
-
I am not familiar with Drupal, when you say you are restructuring is that something internal in Drupal? Or does that mean you are changing the page structure of your site, like for instance moving pages around? Or are you removing some widgets and things like that from pages?
-
It's Drupal 7. We don't redesign, we're restructuring. Yes, server takes too much time to generate the pages, they're dynamic.
-
Server response time is tied to two factors. The first one is the DNS look up, the second one is the time it takes your server to generate a page and spit it out. Generally both of those can be improved without having to redesign your site. What is your site currently developed in? Is it constantly changing?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Best SEO Option for Multi-site Set-up
Hi Guys, We have a Business to Business Software Website. We are Global business but mainly operate in Ireland, UK and USA. I would like your input on best practice for domain set-up for best SEO results in local markets. Currently we have: example.com (no market specified) and now we are creating: example.com/ie (Ireland) example.com/uk (united kingdom) example.com/us (united states) My question is mainly based on the example.com/us website - should we create example.com/us for the US market OR just use example.com for the US the market? If the decision is example.com/us should we build links to the directory or the main .com website. To summarize there is two questions: 1. Advise on domain set-up 2. Which site to build links to if example.com/us is the decision. Thank you in advance, Glen.
Local Website Optimization | | DigitalCRO0 -
Impact of .us vs .com on SEO rankings?
Our website is hosted on www.discovered.us. I have 2 questions: 1: we have had regular feedback a .us domain is negative in SEO and in conversion (customers don't like it). We are thinking of changing domain to: www.dscvrd.com.
Local Website Optimization | | Discovered
Any insights on the impact on our rankings (if any) if we do this? 2: we are focusing our SEO global / USA first but conversions in UK are better. We currently do not have multi-language SEO setup. What would the impact be of implementing www.discovered.co.uk on SEO in UK? Thanks! Gijsbert0 -
Should I use pipe in title tags for local seo?
Hi, I've created a bunch of landing pages for local areas, reading, windsor, slough etc for the title tag I have for Windsor Emergency Electrician Windsor - BrandName should I be using a pipe in the tag to further help search engines learn/identify the location? Emergency Electrician | Windsor - BrandName Thank you Kev
Local Website Optimization | | otex1 -
Onsite Optimization for 2 Locations on One Site
Hello, We have multiple client who have 2 office locations n the same state in varying counties and would like to have their site rank for two counties. Is this plausible ? For instance they would like their header tags to read "Lawyer in Middlesex & Monmouth County NJ" Rather than "Middlesex County NJ Lawyer" Would this be an effective strategy or be seen as stuffing by Google?
Local Website Optimization | | Armen-SEO0 -
Benefits of adding keywords to site structure?
Hello fellow Mozzers, This is kind of a hypothetical, but it might have implications for future projects. Do you think there would be any benefits (or drawbacks) to placing pages of a site into a directory named after a keyword? For example, if I had a local store that sold hockey equipment, and "hockey", "equipment", and "hockey equipment" were the main targets being optimized for, would it be better (assuming the actual pages were the same) to structure the site as hypotheticalwebsite.com/about-us/ hypotheticalwebsite.com/hockey-skates/ hypotheticalwebsite.com/hockey-sticks/ hypotheticalwebsite.com/blog/ or hypotheticalwebsite.com/hockey-equipment/about-us/ hypotheticalwebsite.com/hockey-equipment/hockey-skates/ hypotheticalwebsite.com/hockey-equipment/hockey-sticks/ hypotheticalwebsite.com/hockey-equipment/blog/ Additionally, would any of this change if the root domain or the individual pages ALSO used those keywords (or if both of them used it)? pseudonyms-hockey-gear.com/hockey-equipment/skates/ pseudonyms-penalty-box.com/hockey-equipment/hockey-skates/ pseudonyms-hockey-gear.com/hockey-equipment/hockey-skates/ I've got a hunch that some of these are overkill, but I'm not sure where the scale tips from helpful to negligible to actively counterproductive. Thanks, everyone!
Local Website Optimization | | BrianAlpert780 -
Local SEO HELP for Franchise SAB Business
This all began when I was asked to develop experiment parameters for our content protocol & strategy. It should be simple right? I've reviewed A/B testing tips for days now, from Moz and other sources.I'm totally amped and ready to begin testing in Google Analytics. Say we have a restoration service franchise with over 40 franchises we perform SEO for. They are all over the US. Every franchise has their own local website. Example restorationcompanylosangeles.com Every franchise purchases territories in which they want to rank in. Some service over 100 cities. Most franchises also have PPC campaigns. As a part of our strategy we incorporate the location reach data from Adwords to focus on their high reach locations first. We have 'power pages' which include 5 high reach branch preferences (areas in which the owners prefer to target) and 5 non branch preference high reach locations. We are working heavily on our National brand presence & working with PR and local news companies to build relationships for natural backlinks. We are developing a strategy for social media for national brand outlets and local outlets. We are using major aggregators to distribute our local citation for our branch offices. We make sure all NAP is consistent across all citations. We are partners with Google so we work with them on new branches that are developing to create their Google listings (MyBusiness & G+). We use local business schema markup for all pages. Our content protocol encompasses all the needed onsite optimization tactics; meta, titles, schema, placement of keywords, semantic Q&A & internal linking strategies etc. Our leads are calls and form submissions. We use several call tracking services to monitor calls, caller's location etc. We are testing Callrail to start monitoring landing pages and keywords that generating our leads. Parts that I want to change: Some of the local sites have over 100 pages targeted for 'water damage + city ' aka what Moz would call "Doorway pages. " These pages have 600-1000 words all talking about services we provide. Although our writers (4 of them) manipulate them in a way so that they aren't duplicate pages. They add about 100 words about the city location. This is the only unique variable. We pump out about 10 new local pages a month per site - so yes - over 300 local pages a month. Traffic to the local sites is very scarce. Content protocol / strategy is only tested based on ranking! We have a tool that monitors ranking on all domains. This does not count for mobile, local, nor user based preference searching like Google Now. My team is deeply attached to basing our metrics solely on ranking. The logic behind this is that if there is no local city page existing for a targeted location, there is less likelihood of ranking for that location. If you are not seen then you will not get traffic nor leads. Ranking for power locations is poor - while less competitive low reach locations rank ok. We are updating content protocol by tweaking small things (multiple variants at a time). They will check ranking everyday for about a week to determine whether that experiment was a success or not. What I need: Internal duplicate content analyzer - to prove that writing over 400 pages a month about water damage + city IS duplicate content. Unique content for 'Power pages' - I know based on dozens of chats here on the community and in MOZ blogs that we can only truly create quality content for 5-10 pages. Meaning we need to narrow down what locations are most important to us and beef them up. Creating blog content for non 'power' locations. Develop new experiment protocol based on metrics like traffic, impressions, bounce rate landing page analysis, domain authority etc. Dig deeper into call metrics and their sources. Now I am at a roadblock because I cannot develop valid content experimenting parameters based on ranking. I know that a/b testing requires testing two pages that are same except the one variable. We'd either non index these or canonicalize.. both are not in favor of testing ranking for the same term. Questions: Are all these local pages duplicate content? Is there a such thing as content experiments based solely on ranking? Any other suggestions for this scenario?
Local Website Optimization | | MilestoneSEO_LA1 -
Local site went from dominating first page - bad plugin caused duplicate content issues - now to 2nd page for all!
I had a bad plugin create duplicate content issues on my Wordpress CMS - www.pmaaustin.com I got it fixed, but now every keyword has been stuck on page 2 for search terms for 4 months now, where I was 49 out of 52 keywords on page one. It's a small local niche with mostly easier to rank keywords. Am I missing something? p.s. Also has a notice on the Dashboard that says: "404 Redirected: There are 889 captured 404 URLs that need to be processed." Could that be a problem? Thanks, Steve
Local Website Optimization | | OhYeahSteve0 -
Bing ranking a weak local branch office site of our 200-unit franchise higher than the brand page - throughout the USA!?
We have a brand with a major website at ourbrand.com. I'm using stand-ins for the actual brandname. The brand is a unique term, has 200 local offices with sites at ourbrand.com/locations/locationname, and is structured with best practices, and has a well built sitemap.xml. The link profile is diverse and solid. There are very few crawl errors and no warnings in Google Webmaster central. Each location has schema.org markup that has been checked with markup validation tools. No matter what tool you use, and how you look at it t's obvious this is the brand site. DA 51/100, PA 59/100. A rouge franchisee has broken their agreement and made their own site in a city on a different domain name, ourbrandseattle.com. The site is clearly optimized for that city, and has a weak inbound link profile. DA 18/100, PA 21/100. The link profile has low diversity and generally weak. They have no social media activity. They have not linked to ourbrand.com <- my leading theory. **The problem is that this rogue site is OUT RANKING the brand site all over the USA on Bing. **Even where it makes no sense at all. We are using whitespark.ca to check our ranking remotely in other cities and try to remove the effects of local personalization. What should we do? What have I missed?
Local Website Optimization | | scottclark0