Slow page load times from asynchronous javascript
-
Our analytics are showing that our homepage takes, on average, ~14 seconds to load. All of the content on the page loads fairly quickly, while it takes a few seconds to load the social media stuff (which is mostly asynchronous javascript that is running in the background).
The question is this: Does Her-Magesty-Google take into account the amount of time it takes to load everything, including the social media stuff from asynchronous javascript? Or does Google see that the content loads fairly quickly and doesn't ding me for the js that is running in the background?
-
I would check the loading times in Google Webmaster.
Then I would go to some websites that measure loading times and compare your webiste's speed with the speeds of your competitors. If your is significantly slower, I would worry.
Try to combine several js into one.
Try to substitute social media buttons with some other ones.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How can i speed up my loading of my home page?
Speed test of my homepage is very bad. What i can do to enhance it? Here is my homepage link: https://madrasatelquran.com
On-Page Optimization | | MadrasatElQuran0 -
FAQ page structure
I have read in other discussions that having all questions on an FAQ page is the way to go and then if the question has an answer worthy of its own page, you should abbreviate the answer and link to the page with more content. My question is when using some templates in WP, they have a little + button you can click and it reveal the answer to the question. Does this hurt SEO versus having all text visible and then using headers/subheaders? An example of the + button https://fyrfyret.dk/faq/
On-Page Optimization | | OrlandSEO1 -
Too many links per page? Double navigation on every page...
I have a client with navigation across the top of each page plus the same nav links in a sidebar on every page. Can that duplication (or the sheer number of links) on each page have a negative ranking factor?
On-Page Optimization | | brm20170 -
Will it upset Google if I aggregate product page reviews up into a product category page?
We have reviews on our product pages and we are considering averaging those reviews out and putting them on specific category pages in order for the average product ratings to be displayed in search results. Each averaged category review would be only for the products within it's category, and all reviews are from users of the site, no 3rd party reviews. For example, averaging the reviews from all of our boxes products pages, and listing that average review on the boxes category page. My question is, will this be doing anything wrong in the eyes of Google, and if so how so? -Derick
On-Page Optimization | | Deluxe0 -
New Page Not ranking?
One of this client's top keyword is "oak beams". They already rank well in the UK for other related terms like "reclaimed oak beams" at /reclaimed-oak-beams/ and "air dried oak beams" at /air-dried-oak-beams/ We have created a page at /oak-beams/ but this page ranks nowhere? Instead the reclaimed oak beams or air dried oak beams page ranks for the term "oak beams". Any ideas why Google is swapping between those pages and not choosing the /oak-beams/ page? A few notes are that the /oak-beams/ page is newest page on the site and yes I know there are no links pointing to it but there are no links pointing to the other pages either?
On-Page Optimization | | Marketing_Today0 -
What to do about pages I have deleted?
I have been working through the dead links on my page and recreating the page with new content for those pages that it still makes sense to have on the site. But I have a few that were just changes of the title, spelling mistakes or other ways of saying the same thing In other words I created a page called "areas of the UK we cover" but decided to change it to "areas covered" However, I must have created links to this page and now it is a dead link with a page authority of 19 I think it would be spammy to have two pages, one called "areas covered" and the other called "areas of the UK we cover. It's not a disallow in Robots.txt because the page does not exist Please note I do not have access to the header to add code for a 301 redirect. I'm still using webs.com but not for new sites. I also have a page called singing telegrams london, that I changed from singagrams london. These are two words for the same thing but they are two very different keywords would it be ok to recreate this page and create content for singagrams london. Help is much appreciated
On-Page Optimization | | singingtelegramsuk0 -
Too Many On-Page Links
If a page has more than 100 links, rather than splitting up the page into multiple pages, is it ok to use name="robots" content="noindex, follow" />? The page in question lists links to articles so the page itself isn't that important to appear in serps, but the articles are the helpful content pages: www.ides.com/articles/processing/injection-molding/
On-Page Optimization | | Prospector-Plastics0 -
Avoiding "Duplicate Page Title" and "Duplicate Page Content" - Best Practices?
We have a website with a searchable database of recipes. You can search the database using an online form with dropdown options for: Course (starter, main, salad, etc)
On-Page Optimization | | smaavie
Cooking Method (fry, bake, boil, steam, etc)
Preparation Time (Under 30 min, 30min to 1 hour, Over 1 hour) Here are some examples of how URLs may look when searching for a recipe: find-a-recipe.php?course=starter
find-a-recipe.php?course=main&preperation-time=30min+to+1+hour
find-a-recipe.php?cooking-method=fry&preperation-time=over+1+hour There is also pagination of search results, so the URL could also have the variable "start", e.g. find-a-recipe.php?course=salad&start=30 There can be any combination of these variables, meaning there are hundreds of possible search results URL variations. This all works well on the site, however it gives multiple "Duplicate Page Title" and "Duplicate Page Content" errors when crawled by SEOmoz. I've seached online and found several possible solutions for this, such as: Setting canonical tag Adding these URL variables to Google Webmasters to tell Google to ignore them Change the Title tag in the head dynamically based on what URL variables are present However I am not sure which of these would be best. As far as I can tell the canonical tag should be used when you have the same page available at two seperate URLs, but this isn't the case here as the search results are always different. Adding these URL variables to Google webmasters won't fix the problem in other search engines, and will presumably continue to get these errors in our SEOmoz crawl reports. Changing the title tag each time can lead to very long title tags, and it doesn't address the problem of duplicate page content. I had hoped there would be a standard solution for problems like this, as I imagine others will have come across this before, but I cannot find the ideal solution. Any help would be much appreciated. Kind Regards5