Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Is page speed important to improve SEO ranking?
-
@jasparcj
many codes of some pages, can be seen as malicious by the Google robot.
Although they are not bad for users, google detects them in another way.
It is best to consult and review those codes. -
@pau4ner Thanks for your comment. And you're 100% right
-
Nobody except Google can tell for sure, but many SEOs and myself (I work with several websites) haven't noticed any substantial change in rankings after improving our site's speed. In fact, you can find many examples of slower sites ranking above much faster websites.
I believe though that if your site speed is so slow that it impairs user's experience, you will lose rankings. Not due to the speed itself, but due to the higher bounce rates it is causing.
In summary (and according to my experience and many other SEO professionals), if your website loads quite fast, improving its speed won't cause any ranking improvements. But if it is very slow, it may have a positive impact.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
GoogleBot still crawling HTTP/1.1 years after website moved to HTTP/2
Whole website moved to https://www. HTTP/2 version 3 years ago. When we review log files, it is clear that - for the home page - GoogleBot continues to only access via HTTP/1.1 protocol Robots file is correct (simply allowing all and referring to https://www. sitemap Sitemap is referencing https://www. pages including homepage Hosting provider has confirmed server is correctly configured to support HTTP/2 and provided evidence of accessing via HTTP/2 working 301 redirects set up for non-secure and non-www versions of website all to https://www. version Not using a CDN or proxy GSC reports home page as correctly indexed (with https://www. version canonicalised) but does still have the non-secure version of website as the referring page in the Discovery section. GSC also reports homepage as being crawled every day or so. Totally understand it can take time to update index, but we are at a complete loss to understand why GoogleBot continues to only go through HTTP/1.1 version not 2 Possibly related issue - and of course what is causing concern - is that new pages of site seem to index and perform well in SERP ... except home page. This never makes it to page 1 (other than for brand name) despite rating multiples higher in terms of content, speed etc than other pages which still get indexed in preference to home page. Any thoughts, further tests, ideas, direction or anything will be much appreciated!
Technical SEO | | AKCAC1 -
What are the SEO ramifications of domain redirection?
Hi Moz Community! I was just trying to set up our global site and got this message: "Redirect detected
SEO Tactics | | Padmagandhini
We have detected that the domain bhaktimarga.org redirects to prodfront-coli.bhaktimarga.mediactive-network.net. We do not recommend tracking a redirect URL. Would you like to track prodfront-coli.bhaktimarga.mediactive-network.net for this campaign instead?"
6358703c-d8ef-4c0a-83a9-c948d370d743-image.png What's interesting is when you go to the site, Bhaktimarga.org, it shows our domain in the URL bar. Is this done for performance and masks the hosting provider domain? I haven't talked to website developers about this yet, but my main question is...Does this have any SEO ramification? Thanks so much,
Padma0 -
Google keeps marking different pages as duplicates
My website has many pages like this: mywebsite/company1/valuation mywebsite/company2/valuation mywebsite/company3/valuation mywebsite/company4/valuation ... These pages describe the valuation of each company. These pages were never identical but initially, I included a few generic paragraphs like what is valuation, what is a valuation model, etc... in all the pages so some parts of these pages' content were identical. Google marked many of these pages as duplicated (in Google Search Console) so I modified the content of these pages: I removed those generic paragraphs and added other information that is unique to each company. As a result, these pages are extremely different from each other now and have little similarities. Although it has been more than 1 month since I made the modification, Google still marks the majority of these pages as duplicates, even though Google has already crawled their new modified version. I wonder whether there is anything else I can do in this situation? Thanks
Technical SEO | | TuanDo96270 -
Collections or blog posts for Shopify ecommerce seo?
Hi, hope you guys can help as I am going down a rabbit hole with this one! We have a solid-ranking sports nutrition site and are building a new SEO keyword strategy on our Shopify built store. We are using collections (categories) for much of the key product-based seo. This is because, as we understand it, Google prioritises collection/category pages over product pages. Should we then build additional collection pages to rank for secondary product search terms that could fit a collection page structure (eg 'vegan sports nutrition'), or should we use blog posts to do this? We have a quality blog with good unique content and reasonable domain authority so both options are open to us. But while the collection/category option may be best for SEO, too many collections/categories could upset our UX. We have a very small product range (10 products) so want to keep navigation fast and easy. Our 7 lead keyword collection pages do this already. More run the risk of upsetting ease/speed of site navigation. On the other hand, conversion rate from collection pages is historically much better than blog pages. We have made major technical upgrades to the blog to improve this but these are yet to be tested in anger. So at the heart of it all - do you guys recommend favouring blog posts or collection/category pages for secondary high sales intent keywords? All help gratefully received - thanks!
SEO Tactics | | WP332 -
Is NitroPack plugin Black Hat SEO for speed optimization
We are getting ready to launch our redesigned WP site and were considering using NitroPack performance optimization plugin, until some of our developers started ringing the alarm. Here is what some in the SEO community are saying about the tool. The rendering of the website made with the NitroPack plugin in the Page Metric Test Tools is based entirely on the inline CSS and JS in the HTML file without taking into account additional numerous CSS or JS files loaded on the page. As a result, the final metric score does not include CSS and JavaScript files evaluation and parsing. So what they are saying is that a lot of websites with the NitroPack plugin never become interactive in the Page Metric Tools because all interactivity is derived from JavaScript and CSS execution. So, their "Time to Interactive" and "Speed Index" should be reported as equal to infinity. Would Google consider this Black Hat SEO and start serving manual actions to sites using NitroPack? We are not ready to lose our hard-earned Google ranking. Please, let me know your thoughts on the plugin. Is it simply JS and CSS "lazy loading" that magically offers the first real-world implementation that works magic and yields fantastic results, or is it truly a Black Hat attempt at cheating Google PageSpeed Insights numbers? Thank you!
On-Page Optimization | | opiates0 -
Web Core Vitals and Page Speed Insights Not Matching Scores
We have some URLs that are being flagged as poor inside Search Console in the Web Core Vitals report. For example, the report is saying that some pages have too many CLS issues. When looking into things we can do to update we have noticed that when we run the same page using the PageSpeed Insights tool we are not getting the same bad scores. This is making it hard for us to actually know what needs to be addressed. Nor can we tell if a change actually fixed the issue because in PageSpeed Insights there is not an issue. Has anyone else had similar issues. If so have you found a way to fix it?
On-Page Optimization | | RMATVMC0 -
Address on Every page of the website for Local SEO? Good or Bad?
Is this good idea to add business address on every page of the website?, How Google see this? and This is Good or bad for ranking?
On-Page Optimization | | Dan_Brown10 -
Creating New Pages Versus Improving Existing Pages
What are some things to consider or things to evaluate when deciding whether you should focus resources on creating new pages (to cover more related topics) versus improving existing pages (adding more useful information, etc.)?
On-Page Optimization | | SparkplugDigital0