Google page speed testing failures
-
When running our domain through google's page speed insights I'm getting the error message 'An error occurred while fetching or analyzing the page' at around 65% load. I'm concerned it's affecting our organic rankings. The domain is https://www.scottscastles.com/
When testing in https://testmysite.withgoogle.com/ it is also failing at around 70% with the message 'It's taking longer than expected. You can leave this tab open and check back in a little while. We'll have your results soon.' but the results never come.
I've tried testing on a few different speed testing sites without failures (https://tools.pingdom.com, https://gtmetrix.com, https://www.webpagetest.org and a few others). We’re stumped as everything appears correct and was working but now isn't. Is this Google or us, or a combination of the two?
Any help greatly appreciated!
-
If you're considering other hosting providers to speed things up, look into duda.co. They have a great thing going from everything I've researched.
-
Hi Nigel
Thanks for your response, much appreciated. We're aware it's a mess. We're currently rebuilding the back end from scratch, which will solve the structural problems but completion is 6 months away.
We've checked htaccess, it's not that. We have rel canonical on almost everything to fix the issue where pages show for the wrong URLs etc, it’s not ideal but fixing it is a massive job given we just need to keep it going another 6 months. We want to avoid work we will be throwing away, unless there are big wins.
Whilst communicating this to you my developer has found a fix. He said 'I’ve had to regenerate some of the SVGs, looks like someone got creative with now they built the SVG, I’ve now taken out all the creativity and hard coded the data into them.'
So it sounds very site specific.
thanks again Nigel - I've now clocked Carousel Projects UK.
Mat
-
Hi imaterus
You are using a 302 directive (temporary redirect) to forward http to https - this should be changed to a 301 (permanent redirect)
You have a lot of duplicate pages like this specifying a non-www domain as well as a www?
https://www.scottscastles.com/large-holiday-homes/sleeping20.html
https://scottscastles.com/large-holiday-homes/sleeping20.htmlWhich is weird - you should have only one specified. This is part of the 944 duplicate pages problem!
You need someone to properly work on this site as it's something of a mess,
Apologies - but it is,
Kind Regards
Nigel
Carousel Projects UK
-
Hi imaterus
I've looked through the robots.txt and there is nothing there.
# robotstxt.org/ Sitemap: https://www.scottscastles.com/sitemap.xml User-agent: * Disallow: /error/unsupported-browser.html Disallow: /search/process.html Disallow: /my-favourites/add.html Disallow: /search/quick-search.html Disallow: /search/property-list.html Disallow: /my-favourites/toggle.html Disallow: /my-favourites/remove.html Disallow: /search/map-markers.html Disallow: /my-favourites/clear.html
Maybe check through the htaccess and see if there is a directive that is holding up the crawl. If Google insights can't crawl it you may have a problem with the Google bot crawling as well which could negatively impact SEO.
Regards
NIgel
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What to do when Google automatically replaces the page title.
Hello Friends, Might you all are aware of the scenario when Google auto generates the snippets for search results. But nowadays I am seeing some changes like google is showing some specific words in the last of search results title for every page of my website. It looks Google is treating those words as the brand name. I have tried many things to solve this but unfortunately, nothing works for this. Does anyone see the same changes? Can anybody help me out with this or suggest me the reasons behind this.
Technical SEO | | Shalusingh1 -
Why google removed my landing pages from index?
I made new website meko.lv. I put many work to it, to make page SEO friendly, sprites, reduced requests added SSL, got google page speed insights score 100/100, but in 2. october all pages in google webmasters disappeared from index. Could you please look at website and say whats wrong with it? They are all search results present in google but for how long. it is so annoying, you put so many work but in result get high spam score. It is obvious that new pages can not get good links in one month https://meko.lv/ google webmasters google page speed score: https://developers.google.com/speed/pagespeed/insights/?url=http%3A%2F%2Fmeko.lv%2F&tab=mobile q1LDHTn
Technical SEO | | Mekounko0 -
Joomla creating duplicate pages, then the duplicate page's canonical points to itself - help!
Using Joomla, every time I create an article a subsequent duplicate page is create, such as: /latest-news/218-image-stabilization-task-used-to-develop-robot-brain-interface and /component/content/article?id=218:image-stabilization-task-used-to-develop-robot-brain-interface The latter being the duplicate. This wouldn't be too much of a problem, but the canonical tag on the duplicate is pointing to itself.. creating mayhem in Moz and Webmaster tools. We have hundreds of duplicates across our website and I'm very concerned with the impact this is having on our SEO! I've tried plugins such as sh404SEF and Styleware extensions, however to no avail. Can anyone help or know of any plugins to fix the canonicals?
Technical SEO | | JamesPearce0 -
Blocking Test Pages Enmasse on Sub-domain
Hello, We have thousands of test pages on a sub-domain of our site. Unfortunately at some point, these pages were visible to search engines and got indexed. Subsequently, we made a change to the robots.txt file for the test sub-domain. Gradually, over a period of a few weeks, the impressions and clicks as reported by Google Webmaster Tools fell off for the test. sub-domain. We are not able to implement the no index tag in the head section of the pages given the limitations of our CMS. Would blocking off Google bot via the firewall enmasse for all the test pages have any negative consequences for the main domain that houses the real live content for our sites (which we would like to of course remain in the Google index). Many thanks
Technical SEO | | CeeC-Blogger0 -
Page feedback
We recently wrote a new website page to cover the direct mail services our organization offers. We kept the title tag to 70 characters, the meta description under 150 characters. H1 tag has what we feel is the most important term. If anyone out there has time to review & provide a little feedback, we'd really appreciate it. It would be great to know if it is built well and providing a solid end user experience. http://www.cushingco.com/print_products/additional_services/direct_mail.shtml At the moment, the only links pointing to this page are from our blog. One bit of content I am contemplating is a short paragraph - What is Direct Mail Marketing? Literally providing a short definition of it. The page was activated last Thursday and showing up in some Google results on the 4th/5th page but I am thinking this is probably just a temporary bump for now. Anyway, thanks in advance for any advice!!!
Technical SEO | | SEOSponge0 -
Splitting Page Authority with two URLs for the same page.
Hello guys, My website is currently holding two different URLs for the same page and I am under the impression such set up is dividing my Page Authority and Link Juice. We currently have the following page with both URLs below: www.wbresearch.com/soldiertechnologyusa/home.aspx
Technical SEO | | JoaoPdaCosta-WBR
www.wbresearch.com/soldiertechnologyusa/ Analysing the page authority and backlinks I identified that we are splitting the amount of backlinks (links from sites, social media and therefore authority). "/home.aspx"
PA: 67
Linking Root Domains: 52
Total Links: 272 "/"
PA: 64
Linking Root Domains: 29
Total Links: 128 I am under the impression that if the URLs were the same we would maximise our backlinks and therefore page authority. My Question: How can I fix this? Should I have a 301 redirect from the page "/" to the "/home.aspx" therefore passing the authority and link juice of “/” directly to “/homes.aspx”? Trying to gather thoughts and ideas on this, suggestions are much appreciated? Thanks!0 -
Page Download Speed, Whats Acceptable?
Good morning from 15 degrees C partly Cloudy wetherby Uk 🙂 When you work in the game of SEO you natutally take a look at other seo service provider reports to see if you can learn anything new. And whilst in the past site speed was something that would get my attention nowadays I dont take so much notrice but a competitors technical audit was heavy on site speed which made me wonder... "what is an acceptable download time for a page" and "whats a good tool to measure page download time" Any insights welcome 🙂
Technical SEO | | Nightwing0 -
Page not Accesible for crawler in on-page report
Hi All, We started using SEOMoz this week and ran into an issue regarding the crawler access in the on-page report module. The attached screen shot shows that the HTTP status is 200 but SEOMoz still says that the page is not accessible for crawlers. What could this be? Page in question
Technical SEO | | TiasNimbas
http://www.tiasnimbas.edu/Executive_MBA/pgeId=307 Regards, Coen SEOMoz.png0