Tools to check mobile speed performance
-
Hey Guys,
Looking at a site which has mobile versions of each page example: m.domain.com.au
Some of these pages have images which are over 1mb.
I want to quickly identify these pages with high image file sizes, any good tools which can do this?
Cheers.
-
Run Screaming Frog on your subdomains and check the Images tab in the report, then sort by image size and you'll find the large images.
Download Screaming Frog from here: http://www.screamingfrog.co.uk/seo-spider/
-
You can use Google PageSpeed Insights:
https://developers.google.com/speed/pagespeed/insights/And if your site is on WordPress then you can install this plugin:
https://wordpress.org/plugins/google-pagespeed-insights/You also can use WebPageTest, GTMetrix, Pingdom Tools. Even one simply *nix tool as wget can make local mirror to site where you can see filesizes (command is wget -r -m http://m.domain.com/au ).
Alternative you can use desktop crawlers (like mine SEOSpyder) where you can see also images size in bytes.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How Can A SubDomain Out Perform A Root Domain?
Hi guys! I have a rather strange SEO question. It may not be that strange at all actually. If a site has a subdomain or a shopping cart that is on a subdomain through a third-party shopping cart provider, can the third party shopping cart transfer value to the subdomain causing the subdomain to have greater domain authority than the main site or root domain? Another question, this subdomain, up until yesterday, blocked google from crawling it with robots txt, however it has a much higher domain authority than the root domain. The root domain has a really low domain authority, despite not blocking google from crawling it. How is this possible? I hope these questions make sense. I am a little stumped & trying to figure out why the subdomain is out-performing the main site despite being hidden from search, if that's even the case. Please let me know if I have it all wrong..
Intermediate & Advanced SEO | | Prae0 -
Link conundrum - losing nav/footer links in mobile view
Hi Moz folks! I'm currently moving a site from being hosted on www. and m. separately to a responsive single URL. The problem is, the desktop version currently has links to important landing pages in the footer (about 60) and that's not something we want to replicate on mobile (mainly because it will look pretty awful.) There is no navigation menu because the key to the homepage is to convert users to subscription so any distraction reduces conversion rate. The footer links will continue to exist on the desktop view but, since Google's mobile-first index, presumably we lose these important homepage links to our most important pages. So, my questions: Do you think there is any SEO value in the desktop footer links? Do you have any suggestions about how best to include these 60-odd links in a way that works for mobile? Thanks!
Intermediate & Advanced SEO | | d_foley0 -
E-Commerce Mobile Pagination Dillema
Hi Everybody, I'm managing the SEO for an E-commerce site with different desktop and mobile sites (meaning, not responsive). We're changing the way reviews on mobile product pages will be displayed from 'view all' to pagination (due to server load). Basically the above the fold part of the page will always display the product, and below the fold will have x numbers of reviews on each page. But here is where it gets tricky: 1 - A different number of review pages will exist on mobile vs desktop (due to different no. of reviews per page on each device) - so I'm wondering what's the solution regarding canonicals. Usually every mobile page points to its desktop parallel, but now we'll have non-matching pages. 2 - The users will be able to change the no. of reviews displayed on each page. So the number of paginated pages will change accordingly. I was thinking about a solution where all the reviews will be in the first page's html (and only X of them will be displayed on screen), and all the other paginated pages will be created dynamically (with # and won't be indexed, so basically no pagination in mobile). Does anyone think this can be seen as cloaking or has any other thoughts? Thanks, Sarah
Intermediate & Advanced SEO | | Don340 -
I need thoughts on how to chase a suspected Hosting Issue with Simple Helix and 524 errors, also some site speed data mixed in...
So the back story on this project is we've been working as PPC and SEO managers with an ecoomerce site (Magento Enterprise based) that crashed in April. After the issue they fired their developer and switched hosting to Simple Helix at the recommendation of the new developer. Since the change we have seen a plummeting ecommerce conversion rate especially on weekends. Every time something seems really bad, the Developer gives us a "nothing on our end causing it." So doing more research we found site speed in GA was reporting crazy numbers of 25+ seconds for page loads, when we asked Simple Helix gave us answers back that it was "Baidu spiders" crawling the site causing the slowdown. I knew that wasn't the issue. In all of this the developer keeps reporting back to the site owner that there is no way it is hosting. So the developer finally admitted the site could be slowing down from a Dos attack or some other form of probing. So they installed Cloudflare. Since then the site has been very fast, and we haven't seen turbulence in the GA site speed data. What we have seen though is the appearance of 524 and 522 errors in Search Console. Does anyone have experience with Cloudflare that seeing those types of errors are common in usage? Is there any other thought what might be causing that and what that means from the servers, because the developer reports back that Simple Helix has had no issues during this time. This has been a super frustrating project and we've tried a lot different tests, but there is really abnormal conversion data as I said especially during peak times on the weekend. Any ideas of what to chase would be appreciated.
Intermediate & Advanced SEO | | BCutrer0 -
Some Tools Not Recognizing Meta Tags
I am analyzing a site which has several thousands of pages, checking the headers, meta tags, and other on page factors. I noticed that the spider tool on SEO Book (http://tools.seobook.com/general/spider-test) does not seem to recognize the meta tags for various pages. However, using other tools including Moz, it seems the meta tags are being recognized. I wouldn't be as concerned with why a tool is not picking up the tags. But, the site suffered a large traffic loss and we're still trying to figure out what remaining issues need to be addressed. Also, many of those pages once ranked in Google and now cannot be found unless you do a site:// search. Is it possible that there is something blocking where various tools or crawlers can easily read them, but other tools cannot. This would seem very strange to me, but the above is what I've witnessed recently. Your suggestions and feedback are appreciated, especially as this site continues to battle Panda.
Intermediate & Advanced SEO | | ABK7170 -
Showing Duplicate Content in Webmaster Tools.
About 6 weeks ago we completely redid our entire site. The developer put in 302 redirects. We were showing thousands of duplicate meta descriptions and titles. I had the redirects changed to 301. For a few weeks the duplicates slowly went down and now they are right back to where they started. Isn't the point of 301 redirects to show Google that content has permanently been moved? Why is it not picking this up? I knew it would take some time but I am right where I started after a month.
Intermediate & Advanced SEO | | EcommerceSite0 -
How to get around Google Removal tool not removing redirected and 404 pages? Or if you don't know the anchor text?
Hello! I can’t get squat for an answer in GWT forums. Should have brought this problem here first… The Google Removal Tool doesn't work when the original page you're trying to get recached redirects to another site. Google still reads the site as being okay, so there is no way for me to get the cache reset since I don't what text was previously on the page. For example: This: | http://0creditbalancetransfer.com/article375451_influencial_search_results_for_.htm | Redirects to this: http://abacusmortgageloans.com/GuaranteedPersonaLoanCKBK.htm?hop=duc01996 I don't even know what was on the first page. And when it redirects, I have no way of telling Google to recache the page. It's almost as if the site got deindexed, and they put in a redirect. Then there is crap like this: http://aniga.x90x.net/index.php?q=Recuperacion+Discos+Fujitsu+www.articulo.org/articulo/182/recuperacion_de_disco_duro_recuperar_datos_discos_duros_ii.html No links to my site are on there, yet Google's indexed links say that the page is linking to me. It isn't, but because I don't know HOW the page changed text-wise, I can't get the page recached. The tool also doesn't work when a page 404s. Google still reads the page as being active, but it isn't. What are my options? I literally have hundreds of such URLs. Thanks!
Intermediate & Advanced SEO | | SeanGodier0 -
Google bot vs google mobile bot
Hi everyone 🙂 I seriously hope you can come up with an idea to a solution for the problem below, cause I am kinda stuck 😕 Situation: A client of mine has a webshop located on a hosted server. The shop is made in a closed CMS, meaning that I have very limited options for changing the code. Limited access to pagehead and can within the CMS only use JavaScript and HTML. The only place I have access to a server-side language is in the root where a Defualt.asp file redirects the visitor to a specific folder where the webshop is located. The webshop have 2 "languages"/store views. One for normal browsers and google-bot and one for mobile browsers and google-mobile-bot.In the default.asp (asp classic). I do a test for user agent and redirect the user to one domain or the mobile, sub-domain. All good right? unfortunately not. Now we arrive at the core of the problem. Since the mobile shop was added on a later date, Google already had most of the pages from the shop in it's index. and apparently uses them as entrance pages to crawl the site with the mobile bot. Hence it never sees the default.asp (or outright ignores it).. and this causes as you might have guessed a huge pile of "Dub-content" Normally you would just place some user-agent detection in the page head and either throw Google a 301 or a rel-canon. But since I only have access to JavaScript and html in the page head, this cannot be done. I'm kinda running out of options quickly, so if anyone has an idea as to how the BEEP! I get Google to index the right domains for the right devices, please feel free to comment. 🙂 Any and all ideas are more then welcome.
Intermediate & Advanced SEO | | ReneReinholdt0