Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Finding issue on Gtmetrix speed test and google speed test
-
Hey, when i have tested my website https://www.socprollect-mea.com/ on GT metrix. When i test morning, it shown page speed of 3.8 seconds and in noon it shown 2 seconds and later i check, it is defeclecting. speed on the Google page speed test as well. What kind of error is this and does anyone have the solution for this issue?
-
If you have in the morning 3.8 seconds and in the evening 2 seconds, it could mean that either the tool to test your website is "busy" or the server your website is on is "busy". A server thats on a higher load takes a bit more time to complete loading the website.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Does Google Understand H2 As Subtitle?
I use some HTML 5 tags on my custom template. I implement <header class="entry-header-outer"> Flavour & Chidinma – 40 Yrs 40 Yrs by Flavour & Chidinma </header> html code. h1 tag serves as the title, while h2 tag servers as the subtitle of the post. Take a look at it here: https://xclusiveloaded.com/flavour-chidinma-40-yrs/ I want to know if it's ok or should I remove the h2 tag. Guys, what is your thoughts?
On-Page Optimization | | Kingsmart4 -
Google Webmaster Guideline Change: Human-Readable list of links
In the revised webmaster guidelines, google says "[...] Provide a sitemap file with links that point to the important pages on your site. Also provide a page with a human-readable list of links to these pages (sometimes called a site index or site map page)." (Source: https://support.google.com/webmasters/answer/35769?hl=en) I guess what they mean by this is something like this: http://www.ziolko.de/sitemap.html Still, I wonder why they say that. Just to ensure that every page on a site is linked and consequently findable by humans (and crawlers - but isn't the XML sitemap for those and gives even better information)? Should not a good navigation already lead to every page? What is the benefit of a link-list-page, assuming you have an XML sitemap? For a big site, a link-list is bound to look somewhat cluttered and its usefulness is outclassed by a good navigation, which I assume as a given. Or isn't it? TL;DR: Can anybody tell me what exactly is the benefit of a human-readable list of all links? Regards, Nico
On-Page Optimization | | netzkern_AG0 -
How does Google treat Dynamic Titles?
Let's say my website can be accessed in only 3 states Colorado, Arizona and Ohio. I want to display different information to each visitor based on where they are located. For this I would also like the title to change based on their location. Not quite sure how Google we treat the title and rank the site.... Any resources you can provide would be helpful. Thanks
On-Page Optimization | | Firestarter-SEO0 -
My Meta Description changes when i use different keyword in google search.
Hello everyone, I have a question for the community. I have a website with several articles and news that i manage. I set specific meta descriptions for every page but when i search in google it gives me back different meta descriptions depending on the keyword that i use to search. What i notice is that google looks in my page for the most relevant part of the text that combines with my keyword and gives me back that result. I thought that this only happen when i have an empty meta description. Anyone felt the same ? Best Ricardo www.meuportalfinanceiro.pt
On-Page Optimization | | Adclick0 -
Solve duplicate content issues by using robots.txt
Hi, I have a primary website and beside that I also have some secondary websites with have same contents with primary website. This lead to duplicate content errors. Because of having many URL duplicate contents, so I want to use the robots.txt file to prevent google index the secondary websites to fix the duplicate content issue. Is it ok? Thank for any help!
On-Page Optimization | | JohnHuynh0 -
How does Google Detect which keywords my website should show up for in the SE?
When I checked my Google Webmaster Tools I found that my website is showing up for keywords that I didn't optimize for ... for example I optimize my website for "funny pictures with captions", and the website is showing up for "funny images with captions". I know that this is good, but the keyword is dancing all around, sometimes I search for "funny pictures with captions" and I show up in the 7th page, and some time I don't show up. and the same goes for the other keyword. of course I am optimizing for more than two keywords but the results is not consistent. my question is how does Google decide which keywords you website should show up for? Is it the on-page keywords?, or is it the off-page anchor text keywords? Thank you in advance ...
On-Page Optimization | | FarrisFahad
FarrisFahad0 -
Does Google index dynamically generated content/headers, etc.?
To avoid dupe content, we are moving away from a model where we have 30,000 pages, each with a separate URL that looks like /prices/<product-name>/<city><state>, often with dupe content because the product overlaps from city to city, and it's hard to keep 30,000 pages unique, where sometimes the only distinction is the price & the city/state.</state></city></product-name> We are moving to a model with around 300 unique pages, where some of the info that used to be in the url will move to the page itself (headers, etc.) to cut down on dupe content on those unique 300 pages. My question is this. If we have 300 unique-content pages with unique URL's, and we then put some dynamic info (year, city, state) into the page itself, will Google index this dynamic content? The question behind this one is, how do we continue to rank for searches for that product in the city-state being searched without having that info in the URL? Any best practices we should know about?
On-Page Optimization | | editabletext0 -
Do images on a CDN affect my Google Ranking?
I have recently switched my images to a CDN (MaxCDN) and all of the images within my post are now get loaded directly from the CDN. Will this affect my Google ranking? Do Google care if the image is hosted physicaly on the domain?
On-Page Optimization | | Amosnet0