I have followed all the steps in google speed ranking on how to increase my website http://briefwatch.com/ speed but no good result
-
My website http://briefwatch.com/ has a very low-speed score on google page speed and I followed all the steps given to me still my website speed doesn't increase
-
you need to optimize the theme and your images and check your hosting provider is it good or bad .
-
Even though website speed is not a dominant factor for Google rankings, this problem is not trivial. After many tries, for my San Diego Hills website, this is the most relevant way to increase my website speed
Try using some of these steps :- Install WP Rocket (better use pro version) and Autoptimize, you can choose one or combinating.
- Asset Cleanup (use this to manually remove unnecessary data that appears on the page)
- Use Google page speed insights to see which parts are contributing to the slowness of your website.
- Use a plugin to compress images to optimize loading speed. Or you can use Ezgif to compress images before inserting them into the page. Webp image files are often recommended because the files are small, but they are often not crawled by Google
-
You can use the Web Console in Firefox to see the URL load speed. According to the screenshot attached there is no delay resolving the DNS and connecting to your website. However, there is huge delay of >5s waiting for the response. So, something must be wrong on the server side. I'd recommend checking the server logs (or the Nginx cache logs because there seem to be a cache).
Notice that this delay has nothing to do with the page size or contents. That's why following the steps to optimize the contents didn't help.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Checking subdomains/ site structure of a website for International SEO
Dear Moz community, I am looking into two websites for a friend and we want to understand the following: What is the site structure as per the sub domains? e.g currently it is .com/en/ or .com/ru/ or .com/zh/ Using the crawl report, each page has a en or other language version. I take it this means that we have to create copy, meta titles and descriptions for each of the languages even if the page is the same but in a different language? To avoid duplication of content would you suggest canonical tags to be put in place? To check hreflang mark up, I couldn't find anything in the code which makes me thing a script is automatically translating this? This is the first time I have started to look at international SEO and want to understand what to look for in an audit of exisiting sites. Thank you,
Local Website Optimization | | TAT1000 -
More pages on website better for SEO?
Hi all, Is creating more pages better for SEO? Of course the pages being valuable content. Is this because you want the user to spend as much time as possible on your site. A lot of my competitors websites seem to have more pages than mine and their domain authorities are higher, for example the services we provide are all on one page and for my competitors each services as its own page. Kind Regards, Aqib
Local Website Optimization | | SMCCoachHire0 -
Google for Jobs: how to deal with third-party sites that appear instead of your own?
We have shared our company's job postings on several third-party websites, including The Muse, as well as putting the job postings on our own website. Our site and The Muse have about the same schema markup except for these differences: The Muse...
Local Website Optimization | | Kevin_P
• Lists Experience Requirements
• Uses HTML in the description with tags and other markup (our website just has plain text)
• Has a Name in JobPosting
• URL is specific to the position (our website's URL just goes to the homepage)
• Has a logo URL for Organization When you type the exact job posting's title into Google, The Muse posting shows up in Google for Jobs--not our website's duplicate copy. The only way to see our website's job posting is to type in the exact job title plus "site:http://www.oursite.com". What is a good approach for getting our website's posting to be the priority in Google for Jobs? Do we need to remove postings from third-party sites? Structure them differently? Do organic factors affect which version of the job posting is shown, and if so, can I assume that our site will face challenges outranking a big third-party site?1 -
How can I see if my users are coming from google.com or google.ca?
Got a local search question for you here. Google will automatically redirect most users to their country's google product of choice - google.ca if you're in Canada. However, I'm seeing a ton of organic traffic to our website that is ranking poorly in Google.ca, but ranking well on Google.com. This is a local HVAC business in Alberta. Is there a way to see the amount of traffic coming from Google.com as opposed to Google.ca in analytics?
Local Website Optimization | | brettmandoes0 -
Theory: Local Keywords are Hurting National Rankings?
I've read a good amount here and in other blog posts about strategies for national brands to rank locally as well with local landing pages, citations, etc. I have noticed something strange that I'd like to hear if anyone else is running into, or if anyone has a definitive answer for. I'm looking at a custom business printing company where the products can and are often shipped out of state, so it's a national brand. On each product page, the client is throwing in a few local keywords near where the office is to help rank for local variations. When looking at competitors that have a lower domain authority, lower volume of linking root domains, less content on the page, and other standard signals, they are ranking nationally better than the client. The only thing they're doing that could be better is bolding and throwing in the page keyword 5-10 times (which looks unnatural). But when you search for keyword + home city, the client ranks better. My hypothesis is that since the client is optimizing product pages for local keywords as well as national, it is actually hurting on national searches because it's seen as local-leaning business. Has anyone run into this before, or have a definitive answer?
Local Website Optimization | | Joe.Robison2 -
Localized Search Results
I'll try to setup this question: I go to Google.com and set the search tools to a particular city that I am not in (say I live in Nashville but set the search tools for Rockville MD). I do a search for a specific term without a location modifier such as "chrysler town and country" and I don't see the website I'm looking for in the first 100 results. Then I keep the search tools the same, but change the specific search to "chrysler town and country rockville md" and the website I'm looking for is now the #1 result. What would affect the difference? I would have expected the website to have a similar ranking in both situations.
Local Website Optimization | | perkfriday0 -
Is it possible to target a keyword which is english but targeted to google.com.tr user
Hey I want to know, is it possible to target a keyword which is english, but target market .com.tr For that purpose must we take backlink from site written english but target to turkish ?
Local Website Optimization | | atakala
Or site written english but target to anywhere? I know this question is a bit confusing but my boss want me to that.0 -
Does Google play fair? Is 'relevant content' and 'usability' enough?
It seems there are 2 opposing views, and as a newbie this is very confusing. One view is that as long as your site pages have relevant content and are easy for the user, Google will rank you fairly. The other view is that Google has 'rules' you must follow and even if the site is relevant and user-friendly if you don't play by the rules your site may never rank well. Which is closer to the truth? No one wants to have a great website that won't rank because Google wasn't sophisticated enough to see that they weren't being unfair. Here's an example to illustrate one related concern I have: I've read that Google doesn't like duplicated content. But, here are 2 cases in which is it more 'relevant' and 'usable' to the user to have duplicate content: Say a website helps you find restaurants in a city. Restaurants may be listed by city region, and by type of restaurant. The home page may have links to 30 city regions. It may also have links for 20 types of restaurants. The user has a choice. Say the user chooses a region. The resulting new page may still be relevant and usable by listing ALL 30 regions because the user may want to choose a different region. Altenatively say the user chooses a restaurant type for the whole city. The resulting page may still be relevant and usable by giving the user the ability to choose another type OR another city region. IOW there may be a 'mega-menu' at the top of the page which duplicates on every page in the site, but is very helpful. Instead of requiring the user to go back to the home page to click a new region or a new type the user can do it on any page. That's duplicate content in the form of a mega menu, but is very relevant and usable. YET, my sense is that Google MAY penalize the site even though arguably it is the most relevant and usable approach for someone that may or may not have a specific region or restaurant type in mind.. Thoughts?
Local Website Optimization | | couponguy0