Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Sudden Drop in Mobile Core Web Vitals
-
For some reason, after all URLs being previously classified as Good, our Mobile Web Vitals report suddenly shifted to the above, and it doesn't correspond with any site changes on our end.
Has anyone else experience something similar or have any idea what might have caused such a shift?
Curiously I'm not seeing a drop in session duration, conversion rate etc. for mobile traffic despite the seemingly sudden change.
-
I can’t understand their algorithm for core web vitals. I have made some technical updates to our website for speed optimization, but the thing that happened in the search console is very confusing for my site.
For desktops, pages are indexed as good URLs
while mobile-indexed URLs are displayed as poor URLs.
Our website is the collective material for people looking for Canada immigration (PAIC), and 70% of the portion is filled with text only. We are using webp images for optimization, still it is not passing Core Web Vitals.I am looking forward to the expert’s suggestion to overcome this problem.
-
I can’t understand their algorithm for core web vitals. I have made some technical updates to our website for speed optimization, but the thing that happened in the search console is very confusing for my site.
For desktops, pages are indexed as good URLs
while mobile-indexed URLs are displayed as poor URLs.
Our website is the collective material for people looking for Canadian immigration (PAIC), and 70% of the portion is filled with text only. We are using webp images for optimization, still it is not passing Core Web Vitals.I am looking forward to the expert’s suggestion to overcome this problem.
-
@rwat Hi, did you find a solution?
-
Yes, I am also experiencing the same for one of my websites, but most of them are blog posts and I am using a lot of images without proper optimization, so that could be the reason. but not sure.
It is also quite possible that Google maybe adding some more parameters to their main web critical score.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Unsolved Google Search Console Still Reporting Errors After Fixes
Hello, I'm working on a website that was too bloated with content. We deleted many pages and set up redirects to newer pages. We also resolved an unreasonable amount of 400 errors on the site. I also removed several ancient sitemaps that listed content deleted years ago that Google was crawling. According to Moz and Screaming Frog, these errors have been resolved. We've submitted the fixes for validation in GSC, but the validation repeatedly fails. What could be going on here? How can we resolve these error in GSC.
Technical SEO | | tif-swedensky0 -
Unsolved Duplicate LocalBusiness Schema Markup
Hello! I've been having a hard time finding an answer to this specific question so I figured I'd drop it here. I always add custom LocalBusiness markup to clients' homepages, but sometimes the client's website provider will include their own automated LocalBusiness markup. The codes I create often include more information. Assuming the website provider is unwilling to remove their markup, is it a bad idea to include my code as well? It seems like it could potentially be read as spammy by Google. Do the pros of having more detailed markup outweigh that potential negative impact?
Local Website Optimization | | GoogleAlgoServant0 -
Breadcrumbs on Mobile How important are they for SEO?
Due to Poor unsightly look of breadcrumbs and the space it takes up above the fold we only employ breadcrumbs on our desktop version. Breadcrumbs are hidden from view on mobile version. However as mobile first indexing is now in play what technical SEO impacts will this have? one thing that comes to mind is crawling deeper pages where breadcrumbs made them accessible in less than 3 link clicks? But i am unsure now of the impacts of not having breadcrumbs visible for mobile version of our site.
Technical SEO | | oceanstorm0 -
Best Web-site Structure/ SEO Strategy for an online travel agency?
Dear Experts! I need your help with pointing me in the right direction. So far I have found scattered tips around the Internet but it's hard to make a full picture with all these bits and pieces of information without a professional advice. My primary goal is to understand how I should build my online travel agency web-site’s (https://qualistay.com) structure, so that I target my keywords on correct pages and do not create a duplicate content. In my particular case I have very similar properties in similar locations in Tenerife. Many of them are located in the same villa or apartment complex, thus, it is very hard to come up with the unique description for each of them. Not speaking of amenities and pricing blocks, which are standard and almost identical (I don’t know if Google sees it as a duplicate content). From what I have read so far, it’s better to target archive pages rather than every single property. At the moment my archive pages are: all properties (includes all property types and locations), a page for each location (includes all property types). Does it make sense adding archive pages by property type in addition OR in stead of the location ones if I, for instance, target separate keywords like 'villas costa adeje' and 'apartments costa adeje'? At the moment, the title of the respective archive page "Properties to rent in costa adeje: villas, apartments" in principle targets both keywords... Does using the same keyword in a single property listing cannibalize archive page ranking it is linking back to? Or not, unless Google specifically identifies this as a duplicate content, which one can see in Google Search Console under HTML Improvements and/or archive page has more incoming links than a single property? If targeting only archive pages, how should I optimize them in such a way that they stay user-friendly. I have created (though, not yet fully optimized) descriptions for each archive page just below the main header. But I have them partially hidden (collapsible) using a JS in order to keep visitors’ focus on the properties. I know that Google does not rank hidden content high, at least at the moment, but since there is a new algorithm Mobile First coming up in the near future, they promise not to punish mobile sites for a collapsible content and will use mobile version to rate desktop one. Does this mean I should not worry about hidden content anymore or should I move the descirption to the bottom of the page and make it fully visible? Your feedback will be highly appreciated! Thank you! Dmitry
Technical SEO | | qualistay1 -
Sudden jump in the number of 302 redirects on my Squarespace Site
My Squarespace site www.thephysiocompany.com has seen a sudden jump in 302 redirects in the past 30 days. Gone from 0-302 (ironically). They are not detectable using generic link redirect testing sites and Squarespace have not explanation. Any help would be appreciated.
Technical SEO | | Jcoley0 -
Why has my search traffic suddenly tanked?
On 6 June, Google search traffic to my Wordpress travel blog http://www.travelnasia.com tanked completely. There are no warnings or indicators in Webmaster Tools that suggest why this happened. Traffic from search has remained at zero since 6 June and shows no sign of recovering. Two things happened on or around 6 June. (1) I dropped my premium theme which was proving to be not mobile friendly and replaced it with the ColorMag theme which is responsive. (2) I relocated off my previous hosting service which was showing long server lag times to a faster host. Both of these should have improved my search performance, not tanked it. There were some problems with the relocation to the new web host which resulted in a lot of "out of memory" errors on the website for 3-4 days. The allowed memory was simply not enough for the complexity of the site and the volume of traffic. After a few days of trying to resolve these problems, I moved the site to another web host which allows more PHP memory and the site now appears reliably accessible for both desktop and mobile. But my search traffic has not recovered. I am wondering if in all of this I've done something that Google considers to be a cardinal sin and I can't see it. The clues I'm seeing include: Moz Pro was unable to crawl my site last Friday. It seems like every URL it tried to crawl was of the form http://www.travelnasia.com/wp-login.php?action=jetpack-sso&redirect_to=http://www.travelnasia.com/blog/bangkok-skytrain-bts-mrt-lines which resulted in a 500 status error. I don't know why this happened but I have disabled the Jetpack login function completely, just in case it's the problem. GWT tells me that some of my resource files are not accessible by GoogleBot due to my robots.txt file denying access to /wp-content/plugins/. I have removed this restriction after reading the latest advice from Yoast but I still can't get GWT to fetch and render my posts without some resource errors. On 6 June I see in Structured Data of GWT that "items" went from 319 to 1478 and "items with errors" went from 5 to 214. There seems to be a problem with both hatom and hcard microformats but when I look at the source code they seem to be OK. What I can see in GWT is that each hcard has a node called "n [n]" which is empty and Google is generating a warning about this. I see that this is because the author vcard URL class now says "url fn n" but I don't see why it says this or how to fix it. I also don't see that this would cause my search traffic to tank completely. I wonder if anyone can see something I'm missing on the site. Why would Google completely deny search traffic to my site all of a sudden without notifying any kind of penalty? Note that I have NOT changed the content of the site in any significant way. And even if I did, it's unlikely to result in a complete denial of traffic without some kind of warning.
Technical SEO | | Gavin.Atkinson1 -
Google rankings dropped dramatically after 24 hrs of hosting suspension
Hi, One of my websites ( http://www.traveldestinationsearch.com/ ) dropped most of its Google rankings after 24 hours of hosting suspension (from April 26 until April 27, 2012). The hosting company suspended my website after exceeding the bandwidth limit: there was no unusual activity on my website, it just exceeded its bandwidth limit by 20-30MB for the previous month. Anyway, the website is back online since April 27 but the problem is that, following these 24 hrs of no service, I see a dramatic decrease of my website's Google rankings for its main keywords. Even today, April 29, I can't find my website anywhere in the first 100 results for most of its targeted keywords. Before the suspension, the website ranked #1 for its main keyword and somewhere in the first 2-3 pages of Google search results for other two main keywords. My question is: is it the hosting suspension the reason for the Google ranking drop, and (assuming this is a temporary problem) when do you think I should expect my website to regain the rankings it had before the hosting suspension? Thanks for your support. Regards, Adrian
Technical SEO | | AdrianBanu0