Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Mobile Googlebot vs Desktop Googlebot - GWT reports - Crawl errors
-
Hi Everyone,
I have a very specific SEO question. I am doing a site audit and one of the crawl reports is showing tons of 404's for the "smartphone" bot and with very recent crawl dates. If our website is responsive, and we do not have a mobile version of the website I do not understand why the desktop report version has tons of 404's and yet the smartphone does not. I think I am not understanding something conceptually.
I think it has something to do with this little message in the Mobile crawl report.
"Errors that occurred only when your site was crawled by Googlebot (errors didn't appear for desktop)."
If I understand correctly, the "smartphone" report will only show URL's that are not on the desktop report. Is this correct?
-
Hey Carla,
I'm not entirely sure what you're saying with:
"one of the crawl reports is showing tons of 404's for the "smartphone" bot and with very recent crawl dates. If our website is responsive, and we do not have a mobile version of the website I do not understand why the desktop report version has tons of 404's and yet the smartphone does not. I think I am not understanding something conceptually."
You say that the smartphone bot is seeing tons of 404s and the desktop report is showing tons of 404s, but the smartphone does not. If you can clarify that, I can probably better answer your question.
However, the answer is likely that Google may decide not to crawl URLs that it has already identified as 404s in one context. That is to say if they identify URLs on the mobile device as 404s they will know not to crawl them if they encounter them on desktop and vice versa.
-Mike
-
Ok I forgot to add something to this as well. Why would the URLS show up on the smartphone report if they are not on the desktop report? Afterall a 404 from either device is still a 404?
Thanks
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
[Very Urgent] More 100 "/search/adult-site-keywords" Crawl errors under Search Console
I just opened my G Search Console and was shocked to see more than 150 Not Found errors under Crawl errors. Mine is a Wordpress site (it's consistently updated too): Here's how they show up: Example 1: URL: www.example.com/search/adult-site-keyword/page2.html/feed/rss2 Linked From: http://an-adult-image-hosting.com/search/adult-site-keyword/page2.html Example 2 (this surprised me the most when I looked at the linked from data): URL: www.example.com/search/adult-site-keyword-2.html/page/3/ Linked From: www.example.com/search/adult-site-keyword-2.html/page/2/ (this is showing as if it's from our own site) http://a-spammy-adult-site.com/search/adult-site-keyword-2.html Example 3: URL: www.example.com/search/adult-site-keyword-3.html Linked From: http://an-adult-image-hosting.com/search/adult-site-keyword-3.html How do I address this issue?
Intermediate & Advanced SEO | | rmehta10 -
Duplicate H1 on single page for mobile and desktop
I have a responsive site and whilst this works and is liked by google from a user perspective the pages could look better on mobile. I have a wordpress site and use the Divi Builder with elegant themes and have developed a separate page header for mobile that uses a manipulated background image and smaller H1 font size. When crawling the site two H1s can be detected on the same page - they are exactly the same words and only one will show according to device. However, I need to know if this will cause me a problem with google and SEO. As the mobile changes are not just font size but also adaptations to some visual elements it is not something I can simply alter in the CSS. Would appreciate some input as to whether this is a problem or not
Intermediate & Advanced SEO | | Cells4Life0 -
Lazy Loading of Blog Posts and Crawl Depths
Hi Moz Fans, We are looking at our blog and improving the content as much as we can for SEO purposes, but we have hit a bit of a blank in terms of lazy loading implications and issues with crawl depths. We introduced lazy loading onto the blog home page to increase site speed initially and it works well with infinite scroll, but we were wondering whether this would cause any issues regarding SEO. A lot of the resources online seem to be conflicting and some are very outdated, so some clarification on what is best in terms of lazy loading and crawl depths for blogs, would be fantastic! I hope someone can help and give us some up to date insights - If you need anymore information, I'll reply ASAP
Intermediate & Advanced SEO | | Victoria_0 -
Description vs meta description
I have an e-commerce website and am trying to create product category pages. I am under the impression that Description is the text that would appear under the title on a google search and I believe the meta description is just what google reads? Is having BOTH important or just description? Is it ok to duplicate the description for the meta description? I know its not good to duplicate descriptions on other products and pages.
Intermediate & Advanced SEO | | nchachula0 -
Would you rate-control Googlebot? How much crawling is too much crawling?
One of our sites is very large - over 500M pages. Google has indexed 1/8th of the site - and they tend to crawl between 800k and 1M pages per day. A few times a year, Google will significantly increase their crawl rate - overnight hitting 2M pages per day or more. This creates big problems for us, because at 1M pages per day Google is consuming 70% of our API capacity, and the API overall is at 90% capacity. At 2M pages per day, 20% of our page requests are 500 errors. I've lobbied for an investment / overhaul of the API configuration to allow for more Google bandwidth without compromising user experience. My tech team counters that it's a wasted investment - as Google will crawl to our capacity whatever that capacity is. Questions to Enterprise SEOs: *Is there any validity to the tech team's claim? I thought Google's crawl rate was based on a combination of PageRank and the frequency of page updates. This indicates there is some upper limit - which we perhaps haven't reached - but which would stabilize once reached. *We've asked Google to rate-limit our crawl rate in the past. Is that harmful? I've always looked at a robust crawl rate as a good problem to have. Is 1.5M Googlebot API calls a day desirable, or something any reasonable Enterprise SEO would seek to throttle back? *What about setting a longer refresh rate in the sitemaps? Would that reduce the daily crawl demand? We could set increase it to a month, but at 500M pages Google could still have a ball at the 2M pages/day rate. Thanks
Intermediate & Advanced SEO | | lzhao0 -
When to Use Schema vs. Facebook Open Graph?
I have a client who for regulatory reasons cannot engage in any social media: no Twitter, Facebook, or Google+ accounts. No social sharing buttons allowed on the site. The industry is medical devices. We are in the process of redesigning their site, and would like to include structured markup wherever possible. For example, there are lots of schema types under MedicalEntity: http://schema.org/MedicalEntity Given their lack of social media (and no plans to ever use it), does it make sense to incorporate OG tags at all? Or should we stick exclusively to the schemas documented on schema.org?
Intermediate & Advanced SEO | | Allie_Williams0 -
Googlebot on paywall made with cookies and local storage
My question is about paywalls made with cookies and local storage. We are changing a website with free content to a open paywall with a 5 article view weekly limit. The paywall is made to work with cookies and local storage. The article views are stored to local storage but you have to have your cookies enabled so that you can read the free articles. If you don't have cookies enable we would pass an error page (otherwise the paywall would be easy to bypass). Can you say how this affects SEO? We would still like that Google would index all article pages that it does now. Would it be cloaking if we treated Googlebot differently so that when it does not have cookies enabled, it would still be able to index the page?
Intermediate & Advanced SEO | | OPU1