What if page exists for desktop but not mobile?
-
I have a domain (no subdomains) that serves up different dynamic content for mobile/desktop pages--each having the exact same page url, kind of a semi responsive design, and will be using "Vary: User-Agent" to give Google a heads up on this setup.
However, some of the pages are only valid for mobile or only valid for desktop. In the case of when a page is valid only for mobile (call it mysite.com/mobile-page-only ), Google Webmaster Tools is giving me a soft 404 error under Desktop, saying that the page does not exist, Apparently it is doing that because my program is actually redirecting the user/crawler to the home page. It appears from the info about soft 404 errors that Google is saying since it "doesn't exist" I should give the user a 404 page--which I can make it customized and give the user an option to go to the home page, or choose links from a menu, etc..
My concern is that if I tell the desktop bot that mysite.com/mobile-page-only basically is a 404 error (ie doesn't exist), that it could mess up the mobile bot indexing for that page--since it definitely DOES exist for mobile users..
Does anyone here know for sure that Google will index a page for mobile that is a 404 not found for desktop and vice versa? Obviously it is important to not remove something from an index in which it belongs, so whether Google is careful to differential the two is a very important issue. Has anybody here dealt with this or seen anything from Google that addresses it? Might one be better off leaving it as a soft 404 error?
EDIT: also, what about Bing and Yahoo? Can we assume they will handle it the same way?
EDIT: closely related question--in a case like mine does Google need a separate sitemap for the valid mobile pages and valid desktop pages even though most links will be in both? I can't tell from reading several q&a on this.
Thanks, Ted
-
Monica,
I'm going to open a new thread to ask a similar question, as I think I didn't ask it very well.
Thanks for your input,
Ted
-
Thanks. If I understand you, the mobile bot won't crawl a url that the desktop bot has said needs to be fixed for it to work right for desktop. . Would you agree that doesn't really sound right on Google's part, since the url is fine for mobile use? I don't know why it wouldn't crawl for mobile, but if that's the way it is I can try fixing it on desktop to see if that enables the mobile to get crawled.
Once I do that I guess I'll find out whether a 404 not found for desktop will disable it from crawling for mobile (yes that link is accessible from other pages)--I was hoping to avoid trial and error on that because the time lag seems like it would be hard to pin down.
In a nutshell here's what I'm concerned will happen:
Google mobile bot crawls my mobile page and indexes it: Then the desktop bot crawls the same url and gets a 404 not found. Because of the desktop not found, Google removes it from the mobile page index.
I don't see a good way to test that since it depends on when each crawler is crawling. And, if this is what it is doing, I can't think of a good solution to having a responsive site with some content meant only for mobile indexing or only for desktop indexing.
-
If a URL is labeled a 404 it will not be crawled again unless there is a reason to, you mark it as fixed, or you edit the link in some form or fashion. Mark it as fixed and see if the error comes back. There is no harm in doing this.
Can you get to the page on your mobile device just by clicking through your site? If you can, that is good, it will eventually encourage a mobile bot to crawl it. If you can fetch and render as google, then I would just give it some time. I am not sure if there is a string of code you can add to the head of that page telling the robots that it is a mobile only page. I don't know how that works.
I would just mark it as fixed right now and see what happens over the next couple of days.
-
Hi Monica-thanks for your reply:
Ok, for a page that is supposed to be mobile only within a responsive-like setup(ie one domain) here's what I see:
The desktop bot crawls the link and gives a soft 404 error -- presumably because the page is currently being redirected to the home page.
The mobile bot is not crawling that link despite it being prominent on the main site home page, as my dbase is tracking the bot crawling and is not showing that it crawled that link for mobile (but is for desktop), and a search on my smartphone doesn't show that link either (even though it does show other links for pages used by both).. **Yet, if I fetch the mobile only page in webmaster tools using their mobile bot it finds it and renders it perfectly. ** So, why isn't it crawling it? Is it because when the mobile bot crawls it first looks and sees that that link is already 'flagged' as a soft 404 for the desktop? Or, is it because the mobile crawler is getting hung up on a link on the home page for mobile that has nothing to do with this mobile-only link?
It appears that the mobile bot is influenced by the desktop bot results--which is my fear: It seems to me their 2 bots should be independent of each other. If they aren't independent then if I change it to a 404 not found for desktop, would that even help, or would that prevent the mobile bot from ever trying to crawl it?
I would think that anybody who has a responsive page design and has blocked out certain content so that it renders only for mobile or only for non-mobile has had to face this issue.
Not sure what to do--I could fix the soft errors--change them to 404 not found and just see then if Google starts indexing for mobile or not, but was hoping to get some feedback before experimenting.
Thanks again, and please share more if you have more thoughts!
-
Did you look at your Mobile 404 errors? Google uses a different bot for mobile sites and anything related to that mobile page. Chances are, if it isn't reflecting a 404 in the Mobile errors in GWT, it is being indexed properly.
Check it out from you phone. Google the exact keyword and your company name. See if you can get to the page and if it is in fact the correct page.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Doorway page penalty
Has Google changed their interpretation of Doorway pages?We do not sell widgets but allow me to use Widget for this example;If we sold 25 very different widgets an online vendor would typically have 1 "mother" website with 25 different inner pages, each page to explain each type of widget they sell.However, for the past 9 years our approach is to have 25 different websites, one for each widget. With these 25 sites we concentrated on ranking the home page only . All these sites link back to our (No idexed) "Mother' site via no follow links where we have our Shopping Cart and Terms of Business. We did this partly to avoid having 25 separate Shopping Carts and to avoid having to change our Terms 25 times each time that became necessary. But yes we also did this as it was so much easier to rank each different type of widget in the SERPS. Also we think its a better user experience as in our business buyers of yellow widgets will not be interested in blue widgetsWe have been reading for years that google does not like doorways pages but we were not 100% certain if they might regard our sites as such .This is because our approach has worked great for nine years. That is until December last year when all 95% our sites fell dramatically in the SERPS usually from page 1 to page 2 or 3. First thing we did was to go through all our sites and search for the obvious; toxic links, duplicate content, keyword density, https issues, mobility issues, anchor text, etc etc and of course content. We found no obvious problems that could affect 95% of the sites at the same time but we ordered new homepage content for most of our sites from expert seo writers. However, after putting on this new content 3 -4 weeks ago our sites have not moved up the SERPS at all.So we are left with the inescapable conclusion that our problem is because google sees and devalues our sites as doorway pages especially as 95% of your sites have been affected all at the same time Would any SEO experts on this forum agree or be able to offer an opinion?If so, what might be the solution going forward? We have 2 solutions under consideration;1) Remove all links from each of our 25 sites to our "mother Site" and put a shopping cart and our TOS on each of the 25 sites so they are all truly independent stand alone websites.2) Create 25 inner pages on our mother site (after removing the no index) , for each of the 25 widgets we sell , then 301 each of the 25 individual sites home pages to its inner page on the mother site . I think this might be the best solution partly as almost all of our higher ranking competitors are ranking their inner pages not their homepage. But I worry if these 25 sites will really pass much link juice if they have been devalued by Google.?Any advice will be gratefully received.
Intermediate & Advanced SEO | | apcsilver90 -
Desktop vs. Mobile Website - ranking impact
Working on develop mobile pages using dynamic serving method, we are planing on only develop number of important pages (not the whole site) to be mobile friendly. To keep the consistency of the user experience, the new mobile site will only have internal links to pages that are mobile friend. Questions: If an existing non-mobile page ranking #1 on mobile SERP today, this page will not have a mobile friendly version, and will not link in the mobilefriendly site. will there be any impact to the ranking. Assuming: When Google mobile/Smartphone bots will not see a link to this page. The page will still accessible to Google desktop bots.
Intermediate & Advanced SEO | | tomchu0 -
Possible to Improve Domain Authority By Improving Content on Low Page Rank Pages?
My sites domain authority is only 23. The home page has a page authority of 32. My site consists of about 400 pages. The topic of the site is commercial real estate (I am a real estate broker). A number of the sites we compete against have a domain authority of 30-40. Would our overall domain authority improved if we re-wrote the content for several hundred of pages that had the lowest page authority (say 12-15)? Is the overall domain authority derived by an average of the page authority of each page on a domain? Alternatively could we increase domain authority by setting the pages with the lowest page authority to "no index". By the way our domain is www.nyc-officespace-leader.com Thanks, Alan
Intermediate & Advanced SEO | | Kingalan10 -
Help! Website Page Structure.
Hi there, I have a cupcake website; www.cupcakesdelivered.com.au To date, we have sold only regular cupcakes. Moving forward, we are about to start selling lots of different sorts of cupcakes and want to categorise them - i.e.; sport cupcakes, corporate cupcakes, movie-themed cupcakes etc. I am looking for a recommendation on how best to structure this in terms of pages / domains / subdomains etc, so as to best support SEO. Your help would be greatly appreciated!! Thank you, Laura.
Intermediate & Advanced SEO | | cupcakesdelivered0 -
Mobile Search Results Include Pages Meant Only for Desktops/Laptops
When I put in site:www.qjamba.com on a mobile device it comes back with some of my mobile-friendly pages for that site(same url for mobile and desktop-just different formatting), and that's great. HOWEVER, it also shows a whole bunch of the pages (not identified by Google as mobile-friendly) that are fine for desktop users but are not supposed to exist for the mobile users, because they are too slow. Until a few days ago those pages were being redirected for mobile users to the home page. I since have changed that to 404 not founds. Do we know that Google keeps a mobile index separate from the desktop index? If so, I would think that 404 should work.. How can I test whether the 404 not founds will remove a url so they DON'T appear on a mobile device when I put in site:www.qjamba.com (or a user searches) but DO appear on a desktop for the same command.
Intermediate & Advanced SEO | | friendoffood0 -
Dynamic pages - ecommerce product pages
Hi guys, Before I dive into my question, let me give you some background.. I manage an ecommerce site and we're got thousands of product pages. The pages contain dynamic blocks and information in these blocks are fed by another system. So in a nutshell, our product team enters the data in a software and boom, the information is generated in these page blocks. But that's not all, these pages then redirect to a duplicate version with a custom URL. This is cached and this is what the end user sees. This was done to speed up load, rather than the system generate a dynamic page on the fly, the cache page is loaded and the user sees it super fast. Another benefit happened as well, after going live with the cached pages, they started getting indexed and ranking in Google. The problem is that, the redirect to the duplicate cached page isn't a permanent one, it's a meta refresh, a 302 that happens in a second. So yeah, I've got 302s kicking about. The development team can set up 301 but then there won't be any caching, pages will just load dynamically. Google records pages that are cached but does it cache a dynamic page though? Without a cached page, I'm wondering if I would drop in traffic. The view source might just show a list of dynamic blocks, no content! How would you tackle this? I've already setup canonical tags on the cached pages but removing cache.. Thanks
Intermediate & Advanced SEO | | Bio-RadAbs0 -
Blocking Pages Via Robots, Can Images On Those Pages Be Included In Image Search
Hi! I have pages within my forum where visitors can upload photos. When they upload photos they provide a simple statement about the photo but no real information about the image,definitely not enough for the page to be deemed worthy of being indexed. The industry however is one that really leans on images and having the images in Google Image search is important to us. The url structure is like such: domain.com/community/photos/~username~/picture111111.aspx I wish to block the whole folder from Googlebot to prevent these low quality pages from being added to Google's main SERP results. This would be something like this: User-agent: googlebot Disallow: /community/photos/ Can I disallow Googlebot specifically rather than just using User-agent: * which would then allow googlebot-image to pick up the photos? I plan on configuring a way to add meaningful alt attributes and image names to assist in visibility, but the actual act of blocking the pages and getting the images picked up... Is this possible? Thanks! Leona
Intermediate & Advanced SEO | | HD_Leona0 -
Pages that were ranking now are not?
Hi Folks Noticed something strange just now pages that were ranking on position 10 on Google for searches such as 'ufc trainer kinect best price' at the start of this week are no longer ranking? Is what has happened to the site the famous google dance or sandbox effect as the site only officially went live on Monday If this is the case what is the recommended course of action to get back ranking competitively again? as I have no idea on what has gone wrong as I has always tried to follow best practice from these forums and the SEOMOZ and YOUMOZ articles My site is www.cheapfindergames.com Many Thanks Ian
Intermediate & Advanced SEO | | ocelot0