Differences between Lynx Viewer, Fetch as Googlebot and SEOMoz Googlebot Rendering
-
Three tools to render a site as Googlebot would see it:
SEOMoz toolbar.
Lynxviewer (http://www.yellowpipe.com/yis/tools/lynx/lynx_viewer.php )
Fetch as Googlebot.I have a website where I can see dropdown menus in regular browser rendering, Lynxviewer and Fetch as Googlebot. However, in the SEOMoz toolbar 'render as googlebot' tool, I am unable to see these dropdown menus when I have javascript disabled.
Does this matter? Which of these tools is a better way to see how googlebot views your site?
-
Each tool processes pages differently, attempting to emulate the actual Googlebot crawler. You may want to jump over to SEOmoz's Help Desk to get specific info on the Moz version, however the only way to know that you'll always be able to see what Googlebot actually sees, even when the Googlebot might change over time, is to use Google Webmaster Tools.
Sign into GWT, then click to "Diagnostics" and then "Fetch as Googlebot". There you'll be able to enter a URL. It may take a few minutes to get the results, but you'll see what they see.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Resolving 301 Redirect Chains from Different URL Versions (http, https, www, non-www)
Hi all, Our website has undergone both a redesign (with new URLs) and a migration to HTTPS in recent years. I'm having difficulties ensuring all URLs redirect to the correct version all the while preventing redirect chains. Right now everything is redirecting to the correct version but it usually takes up to two redirects to make this happen. See below for an example. How do I go about addressing this, or is this not even something I should concern myself with? Redirects (2) <colgroup><col width="123"><col width="302"></colgroup>
Technical SEO | | theyoungfirm
| Redirect Type | URL |
| | http://www.theyoungfirm.com/blog/2009/index.html 301 | https://theyoungfirm.com/blog/2009/index.html 301 | https://theyoungfirm.com/blog/ | This code below was what we added to our htaccess file. Prior to adding this, the various subdomain versions (www, non-www, http, etc.) were not redirecting properly. But ever since we added it, it's now created these additional URLs (see bolded URL above) as a middle step before resolving to the correct URL. RewriteEngine on RewriteCond %{HTTP_HOST} ^www.(.*)$ [NC] RewriteRule ^(.*)$ https://%1/$1 [R=301,L] RewriteCond %{HTTPS} !on RewriteRule (.*) https://%{HTTP_HOST}%{REQUEST_URI} [R=301,L] Your feedback is much appreciated. Thanks in advance for your help. Sincerely, Bethany0 -
Different breadcrumbs for each productpage
Hi all, I have a question related to the breadcrumb. We have an e-commerce site. There is a difference in the breadcrumb when navigating to our products vs directly browsing to the URL of the product. When you navigate to the product the breadcrumb looks like this (also in the source code):
Technical SEO | | AMAGARD
Home > Sand > Sandpit sand > Bigbag Sandpit sand type xyz When you visit the product URL directly, the breadcrumb looks like this (also in the source code):
Home > Bigbag Sandpit sand type xyz Looks to me that can be confusing for a search engine and that it is unclear what the site's structure/hierarchy is like (and also for a user of course). Is that true? If yes, does this have a big direct negative impact looking at SEO? Thanks in advance!0 -
Cross domain canonical for different branded sites
Hi everyone, We are working on 5 websites that offer the same products but are of different brands and locations. They are owned by the same company, but each run independently. On the sites, they have content such as privacy policies, terms and conditions and guides that are the same across all brands. Will publishing these be flagged as duplicate content by Google? If yes, is it recommended to add rel=canonical to all duplicate pages across all sites pointing to one of the five? We are just concerned that the 4 sites with duplicate content would be valued less than the canonical as a result of passed link equity. We are doing SEO optimisations for all and are trying to rank them well in SERPs. If a canonical is not the best solution here, what would be the best to do apart from completely rewriting content? Is it noindex tag or turning the texts into images and adding to PDFs? Thank you.
Technical SEO | | nhhernandez1 -
Crawl Issues / Partial Fetch Via Google
We recently launched a new site that doesn't have any ads, but in Webmaster Tools under "Fetch as Google" under the rendering of the page I see: Googlebot couldn't get all resources for this page. Here's a list: URL Type Reason Severity https://static.doubleclick.net/instream/ad_status.js Script Blocked Low robots.txt https://googleads.g.doubleclick.net/pagead/id AJAX Blocked Low robots.txt Not sure where that would be coming from as we don't have any ads running on our site? Also, it's stating the the fetch is a "partial" fetch. Any insight is appreciated.
Technical SEO | | vikasnwu0 -
Dynamically serving different HTML on the same URL
Dear Mozers, We are creating a mobile version for a real estate website. We are planning to dynamically serve different HTML on same URL. I'm a little confused about the on-page optimization for the mobile version. The desktop version pages has lot of text content and I strongly believe that made us ranking for various keywords. Now if I'm creating this mobile version do I need to serve all the same exact text content on the mobile version too? I found zillow.com using the same method, their desktop version has lot of text content and mobile version is clean without any text. Does this affect the sites SEO anyway? Please help, share your thoughts. RIyas
Technical SEO | | riyas_0 -
Googlebot Crawl Rate causing site slowdown
I am hearing from my IT department that Googlebot is causing as massive slowdown/crash our site. We get 3.5 to 4 million pageviews a month and add 70-100 new articles on the website each day. We provide daily stock research and marke analysis, so its all high quality relevant content. Here are the crawl stats from WMT: http://imgur.com/dyIbf I have not worked with a lot of high volume high traffic sites before, but these crawl stats do not seem to be out of line. My team is getting pressure from the sysadmins to slow down the crawl rate, or block some or all of the site from GoogleBot. Do these crawl stats seem in line with sites? Would slowing down crawl rates have a big effect on rankings? Thanks
Technical SEO | | SuperMikeLewis0 -
Using differing calls to action based on IP address
Hi, We have an issue with a particular channel on a lead generation site where we have sales staff requiring different quality of leads in different parts of the country. In saturated markets they require a stricter lead qualification process than those in more challenging markets. To combat the problem I am toying with the idea of severing very slightly different content based on IP address. The main change in content would be in terms of calls to action and lead qualification processes. We would plan to have a "standard" version of the site for when IP location can not be detected. URLs on this version would be the rel="canonical" for the location specific pages. Is there a way to do this without creating duplicate content, cloaking or other such issues on the site? Any advice, theories or case studies would be greatly appreciated.
Technical SEO | | SEM-Freak1 -
What are the SEOmoz-suggested best practices for limiting the number of 301 redirects for a given site?
I've read some vague warnings of potential problems with having a long list of 301 redirects within an htaccess file. If this is a problem, could you provide any guidance on how much is too much? And if there is a problem associated with this, what is that problem exactly?
Technical SEO | | roush0