Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
What is the best tool to view your page as Googlebot?
-
Our site was done with asp.net and a lot of scripting. I want to see what Google can see and what it can't. What is the best tool that duplicates Googlebot? I have found several but they seem old or inaccurate.
-
I just used it today. The simple crawl is free. Advanced is paid.
-
I think it's free again? I use it today...
-
Used to be free, now it's 9,99$ per year
certainly not unreasonable, but some sort of demo would be nice so one knows what to expect. Do you have another recommendation?
-
Hi Niners52,
Personally, I like to use webconfs search engine spider tools (http://www.webconfs.com/search-engine-spider-simulator.php) but there are loads out there that you can use and it is your own preference really.
Also, I use ScreamingFrog to do similar tasks.
Matt.
-
I'm a fan of http://www.seo-browser.com/ - use their simple search and it's totally free.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Https pages indexed but all web pages are http - please can you offer some help?
Dear Moz Community, Please could you see what you think and offer some definite steps or advice.. I contacted the host provider and his initial thought was that WordPress was causing the https problem ?: eg when an https version of a page is called, things like videos and media don't always show up. A SSL certificate that is attached to a website, can allow pages to load over https. The host said that there is no active configured SSL it's just waiting as part of the hosting package just in case, but I found that the SSL certificate is still showing up during a crawl.It's important to eliminate the https problem before external backlinks link to any of the unwanted https pages that are currently indexed. Luckily I haven't started any intense backlinking work yet, and any links I have posted in search land have all been http version.I checked a few more url's to see if it’s necessary to create a permanent redirect from https to http. For example, I tried requesting domain.co.uk using the https:// and the https:// page loaded instead of redirecting automatically to http prefix version. I know that if I am automatically redirected to the http:// version of the page, then that is the way it should be. Search engines and visitors will stay on the http version of the site and not get lost anywhere in https. This also helps to eliminate duplicate content and to preserve link juice. What are your thoughts regarding that?As I understand it, most server configurations should redirect by default when https isn’t configured, and from my experience I’ve seen cases where pages requested via https return the default server page, a 404 error, or duplicate content. So I'm confused as to where to take this.One suggestion would be to disable all https since there is no need to have any traces to SSL when the site is even crawled ?. I don't want to enable https in the htaccess only to then create a https to http rewrite rule; https shouldn't even be a crawlable function of the site at all.RewriteEngine OnRewriteCond %{HTTPS} offor to disable the SSL completely for now until it becomes a necessity for the website.I would really welcome your thoughts as I'm really stuck as to what to do for the best, short term and long term.Kind Regards
Web Design | | SEOguy10 -
WordPress Category page title h1 or h2
Hi friends, I know this is a minor technical change, but we are in an extremely competitive market and I don't want to have any points against us. On our WordPress Category pages i.e. http://www.domain.com/category/�tegory-title%/ I looked at the code behind the the Title of the category page, which is "Browsing: %Category Title%" The code is an h2. I look at the posts in the category archive below, and those are also h2's. The theme preview is here and you can click on Entertainment - Reviews to see exactly what I'm referring to - http://themeforest.net/item/smartmag-responsive-retina-wordpress-magazine/full_screen_preview/6652608 I changed the code for the "Browsing: %Category Title%" to h1, which I believe is more consistent and standard formatting. 1. Is this a correct technical on-page optimization? 2. Would it be beneficial to remove "Browsing"?
Web Design | | JustinMurray0 -
Best SEO practice - Umbrella brand with several domains
Hi, we have several blogs and comparison sites on specific topics. All the domains rank on top positions in very competitive niche markets. We think that we can get more profit out of the domains when we put them under an umbrella brand. Customers that visit domain A can then also find products easily on domain B. We see this for example on health.com, with several brands in the top. To maintain or improve our rankings i'm looking for specific information for the link structure. For example, is it better to have the 'about us'/rel=author on each domain, with contributors on that specific domain or is it better to have them all in the (umbrella) brand domain. At the moment we have the structure like this: domainA.com, domainA.com/blog, domainA.com/about-us and domainB.com, domainB.com/blog, domainB.com/about-us. I think to maintain the rankings it is best to keep specific content (like blog/ about us) on the domain. So is it the best to just do side wide links with a logo (like health.com) and what about hosting? We work with wordpress, so all domains will be hosted on one ip? when we use the multiple site option of WP? All information on this topic is more than welcome 🙂
Web Design | | remkoallertz0 -
Can anyone recommend a tool that will identify unused and duplicate CSS across an entire site?
Hi all, So far I have found this one: http://unused-css.com/ It looks like it identifies unused, but perhaps not duplicates? It also has a 5,000 page limit and our site is 8,000+ pages....so we really need something that can handle a site larger than their limit. I do have Screaming Frog. Is there a way to use Screaming Frog to locate unused and duplicate CSS? Any recommendations and/or tips would be great. I am also aware of the Firefix extensions, but to my knowledge they will only do one page at a time? Thanks!
Web Design | | danatanseo0 -
One Page Guide vs. Multiple Individual Pages
Howdy, Mozzers! I am having a battle with my inner-self regarding how to structure a resources section for our website. We're building out several pieces of content that are meant to be educational for our clients and I'm having trouble deciding how to layout the content structure. We could either layout all eight short sections on a single page, or create individual pages for each section. The goal is obviously to attract new potential clients by targeting these terms that they may be searching for in an information gathering stage. Here's my dilemma...
Web Design | | jpretz
With the single page guide, it would be nice because it will have a lot of content (and of course, keywords) to be picked up by the SERPS but I worry that it is going to be a bit crammed (because of eight sections) for the user. The individual pages would be much better organized and you can target more specific keywords, but I worry that it may get flagged for light content as some pages may have as little as a 150 word description. I have always been mindful of writing copy for searchers over spiders, but now I'm at a more technical crossroads as far as potentially getting dinged for not having robust content on each page. Here's where you come in...
What do you think is the better of the two options? I like the idea of having the multiple pages because of the ability to hone-in on a keyword and the clean, organized feel, but I worry about the lack of content (and possibly losing out on long-tail opportunities). I'd love to hear your thoughts. Please and thank you. Ready annnnnnnnnnnnd GO!0 -
ECWID How to fix Duplicate page content and external link issue
I am working on a site that has a HUGE number of duplicate pages due to ECWID ecommerce platform. The site is built with Joomla! How can I rectify this situation? The pages also show up as "external " links on crawls... Is it the ECWID platform? I have never worked on a site that uses this. Here is an example of a page with the issue (there are 6280 issues) URL: http://www.metroboltmi.com/shop-spare-parts?Itemid=218&option=com_rokecwid&view=ecwid&ecwid_category_id=3560081
Web Design | | Atlanta-SMO0 -
Best method to stop crawler access to extra Nav Menu
Our shop site has a 3 tier drop down mega-menu so it's easy to find your way to anything from anywhere. It contains about 150 links and probably 300 words of text. We also have a more context-driven single layer of sub-category navigation as well as breadcrumbs on our category pages. You can get to every product and category page without using the drop down mega-menu. Although the mega-menu is a helpful tool for customers, it means that every single page in our shop has an extra 150 links on it that go to stuff that isn't necessarily related or relevant to the page content. This means that when viewed from the context of a crawler, rather than a nice tree like crawling structure, we've got more of an unstructured mesh where everything is linked to everything else. I'd like to hide the mega-menu links from being picked up by a crawler, but what's the best way to do this? I can add a nofollow to all mega-menu links, but are the links still registered as page content even if they're not followed? It's a lot of text if nothing else. Another possibility we're considering is to set the mega-menu to only populate with links when it's main button is hovered over. So it's not part of the initial page load content at all. Or we could use a crude yet effective system we have used for some other menus we have of base encoding the content inline so it's not readable by a spider. What would you do and why? Thanks, James
Web Design | | DWJames0 -
Mobile Site Pages: Word Count Help
Hi there I am doing a mobile website for a client and they asked me what the dieal word count would be per page. They are SEO conciosu but we are not doing SEO on this site. I would just like to know a general rule of thumb. Regards Stef
Web Design | | stefanok0