Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
What is the best tool to view your page as Googlebot?
-
Our site was done with asp.net and a lot of scripting. I want to see what Google can see and what it can't. What is the best tool that duplicates Googlebot? I have found several but they seem old or inaccurate.
-
I just used it today. The simple crawl is free. Advanced is paid.
-
I think it's free again? I use it today...
-
Used to be free, now it's 9,99$ per year
certainly not unreasonable, but some sort of demo would be nice so one knows what to expect. Do you have another recommendation? -
Hi Niners52,
Personally, I like to use webconfs search engine spider tools (http://www.webconfs.com/search-engine-spider-simulator.php) but there are loads out there that you can use and it is your own preference really.
Also, I use ScreamingFrog to do similar tasks.
Matt.
-
I'm a fan of http://www.seo-browser.com/ - use their simple search and it's totally free.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How is Single Page Application (SPA) bad for SEO
Hi guys. I am quite inspired of SPA technique. It's really amazing when all your interaction with the site is going on the fly and you don't see any page reloads. I've started implementing the site with this instruction and already found nice guys to make the design. The only downside of the using SPA which I can see **is the **SEO part. That's because the URL does not really change and different pages don't have their unique URL addresses.
Web Design | | Billy_gym
Actually they have, but it looks like: yoursite.com/#/products yoursite.com/#/prices yoursite.com/#/contact So all of them goes after # and being just anchors. For Google this mean all of these pages is just yoursite.com/ My question is what is really proven method to implement the URL structure in Single Page Application, so all the pages indexed by Google correctly (sorry I don't mention the other search engines because of market share). The other question, of course, is examples. It will be great to see real life site examples, better authority sites, which use SPA technique and well indexed by search engines.1 -
Internal Linking: What is the best practice for pages not included in Nav bar?
I never quite understood why internal linking was such a big deal for SEO, but now I'm having second thoughts and perhaps understanding it more. I always thought since most websites have a navigation feature--usually the menu bar located at the top and often another one in the footer--that internal navigation was usually already built in to most websites and therefore, a silly topic to make a fuss over; however, I may be the silly one after all. I am now creating pages that are not included in the navigation so.... What is the best practice for this? If I am creating say, pages for certain locations and those location pages begin to number in the hundreds, it makes my navigation bar a little too cumbersome to have all those pages in a drop down menu. So I made a Locations page and just link to all those pages from that page (and from nowhere else). But now I'm wondering if this could be a bad internal linking practice and perhaps hurt my online visibility as an SEO ranking factor. Is this a crawl problem? And if so, is there a better option that provides a good visitor experience while appeasing the search engines.
Web Design | | Dino640 -
Can anyone recommend a tool that will identify unused and duplicate CSS across an entire site?
Hi all, So far I have found this one: http://unused-css.com/ It looks like it identifies unused, but perhaps not duplicates? It also has a 5,000 page limit and our site is 8,000+ pages....so we really need something that can handle a site larger than their limit. I do have Screaming Frog. Is there a way to use Screaming Frog to locate unused and duplicate CSS? Any recommendations and/or tips would be great. I am also aware of the Firefix extensions, but to my knowledge they will only do one page at a time? Thanks!
Web Design | | danatanseo0 -
Best way to indicate multiple Lang/Locales for a site in the sitemap
So here is a question that may be obvious but wondering if there is some nuance here that I may be missing. Question: Consider an ecommerce site that has multiple sites around the world but are all variations of the same thing just in different languages. Now lets say some of these exist on just a normal .com page while others exist on different ccTLD's. When you build out the XML Sitemap for these sites, especially the ones on the other ccTLD's, we want to ensure that using <loc>http://www.example.co.uk/en_GB/"</loc> <xhtml:link<br>rel="alternate"
Web Design | | DRSearchEngOpt
hreflang="en-AU"
href="http://www.example.com.AU/en_AU/"
/>
<xhtml:link<br>rel="alternate"
hreflang="en-NZ"
href="http://www.example.co.NZ/en_NZ/"
/> Would be the correct way of doing this. I know I have to change this for each different ccTLD but it just looks weird when you start putting about 10-15 different language locale variations as alternate links. I guess I am just looking for a bit of re-affirmation I am doing this right.</xhtml:link<br></xhtml:link<br> Thanks!0 -
Best Webhosting Suggestions??
Good morning my fellow Mozzers! I am currently looking at adding some diversity to my current web hosting and I was hoping I could get some suggestions. I dont currently need a VPS or Dedicated Server, I just need some shared hosting, you know, packeges that are sub $20 a month...i mean i will pay more than that, but so far everything i look at that meets my needs(basic hosting, email, ect...). This is for client sites and they are growing in number somewhat rapidly. I currently host with GoDaddy and they are amazing in the support department, but I do question whether their servers are causing slow page loads ect...but all in all I am happy with them. I have used Netword Solutions in the past, but left them because i was not a big fan of talking to support people in india and malasia. I do think that their servers might have performed better than GoDaddy so i am not ruling them out at this point i am looking for a provider that has excellent support and who has servers that are not so overloaded the can render pages and content slowly. Performance is very important to me. I am not looking for the cheapest, I am looking for the overall best. Thanks in advance SEOmoz family!!!
Web Design | | WebbyNabler0 -
Does it do harm if you add a rel="canonical" tag on a page that doesn't need it?
If a page is clearly unique and there is obviously no canonical tag needed, does it hurt anything if one has been added?
Web Design | | jaychow0 -
Best method to stop crawler access to extra Nav Menu
Our shop site has a 3 tier drop down mega-menu so it's easy to find your way to anything from anywhere. It contains about 150 links and probably 300 words of text. We also have a more context-driven single layer of sub-category navigation as well as breadcrumbs on our category pages. You can get to every product and category page without using the drop down mega-menu. Although the mega-menu is a helpful tool for customers, it means that every single page in our shop has an extra 150 links on it that go to stuff that isn't necessarily related or relevant to the page content. This means that when viewed from the context of a crawler, rather than a nice tree like crawling structure, we've got more of an unstructured mesh where everything is linked to everything else. I'd like to hide the mega-menu links from being picked up by a crawler, but what's the best way to do this? I can add a nofollow to all mega-menu links, but are the links still registered as page content even if they're not followed? It's a lot of text if nothing else. Another possibility we're considering is to set the mega-menu to only populate with links when it's main button is hovered over. So it's not part of the initial page load content at all. Or we could use a crude yet effective system we have used for some other menus we have of base encoding the content inline so it's not readable by a spider. What would you do and why? Thanks, James
Web Design | | DWJames0 -
Landing pages vs internal pages.
Hey everyone I have run into a problem and would greatly appreciate anyone that could weigh in on it. I have a web client that went to an outside vendor for marketing. The client asked me to help them target some keywords and since I am new to the SEO world I have proceeded by researching the best keywords for the client. I found 6 that see excellent monthly searches. I then registered the .com and or .net domain names that match these words. I then started building landing pages that make reference to the keyword and then have links to his site to get more info. My customer sent the first of these sites to the marketer and he says I am doing things all wrong. He says rather then having landing pages like this I should just point the domain names at internal pages to the website. He also says that I should not have different looks for the landing pages from the main site and that I should have the full site menu on each landing page. I wanted to here what everyone here has to say about the pros and cons of the way to do this cause the guy giving the advice to me has a lower ranking site then I do and I have only started working on getting my site ranked this year. He has atleast according to him been doing this forever. Thanks, Ron
Web Design | | bsofttech0