Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
What is the best tool to view your page as Googlebot?
-
Our site was done with asp.net and a lot of scripting. I want to see what Google can see and what it can't. What is the best tool that duplicates Googlebot? I have found several but they seem old or inaccurate.
-
I just used it today. The simple crawl is free. Advanced is paid.
-
I think it's free again? I use it today...
-
Used to be free, now it's 9,99$ per year
certainly not unreasonable, but some sort of demo would be nice so one knows what to expect. Do you have another recommendation?
-
Hi Niners52,
Personally, I like to use webconfs search engine spider tools (http://www.webconfs.com/search-engine-spider-simulator.php) but there are loads out there that you can use and it is your own preference really.
Also, I use ScreamingFrog to do similar tasks.
Matt.
-
I'm a fan of http://www.seo-browser.com/ - use their simple search and it's totally free.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How is Single Page Application (SPA) bad for SEO
Hi guys. I am quite inspired of SPA technique. It's really amazing when all your interaction with the site is going on the fly and you don't see any page reloads. I've started implementing the site with this instruction and already found nice guys to make the design. The only downside of the using SPA which I can see **is the **SEO part. That's because the URL does not really change and different pages don't have their unique URL addresses.
Web Design | | Billy_gym
Actually they have, but it looks like: yoursite.com/#/products yoursite.com/#/prices yoursite.com/#/contact So all of them goes after # and being just anchors. For Google this mean all of these pages is just yoursite.com/ My question is what is really proven method to implement the URL structure in Single Page Application, so all the pages indexed by Google correctly (sorry I don't mention the other search engines because of market share). The other question, of course, is examples. It will be great to see real life site examples, better authority sites, which use SPA technique and well indexed by search engines.1 -
Is having a site map page necessary?
Hello all! So I know having a sitemap XML file is important to include in your robots.txt file. I also know it is important to submit your XML sitemap to Google and Bing. However, I am wondering if it is beneficial for your site's SEO value to have a sitemap page displayed on your website? Or is this just a redundant action if you have already done the above two actions with your XML sitemap? Thanks in advance!
Web Design | | Myles920 -
How to find out that none of the images on my site violates copyrights? Is there any tool that can do this without having to check manually image by image?
We plan to add several thousand images to our site and we outsourced the image search to some freelancers who had instructions to just use royalty free pictures. Is there any easy and quick way to check that in fact none of these images violates copyrights without having to check image by image? In case there are violations we are unaware of, do you think we need to be concerned about a risk of receiving Takedown Notices (DMCA) before owner giving us notification for giving us opportunity to remove the photo?
Web Design | | lcourse1 -
Have an eBook. What is best practice for SEO?
Hello We have a free eBook - its a great resource and great piece of content. It is available to download on our website here - http://re-timer.com/the-product/how-to-sleep-better/ The book is available as a whole or as individual chapters (i.e. http://re-timer.com/app/uploads/2015/07/Chapter8.pdf?b0df38). The PDF chapters appear to be doing well in Google search for certain keywords. I can't measure this in GA though. I would like the eBook to assist the SEO of my website overall. If I create a web page and 'embedded' the PDF into it will Google still crawl this page? At the moment we are also using this to collect email addresses, this is a nice to have and it is OK if people get the eBook without doing this (if they find a chapter in Google they currently don't have to enter their email address). I'm sure lots of people have eBooks now. What is best practice and the best way to use this as a tool to maximise SEO for the whole website (http://re-timer.com)? Thank you! Laura
Web Design | | LauraFalls1 -
Lots of Listing Pages with Thin Content on Real Estate Web Site-Best to Set them to No-Index?
Greetings Moz Community: As a commercial real estate broker in Manhattan I run a web site with over 600 pages. Basically the pages are organized in the following categories: 1. Neighborhoods (Example:http://www.nyc-officespace-leader.com/neighborhoods/midtown-manhattan) 25 PAGES Low bounce rate 2. Types of Space (Example:http://www.nyc-officespace-leader.com/commercial-space/loft-space)
Web Design | | Kingalan1
15 PAGES Low bounce rate. 3. Blog (Example:http://www.nyc-officespace-leader.com/blog/how-long-does-leasing-process-take
30 PAGES Medium/high bounce rate 4. Services (Example:http://www.nyc-officespace-leader.com/brokerage-services/relocate-to-new-office-space) High bounce rate
3 PAGES 5. About Us (Example:http://www.nyc-officespace-leader.com/about-us/what-we-do
4 PAGES High bounce rate 6. Listings (Example:http://www.nyc-officespace-leader.com/listings/305-fifth-avenue-office-suite-1340sf)
300 PAGES High bounce rate (65%), thin content 7. Buildings (Example:http://www.nyc-officespace-leader.com/928-broadway
300 PAGES Very high bounce rate (exceeding 75%) Most of the listing pages do not have more than 100 words. My SEO firm is advising me to set them "No-Index, Follow". They believe the thin content could be hurting me. Is this an acceptable strategy? I am concerned that when Google detects 300 pages set to "No-Follow" they could interpret this as the site seeking to hide something and penalize us. Also, the building pages have a low click thru rate. Would it make sense to set them to "No-Follow" as well? Basically, would it increase authority in Google's eyes if we set pages that have thin content and/or low click thru rates to "No-Follow"? Any harm in doing this for about half the pages on the site? I might add that while I don't suffer from any manual penalty volume has gone down substantially in the last month. We upgraded the site in early June and somehow 175 pages were submitted to Google that should not have been indexed. A removal request has been made for those pages. Prior to that we were hit by Panda in April 2012 with search volume dropping from about 7,000 per month to 3,000 per month. Volume had increased back to 4,500 by April this year only to start tanking again. It was down to 3,600 in June. About 30 toxic links were removed in late April and a disavow file was submitted with Google in late April for removal of links from 80 toxic domains. Thanks in advance for your responses!! Alan0 -
Best way to indicate multiple Lang/Locales for a site in the sitemap
So here is a question that may be obvious but wondering if there is some nuance here that I may be missing. Question: Consider an ecommerce site that has multiple sites around the world but are all variations of the same thing just in different languages. Now lets say some of these exist on just a normal .com page while others exist on different ccTLD's. When you build out the XML Sitemap for these sites, especially the ones on the other ccTLD's, we want to ensure that using <loc>http://www.example.co.uk/en_GB/"</loc> <xhtml:link<br>rel="alternate"
Web Design | | DRSearchEngOpt
hreflang="en-AU"
href="http://www.example.com.AU/en_AU/"
/>
<xhtml:link<br>rel="alternate"
hreflang="en-NZ"
href="http://www.example.co.NZ/en_NZ/"
/> Would be the correct way of doing this. I know I have to change this for each different ccTLD but it just looks weird when you start putting about 10-15 different language locale variations as alternate links. I guess I am just looking for a bit of re-affirmation I am doing this right.</xhtml:link<br></xhtml:link<br> Thanks!0 -
ECWID How to fix Duplicate page content and external link issue
I am working on a site that has a HUGE number of duplicate pages due to ECWID ecommerce platform. The site is built with Joomla! How can I rectify this situation? The pages also show up as "external " links on crawls... Is it the ECWID platform? I have never worked on a site that uses this. Here is an example of a page with the issue (there are 6280 issues) URL: http://www.metroboltmi.com/shop-spare-parts?Itemid=218&option=com_rokecwid&view=ecwid&ecwid_category_id=3560081
Web Design | | Atlanta-SMO0 -
Decreasing Page Load Time with Placeholder Images - Good Idea or Bad Idea?
In an effort to decease our page load time, we are looking at making a change so that all product images on any page past page 1 load with a place holder image. When the user clicks to the next page, it then loads all of the images for that page. Right now, all of the product divs are loaded into a Javascript array and loaded in chunks to the page display div. Product-heavy pages significantly increase load time as the browser loads all of the images from the product HTML before the Javascript can rewrite the display div with page-specific product HTML. In order to get around this, we are looking at loading the product HTML with a small placeholder image and then substituting the appropriate product image URLs when each page is output to the display div. From a user experience, this change will be seamless and they won't be able to tell the difference, plus they will benefit from a potentially a short wait on loading the images for the page in question. However, the source of the page will have all of the product images in a given category page all having the same image. How much of a negative impact will this have on SEO?
Web Design | | airnwater0