Can Googlebot read the content on our homepage?
-
Just for fun I ran our homepage through this tool:
http://www.webmaster-toolkit.com/search-engine-simulator.shtml
This spider seems to detect little to no content on our homepage. Interior pages seem to be just fine. I think this tool is pretty old. Does anyone here have a take on whether or not it is reliable? Should I just ignore the fact that it can't seem to spider our home page?
Thanks!
-
Thanks all! Yes, I was familiar with the "Text-only" version and the Fetch as Googlebot, so I wasn't overly concerned. It just seemed odd that this particular spider couldn't get to the content. I think it is a very unsophisticated spider!
-
Assuming you've verified your site in Google Webmaster Tools, you can go in there and to go Crawl > Fetch as Googlebot. Put that page, and have Googlebot fetch it. Once it's done, you can click on the "Success" link, and this will show you exactly what Googlebot fetched with regards to that page. Make sure the source code you're seeing here is what you expect.
-
Hi Dana,
We would normally check through something like Website Auditor... I've run the tool on our home page and it seems to be missing some parts of our content, not sure why. Never had an issue before though with other tools, so would put it down to this tool....
Hope that helps.
-
Take a look at the text-only cached version of the page. If you are unsure how to do that follow my crude instructions below.
What I do to test if Googlebot can view the content of my homepage:
Do a Google search for 'site:example.com' and find your homepage. Next to the green URL in the SERP listing for your homepage there is a green arrow. Click that and select 'cached'. Then, when viewing the cached version of the homepage, click 'Text-only version' in the bottom right corner of the grey bar that appears at the top of the browser.
If the content you are questioning shows up, there is a good chance Google has obviously been able to crawl and index it. If the content is not there, there is a good chance they can't. If the content is in a hidden div it will likely still not show up in the text-only cache.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Devaluing certain content to push better content forward
Hi all, I'm new to Moz, but hoping to learn a lot from it in hopes of growing my business. I have a pretty specific question and hope to get some feedback on how to proceed with some changes to my website. First off, I'm a landscape and travel photographer. My website is at http://www.mickeyshannon.com - you can see that the navigation quickly spreads out to different photo galleries based on location. So if a user was looking for photos from California, they would find galleries for Lake Tahoe, Big Sur, the Redwoods and San Francisco. At this point, there are probably 600-800 photos on my website. At last half of these are either older or just not quite up to par with the quality I'm starting to feel like I should produce. I've been contemplating dumbing down the galleries, and not having it break down so far. So instead of four sub-galleries of California, there would just be one California gallery. In some cases, where there are lots of good images in a location, I would probably keep the sub-galleries, but only if there were dozens of images to work with. In the description of each photo, the exact location is already mentioned, so I'm not sure there's a huge need for these sub-galleries except where there's still tons of good photos to work with. I've been contemplating building a sort of search archive. Where the best of my photos would live in the main galleries, and if a user didn't find what they were looking for, they could go and search the archives for older photos. That way they're still around for licensing purposes, etc. while the best of the best are pushed to the front for those buying fine art prints, etc. These pages for these search archives would probably need to be de-valued somehow, so that the main galleries would be more important SEO-wise. So for the California galleries, four sub-galleries of perhaps 10 images each would become one main California gallery with perhaps 15 images. The other 25 images would be thrown in the search archive and could be searched by keyword. The question I have - does this sound like a good plan, or will I really be killing my site when it comes to SEO by making such a large change? My end goal would be to push my better content to the front, while scaling back a lot of the excess. Hopefully I explained this question well. If not, I can try to elaborate further! Thanks, Mickey
Technical SEO | | msphotography0 -
Duplicate content for vehicle inventory.
Hey all, In the automotive industry... When uploading vehicle inventory to a website I'm concerned with duplicate content issues. For example, 1 vehicle is uploaded to the main manufacturers website, then again to the actual dealerships website & then again to Craigslist & even sometimes to a group site. The information is all the same, description, notes, car details & images. What would you all recommend for alleviating duplicate content issues? Should I be using the rel canonical back to the manufacturers website? Once the vehicle is sold all pages disappear. Thanks so much for any advice.
Technical SEO | | DCochrane0 -
How to protect against duplicate content?
I just discovered that my company's 'dev website' (which mirrors our actual website, but which is where we add content before we put new content to our actual website) is being indexed by Google. My first thought is that I should add a rel=canonical tag to the actual website, so that Google knows that this duplicate content from the dev site is to be ignored. Is that the right move? Are there other things I should do? Thanks!
Technical SEO | | williammarlow0 -
Duplicate page content
Hello, The pro dashboard crawler bot thing that you get here reports the mydomain.com and mydomain.com/index.htm as duplicate pages. Is this a problem? If so how do I fix it? Thanks Ian
Technical SEO | | jwdl0 -
Duplicate content with same URL?
SEOmoz is saying that I have duplicate content on: http://www.XXXX.com/content.asp?ID=ID http://www.XXXX.com/CONTENT.ASP?ID=ID The only difference I see in the URL is that the "content.asp" is capitalized in the second URL. Should I be worried about this or is this an issue with the SEOmoz crawl? Thanks for any help. Mike
Technical SEO | | Mike.Goracke0 -
Squarespace Duplicate Content Issues
My site is built through squarespace and when I ran the campaign in SEOmoz...its come up with all these errors saying duplicate content and duplicate page title for my blog portion. I've heard that canonical tags help with this but with squarespace its hard to add code to page level...only site wide is possible. Was curious if there's someone experienced in squarespace and SEO out there that can give some suggestions on how to resolve this problem? thanks
Technical SEO | | cmjolley0 -
How to allow googlebot past paywall
Does anyone know of any ways or ideas to allow Google/Bing etc. to index your content, but have it behind a paywall for users?
Technical SEO | | MirandaP0 -
Duplicate content?
I have a question regarding a warning that I got on one of my websites, it says Duplicate content. I'm canonical url:s and is also using blocking Google out from pages that you are warning me about. The pages are not indexed by Google, why do I get the warnings? Thanks for great seotools! 3M5AY.png
Technical SEO | | bnbjbbkb0