The type of URL you posted will deliver a 404 from Google because they use a session code after the "cache:" parameter. If you go to their site and run a search result then select the down arrow to open up the "Cached" and "Similar" menu you should be able to click through and see their cahced result. After doing so, you'll notice some code like ":l8wcNgU5elwJ:" before your URL. With this it works, delete it and you get a 404.
Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Best posts made by RyanPurkey
-
RE: Pages are Indexed but not Cached by Google. Why?
-
RE: I have removed a subdomain from my main domain. We have stopped the subdomain completely. However the crawl still shows the error for that sub-domain. How to remove the same from crawl reports.
Patrick's answer gives you a great check list. I'd only add that you within Moz Analytics you can customize the crawler to only report on a certain portion of the site if problems still arise in displaying your data. Still, by using 301 redirection from the old subdomain to new location, cleaning up any old referencing links, and blocking further indexation you should see the errors disappear. Cheers!
-
RE: Pages are Indexed but not Cached by Google. Why?
The "site:" operator isn't as precise as what you'll find in Google Webmaster Tools (GWT) and doesn't return each and every page for a site Google has indexed for a few reasons. One reason, Google is preventing people without private access to the site from seeing each and every page it indexes for a site to keep that data from being scraped publicly. In your case, that's good if your competitor is running similar searches like you're doing now in the attempt to copy your site. Instead Google gives you that information privately via GWT.
The same goes for cached pages. The overarching reason is that it's about preventing over exposure publicly both in how Google operates and how a site is constructed. Ultimately you'll have to trust GWT and your own site's server records more than what you can find searching Google as an average user.
-
RE: SEO strategy for conversion-optimised home page
I think one of the best takeaways from Rand's work with Conversion Rate Experts is the understanding Rand got from talking about his services in person and how well such conversations "converted" versus how Moz was talking about what it did and offered on the site. For your specific case the solution is probably somewhat similar, how would you first describe and introduce your product (home page, very well crafted) and then how you would address specific examples and use cases (blog post, referencing your core service) or other pages.
Home pages can often rank for a robust set of terms so you might be alright in ranking with the smaller site format, still spend the time going through your Analytics carefully to see what pages you should keep and redesign versus what pages you could most likely redirect to the higher converting new ones. Also, test test test. Make sure you're making improvements with the changes you're making. Optimizely should be able to help you in that regard: https://www.optimizely.com/statistics
If you're very local, spending time seeing how your referrals and leads arrive via sites like Yelp, Google Local and others would be good too. It sounds like you're on the right track though and just need to tie things together with Analytics.
-
RE: Pages are Indexed but not Cached by Google. Why?
I would caution against creating a tool to do what you're describing as you might end up violating Google's terms of use. Instead, use a dedicated tool for monitoring rankings--like Moz's pro feature set--around a specific set of keywords that have value for you instead of each and every page on your site. Chasing after the immediate effect of ranking and changes is akin to trying to precisely unravel Google's ranking algorithm, something Google very much doesn't want you to do. Instead look at past performance of content (analytics, server logs, etc.) and whether or not it improves after changes. The improvement is also subjective. Maybe you get less users and sessions, but much higher conversions...
Within GWT you're going to want to look at Index Status https://www.google.com/webmasters/tools/index-status? and compare it with the number of pages in your sitemap(s). This most likely isn't going to be an exact match as Google at times limits the amount it caches and indexes a site based on its own determination of page worthiness (high percentage of the page is duplicate content for example). So look for a decent percentage of indexation versus exacting numbers. Also, having pages that perform really well for you indexed and ranking well is more important that 100 that don't.
Ultimately the more precisely you try to deconstruct Google the more difficult things will be. Take old Ben's advice, "Let go..."
-
RE: Going to Mozcon - what to do in Seattle
Hey Andy. Seattle is great. I bet a bunch of mozzers could chime in on this one as they're there every day, but some other things to do: take a "duck" ride, go on a brewery tour, check out Pike Place Market... those are all classic Seattle activities.
-
RE: 2 businesses same phone number
I agree with Miriam and give a thumbs up to her thorough response. I would only add that not having an additional number for the separate business is pretty inexcusable. With Skype, Anveo, or any other myriad of VOIP and cell options, getting another number is a dirt cheap cost for someone running a legitimate business. Plus the value it adds for the business should more than cover the $50 or so they'd spend yearly. If they still want the number to ring to the same line, the could just set up call forwarding on the new number... Again, all that should cost less than $100 a year and probably be even cheaper than that. Best of luck!
-
RE: Pages are Indexed but not Cached by Google. Why?
Yeup! Indexing time varies. You'll be able to tell the time between crawl and indexation by when Google shows your page version B in it's cache after you made changes from A, so if the 'example.html' page is already in Google's index you'll see this:
You make changes on a page, example.html (version A is now version B)Google crawls example.html (version B)
You check Google to see if example.html is version A or B in the cache
no?
no?
no?
no?
yes. That's how long it takes.OR, you make a new page. It gets crawled. Checking if it's indexed... no, no, no, no, yes?! That's how long it takes.
Again, this time period varies and having a site with excellent domain strength and trust usually makes it a shorter time period. It also tends to influence how many pages Google decides to keep in its index or show to users. Pretty much everything gets better for a site the stronger its domain authority and trust are.
-
RE: Is eLocal a scam or legitimate directory for local SEO?
They've been discussed in the past here: http://moz.com/community/q/is-it-okay-to-use-elocal-services, and Kristy's thorough answer at the time still rings true, "There is nothing blackhat about using a citation building service like eLocal, Yext, etc. Rather, there are other considerations that need to be taken into account in determining whether this is the right move for you." And I agree with her. It seems like you've got a handle on your site's listings and they wouldn't bring much additional value to the work you're already doing.
-
RE: What is Yandex and why do I care?
http://en.wikipedia.org/wiki/Yandex The webmaster tools it hosts are free: https://webmaster.yandex.com/ Just another search engine based service to help with crawling, linking and miscellaneous tools. If your site is non-international it's probably not much of a worry as they're mostly a Russia-based search provider.