How do you check the google cache for hashbang pages?
-
So we use http://webcache.googleusercontent.com/search?q=cache:x.com/#!/hashbangpage to check what googlebot has cached but when we try to use this method for hashbang pages, we get the x.com's cache... not x.com/#!/hashbangpage
That actually makes sense because the hashbang is part of the homepage in that case so I get why the cache returns back the homepage.
My question is - how can you actually look up the cache for hashbang page?
-
I was actually trying to give you the tools to figure out what's cached and indexed. You can just run a site search for the content and look at the cache, though. For example:
If nothing shows up it's probably not indexed.
-
Thanks Carson but that wasn't the question.
The question was how to check the cache.
-
Generally I'd avoid hashtags or hashbangs if you have large amounts of content you want indexed behind a hashbang. Use pushState instead whenever it makes sense for the user to actually change the URL.
The general rule is that if you can see the content in your page source (ctrl+u version), it's probably being indexed. That means that client-side AJAX behind hashbangs is generally not indexed, where server-side will generally get indexed.
If for some reason you must use hashbangs, AND you must use client-rendering content, create an HTML snapshot of your page for Google. Generally, though, that's more effort than changing one of the above.
-
I think google has stopped responding to cache requests on hashbang pages all together.
See here... **I'm just playing with random urls and don't see google cache 404'ing as it should **http://recordit.co/XBlo3U2A73
You can really put anything there it won't work.
-
Searching for indexed & duplicate content. I put a line or two in quotes and Googled it. I found most of the UTMs that way. Once you do that, it's a simple change to site:yoursite.com inurl:UTM
-
Thanks a lot, Matt.
I'm curious.. how did you exactly find the version with the utm codes that are being cached?
-
Strangely, browseo sees it correctly: http://www.browseo.net/?url=https%3A%2F%2Fplaceit.net%2F%3F_escaped_fragment_%3D%2Fstages%2Fsamsung-galaxy-note-friends-park
I'm not 100% sure why this is happening on your site specifically. Normally the #! isn't too big of an issue for cache but I've seen it have a few hiccups. These pages seem to be indexed fine but they aren't generating cache.
I did find a few working but only those with UTM codes:
This doesn't look like it's working but view the source code - the content is actually there. I found it by Googling the content in " marks.
-
What you're saying make sense and our urls are setup like this but we still don't see just the homepage come up when looking up the google cache with the esc fragment version
http://webcache.googleusercontent.com/search?q=cache:https://placeit.net/?escaped_fragment=/stages/samsung-galaxy-note-friends-park
https://placeit.net/?escaped_fragment=/stages/samsung-galaxy-note-friends-park
homepage - http://webcache.googleusercontent.com/search?q=cache:https://placeit.net/?escaped_fragment=
-
Let's use a Wix example site (not a client, just a sample from their page) as my example. Say you wanted to check:
http://www.kingskolacheny.com/#!press/crr2
In the source code I see the escaped fragment URL. This is the one you can find a cache for:
http://www.kingskolacheny.com/?escaped_fragment=press/crr2
That leads me to: http://webcache.googleusercontent.com/search?q=cache:http://www.kingskolacheny.com/?escaped_fragment=press/crr2
If your #! URLs are not setup this way, you will struggle to see it. One page websites are ... one page. But if you have escaped fragment URLs setup, you should be able to submit those and go from there.
The easiest way I know to find these is Screaming Frog, Ajax tab, Ugly URL field - try that one.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Blog posts and 3rd hierarchy pages weigh same as per Google?
Our blogs.website.com has been a sub-directory since 2 years as website.com/blog. Now we have blog posts with URLs website.com/blog/blog-post-1. This blog is located at different place technically, away from website. Now my doubt is whether the blog-posts have equal weightage at Google just like other 3rd hierarchy level pages of website like website.com/page1/topic2. Thanks
Intermediate & Advanced SEO | | vtmoz1 -
Google Adsbot crawling order confirmation pages?
Hi, We have had roughly 1000+ requests per 24 hours from Google-adsbot to our confirmation pages. This generates an error as the confirmation page cannot be viewed after closing or by anyone who didn't complete the order. How is google-adsbot finding pages to crawl that are not linked to anywhere on the site, in the sitemap or linked to anywhere else? Is there any harm in a google crawler receiving a higher percentage of errors - even though the pages are not supposed to be requested. Is there anything we can do to prevent the errors for the benefit of our network team and what are the possible risks of any measures we can take? This bot seems to be for evaluating the quality of landing pages used in for Adwords so why is it trying to access confirmation pages when they have not been set for any of our adverts? We included "Disallow: /confirmation" in the robots.txt but it has continued to request these pages, generating a 403 page and an error in the log files so it seems Adsbot doesn't follow robots.txt. Thanks in advance for any help, Sam
Intermediate & Advanced SEO | | seoeuroflorist0 -
Does including your site in Google News (and Google) Alerts helps with SEO?
Based on the following article http://homebusiness.about.com/od/yourbusinesswebsite/a/google-alerts.htm in order to check if you are included you need to run site:domain.com and click the news search tab. If you are not there then... I ran the test on MOZ and got no results which surprised me. Next step according to :https://support.google.com/news/publisher/answer/40787?hl=en#ts=3179198 is to submit your site for inclusion. Should I? Will it help? P.S.
Intermediate & Advanced SEO | | BeytzNet
This is a followup question to the following: http://moz.com/community/q/what-makes-a-site-appear-in-google-alerts-and-does-it-mean-anything0 -
Weird Page switch for a keyword in Google Rankings
Over this past weekend Google switched the page which usually showed in search results for keyword benchmarking. It went from from http://www.apqc.org/benchmarking to http://www.apqc.org/benchmarking-portal/osb. Also on Google the Rankings for the keyword 'benchmarking' sank from 15 to 47 for http://www.apqc.org/benchmarking Just looking for some theories or ideas or anyone that has had this happen to them.
Intermediate & Advanced SEO | | inhouseninja0 -
Does Google throttle back the search performance of a penalised website/page after the penalty has been removed?
Hi Mozzers. Back in 2013 my website www.octopus-hr.co.uk was hit by a Penguin 2.0 penalty owing to a harmful backlink profile built by a dodgy SEO consultant (now fired). The penalty seemed to apply to the homepage of the site but other pages were unaffected. We got what links we could removed, disavowed the rest and were informed in September 2013 that the penalty had been removed and our re-inclusion request had been successful. However our website homepage still ranks poorly for the search terms we're targeting in the UK: "HR Software" "HR Systems" On page factors are in my opinion pretty well optimised for these search terms. In terms of link building post penalty we've focused on high authority and relevant sites. I believe that compared to most of our search competitors the back link profile to our homepage is in pretty good shape, however it still ranks badly. Has anyone had any experience of a penalty hangover from Google in the past? Are there other things I should consider? Thanks David
Intermediate & Advanced SEO | | OctopusHR0 -
Optimize the category page or a content page?
Hi, We wish to start ranking on a specific keyword ("log house prices" in italian). We have two options on what pages we should optimize for this keyword: A long content page (1000+ words with images) Log houses category page, optimized for the keyword (we have 50+ houses on this page, together with a short price summary). I would think that we have better chances with ranking with option nr.2 , but then we can't use that page for ranking with a more short-tail keyword (like "log houses"). What would you suggest? Is there maybe a third option for this?
Intermediate & Advanced SEO | | JohanMattisson0 -
Google consolidating link juice on duplicate content pages
I've observed some strange findings on a website I am diagnosing and it has led me to a possible theory that seems to fly in the face of a lot of thinking: My theory is:
Intermediate & Advanced SEO | | James77
When google see's several duplicate content pages on a website, and decides to just show one version of the page, it at the same time agrigates the link juice pointing to all the duplicate pages, and ranks the 1 duplicate content page it decides to show as if all the link juice pointing to the duplicate versions were pointing to the 1 version. EG
Link X -> Duplicate Page A
Link Y -> Duplicate Page B Google decides Duplicate Page A is the one that is most important and applies the following formula to decide its rank. Link X + Link Y (Minus some dampening factor) -> Page A I came up with the idea after I seem to have reverse engineered this - IE the website I was trying to sort out for a client had this duplicate content, issue, so we decided to put unique content on Page A and Page B (not just one page like this but many). Bizarrely after about a week, all the Page A's dropped in rankings - indicating a possibility that the old link consolidation, may have been re-correctly associated with the two pages, so now Page A would only be getting Link Value X. Has anyone got any test/analysis to support or refute this??0 -
My page has fallen off the face of the earth on Google. What happened?
I have checked all of the usual things. My page has not lost any links or authority. It is not black listed or any other obvious sign. What's going on? This has just happened within the past 3 days.
Intermediate & Advanced SEO | | Tormz0