Google cache tool help
-
This link is for the Ebay Google cache - http://webcache.googleusercontent.com/search?q=cache:www.ebay.com&strip=1
I wanted to do the same for my homepage so I switched out the urls and it worked. When I try to get a different link in there such as mysite.com/category it wont work. I know my pages are indexed. Any ideas why it wont work for other pages?
-
Yes. Once you get to the cached page, on the bottom right of the grey bar at the top is a link that says "text-only version". Click on that, and there you have it.
-
I am trying to look at the text only part, when I looked at the cache directly from search it shows the entire page. Is there a way to do text only?
-
Is there some particular reason you want to do it that way?
If you use the URL you are interested in as a search term, you can just click on the little green down arrow next to the green URL in the SERP and choose cache; this will show you what Google has cached and when it was cached.
-
Niners52,
Should work for a fully qualified URL. I just tried it for my root domain www.davenporttractro.com as well a one of our deep product page URL www.davenporttractor.com/p-896-be-careful-plate-a-b-g-r-50-through-820-1948-1956.aspx
So I'm not sure what URL you are trying, but try and remove the http:// portion of a URL you may be inserting. This made a big difference in my experiment.
I suspect that you may not be able to use the wildcard /
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
My main domain is missing in google, subdomain appears instead.
I have two SEO optimised pages in my website targeting different keywords www.example.com <-- main selling page (Pocket Guitar | Guitar Instruments)
On-Page Optimization | | kevinbp
www.example.com/index/ <-- 2nd selling page (Guitar Australia | Guitar Perth) Q: At first my website "www.example.com" is ranking on google first page. Suddenly it disappears and the link "www.example.com/index/" appears instead. No matter what i search, "Pocket Guitar | Guitar Instruments | Guitar Australia | Guitar Perth", the link www.example.com/index/ appears on the front page instead of www.example.com. What is happening to my main domain? Should i be worried?0 -
Site not showing up in Google search since move
Hi, hoping someone might help me with some answer(s) as to why our site no longer shows up in Google search results. Even when we type the full name and city into Google, the site is absent. Our Facebook page, LinkedIn and some backlinks show, but our site is missing. I can no longer find it in Google places. I'm sure I've done something wrong since moving from a static (flash-based) site to Wordpress. But the robots.txt file looks okay to me and the sitemap.xml file is present. Anyway, this is what happens when you ask a network technician handle website design... We know just enough to be dangerous! Here is the site in question: www.newfrontiertechnologies.com located in Shoreline WA. Any advice is much appreciated.
On-Page Optimization | | NFTECH0 -
Google Xml Sitemaps
Which plugin is good to use to create and submit my sitemap: sitemap from yoast or google xml sitemap plugin?
On-Page Optimization | | Sebastyan22
Which one is better? I already saw this video but I get an error when I submited it to webmaster tools and I don't know why:http://www.quicksprout.com/university/how-to-set-up-and-optimize-a-sitemap/_''Your Sitemap appears to be an HTML page. Please use a supported sitemap format instead.''_Thank you !0 -
Duplicate content shown in Google webmaster tools for 301 redirected URLs.
Why does Google webmaster tools shows 5 URLs that have been 301 redirected as having duplicate meta descriptions?
On-Page Optimization | | Madlena0 -
I need some help...
I am completely perplexed here guys. I have accomplished all of the the things that the On- Page Analysis tool says that we need to perform as far as( Keyword laden page titles and webpages) yet the report comes back and gives the webpage a C and says that we still need to correct these issues. Can anyone explain this? The keywords are: " real estate augusta ga" " property management augusta ga" the address is: www.aubenrealty.com Thanks in advance, C
On-Page Optimization | | AubbiefromAubenRealty0 -
Does adding ™ help or hurt?
I have a client who rents kids' party products that are branded (Batman, Curious George, etc.) Does it help or hurt to put the superscript "TM" in the Meta Title, or other meta or actual content of their product listings? I have some guesses, but am curious for other opinions / data! Thanks very much!
On-Page Optimization | | measurableROI0 -
SEO Issues with Avactis Shopping Cart Please Help
The home page title duplicates on all cms pages. this is causing the title to be over 70 characters. This is my first experience with Avactis so I am not exactly sure how to handle. Does anyone know?
On-Page Optimization | | MACameron0 -
Does Google respect User-agent rules in robots.txt?
We want to use an inline linking tool (LinkSmart) to cross link between a few key content types on our online news site. LinkSmart uses a bot to establish the linking. The issue: There are millions of pages on our site that we don't want LinkSmart to spider and process for cross linking. LinkSmart suggested setting a noindex tag on the pages we don't want them to process, and that we target the rule to their specific user agent. I have concerns. We don't want to inadvertently block search engine access to those millions of pages. I've seen googlebot ignore nofollow rules set at the page level. Does it ever arbitrarily obey rules that it's been directed to ignore? Can you quantify the level of risk in setting user-agent-specific nofollow tags on pages we want search engines to crawl, but that we want LinkSmart to ignore?
On-Page Optimization | | lzhao0