Website Cached Version
-
Hi all
Why my full content is not appearing in Text only version(cached version): http://webcache.googleusercontent.com/search?q=cache:zakoopi.com&es_sm=93&strip=1
Original website link: http://www.zakoopi.com/
How can I resolve this issue?
-
Hi Ravi
When you select "Text-only version", it's showing you the way that Googlebot, or potentially other robots and crawlers, see your website. This is an important tool because it can tell you if Google is picking up content on your site. If you see nothing, then Google can't see your content, and you need to change this. Luckily for you, Google is seeing content.
However, it looks like you may be missing some basic SEO opportunities on this page - take a look at On-Page Factors.
Here's a good post from Boostability Blog and SEO Theory with some good information on Google's cache.
Hope this all helps! Good luck!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Ecommerce Preferred URL Structure for Printing Website
Hello Mozers! We are adding an ecommerce functionality to our existing website.
Technical SEO | | CheapyPP
Our company offers a wide range of commercial printing and mail services. We have done a pretty good job over the years in building content both in terms of our print offerings and blog section highlighting those offerings. We have finally bit the bullet and have decided to add end-to end ecommerce functionality. Users will be able to price, pay, upload and order thru our website. My question to the community becomes which sub folder do we use?
The ecommerce functionality is a third part software and needs to sit in a sub folder and we can't seem to find a good fit. Most of our content pages for print items are something like this www.website/printing/ - pillar page examples of url structure for sub pages www.website/printing/flyer-printing/
www.website/printing/booklet-printing/
www.website/printing/door-hangers/
www.website/printing/business-cards/ Options would be order-printing/ or prints/ So we we thinking /orders/ would be the best but not certain and wanted some feedback from the community. If we did go this route the url structure would be: order/business-cards this would be the default econ page order/business-cards/full-uv-coaing-both-sides individual product page What are your thoughts? CH0 -
Pages are Indexed but not Cached by Google. Why?
Hello, We have magento 2 extensions website mageants.com since 1 years google every 15 days cached my all pages but suddenly last 15 days my websites pages not cached by google showing me 404 error so go search console check error but din't find any error so I have cached manually fetch and render but still most of pages have same 404 error example page : - https://www.mageants.com/free-gift-for-magento-2.html error :- http://webcache.googleusercontent.com/search?q=cache%3Ahttps%3A%2F%2Fwww.mageants.com%2Ffree-gift-for-magento-2.html&rlz=1C1CHBD_enIN803IN804&oq=cache%3Ahttps%3A%2F%2Fwww.mageants.com%2Ffree-gift-for-magento-2.html&aqs=chrome..69i57j69i58.1569j0j4&sourceid=chrome&ie=UTF-8 so have any one solutions for this issues
Technical SEO | | vikrantrathore0 -
US and UK Websites of Same Business with Same Content
Hello Community, I need your help to understand, whether I can use the US website's content on my UK website or not? US Website's domain: https://www.fortresssecuritystore.com UK Website's domain: https://www.fortresssecuritystore.co.uk Both websites are having same content on all the pages, including testimonials/reviews. I am trying to gain business from Adwords and Organic SEO marketing. Thanks.
Technical SEO | | CommercePundit1 -
CSS and Javascipt files - website redesign project
UPDATED: We ran a crawl of the old website and have a list of css and javascript links as are part of the old website content. As the website is redesigned from scratch, I don't think these old css and javascipt files are being used for anything on the new site. I've read elsewhere online that you redirect "all" content files if launching/migrating to a new site. We are debating if this is needed for css and javascript files. Examples (A) http://website.com/wp-content/themes/style.css (B) http://website.com/wp-includes/js/wp-embed.min.js?ver=4.8.1
Technical SEO | | CREW-MARKETING0 -
Why is Google's cache preview showing different version of webpage (i.e. not displaying content)
My URL is: http://www.fslocal.comRecently, we discovered Google's cached snapshots of our business listings look different from what's displayed to users. The main issue? Our content isn't displayed in cached results (although while the content isn't visible on the front-end of cached pages, the text can be found when you view the page source of that cached result).These listings are structured so everything is coded and contained within 1 page (e.g. http://www.fslocal.com/toronto/auto-vault-canada/). But even though the URL stays the same, we've created separate "pages" of content (e.g. "About," "Additional Info," "Contact," etc.) for each listing, and only 1 "page" of content will ever be displayed to the user at a time. This is controlled by JavaScript and using display:none in CSS. Why do our cached results look different? Why would our content not show up in Google's cache preview, even though the text can be found in the page source? Does it have to do with the way we're using display:none? Are there negative SEO effects with regards to how we're using it (i.e. we're employing it strictly for aesthetics, but is it possible Google thinks we're trying to hide text)? Google's Technical Guidelines recommends against using "fancy features such as JavaScript, cookies, session IDs, frames, DHTML, or Flash." If we were to separate those business listing "pages" into actual separate URLs (e.g. http://www.fslocal.com/toronto/auto-vault-canada/contact/ would be the "Contact" page), and employ static HTML code instead of complicated JavaScript, would that solve the problem? Any insight would be greatly appreciated.Thanks!
Technical SEO | | fslocal0 -
Websites on same c class IP address
If two websites are on the same c class IP address, what does it mean ? Does two websites belong to the same company ?
Technical SEO | | seoug_20050 -
Website has been penalized?
Hey guys, We have been link building and optimizing our website since the beginning of June 2010. Around August-September 2010, our site appeared on second page for the keywords we were targeting for around a week. They then dropped off the radar - although we could still see our website as #1 when searching for our company name, domain name, etc. So we figured we had been put into the 'google sandbox' sort of thing. That was fine, we dealt with that. Then in December 2010, we appeared on the first page for our keywords and maintained first page rankings, even moving up the top 10 for just over a month. On January 13th 2011, we disappeared from Google for all of the keywords we were targeting, we don't even come up in the top pages for company name search. Although we do come up when searching for our domain name in Google and we are being cached regularly. Before we dropped off the rankings in January, we did make some semi-major changes to our site, changing meta description, changing content around, adding a disclaimer to our pages with click tracking parameters (this is when SEOmoz prompted us that our disclaimer pages were duplicate content) so we added the disclaimer URL to our robots.txt so Google couldn't access it, we made the disclaimer an onclick link instead of href, we added nofollow to the link and also told Google to ignore these parameters in Google Webmaster Central. We have fixed the duplicate content side of things now, we have continued to link build and we have been adding content regularly. Do you think the duplicate content (for over 13,000 pages) could have triggered a loss in rankings? Or do you think it's something else? We index pages meta description and some subpages page titles and descriptions. We also fixed up HTML errors signaled in Google Webmaster Central and SEOmoz. The only other reason I think we could have been penalized, is due to having a link exchange script on our site, where people could add our link to their site and add theirs to ours, but we applied the nofollow attribute to those outbound links. Any information that will help me get our rankings back would be greatly appreciated!
Technical SEO | | bigtimeseo0