Cache my page
-
So I need to get this page cached: http://www.flowerpetal.com/index.jsp?info=13
It's been 4-5 months since uploaded. Now it's linked to from the homepage of a PR5 site. I've tweeted that link 10 times, facebooked, stumbled, linked to it from other articles and still nothing. And I submitted the url to google twice.
Any thoughts?
Thanks
Tyler
-
Thanks! It's weird that it has been seen but doesn't show as being cached: http://webcache.googleusercontent.com/search?q=cache%3Ahttp%3A%2F%2Fwww.flowerpetal.com%2Findex.jsp%3Finfo%3D13&pws=0&hl=en&num=10
-
The URL is in the index, it just doesn't have a meta description or cache link displayed. I did a search for ["Finding the Right Cremation Urn" site:flowerpetal.com] and that page does show up in the results.
It doesn't totally answer your question, but at least the page itself is indexed and this can be a starting point for further investigating the question.
-
-
Thanks @wissam. Which GWT should I use. GWT Designer?
-
when you say "submitted the url to google twice", did you submit to google.com/adurl? If so, forget it! Go into your Google Webmaster Tools and submit an xml sitemap. Make sure it is clean an no 404's. If you did submit an XML sitemap, then I don't know. Also, I can see you already went through the site verification so I don't know why it wouldn't be cached
-
lol ok
when you click submit un Header Response you should get:
| Status: HTTP/1.1 200 OK |
- Login to your Google Webmaster tools, I assume you added your site and verified
Go to your site dashboard, under "Diagnostics" go to "crawl errors" just double check if google is receiving crawling errors while fetching ur page.
- in your GWT dashboard go under Diagnostics" and choose "fetch as googlebot, Enter ur URL and click fetch.
what s going to happen is googlebot will fetch that page and show you what it sees.
-
That sounds great! How do I see if its giving a 200.
What would I do in GWT
How can I fetch as googlebot in labs?
Damn you got me on all three
-
Tyler,
I would do 2 things
- Check page header and double check if its giving a 200 OK
- i would go to GWT to check of this url and i would go to Labs and "Fetch As googlebot"
please inform us with the results
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
I want to move some pages of my website to a folder and nav menu in those pages should only show inner page links, will it hurt SEO?
Hi, My website has a few SaaS products, to make my website simple i want to move my website some pages to its specific folder structure , so eg website.com/product1/features
Technical SEO | | webbeemoz
website.com/product1/pricing
website.com/product1/information and same for product2 and so on, the website.com/product1/.. menu will only show the links of product1 and only one link to homepage (possibly in footer). Please share your opinion will it be a good idea, from UI perspective it will be simple , but i am not sure about SEO perspective, please help thanks0 -
How to find orphan pages
Hi all, I've been checking these forums for an answer on how to find orphaned pages on my site and I can see a lot of people are saying that I should cross check the my XML sitemap against a Screaming Frog crawl of my site. However, the sitemap is created using Screaming Frog in the first place... (I'm sure this is the case for a lot of people too). Are there any other ways to get a full list of orphaned pages? I assume it would be a developer request but where can I ask them to look / extract? Thanks!
Technical SEO | | KJH-HAC1 -
Pages Not Getting Indexed
Hey there I have a website with pretty much 3-4 pages. All of them had a canonical pointing to one page and the same content ( which happened by mistake ) I removed the canonical URL and added one pointing to its page. Also, I added the original content that was supposed to be there to begin with. It's been weeks but those pages are not getting indexed on the SERPS while the one that they use to point with the canonical does.
Technical SEO | | AngelosS0 -
Moving Some Content From Page A to Page B
Page A has written content, pictures, videos. The written content from Page A is being moved to Page B. When Google crawls the pages next time around will Page B receive the content credit? Will there not be any issues that this content originally belonged to Page A? Page A is not a page I want to rank for (just have great pictures and videos for users). Can I 301 redirect from Page A to B since the written content from A has been deleted or no need? Again, I intent to keep Page A live because good value for users to see pictures and videos.
Technical SEO | | khi50 -
My website pages are not crawled, what to do?
Hi all. I have made some changes on the website so i like to crawled them by the search engines Google especially. I have made these changes around 2 weeks ago. I have submitted my website on good bookmarking websites. Also i used a tool available in Google webmasters "Fetch as Google", Resubmitted a sitemap.xml. Still my pages are not crawled your opinion please. Thanks
Technical SEO | | lucidsoftech0 -
How to verify a page-by-page level 301 redirect was done correctly?
Hello, I told some tech guys to do a page-by-page relevant 301 redirect (as talked about in Matt Cutts video https://www.youtube.com/watch?v=r1lVPrYoBkA) when a company wanted to move to a new domain when their site was getting redesigned. I found out they did a 302 redirect on accident and had to fix that, so now I don't trust they did the page-by-page relevant redirect. I have a feeling they just redirected all of the pages on the old domain to the homepage of the new domain. How could I confirm this suspicion? I run the old domain through screaming frog and it only shows 1 URL - the homepage. Does that mean they took all of the pages on the old domain offline? Thanks!
Technical SEO | | EvolveCreative0 -
Why are pages linked with URL parameters showing up as separate pages with duplicate content?
Only one page exists . . . Yet I link to the page with different URL parameters for tracking purposes and for some reason it is showing up as a separate page with duplicate content . . . Help? rpcIZ.png
Technical SEO | | BlueLinkERP0 -
Ranking above PLACE PAGES
What does it take for results to show up above Place Page results. It seems like Google Local gets a lot of emphasis . Any thoughts?
Technical SEO | | musillawfirm0