Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
My website hasn't been cached for over a month. Can anyone tell me why?
-
I have been working on an eCommerce site www.fuchia.co.uk.
I have asked an earlier question about how to get it working and ranking and I took on board what people said (such as optimising product pages etc...) and I think i'm getting there.
The problem I have now is that Google hasn't indexed my site in over a month and the homepage cache is 404'ing when I check it on Google. At the moment there is a problem with the site being live for both WWW and non-WWW versions, i have told google in Webmaster what preferred domain to use and will also be getting developers to do 301 to the preferred domain. Would this be the problem stopping Google properly indexing me? also I'm only having around 30 pages of 137 indexed from the last crawl.
Can anyone tell me or suggest why my site hasn't been indexed in such a long time?
Thanks
-
Fair point about the Sitemap. Thanks a lot, I'll take these on board and see what happens from there.
Thanks,
-
Cache won't be built or updated overnight so sometimes the first few caches are a waiting game. How long has this site been live? If it's fairly new, what you're experiencing is common. If it's an older site and you recently started changing a lot of the technical stuff - redirecting, canonicals, etc. it may just take a little while to settle in.
The other major recommendation I would give you is to change your sitemap "change frequency" to be slightly more accurate. Does this page http://www.fuchia.co.uk/products/clothing/dresses/dog-tooth-print-dress.aspx really change "daily"? By having daily on every page you aren't helping Google prioritize their crawl, which means you may get a cache for your dog tooth print dress before you get a new cache for your main page.
So I would fix that, resubmit sitemap and then it's a waiting game. Could be a week, could be two, I've seen it go almost a month but not if you use G+.
-
Hi Matt,
I used ping device and it's pinging fine.
I will work on the Google+ suggestion.
I have resubumitted a Sitemap for both fuchia.co.uk and www.fuchia.co.uk as I verified ownership of both to allow me set preferred domain. I submitted one this morning, so maybe that will help. But we will see.
It seems like the main priority at the moment is getting everything redirected and canonicalised and see if that helps anything.
-
Hi Sanket,
The site has been live for around 3 months I would say.
-
I've found that if you manually ping Google, they often update their cache at the same time.
Google doesn't have a cache for either cache: www.fuchia.co.uk. or cache: fuchia.co.uk. so I don't think it's a canonical issue.
I would suggest a few things:
-
Use PingDevice http://www.pingdevice.com/
-
Put your main domain in a Google Plus post every now and then.
-
Resubmit a sitemap. Usually this gets you crawled fairly quickly and possibly updates your cache.
-
-
Hi,
Your site is open with or without WWW so it is major problem you have to do proper 301 redirect in .htaccess file. Need to implement rel=canonical into your site i did not find that code. I see 243 pages are indexed of your site by google. can i know about the domain edge of your site?? when you have live this site?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Why some domains and sub-domains have same DA, but some others don't?
Hi I noticed for some blog providers in my country, which provide a sub-domian address for their blogs. the sub-domain authority is exactly as the main domain. Whereas, for some other blog providers every subdomain has its different and lower authority. for example "ffff.blog.ir" and "blog.ir" both have domain authority of 60. It noteworthy to mention that the "ffff.blog.ir" does not even exist! This is while mihanblog.com and hfilm.mihanblog.com has diffrent page authority.
Intermediate & Advanced SEO | | rayatarh5451230 -
If my website uses CDN does thousands of 301 redirect can harm the website performance?
Hi, If my website uses CDN does thousands of 301 redirect can harm the website performance? Thanks Roy
Intermediate & Advanced SEO | | kadut1 -
Can't generate a sitemap with all my pages
I am trying to generate a site map for my site nationalcurrencyvalues.com but all the tools I have tried don't get all my 70000 html pages... I have found that the one at check-domains.com crawls all my pages but when it writes the xml file most of them are gone... seemingly randomly. I have used this same site before and it worked without a problem. Can anyone help me understand why this is or point me to a utility that will map all of the pages? Kindly, Greg
Intermediate & Advanced SEO | | Banknotes0 -
Will disallowing URL's in the robots.txt file stop those URL's being indexed by Google
I found a lot of duplicate title tags showing in Google Webmaster Tools. When I visited the URL's that these duplicates belonged to, I found that they were just images from a gallery that we didn't particularly want Google to index. There is no benefit to the end user in these image pages being indexed in Google. Our developer has told us that these urls are created by a module and are not "real" pages in the CMS. They would like to add the following to our robots.txt file Disallow: /catalog/product/gallery/ QUESTION: If the these pages are already indexed by Google, will this adjustment to the robots.txt file help to remove the pages from the index? We don't want these pages to be found.
Intermediate & Advanced SEO | | andyheath0 -
Duplicate Content through 'Gclid'
Hello, We've had the known problem of duplicate content through the gclid parameter caused by Google Adwords. As per Google's recommendation - we added the canonical tag to every page on our site so when the bot came to each page they would go 'Ah-ha, this is the original page'. We also added the paramter to the URL parameters in Google Wemaster Tools. However, now it seems as though a canonical is automatically been given to these newly created gclid pages; below https://www.google.com.au/search?espv=2&q=site%3Awww.mypetwarehouse.com.au+inurl%3Agclid&oq=site%3A&gs_l=serp.3.0.35i39l2j0i67l4j0i10j0i67j0j0i131.58677.61871.0.63823.11.8.3.0.0.0.208.930.0j3j2.5.0....0...1c.1.64.serp..8.3.419.nUJod6dYZmI Therefore these new pages are now being indexed, causing duplicate content. Does anyone have any idea about what to do in this situation? Thanks, Stephen.
Intermediate & Advanced SEO | | MyPetWarehouse0 -
My landing pages don't show up in the SERPs, only my frontpage does.
I am having some trouble with getting the landing pages for a clients website to show up in the SERPs.
Intermediate & Advanced SEO | | InmediaDK
As far as I can see, the pages are optimized well, and they also get indexed by Google. The website is a danish webshop that sells wine, www.vindanmark.com Take for an instance this landing page, http://www.vindanmark.com/vinhandel/
It is optimzied for the keywords "Vinhandel Århus". Vinhandel means "Winestore" and "Århus" is a danish city. As you can see, I manage to get them at page 1 (#10), but it's the frontpage that ranks for the keyword. And this goes for alle the other landing pages as well. But I can't figure out, why the frontpage keep outranking the landingpages on every keyword.
What am I doing wrong here?1 -
What Happens If a Hreflang Sitemap Doesn't Include Every Language for Missing Translated Pages?
As we are building a hreflang sitemap for a client, we are correctly implementing the tag across 5 different languages including English. However, the News and Events section was never translated into any of the other four languages. There are also a few pages that were translated into some but not all of the 4 languages. Is it good practice to still list out the individual non-translated pages like on a regular sitemap without a hreflang tag? Should the hreflang sitemap include the hreflang tag with pages that are missing a few language translations (when one or two language translations may be missing)? We are uncertain if this inconsistency would create a problem and we would like some feedback before pushing the hreflang sitemap live.
Intermediate & Advanced SEO | | kchandler0 -
How important is the optional <priority>tag in an XML sitemap of your website? Can this help search engines understand the hierarchy of a website?</priority>
Can the <priority>tag be used to tell search engines the hierarchy of a site or should it be used to let search engines know which priority to we want pages to be indexed in?</priority>
Intermediate & Advanced SEO | | mycity4kids0