Can't get Google to index our site although all seems very good
-
Hi there,
I am having issues getting our new site, https://vintners.co indexed by Google although it seems all technical and content requirements are well in place for it. In the past, I had way poorer websites running with very bad setups and performance indexed faster.
What's concerning me, among others, is that the crawler of Google comes from time to time when looking on Google Search Console but does not seem to make progress or to even follow any link and the evolution does not seem to do what google says in GSC help. For instance, our sitemap.xml was submitted, for a few days, it seemed like it had an impact as many pages were then visible in the coverage report, showing them as "detected but not yet indexed" and now, they disappeared from the coverage report, it's like if it was not detected any more.
Anybody has any advice to speed up or accelerate the indexing of a new website like ours? It's been launched since now almost two months and I was expected, at least on some core keywords, to quickly get indexed.
-
I would try to get some more external links and also add more internal linking. This website has many outbound links in your posts but almost no internal links, and that is something I would change. I would also noindex thin content pages, such as some not necessary category pages. It is advisable to have your web content translated by agencies that offer professional Language translation services in Singapore such as Lingua Technologies International.
-
Hello,I just saw this thread of comments. for Google to carry out a good indexing of the pages, you have to have time, but more importantly, make those pages are of quality, from the content, to the keywords or structure. It is advisable to have your web content translated by agencies that offer professional Language translation services in Singapore such as Lingua Technologies International.
-
Hello,I just saw this thread of comments. for Google to carry out a good indexing of the pages, you have to have time, but more importantly, make those pages are of quality, from the content, to the keywords or structure.
In some cases when translating a page Google is able to detect a low quality translation or one done by automatic translators, so it may not index that page properly. It is advisable to have your web content translated by agencies that offer professional website translation services such as Blarlo. -
@pau4ner Thanks, very helpful and somehow comforting. For the FR translations, indeed we are doing them "a posteriori". As we have a canonical, I was thinking that it could not hurt, even if of course it is better once everything has been translated.
For the category pages that you suggest to noindex, are you thinking of pages like https://vintners.co/regions/france-greater-than-loire-greater-than-anjou which are probably good as internal links but are just content listing?
-
I have the same problem, I created a gross net salary calculator for Germany. I proactively reply on Quora, Facebook groups, and Reddit to questions with links to my site. But still, my site is not indexed. Also when I check the speed score for my site I get only a score of 30 for mobile. How much does this influence the indexing speed?
-
Hi, nowadays new sites can take longer to get indexed by Google. Your site is indexed, although not all pages.
I also see that you have submitted two sitemaps. It is not really necessary in that case. Having a quick look at other technical issues everything seems fine to me.
I would try to get some more external links and also add more internal linking. This website has many outbound links in your posts but almost no internal links, and that is something I would change. I would also noindex thin content pages, such as some not necessary category pages.
I assume content is 100% original and ahreflang is implemented amongst your language variations. I have noticed however that in the french version you still keep your titles and blog posts in English. They are canonical to your English original content, but I'd probably had them translated and add and hreflang instead.
Apart from that, getting more links to your website and reducing the internal link/outbound link ratio should work —I've had some recent cases that this helped, although it took 4-5 months to solve the indexing issues. -
@tom-capper Thanks for the reply. The concern is indeed that only a few pages have been indexed (ranking will be a concern later), and that although the sitemap has been discovered by Google, and Google Search Console said the engine has discovered all the pages, it seems like they are not being indexed.
I'm surprised as I had websites in the same domain, with very few external links too, that got indexed way faster!
-
The site appears to be indexed, but maybe not all pages.
Is your concern that the other pages are not being indexed, or that the pages that are already indexed are not ranking for any keywords?
I suspect in either case it doesn't help that this site has almost no external links (DA 1) - with this level of obscurity, Google will not prioritise crawling resources.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
GoogleBot still crawling HTTP/1.1 years after website moved to HTTP/2
Whole website moved to https://www. HTTP/2 version 3 years ago. When we review log files, it is clear that - for the home page - GoogleBot continues to only access via HTTP/1.1 protocol Robots file is correct (simply allowing all and referring to https://www. sitemap Sitemap is referencing https://www. pages including homepage Hosting provider has confirmed server is correctly configured to support HTTP/2 and provided evidence of accessing via HTTP/2 working 301 redirects set up for non-secure and non-www versions of website all to https://www. version Not using a CDN or proxy GSC reports home page as correctly indexed (with https://www. version canonicalised) but does still have the non-secure version of website as the referring page in the Discovery section. GSC also reports homepage as being crawled every day or so. Totally understand it can take time to update index, but we are at a complete loss to understand why GoogleBot continues to only go through HTTP/1.1 version not 2 Possibly related issue - and of course what is causing concern - is that new pages of site seem to index and perform well in SERP ... except home page. This never makes it to page 1 (other than for brand name) despite rating multiples higher in terms of content, speed etc than other pages which still get indexed in preference to home page. Any thoughts, further tests, ideas, direction or anything will be much appreciated!
Technical SEO | | AKCAC1 -
Unsolved Crawling only the Home of my website
Hello,
Product Support | | Azurius
I don't understand why MOZ crawl only the homepage of our webiste https://www.modelos-de-curriculum.com We add the website correctly, and we asked for crawling all the pages. But the tool find only the homepage. Why? We are testing the tool before to suscribe. But we need to be sure that the tool is working for our website. If you can please help us.0 -
Blocking in Robots.txt and the re-indexing - DA effects?
I have two good high level DA sites that target the US (.com) and UK (.co.uk). The .com ranks well but is dormant from a commercial aspect - the .co.uk is the commercial focus and gets great traffic. Issue is the .com ranks for brand in the UK - I want the .co.uk to rank for brand in the UK. I can't 301 the .com as it will be used again in the near future. I want to block the .com in Robots.txt with a view to un-block it again when I need it. I don't think the DA would be affected as the links stay and the sites live (just not indexed) so when I unblock it should be fine - HOWEVER - my query is things like organic CTR data that Google records and other factors won't contribute to its value. Has anyone ever blocked and un-blocked and whats the affects pls? All answers greatly received - cheers GB
Technical SEO | | Bush_JSM0 -
Google Search Console - Excluded Pages and Multiple Properties
I have used Moz to identify keywords that are ideal for my website and then I optimized different pages for those keywords, but unfortunately rankings for some of the pages have declined. Since I am working with an ecommerce site, I read that having a lot of Excluded pages on the Google Search Console was to be expected so I initially ignored them. However, some of the pages I was trying to optimize are listed there, especially under the 'Crawled - currently not indexed' and the 'Discovered - currently not indexed' sections. I have read this page (link: https://moz.com/blog/crawled-currently-not-indexed-coverage-status ) and plan on focusing on Steps 5 & 7, but wanted to ask if anyone else has had experience with these issues. Also, does anyone know if having multiple properties (https vs http, www vs no www) can negatively affect a site? For example, could a sitemap from one property overwrite another? Would removing one property from the Console have any negative impact on the site? I plan on asking these questions on a Google forum, but I wanted to add it to this post in case anyone here had any insights. Thank you very much for your time,
SEO Tactics | | ForestGT
Forest0 -
Getting high priority issue for our xxx.com and xxx.com/home as duplicate pages and duplicate page titles can't seem to find anything that needs to be corrected, what might I be missing?
I am getting high priority issue for our xxx.com and xxx.com/home as reporting both duplicate pages and duplicate page titles on crawl results, I can't seem to find anything that needs to be corrected, what am I be missing? Has anyone else had a similar issue, how was it corrected?
Technical SEO | | tgwebmaster0 -
Why is this site beating mine? I can't work it out!
I feel my site is not to bad. Needs more work and could be better but what site could not, However i'm bamboozled by some of the sites that are out ranking me on key terms. e.g. Horse Rugs http://www.fasttackdirect.co.uk/products-1-122/Horse_Clothing/HORSE_RUG_SALE.html ranking 8th http://www.centralsaddlery.co.uk/horse/horse-rugs/ not ranking in first 10 pages I dont expect my site (the later) to rank well as its not up there with the bigger players in our industry yet however I can't see why a page like the one i mentioned is ranking so well when as far as i can see its not as well optimized and has very little content. This is not me ranting about it or whining about why i'm not top etc. I just can't work it out and would love sombody to explain the reasons for this? The only thing i can think of is that they have more category with the words "Horse Rugs" in them. Other than that i'm stumped! Ideas on a postcard please!
Technical SEO | | mark_baird0 -
Should I add 'nofollow' to site wide internal links?
I am trying to improve the internal linking structure on my site and ensure that the most important pages have the most internal links pointing to them (which I believe is the best strategy from Google's perspective!). I have a number of internal links in the page footer going to pages such as 'Terms and Conditions', 'Testimonials', 'About Us' etc. These pages, therefore, have a very large number of links going to them compared with the most important pages on my site. Should I add 'nofollow' to these links?
Technical SEO | | Pete40