Can't get Google to index our site although all seems very good
-
Hi there,
I am having issues getting our new site, https://vintners.co indexed by Google although it seems all technical and content requirements are well in place for it. In the past, I had way poorer websites running with very bad setups and performance indexed faster.
What's concerning me, among others, is that the crawler of Google comes from time to time when looking on Google Search Console but does not seem to make progress or to even follow any link and the evolution does not seem to do what google says in GSC help. For instance, our sitemap.xml was submitted, for a few days, it seemed like it had an impact as many pages were then visible in the coverage report, showing them as "detected but not yet indexed" and now, they disappeared from the coverage report, it's like if it was not detected any more.
Anybody has any advice to speed up or accelerate the indexing of a new website like ours? It's been launched since now almost two months and I was expected, at least on some core keywords, to quickly get indexed.
-
I would try to get some more external links and also add more internal linking. This website has many outbound links in your posts but almost no internal links, and that is something I would change. I would also noindex thin content pages, such as some not necessary category pages. It is advisable to have your web content translated by agencies that offer professional Language translation services in Singapore such as Lingua Technologies International.
-
Hello,I just saw this thread of comments. for Google to carry out a good indexing of the pages, you have to have time, but more importantly, make those pages are of quality, from the content, to the keywords or structure. It is advisable to have your web content translated by agencies that offer professional Language translation services in Singapore such as Lingua Technologies International.
-
Hello,I just saw this thread of comments. for Google to carry out a good indexing of the pages, you have to have time, but more importantly, make those pages are of quality, from the content, to the keywords or structure.
In some cases when translating a page Google is able to detect a low quality translation or one done by automatic translators, so it may not index that page properly. It is advisable to have your web content translated by agencies that offer professional website translation services such as Blarlo. -
@pau4ner Thanks, very helpful and somehow comforting. For the FR translations, indeed we are doing them "a posteriori". As we have a canonical, I was thinking that it could not hurt, even if of course it is better once everything has been translated.
For the category pages that you suggest to noindex, are you thinking of pages like https://vintners.co/regions/france-greater-than-loire-greater-than-anjou which are probably good as internal links but are just content listing?
-
I have the same problem, I created a gross net salary calculator for Germany. I proactively reply on Quora, Facebook groups, and Reddit to questions with links to my site. But still, my site is not indexed. Also when I check the speed score for my site I get only a score of 30 for mobile. How much does this influence the indexing speed?
-
Hi, nowadays new sites can take longer to get indexed by Google. Your site is indexed, although not all pages.
I also see that you have submitted two sitemaps. It is not really necessary in that case. Having a quick look at other technical issues everything seems fine to me.
I would try to get some more external links and also add more internal linking. This website has many outbound links in your posts but almost no internal links, and that is something I would change. I would also noindex thin content pages, such as some not necessary category pages.
I assume content is 100% original and ahreflang is implemented amongst your language variations. I have noticed however that in the french version you still keep your titles and blog posts in English. They are canonical to your English original content, but I'd probably had them translated and add and hreflang instead.
Apart from that, getting more links to your website and reducing the internal link/outbound link ratio should work —I've had some recent cases that this helped, although it took 4-5 months to solve the indexing issues. -
@tom-capper Thanks for the reply. The concern is indeed that only a few pages have been indexed (ranking will be a concern later), and that although the sitemap has been discovered by Google, and Google Search Console said the engine has discovered all the pages, it seems like they are not being indexed.
I'm surprised as I had websites in the same domain, with very few external links too, that got indexed way faster!
-
The site appears to be indexed, but maybe not all pages.
Is your concern that the other pages are not being indexed, or that the pages that are already indexed are not ranking for any keywords?
I suspect in either case it doesn't help that this site has almost no external links (DA 1) - with this level of obscurity, Google will not prioritise crawling resources.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
GoogleBot still crawling HTTP/1.1 years after website moved to HTTP/2
Whole website moved to https://www. HTTP/2 version 3 years ago. When we review log files, it is clear that - for the home page - GoogleBot continues to only access via HTTP/1.1 protocol Robots file is correct (simply allowing all and referring to https://www. sitemap Sitemap is referencing https://www. pages including homepage Hosting provider has confirmed server is correctly configured to support HTTP/2 and provided evidence of accessing via HTTP/2 working 301 redirects set up for non-secure and non-www versions of website all to https://www. version Not using a CDN or proxy GSC reports home page as correctly indexed (with https://www. version canonicalised) but does still have the non-secure version of website as the referring page in the Discovery section. GSC also reports homepage as being crawled every day or so. Totally understand it can take time to update index, but we are at a complete loss to understand why GoogleBot continues to only go through HTTP/1.1 version not 2 Possibly related issue - and of course what is causing concern - is that new pages of site seem to index and perform well in SERP ... except home page. This never makes it to page 1 (other than for brand name) despite rating multiples higher in terms of content, speed etc than other pages which still get indexed in preference to home page. Any thoughts, further tests, ideas, direction or anything will be much appreciated!
Technical SEO | | AKCAC1 -
What are the SEO ramifications of domain redirection?
Hi Moz Community! I was just trying to set up our global site and got this message: "Redirect detected
SEO Tactics | | Padmagandhini
We have detected that the domain bhaktimarga.org redirects to prodfront-coli.bhaktimarga.mediactive-network.net. We do not recommend tracking a redirect URL. Would you like to track prodfront-coli.bhaktimarga.mediactive-network.net for this campaign instead?"
6358703c-d8ef-4c0a-83a9-c948d370d743-image.png What's interesting is when you go to the site, Bhaktimarga.org, it shows our domain in the URL bar. Is this done for performance and masks the hosting provider domain? I haven't talked to website developers about this yet, but my main question is...Does this have any SEO ramification? Thanks so much,
Padma0 -
Google Search Console - Excluded Pages and Multiple Properties
I have used Moz to identify keywords that are ideal for my website and then I optimized different pages for those keywords, but unfortunately rankings for some of the pages have declined. Since I am working with an ecommerce site, I read that having a lot of Excluded pages on the Google Search Console was to be expected so I initially ignored them. However, some of the pages I was trying to optimize are listed there, especially under the 'Crawled - currently not indexed' and the 'Discovered - currently not indexed' sections. I have read this page (link: https://moz.com/blog/crawled-currently-not-indexed-coverage-status ) and plan on focusing on Steps 5 & 7, but wanted to ask if anyone else has had experience with these issues. Also, does anyone know if having multiple properties (https vs http, www vs no www) can negatively affect a site? For example, could a sitemap from one property overwrite another? Would removing one property from the Console have any negative impact on the site? I plan on asking these questions on a Google forum, but I wanted to add it to this post in case anyone here had any insights. Thank you very much for your time,
SEO Tactics | | ForestGT
Forest0 -
Why images are not getting indexed and showing in Google webmaster
Hi, I would like to ask why our website images not indexing in Google. I have shared the following screenshot of the search console. https://www.screencast.com/t/yKoCBT6Q8Upw Last week (Friday 14 Sept 2018) it was showing 23.5K out 31K were submitted and indexed by Google. But now, it is showing only 1K 😞 Can you please let me know why might this happen, why images are not getting indexed and showing in Google webmaster.
Technical SEO | | 21centuryweb0 -
Google is not indexing my new URL structure. Why not?
Hi all, We launched a new website for a customer on April 29th. That same day we resubmitted the new sitemap & asked Google to fetch the new website. Screenshot is attached of this (GWT Indexed). However, when I look at Google Index (see attachment - Google Index), Automated Production's old website URL's still appear. It's been two weeks. Is it normal for Google's index to take this long to update? Thanks for your help. Cole VoLPjhy vfxVUsO
Technical SEO | | ColeLusby0 -
I always get this error "We have detected that the domain or subfolder does not respond to web requests." I don't know why. PLEASE help
subdomain www.nwexterminating.com subfolder www.nwexterminating.com/pest_control www.nwexterminating.com/termite_services www.nwexterminating.com/bed_bug_services
Technical SEO | | NWExterminating0 -
Why hasn't my sites indexed on opensiteexplorer.org changed in weeks?
Why hasn't my sites indexed on opensiteexplorer.org changed in weeks, even though I've done link-building like crazy?
Technical SEO | | AccountKiller0 -
I am Posting an article on my site and another site has asked to use the same article - Is this a duplicate content issue with google if i am the creator of the content and will it penalize our sites - or one more than the other??
I operate an ecommerce site for outdoor gear and was invited to guest post on a popular blog (not my site) for a trip i had been on. I wrote the aritcle for them and i also will post this same article on my website. Is this a dup content problem with google? and or the other site? Any Help. Also if i wanted to post this same article to 1 or 2 other blogs as long as they link back to me as the author of the article
Technical SEO | | isle_surf0