Can't get Google to index our site although all seems very good
-
Hi there,
I am having issues getting our new site, https://vintners.co indexed by Google although it seems all technical and content requirements are well in place for it. In the past, I had way poorer websites running with very bad setups and performance indexed faster.
What's concerning me, among others, is that the crawler of Google comes from time to time when looking on Google Search Console but does not seem to make progress or to even follow any link and the evolution does not seem to do what google says in GSC help. For instance, our sitemap.xml was submitted, for a few days, it seemed like it had an impact as many pages were then visible in the coverage report, showing them as "detected but not yet indexed" and now, they disappeared from the coverage report, it's like if it was not detected any more.
Anybody has any advice to speed up or accelerate the indexing of a new website like ours? It's been launched since now almost two months and I was expected, at least on some core keywords, to quickly get indexed.
-
I would try to get some more external links and also add more internal linking. This website has many outbound links in your posts but almost no internal links, and that is something I would change. I would also noindex thin content pages, such as some not necessary category pages. It is advisable to have your web content translated by agencies that offer professional Language translation services in Singapore such as Lingua Technologies International.
-
Hello,I just saw this thread of comments. for Google to carry out a good indexing of the pages, you have to have time, but more importantly, make those pages are of quality, from the content, to the keywords or structure. It is advisable to have your web content translated by agencies that offer professional Language translation services in Singapore such as Lingua Technologies International.
-
Hello,I just saw this thread of comments. for Google to carry out a good indexing of the pages, you have to have time, but more importantly, make those pages are of quality, from the content, to the keywords or structure.
In some cases when translating a page Google is able to detect a low quality translation or one done by automatic translators, so it may not index that page properly. It is advisable to have your web content translated by agencies that offer professional website translation services such as Blarlo. -
@pau4ner Thanks, very helpful and somehow comforting. For the FR translations, indeed we are doing them "a posteriori". As we have a canonical, I was thinking that it could not hurt, even if of course it is better once everything has been translated.
For the category pages that you suggest to noindex, are you thinking of pages like https://vintners.co/regions/france-greater-than-loire-greater-than-anjou which are probably good as internal links but are just content listing?
-
I have the same problem, I created a gross net salary calculator for Germany. I proactively reply on Quora, Facebook groups, and Reddit to questions with links to my site. But still, my site is not indexed. Also when I check the speed score for my site I get only a score of 30 for mobile. How much does this influence the indexing speed?
-
Hi, nowadays new sites can take longer to get indexed by Google. Your site is indexed, although not all pages.
I also see that you have submitted two sitemaps. It is not really necessary in that case. Having a quick look at other technical issues everything seems fine to me.
I would try to get some more external links and also add more internal linking. This website has many outbound links in your posts but almost no internal links, and that is something I would change. I would also noindex thin content pages, such as some not necessary category pages.
I assume content is 100% original and ahreflang is implemented amongst your language variations. I have noticed however that in the french version you still keep your titles and blog posts in English. They are canonical to your English original content, but I'd probably had them translated and add and hreflang instead.
Apart from that, getting more links to your website and reducing the internal link/outbound link ratio should work —I've had some recent cases that this helped, although it took 4-5 months to solve the indexing issues. -
@tom-capper Thanks for the reply. The concern is indeed that only a few pages have been indexed (ranking will be a concern later), and that although the sitemap has been discovered by Google, and Google Search Console said the engine has discovered all the pages, it seems like they are not being indexed.
I'm surprised as I had websites in the same domain, with very few external links too, that got indexed way faster!
-
The site appears to be indexed, but maybe not all pages.
Is your concern that the other pages are not being indexed, or that the pages that are already indexed are not ranking for any keywords?
I suspect in either case it doesn't help that this site has almost no external links (DA 1) - with this level of obscurity, Google will not prioritise crawling resources.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Why MOZ just index some of the links?
hello everyone i've been using moz pro for a while and found a lot of backlink oppertunites as checking my competitor's backlink profile.
Link Building | | seogod123234
i'm doing the same way as my competitors but moz does not see and index lots of them, maybe just index 10% of them. though my backlinks are commenly from sites with +80 and +90 DA like Github, Pinterest, Tripadvisor and .... and the strange point is that 10% are almost from EDU sites with high DA. i go to EDU sites and place a comment and in lots of case, MOZ index them in just 2-3 days!! with maybe just 10 links like this, my DA is incresead from 15 to 19 in less than one month! so, how does this "SEO TOOL" work?? is there anyway to force it to crawl a page?0 -
GoogleBot still crawling HTTP/1.1 years after website moved to HTTP/2
Whole website moved to https://www. HTTP/2 version 3 years ago. When we review log files, it is clear that - for the home page - GoogleBot continues to only access via HTTP/1.1 protocol Robots file is correct (simply allowing all and referring to https://www. sitemap Sitemap is referencing https://www. pages including homepage Hosting provider has confirmed server is correctly configured to support HTTP/2 and provided evidence of accessing via HTTP/2 working 301 redirects set up for non-secure and non-www versions of website all to https://www. version Not using a CDN or proxy GSC reports home page as correctly indexed (with https://www. version canonicalised) but does still have the non-secure version of website as the referring page in the Discovery section. GSC also reports homepage as being crawled every day or so. Totally understand it can take time to update index, but we are at a complete loss to understand why GoogleBot continues to only go through HTTP/1.1 version not 2 Possibly related issue - and of course what is causing concern - is that new pages of site seem to index and perform well in SERP ... except home page. This never makes it to page 1 (other than for brand name) despite rating multiples higher in terms of content, speed etc than other pages which still get indexed in preference to home page. Any thoughts, further tests, ideas, direction or anything will be much appreciated!
Technical SEO | | AKCAC1 -
How Can I influence the Google Selected Canonical
Our company recently rebranded and launched a new website. The website was developed by an overseas team and they created the test site on their subdomain. The only problem is that Google crawled and indexed their site and ours. I noticed Google indexed their sub domain ahead of our domain and based on Search Console it has deemed our content as the duplicate of theirs and the Google selected theirs as the canonical.
Community | | Spaziohouston
The website in question is https://www.spaziointerni.us
What would be the best course of action to get our content ranked and selected instead of being marked as the duplicate?
Not sure if I have to modify the content to make it more unique or have them submit a removal in their search console.
Our indexed pages continue to go down due to this issue.
Any help is greatly appreciated.1 -
Unsolved error in crawling
hello moz . my site is papion shopping but when i start to add it an error appears that it cant gather any data in moz!! what can i do>???
Moz Tools | | valigholami13860 -
Google didn't show my correct language-version homepage.
I have a website which serves two languages - English and Chinese. My English homepage can be indexed by Google. But when I search the brand term in English, Google returns my Chinese homepage. I already added the hreflang attributes. And I'm working on building the XML sitemap for three languages. What other things I can work on to fix the issue? Thanks!
Technical SEO | | jsteimle0 -
How To Cleanup the Google Index After a Website Has Been HACKED
We have a client whose website was hacked, and some troll created thousands of viagra pages, which were all indexed by Google. See the screenshot for an example. The site has been cleaned up completely, but I wanted to know if anyone can weigh in on how we can cleanup the Google index. Are there extra steps we should take? So far we have gone into webmaster tools and submitted a new site map. ^802D799E5372F02797BE19290D8987F3E248DCA6656F8D9BF6^pimgpsh_fullsize_distr.png
Technical SEO | | yoursearchteam0 -
Google insists robots.txt is blocking... but it isn't.
I recently launched a new website. During development, I'd enabled the option in WordPress to prevent search engines from indexing the site. When the site went public (over 24 hours ago), I cleared that option. At that point, I added a specific robots.txt file that only disallowed a couple directories of files. You can view the robots.txt at http://photogeardeals.com/robots.txt Google (via Webmaster tools) is insisting that my robots.txt file contains a "Disallow: /" on line 2 and that it's preventing Google from indexing the site and preventing me from submitting a sitemap. These errors are showing both in the sitemap section of Webmaster tools as well as the Blocked URLs section. Bing's webmaster tools are able to read the site and sitemap just fine. Any idea why Google insists I'm disallowing everything even after telling it to re-fetch?
Technical SEO | | ahockley0 -
Partial Site Move -- Tell Google Entire Site Moved?
OK this one's a little confusing, please try to follow along. We recently went through a rebranding where we brought a new domain online for one of our brands (we'll call this domain 'B' -- it's also not the site linked to in my profile, not to confuse things). This brand accounted for 90% of the pages and 90% of the e-comm on the existing domain (we'll call the existing domain 'A') . 'A' was also redesigned and it's URL structure has changed. We have 301s in place on A that redirect to B for those 90% of pages and we also have internal 301s on A for the remaining 10% of pages whose URL has changed as a result of the A redesign What I'm wondering is if I should tell Google through webmaster tools that 'A' is now 'B' through the 'Change of Address' form. If I do this, will the existing products that remain on A suffer? I suppose I could just 301 the 10% of URLs on B back to A but I'm wondering if Google would see that as a loop since I just got done telling it that A is now B. I realize there probably isn't a perfect answer here but I'm looking for the "least worst" solution. I also realize that it's not optimal that we moved 90% of the pages from A to B, but it's the situation we're in.
Technical SEO | | badgerdigital0