Can't get Google to index our site although all seems very good
-
Hi there,
I am having issues getting our new site, https://vintners.co indexed by Google although it seems all technical and content requirements are well in place for it. In the past, I had way poorer websites running with very bad setups and performance indexed faster.
What's concerning me, among others, is that the crawler of Google comes from time to time when looking on Google Search Console but does not seem to make progress or to even follow any link and the evolution does not seem to do what google says in GSC help. For instance, our sitemap.xml was submitted, for a few days, it seemed like it had an impact as many pages were then visible in the coverage report, showing them as "detected but not yet indexed" and now, they disappeared from the coverage report, it's like if it was not detected any more.
Anybody has any advice to speed up or accelerate the indexing of a new website like ours? It's been launched since now almost two months and I was expected, at least on some core keywords, to quickly get indexed.
-
I would try to get some more external links and also add more internal linking. This website has many outbound links in your posts but almost no internal links, and that is something I would change. I would also noindex thin content pages, such as some not necessary category pages. It is advisable to have your web content translated by agencies that offer professional Language translation services in Singapore such as Lingua Technologies International.
-
Hello,I just saw this thread of comments. for Google to carry out a good indexing of the pages, you have to have time, but more importantly, make those pages are of quality, from the content, to the keywords or structure. It is advisable to have your web content translated by agencies that offer professional Language translation services in Singapore such as Lingua Technologies International.
-
Hello,I just saw this thread of comments. for Google to carry out a good indexing of the pages, you have to have time, but more importantly, make those pages are of quality, from the content, to the keywords or structure.
In some cases when translating a page Google is able to detect a low quality translation or one done by automatic translators, so it may not index that page properly. It is advisable to have your web content translated by agencies that offer professional website translation services such as Blarlo. -
@pau4ner Thanks, very helpful and somehow comforting. For the FR translations, indeed we are doing them "a posteriori". As we have a canonical, I was thinking that it could not hurt, even if of course it is better once everything has been translated.
For the category pages that you suggest to noindex, are you thinking of pages like https://vintners.co/regions/france-greater-than-loire-greater-than-anjou which are probably good as internal links but are just content listing?
-
I have the same problem, I created a gross net salary calculator for Germany. I proactively reply on Quora, Facebook groups, and Reddit to questions with links to my site. But still, my site is not indexed. Also when I check the speed score for my site I get only a score of 30 for mobile. How much does this influence the indexing speed?
-
Hi, nowadays new sites can take longer to get indexed by Google. Your site is indexed, although not all pages.
I also see that you have submitted two sitemaps. It is not really necessary in that case. Having a quick look at other technical issues everything seems fine to me.
I would try to get some more external links and also add more internal linking. This website has many outbound links in your posts but almost no internal links, and that is something I would change. I would also noindex thin content pages, such as some not necessary category pages.
I assume content is 100% original and ahreflang is implemented amongst your language variations. I have noticed however that in the french version you still keep your titles and blog posts in English. They are canonical to your English original content, but I'd probably had them translated and add and hreflang instead.
Apart from that, getting more links to your website and reducing the internal link/outbound link ratio should work —I've had some recent cases that this helped, although it took 4-5 months to solve the indexing issues. -
@tom-capper Thanks for the reply. The concern is indeed that only a few pages have been indexed (ranking will be a concern later), and that although the sitemap has been discovered by Google, and Google Search Console said the engine has discovered all the pages, it seems like they are not being indexed.
I'm surprised as I had websites in the same domain, with very few external links too, that got indexed way faster!
-
The site appears to be indexed, but maybe not all pages.
Is your concern that the other pages are not being indexed, or that the pages that are already indexed are not ranking for any keywords?
I suspect in either case it doesn't help that this site has almost no external links (DA 1) - with this level of obscurity, Google will not prioritise crawling resources.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Unsolved What would the exact text be for robots.txt to stop Moz crawling a subdomain?
I need Moz to stop crawling a subdomain of my site, and am just checking what the exact text should be in the file to do this. I assume it would be: User-agent: Moz
Getting Started | | Simon-Plan
Disallow: / But just checking so I can tell the agency who will apply it, to avoid paying for their time with the incorrect text! Many thanks.0 -
Merger & Acquisition Best Practices
Our company (DA 40) recently acquired another company (DA 20). The domain for the acquired company is up, and I am being asked if we should keep paying for the domain. I have a couple of questions regarding this. Is it best practice to continue paying for the older domain and do 301 redirects to our website? If yes, do I have to add 301 redirects for individual matching pages? For example, if the old site has the topic XYZ, should a 301 go to our site if we have a similar page topic XYZ? What about contact pages, about us, etc? Do we redirect all non-matching topics to the new home page? Or in the case of their blogs, do we redirect their blogs to the new blog home page even though we are not keeping their old blogs? What if our company keeps acquiring other companies, are we to assume we have to keep paying for the domains of the acquired companies? Is there anyone out there that would say stop paying for the old domain and just review the company's inbound links, reach out to those sites, and make sure they link to the new site instead? Thanks for the help in advance.
Intermediate & Advanced SEO | | CharityHBS0 -
Unsolved Why did I stop ranking on a keyword and how will I rank on it again?
I often see in my campaigns, that keywords which ranked on a page between spot 1 to 5 on the SERP stop being ranked on that respective page, causing the website to be in the 5th page or worse on Google. I also see that the keyword is not linked to a page anymore. What causes this to happen and how can I solve this from happening in the future? Capture.PNG
Moz Pro | | Ginovdw0 -
50% Visibility drop following June 2021 Google Update
Hello everyone,
Algorithm Updates | | yacpro13
We've observed a 50% drop in our Visibility score in the last week. This is our biggest drop ever, which coincides with June Google updates. We're an established ecommerce website located in Canada. This has obviously severely impacted sales. I'm frantically searching for information regarding fixes / implementations to recover asap, but if anybody could point us in the right direction, that would be hugely appreciated. Thanks!0 -
How do I handle a redirect chain issue pertaining to a page that doesn't actually exist on my site?
I have a page showing up on the insights report as being a redirect chain. This page however does not exist as far as I can tell. It is not on my dashboard anywhere and pointing a browser to it produces a messy page with Wordpress theme error code spit out. How do I track this down to clean it up if the page does not exist within my Wordpress installation? The page for reference is https://butlermobility.com/dealers/downloads. As it stands today the dealers and downloads pages are separate. There is no downloads sub page within the dealers section.
Technical SEO | | NiteSkirm0 -
Google indexing staging / development site that is redirected...
Hi Moz Fans! - Please help. We had a acme.stagingdomain.com while a site was in development, when it went live it redirected (302) to acmeprofessionalservices.com (real names redacted!!) no known external links to staging site although staging site url has been emailed from Google Apps(!!!) now found that staging site is in the index even though it redirects to the proper public site. and some (but not all) of the pages are in the index too. They all redirect to the proper public site when visited. It is convenient to have a redirect from the staging site to the new one for the team, Chrome etc. remember frequently visited sites. Be a shame to lose that. Yes, these pages can be removed using webmaster tools.
Technical SEO | | mozroadjan
But how did they get in the index to start with? And if we're building a new site, and a customer has an existing site is there a danger of duplicate content etc. penalties caused by the staging site? We had a similar incident recently when a PDF that was not linked anywhere on the site appeared in the index. The link had been emailed through Google Apps, and visited in Chrome, but that was it. So 3 questions. Why is the staging site still in the index despite the redirects? How did they get in the index in the first place? Will the new staging site affect the rank of the existing site, eg. duplicate content penalties?0 -
How do I get my pages to go from "Submitted" to "Indexed" in Google Webmaster Tools?
Background: I recently launched a new site and it's performing much better than the old site in terms of bounce rate, page view, pages per session, session duration, and conversions. As suspected, sessions, users, and % new sessions are all down. Which I'm okay with because the the old site had a lot of low quality traffic going to it. The traffic we have now is much more engaged and targeted. Lastly, the site was built using Squarespace and was launched the middle of August. **Question: **When reviewing Google Webmaster Tools' Sitemaps section, I noticed it says 57 web pages Submitted, but only 5 Indexed! The sitemap that's submitted seems to be all there. I'm not sure if this is a Squarespace thing or what. Anyone have any ideas? Thanks!!
Technical SEO | | Nate_D0 -
Best way to handle pages with iframes that I don't want indexed? Noindex in the header?
I am doing a bit of SEO work for a friend, and the situation is the following: The site is a place to discuss articles on the web. When clicking on a link that has been posted, it sends the user to a URL on the main site that is URL.com/article/view. This page has a large iframe that contains the article itself, and a small bar at the top containing the article with various links to get back to the original site. I'd like to make sure that the comment pages (URL.com/article) are indexed instead of all of the URL.com/article/view pages, which won't really do much for SEO. However, all of these pages are indexed. What would be the best approach to make sure the iframe pages aren't indexed? My intuition is to just have a "noindex" in the header of those pages, and just make sure that the conversation pages themselves are properly linked throughout the site, so that they get indexed properly. Does this seem right? Thanks for the help...
Technical SEO | | jim_shook0