Website is not indexed in Google
-
Hi Guys,
I have a problem with a website from a customer. His website is not indexed in Google (except for the homepage). I could not find anything that can possibly be the cause.
I already checked the robots.txt, sitemap, and plugins on the website. In the HTML code i also couldn't find anything which makes indexing harder than usual.
This is the website i am talking about: http://www.xxxx.nl/ (Dutch)
The only thing that i am guessing now is the Google sandbox, but even that is quite unlikely.
I hope you guys discover something i could not find!
Thanks in advance
-
Baldea,
The domain was new indeed.
We are going to try your suggestions and hope for the best!
Fingers crossed indeed
-
Bastiaan,
The domain was new, right? I mean it wasn't dropped/expired etc.
Try to get a dofollow link from a relevant website that has some traffic (Google tends to index very fast such websites and outgoing resources).
Also, make sure you have a sitemap and try including a line in robots.txt:
Sitemap: http://www.wikiboedel.nl/sitemap.xml
Submit the sitemap in Google's Webmaster Tools.
If you do all these, it is impossible bot to get indexed in a few days.
Anyway, fingers crossed.
-
I will try the line in the robots.txt. I already created an XML sitemap which has been submitted via Google Webmaster Tools.
Thanks for helping.
-
Hmm, this is strange indeed. Google should follow the links on the home page and index the available subpages. 2 months should be plenty of time too. Maybe try these two things:
- Add this line to robots.txt:
Allow: / ```Even though the current robots looks in order, this specifically tells search engines to index all except the /wp-pages you excluded.
- Create an XML Sitemap (or just a manual .txt file) and submit it via GWT
This might speed it up. BTW, the site is correctly indexed in Bing.
- Add this line to robots.txt:
-
Thanks for the quick answer Baldea.
This website has been online for about 2 months now i think. It has been verified in the Google Webmaster Tools.
I also did a little linkbuilding (about 3 links) on dutch websites like www.ekudos.nl - which did not seem to help.
-
Hi Bestiaan,
If it was recently (1-2 weeks) created and if you used robots.txt to block search engines (when I create a website, I use to block search engines from robots.txt, until everything is working fine and only afterwards I modify the permissions) than it's normal.
Verify the website in Google's Webmaster Tools and go to the robots.txt section. You should see if that is the reason.
I see your robots.txt file is fine, you have no NOINDEX meta tag or so.
To "stimulate" the indexation process you could make some use of social media. Share a post and wait 1-2 days. Or, some relevant direct link to one of the posts.
I hope this helps.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Website ranking on Google dropping for unknown reason while rankings are improving on Bing. Please help!
one of my websites www.resumeble.com is showing a constant drop in rankings. Earlier the website was ranking for major keywords like resume writing services etc. I used Ahrefs site audit to find issues. According to Ahrefs there was a huge issue of duplicate pages, which is now resolved by proper canonical tag insertion. The site is built on Angular. Fetch report in Google shows perfect code and Sitemap is also perfect. Manual action reporting in webmaster shows no warning. Please suggest what steps should I take to fix this issue.
Technical SEO | | mayyaa40 -
Some of my website urls are not getting indexed while checking (site: domain) in google
Some of my website urls are not getting indexed while checking (site: domain) in google
Technical SEO | | nlogix0 -
Getting Google to index a large PDF file
Hello! We have a 100+ MB PDF with multiple pages that we want Google to fully index on our server/website. First of all, is it even possible for Google to index a PDF file of this size? It's been up on our server for a few days, and my colleague did a Googlebot fetch via Webmaster Tools, but it still hasn't happened yet. My theories as to why this may not work: A) We have no actual link(s) to the pdf anywhere on our website. B) This PDF is approx 130 MB and very slow to load. I added some compression to it, but that only got it down to 105 MB. Any tips or suggestions on getting this thing indexed in Google would be appreciated. Thanks!
Technical SEO | | BBEXNinja0 -
Google dropping pages from SERPs even though indexed and cached. (Shift over to https suspected.)
Anybody know why pages that have previously been indexed - and that are still present in Google's cache - are now not appearing in Google SERPs? All the usual suspects - noindex, robots, duplication filter, 301s - have been ruled out. We shifted our site over from http to https last week and it appears to have started then, although we have also been playing around with our navigation structure a bit too. Here are a few examples... Example 1: Live URL: https://www.normanrecords.com/records/149002-memory-drawings-there-is-no-perfect-place Cached copy: http://webcache.googleusercontent.com/search?q=cache:https://www.normanrecords.com/records/149002-memory-drawings-there-is-no-perfect-place SERP (1): https://www.google.co.uk/search?q=memory+drawings+there+is+no+perfect+place SERP (2): https://www.google.co.uk/search?q=memory+drawings+there+is+no+perfect+place+site%3Awww.normanrecords.com Example 2: SERP: https://www.google.co.uk/search?q=deaf+center+recount+site%3Awww.normanrecords.com Live URL: https://www.normanrecords.com/records/149001-deaf-center-recount- Cached copy: http://webcache.googleusercontent.com/search?q=cache:https://www.normanrecords.com/records/149001-deaf-center-recount- These are pages that have been linked to from our homepage (Moz PA of 68) prominently for days, are present and correct in our sitemap (https://www.normanrecords.com/catalogue_sitemap.xml), have unique content, have decent on-page optimisation, etc. etc. We moved over to https on 11 Aug. There were some initial wobbles (e.g. 301s from normanrecords.com to www.normanrecords.com got caught up in a nasty loop due to the conflicting 301 from http to https) but these were quickly sorted (i.e. spotted and resolved within minutes). There have been some other changes made to the structure of the site (e.g. a reduction in the navigation options) but nothing I know of that would cause pages to drop like this. For the first example (Memory Drawings) we were ranking on the first page right up until this morning and have been receiving Google traffic for it ever since it was added to the site on 4 Aug. Any help very much appreciated! At the very end of my tether / understanding here... Cheers, Nathon
Technical SEO | | nathonraine0 -
How can I index several systems used for my website?
My site is built on PHP, but has a help.website.com page based on a helpdesk platform. I also have a wordpress blog. So, these are three "different systems" under the same domain. When I crawl my site, neither the blog nor the help page show up. How can I make them show up? Thanks!
Technical SEO | | rodelmo880 -
Website cache?
Hi mozzers, I am conducting an audit and was looking at the cache version of it. The homepage is fine but all the other pages, I get a Google 404. I don't think this is normal. Can someone tell me more what could be the issue here? thanks
Technical SEO | | Ideas-Money-Art0 -
Google Reconsideration Request (Penguin) - Will Google give links to remove?
When Penguin v1 hit, our site took a hit for a single phrase (i.e. "widgets") due to the techniques our SEO company was using (network). We've since had those links cleaned up, and our rankings have not recovered. Our SEO company said they submitted a reconsideration request on our behalf, and that Google denied it and didn't provide which links we needed removed. Does Google list links that need removing if they are still not happy with your link profile?
Technical SEO | | crucialx0 -
Database Driven Websites: Crawling and Indexing Issues
Hi all - I'm working on an SEO project, dealing with my first database-driven website that is built on a custom CMS. Almost all of the pages are created by the admin user in the CMS, pulling info from a database. What are the best practices here regarding SEO? I know that overall static is good, and as much static as possible is best, but how does Google treat a site like this? For instance, lets say the user creates a new page in the CMS, and then posts it live. The page is rendered and navigable, after putting together the user-inputed info (the content on the page) and the info pulled from the database (like info pulled out to create the Title tag and H1 tags, etc). Is this page now going to be crawled successfully and indexed as a static page in Google's eyes, and thus ok to start working on rank for, etc? Any help is appreciated - thanks!
Technical SEO | | Bandicoot0