Very well established blog, new posts now being indexed very late
-
I have an established blog.We update it on daily basis. In the past, when I would publish a new post, it would get indexed within a minute or so.
But since a month or so, its taking hours. Sometimes like 10-12 hours for new posts to get indexed. Only thing I have changed is robots.txt.
This is the current robots file.
User-agent: * Disallow: /cgi-bin Disallow: /wp-admin Disallow: /wp-includes Disallow: /wp-content/plugins Disallow: /wp-content/cache Disallow: /wp-content/themes Disallow: /wp-login.php Disallow: /*wp-login.php* Disallow: /trackback Disallow: /feed Disallow: /comments Disallow: /author Disallow: /category Disallow: */trackback Disallow: */feed Disallow: */comments Disallow: /login/ Disallow: /wget/ Disallow: /httpd/ Disallow: /*.php$ Disallow: /*?* Disallow: /*.js$ Disallow: /*.inc$ Disallow: /*.css$ Disallow: /*.gz$ Disallow: /*.wmv$ Disallow: /*.cgi$ Disallow: /*.xhtml$ Disallow: /*?* Disallow: /*? Allow: /wp-content/uploads User-agent: TechnoratiBot/8.1 Disallow: # ia_archiver User-agent: ia_archiver Disallow: / # disable duggmirror User-agent: duggmirror Disallow: / # allow google image bot to search all images User-agent: Googlebot-Image Disallow: /wp-includes/ Allow: /* # allow adsense bot on entire site User-agent: Mediapartners-Google* Disallow: Allow: /* Sitemap: http://www.domainname.com/sitemap.xml.gz
Site has tons of backlinks. Just wondering if something is wrong with the robots file or if it could be something else.
-
The robots.txt file is designed to completely block content. Normally, if your robots.txt file was a factor then your content would not appear in SERPs at all.
It is possible for content to appear in SERPs even though it is blocked by robots.txt if it is linked from other sources. Since this is new content, it is less likely that is the case unless you are immediately sharing links and Google is seeing those links within the time frame you shared.
The first place I would look is your sitemap or whatever tool is used to inform Google that you have new content. When you publish a new blog article, your software should ping Google and inform them there is new content. That is where any investigation should begin. Next step is to check server logs to see how long it takes Google to respond to the alert. If it takes them 12 hours, then there is nothing further you can do about it.
I would be interested in a lot more detail. How many articles how you confirmed as being affected by this issue. Exactly how did you confirm the issue?
As a side note, your robots.txt file is bloated and doesn't adhere to any standards I have seen. How exactly was it created? Did someone go in and make manual modifications to the file?
-
Are you using Feedburner? Has the feed publishing service gotten out of sync? You can re-sync it under the Troubleshootize section.
-
Yes, its a wordpress site and I always had the all in one SEO plugin enabled.
-
Do you use a word press platform? If so do you use a SEO plug in. Different plug ins can effect the index time.
-
Do you use a word press platform? If so do you use a SEO plug in. Different plug ins can effect the index time.
-
Could you possibly reverse the changes of the robots.txt to a previous "working" version where your site was getting indexed quicker?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google indexing is slowing down?
I have up to 20 million unique pages, and so far I've only submitted about 30k of them on my sitemap. We had a few load related errors during googles initial visits, and it thought some were duplicates, but we fixed all that. We haven't gotten a crawl related error for 2 weeks now. Google appears to be indexing fewer and fewer urls every time it visits. Any ideas why? I am not sure how to get all our pages indexed if its going to operate like this... love some help thanks! HnJaXSM.png
Technical SEO | | RyanTheMoz0 -
Blogging on multiple domains
We have three different domains for geotargeting (za,uk and .com). Each site at at the moment has the same content with only country specific details changed like currency etc. What is the best way to get maximum SEO benefit when posting new content.When we post new content should we repost to all three domains (the same content) or will Google only index the url on the domain which is crawled first. Thanks in advance
Technical SEO | | aquaspressovending0 -
Google Indexing - what did I missed??
Hello, all SEOers~ I just renewed my web site about 3 weeks ago, and in order to preserve SEO values as much as possible, I did 301 redirect, XML Sitemap and so on for minimize the possible data losses. But the problem is that about week later from site renewal, my team some how made mistake and removed all 301 redirects. So now my old site URLs are all gone from Google Indexing and my new site is not getting any index from Google. My traffic and rankings are also gone....OMG I checked Google Webmaster Tool, but it didn't say any special message other than Google bot founds increase of 404 error which is obvious. Also I used "fetch as google bot" from webmaster tool to increase chance to index but it seems like not working much. I am re-doing 301 redirect within today, but I am not sure it means anything anymore. Any advise or opinion?? Thanks in advance~!
Technical SEO | | Yunhee.Choi0 -
Should I remove these pages from the Google index?
Hi there, Please have a look at the following URL http://www.elefant-tours.com/index.php?callback=imagerotator&gid=65&483. It's a "sitemap" generated by a Wordpress plug-in called NextGen gallery and it maps all the images that have been added to the site through this plugin, which is quite a lot in this case. I can see that these "sitemap" pages have been indexed by Google and I'm wondering whether I should remove these or not? In my opinion these are pages that a search engine would never would want to serve as a search result and pages that a visitor never would want to see. Attracting any traffic through Google images is irrelevant in this case. What is your advice? Block it or leave it indexed or something else?
Technical SEO | | Robbern0 -
GWT Images Indexing
Hi guys! How does normally take to get Google to index the images within the sitemap? I recently submitted a new, up to date sitemap and most of the pages have been indexed already, but no images have. Any reason for that? Cheers
Technical SEO | | PremioOscar0 -
Getting Posts Indexed
On a Wordpress site I'm working on you can get to any product from home in 2 clicks but I'm a llittle concerned about the URL which looks like this: domain/categoryname/subcategoryname/productpage Will I have trouble getting my products indexed?
Technical SEO | | waynekolenchuk0 -
Correct Indexing problem
I recently redirected an old site to a new site. All the URLs were the same except the domain. When I redirected them I failed to realize the new site had https enable on all pages. I have noticed that Google is now indexing both the http and https version of pages in the results. How can I fix this? I am going to submit a sitemap but don't know if there is more I can do to get this fixed faster.
Technical SEO | | kicksetc0 -
HTML and no index, follow
I’m just learning about HTML and I was wondering can a tag be put into a dynamic HTML page?
Technical SEO | | EricVallee340