Very well established blog, new posts now being indexed very late
-
I have an established blog.We update it on daily basis. In the past, when I would publish a new post, it would get indexed within a minute or so.
But since a month or so, its taking hours. Sometimes like 10-12 hours for new posts to get indexed. Only thing I have changed is robots.txt.
This is the current robots file.
User-agent: * Disallow: /cgi-bin Disallow: /wp-admin Disallow: /wp-includes Disallow: /wp-content/plugins Disallow: /wp-content/cache Disallow: /wp-content/themes Disallow: /wp-login.php Disallow: /*wp-login.php* Disallow: /trackback Disallow: /feed Disallow: /comments Disallow: /author Disallow: /category Disallow: */trackback Disallow: */feed Disallow: */comments Disallow: /login/ Disallow: /wget/ Disallow: /httpd/ Disallow: /*.php$ Disallow: /*?* Disallow: /*.js$ Disallow: /*.inc$ Disallow: /*.css$ Disallow: /*.gz$ Disallow: /*.wmv$ Disallow: /*.cgi$ Disallow: /*.xhtml$ Disallow: /*?* Disallow: /*? Allow: /wp-content/uploads User-agent: TechnoratiBot/8.1 Disallow: # ia_archiver User-agent: ia_archiver Disallow: / # disable duggmirror User-agent: duggmirror Disallow: / # allow google image bot to search all images User-agent: Googlebot-Image Disallow: /wp-includes/ Allow: /* # allow adsense bot on entire site User-agent: Mediapartners-Google* Disallow: Allow: /* Sitemap: http://www.domainname.com/sitemap.xml.gz
Site has tons of backlinks. Just wondering if something is wrong with the robots file or if it could be something else.
-
The robots.txt file is designed to completely block content. Normally, if your robots.txt file was a factor then your content would not appear in SERPs at all.
It is possible for content to appear in SERPs even though it is blocked by robots.txt if it is linked from other sources. Since this is new content, it is less likely that is the case unless you are immediately sharing links and Google is seeing those links within the time frame you shared.
The first place I would look is your sitemap or whatever tool is used to inform Google that you have new content. When you publish a new blog article, your software should ping Google and inform them there is new content. That is where any investigation should begin. Next step is to check server logs to see how long it takes Google to respond to the alert. If it takes them 12 hours, then there is nothing further you can do about it.
I would be interested in a lot more detail. How many articles how you confirmed as being affected by this issue. Exactly how did you confirm the issue?
As a side note, your robots.txt file is bloated and doesn't adhere to any standards I have seen. How exactly was it created? Did someone go in and make manual modifications to the file?
-
Are you using Feedburner? Has the feed publishing service gotten out of sync? You can re-sync it under the Troubleshootize section.
-
Yes, its a wordpress site and I always had the all in one SEO plugin enabled.
-
Do you use a word press platform? If so do you use a SEO plug in. Different plug ins can effect the index time.
-
Do you use a word press platform? If so do you use a SEO plug in. Different plug ins can effect the index time.
-
Could you possibly reverse the changes of the robots.txt to a previous "working" version where your site was getting indexed quicker?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Indexed pages
Just started a site audit and trying to determine the number of pages on a client site and whether there are more pages being indexed than actually exist. I've used four tools and got four very different answers... Google Search Console: 237 indexed pages Google search using site command: 468 results MOZ site crawl: 1013 unique URLs Screaming Frog: 183 page titles, 187 URIs (note this is a free licence, but should cut off at 500) Can anyone shed any light on why they differ so much? And where lies the truth?
Technical SEO | | muzzmoz1 -
No Index PDFs
Our products have about 4 PDFs a piece, which really inflates our indexed pages. I was wondering if I could add REL=No Index to the PDF's URL? All of the files are on a file server, so they are embedded with links on our product pages. I know I could add a No Follow attribute, but I was wondering if any one knew if the No Index would work the same or if that is even possible. Thanks!
Technical SEO | | MonicaOConnor0 -
Pages removed from Google index?
Hi All, I had around 2,300 pages in the google index until a week ago. The index removed a load and left me with 152 submitted, 152 indexed? I have just re-submitted my sitemap and will wait to see what happens. Any idea why it has done this? I have seen a drop in my rankings since. Thanks
Technical SEO | | TomLondon0 -
I'm getting duplicate content created with a random string of character added to the end of my blog post permalinks?
In an effort to clean up my blog content I noticed that I have a lot of posts getting tagged for duplicate content. It looks like ... http://carwoo.com/blog/october-sales-robust-stateside-european-outlook-poor-for-ford http://carwoo.com/blog/october-sales-robust-stateside-european-outlook-poor-for-ford/954bf0df0a0d02b700a06816f2276fa5/ Any thoughts on how and why this would be happening?
Technical SEO | | editabletext0 -
Should this site start again on a new domain
Hi We have not done SEO on this site they have used another company who looks like they outsourced and the links have been built by a third party all blog networks and this company have said they cannot get the links removed. Google flagged artificial links on this web site in February and in April it lost over 10000 visitors in a month and its just free falled ever since. The categories have been recreated and no redirects created due to the amount of backlinks from the blog sites to the original category pages but the site is not recovering its down to 1500 visitors a month and used to get 14000 a month. So should my customer ditch the domain and move this site to fresh domain? http://www.kids-beds-online.com Any answers would really be appreciated. thanks Tracy
Technical SEO | | dashesndots0 -
Two blogs on the same domain
I have had two blogs on the same domain for a while now, and it just occurred to me that no one else seems to do this and maybe it's even weird. http://www.stadriemblems.com/blog/
Technical SEO | | UnderRugSwept
http://www.stadriemblems.com/scouting/blog/ One is our main blog, and one is for a very concentrated niche of customers. What are your opinions on this? Everything from SEO to best practices, to overall unusual-ness?0 -
Promoting a blog or a blog article
Hi what is the best way to promote a blog or a blog article. What i want to do is to find a site when i can put part of the article on that site and then have a link going to my blog for the article. Can anyone recommend any sites that do this please or the best ways to promote a new article from a blog
Technical SEO | | ClaireH-1848860 -
Clarification from old seomoz post
I would need clarification from an old seomoz post - http://www.seomoz.org/q/rankings-changing-based-on-location-within-a-country-normal Particulary the following part - If you type in "Clear browser cache", Google KNOWS what browser you are using and can add the "Firefox" term in on your behalf, without it being apparent to the user What does it mean ? Thanks
Technical SEO | | seoug_20050