Very well established blog, new posts now being indexed very late
-
I have an established blog.We update it on daily basis. In the past, when I would publish a new post, it would get indexed within a minute or so.
But since a month or so, its taking hours. Sometimes like 10-12 hours for new posts to get indexed. Only thing I have changed is robots.txt.
This is the current robots file.
User-agent: * Disallow: /cgi-bin Disallow: /wp-admin Disallow: /wp-includes Disallow: /wp-content/plugins Disallow: /wp-content/cache Disallow: /wp-content/themes Disallow: /wp-login.php Disallow: /*wp-login.php* Disallow: /trackback Disallow: /feed Disallow: /comments Disallow: /author Disallow: /category Disallow: */trackback Disallow: */feed Disallow: */comments Disallow: /login/ Disallow: /wget/ Disallow: /httpd/ Disallow: /*.php$ Disallow: /*?* Disallow: /*.js$ Disallow: /*.inc$ Disallow: /*.css$ Disallow: /*.gz$ Disallow: /*.wmv$ Disallow: /*.cgi$ Disallow: /*.xhtml$ Disallow: /*?* Disallow: /*? Allow: /wp-content/uploads User-agent: TechnoratiBot/8.1 Disallow: # ia_archiver User-agent: ia_archiver Disallow: / # disable duggmirror User-agent: duggmirror Disallow: / # allow google image bot to search all images User-agent: Googlebot-Image Disallow: /wp-includes/ Allow: /* # allow adsense bot on entire site User-agent: Mediapartners-Google* Disallow: Allow: /* Sitemap: http://www.domainname.com/sitemap.xml.gz
Site has tons of backlinks. Just wondering if something is wrong with the robots file or if it could be something else.
-
The robots.txt file is designed to completely block content. Normally, if your robots.txt file was a factor then your content would not appear in SERPs at all.
It is possible for content to appear in SERPs even though it is blocked by robots.txt if it is linked from other sources. Since this is new content, it is less likely that is the case unless you are immediately sharing links and Google is seeing those links within the time frame you shared.
The first place I would look is your sitemap or whatever tool is used to inform Google that you have new content. When you publish a new blog article, your software should ping Google and inform them there is new content. That is where any investigation should begin. Next step is to check server logs to see how long it takes Google to respond to the alert. If it takes them 12 hours, then there is nothing further you can do about it.
I would be interested in a lot more detail. How many articles how you confirmed as being affected by this issue. Exactly how did you confirm the issue?
As a side note, your robots.txt file is bloated and doesn't adhere to any standards I have seen. How exactly was it created? Did someone go in and make manual modifications to the file?
-
Are you using Feedburner? Has the feed publishing service gotten out of sync? You can re-sync it under the Troubleshootize section.
-
Yes, its a wordpress site and I always had the all in one SEO plugin enabled.
-
Do you use a word press platform? If so do you use a SEO plug in. Different plug ins can effect the index time.
-
Do you use a word press platform? If so do you use a SEO plug in. Different plug ins can effect the index time.
-
Could you possibly reverse the changes of the robots.txt to a previous "working" version where your site was getting indexed quicker?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Which pages should I index or have in my XML sitemap?
Hi there, my website is ConcertHotels.com - a site which helps users find hotels close to concert venues. I have a hotel listing page for every concert venue on my site - about 12,000 of them I think (and the same for nearby restaurants). e.g. https://www.concerthotels.com/venue-hotels/madison-square-garden-hotels/304484 Each of these pages list the nearby hotels to that concert venue. Users clicking on the individual hotel are brought through to a hotel (product) page e.g. https://www.concerthotels.com/hotel/the-new-yorker-a-wyndham-hotel/136818 I made a decision years ago to noindex all of the /hotel/ pages since they don't have a huge amount of unique content and aren't the pages I'd like my users to land on . The primary pages on my site are the /venue-hotels/ listing pages. I have similar pages for nearby restaurants, so there are approximately 12,000 venue-restaurants pages, again, one listing page for each concert venue. However, while all of these pages are potentially money-earners, in reality, the vast majority of subsequent hotel bookings have come from a fraction of the 12,000 venues. I would say 2000 venues are key money earning pages, a further 6000 have generated income of a low level, and 4000 are yet to generate income. I have a few related questions: Although there is potential for any of these pages to generate revenue, should I be brutal and simply delete a venue if it hasn't generated revenue within a time period, and just accept that, while it "could" be useful, it hasn't proven to be and isn't worth the link equity. Or should I noindex these "poorly performing pages"? Should all 12,000 pages be listed in my XML sitemap? Or simply the ones that are generating revenue, or perhaps just the ones that have generated significant revenue in the past and have proved to be most important to my business? Thanks Mike
Technical SEO | | mjk260 -
Can adding thousands of new indexable URLs to my site at once be a problem?
Hi everyone, I am currently working on a project that will quickly add thousands of new indexable URLs to my site. For context, the site currently has over a million indexable pages. Is there any danger of adding a few thousand URLs at once to the site? Could it potentially affect crawlability/SEO/other pages? Thank you!
Technical SEO | | StevenLevine0 -
Moving content to a new domain
I need to move a lot of content with podcasts and show notes to a new domain. Instead of doing redirects, we want to keep some content on the current domain to retain the link value. There are business reason to keep content on both websites but the new website will primarily be used for SEO moving forward.If we keep the audio portion of the podcast on the old website and move the show notes and the audio portion of the podcast to the new website, is there any issues with duplicate content?Long-term, I presume Google will re-index the old and the new pages, thus no duplicate content, but I want to make sure I'm not missing anything. I was planning to fetch pages in Search Console as we migrate content.Thanks for your help!
Technical SEO | | JimmyFritz0 -
No Index PDFs
Our products have about 4 PDFs a piece, which really inflates our indexed pages. I was wondering if I could add REL=No Index to the PDF's URL? All of the files are on a file server, so they are embedded with links on our product pages. I know I could add a No Follow attribute, but I was wondering if any one knew if the No Index would work the same or if that is even possible. Thanks!
Technical SEO | | MonicaOConnor0 -
301s - A Year Late
A website I recently was asked to help with was redesigned last year, but no 301s were setup. Looking at the old URLs 95% of the ones from early 2013 are 404s. Their traffic dropped from 50,000 per month to 10,000 and I believe this is one of the reasons. Now the question is: a year later, will it do any good to setup 301 redirects from those old urls. My current thought is that the old URLs have probably lost any link juice they had. But it should hurt anything to setup the 301s anyway. Any thoughts on whether this is worth my time and effort?
Technical SEO | | jkosarek0 -
How to know that i work well in my rank ?
Hi everyone, i'd like to know if i rank my keywords , how to know that i do well? every day
Technical SEO | | engmtamous0 -
Wordpress Page vs. Posts
My campaigns are telling me I have some duplicate content. I know the reason but not sure how to correct it. Example site here: Bikers Blog is a "static page" referencing each actual "blog post" I write. This site is somewhat orphaned and about to be reconstituted. I have a number of other sites with a similar problem. I'm not sure how to structure the "page" so it only shows a summary of the blog post on the page not the whole post. Permalinks is set as "/%postname%/" I've posted on Wordpress.org with no answer. Since this is an SEO issue I thought maybe someone with WP experience could chime in. Thanks, Don
Technical SEO | | NicheGuy0 -
When to SEO optimize a blog post?
Hi there, Here's our situation: there are two people working on the blog. person 1) writes the posts person 2) SEO optimizes the posts I know this is not ideal but it's the best we can do and it's a whole lot better than no blog. 🙂 I'm the fellow optimizing the posts. I've found that my best SEO efforts usually slightly undermine the readability of these posts -- not in an extreme way, I'm not going overboard with keywords or anything. Rather, things like a sexy & enticing article heading may have to be dummed down for search engines... Because of this dumming down, I like to wait a couple of weeks to SEO optimize our posts, the logic being that we get the best of both worlds: a happy regular readership on topic articles that are clearly described for (and aligned to the terms used by) our search engine visitors What I'm wondering is, Generally: can you see any problems with this setup? would you do it differently? Specifically: does Google (et al) punish this sort of backwards re-writing? and, does it somehow amount to less SEO mojo when done retroactively? Thanks so much for your time! Best, Jon
Technical SEO | | JonAmar0