Blogs Not Getting Indexed Intermittently - Why?
-
Over the past 5 months many of our clients are having indexing issues for their blog posts.
A blog from 5 months ago could be indexed, and a blog from 1 month ago could be indexed but blogs from 4, 3 and 2 months ago aren't indexed.It isn't consistent and there is not commonality across all of these clients that would point to why this is happening.
We've checked sitemap, robots, canonical issues, internal linking, combed through Search Console, run Moz reports, run SEM Rush reports (sorry Moz), but can't find anything.
We are now manually submitting URLs to be indexed to try and ensure they get into the index.
Search console reports for many of the URLs will show that the blog has been fetched and crawled, but not indexed (with no errors).
In some cases we find that the blog paginated pages (i.e. blog/page/2 , blog/page/3 , etc.) are getting indexed but not the blogs themselves.
There aren't any nofollow tags on the links going to the blogs either.
Any ideas?
*I've added a screenshot of one of the URL inspection reports from Search Console
-
Very interesting. I never thought of deleting a URL and creating a new one (a better one) and then creating a successful indexing. I'll have to keep that in mind if I need an important URL indexed.
-
@johnbracamontes Hello John, I would recommend you to verify if the content of these articles is similar to others in your blog, I would recommend you to download the featured image and add a description related to the title of your article, in the same way to verify that you only have an h1 in a beginning of the article and modify a little the titles h2 that you have..
-
Google has been much more picky about which pages they index lately, apart from suffering some indexing bugs. So yeah, indexing can be a real pain.
According to Google, when they crawl but do not index a blog post, it is probably due to content quality issues, either from that post or the website overall.
Based on what's worked for us, I'd suggest to substantially modify the content of those posts (adding content, images, etc), and then manually resubmitting them. If that doesn't index them, then delete the post, and publish the content in a new post URL —then submit it.
Hope that helps.
-
I was facing the same problem again and again. I changed the URL and resubmitted it and it worked. I changed the URL again to the previous one and resubmitted it. It is now indexed on google.
-
-
Nothing?
Would love to hear any thoughts.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What Should We Do to Fix Crawled but Not Indexed Pages for Multi-location Service Pages?
Hey guys! I work as a content creator for Zavza Seal, a contractor out of New York, and we're targeting 36+ cities in the Brooklyn and Queens areas with several services for home improvement. We got about 340 pages into our multi-location strategy targeting our target cities with each service we offer, when we noticed that 200+ of our pages were "Crawled but not indexed" in Google Search Console. Here's what I think we may have done wrong. Let me know what you think... We used the same page template for all pages. (we changed the content and sections, formatting, targeted keywords, and entire page strategy for areas with unique problems trying to keep the user experience as unique as possible to avoid duplicate content or looking like we didn't care about our visitors.) We used the same featured image for all pages. (I know this is bad and wouldn't have done it myself, but hey, I'm not the publisher.) We didn't use rel canonicals to tell search engines that these pages were special made for the areas. We didn't use alt tags until about halfway through. A lot of the urls don't use the target keyword exactly. The NAP info and Google Maps embed is in the footer, so we didn't use it on the pages. We didn't use any content about the history or the city or anything like that. (some pages we did use content about historic buildings, low water table, flood prone areas, etc if they were known for that) We were thinking of redoing the pages, starting from scratch and building unique experiences around each city, with testimonials, case studies, and content about problems that are common for property owners in the area, but I think they may be able to be fixed with a rel canonical, the city specific content added, and unique featured images on each page. What do you think is causing the problem? What would be the easiest way to fix it? I knew the pages had to be unique for each page, so I switched up the page strategy every 5-10 pages out of fear that duplicate content would start happening, because you can only say so much about for example, "basement crack repair". Please let me know your thoughts. Here is one of the pages that are indexed as an example: https://zavzaseal.com/cp-v1/premier-spray-foam-insulation-contractors-in-jamaica-ny/ Here is one like it that is crawled but not indexed: https://zavzaseal.com/cp-v1/premier-spray-foam-insulation-contractors-in-jamaica-ny/ I appreciate your time and concern. Have a great weekend!
Local SEO | | everysecond0 -
What Tools Should I Use To Investigate Damage to my website
I would like to know what tools I should use and how to investigate damage to my website in2town.co.uk I hired a person to do some work to my website but they damaged it. That person was on a freelance platform and was removed because of all the complaints made about them. They also put in backdoors on websites including mine and added content. I also had a second problem where my content was being stolen. My site always did well and had lots of keywords in the top five and ten, but now they are not even in the top 200. This happened in January and feb. When I write unique articles, they are not showing in Google and need to find what the problem is and how to fix it. Can anyone please help
Technical SEO | | blogwoman10 -
GoogleBot still crawling HTTP/1.1 years after website moved to HTTP/2
Whole website moved to https://www. HTTP/2 version 3 years ago. When we review log files, it is clear that - for the home page - GoogleBot continues to only access via HTTP/1.1 protocol Robots file is correct (simply allowing all and referring to https://www. sitemap Sitemap is referencing https://www. pages including homepage Hosting provider has confirmed server is correctly configured to support HTTP/2 and provided evidence of accessing via HTTP/2 working 301 redirects set up for non-secure and non-www versions of website all to https://www. version Not using a CDN or proxy GSC reports home page as correctly indexed (with https://www. version canonicalised) but does still have the non-secure version of website as the referring page in the Discovery section. GSC also reports homepage as being crawled every day or so. Totally understand it can take time to update index, but we are at a complete loss to understand why GoogleBot continues to only go through HTTP/1.1 version not 2 Possibly related issue - and of course what is causing concern - is that new pages of site seem to index and perform well in SERP ... except home page. This never makes it to page 1 (other than for brand name) despite rating multiples higher in terms of content, speed etc than other pages which still get indexed in preference to home page. Any thoughts, further tests, ideas, direction or anything will be much appreciated!
Technical SEO | | AKCAC1 -
How Can I influence the Google Selected Canonical
Our company recently rebranded and launched a new website. The website was developed by an overseas team and they created the test site on their subdomain. The only problem is that Google crawled and indexed their site and ours. I noticed Google indexed their sub domain ahead of our domain and based on Search Console it has deemed our content as the duplicate of theirs and the Google selected theirs as the canonical.
Community | | Spaziohouston
The website in question is https://www.spaziointerni.us
What would be the best course of action to get our content ranked and selected instead of being marked as the duplicate?
Not sure if I have to modify the content to make it more unique or have them submit a removal in their search console.
Our indexed pages continue to go down due to this issue.
Any help is greatly appreciated.1 -
Unsolved Using NoIndex Tag instead of 410 Gone Code on Discontinued products?
Hello everyone, I am very new to SEO and I wanted to get some input & second opinions on a workaround I am planning to implement on our Shopify store. Any suggestions, thoughts, or insight you have are welcome & appreciated! For those who aren't aware, Shopify as a platform doesn't allow us to send a 410 Gone Code/Error under any circumstance. When you delete or archive a product/page, it becomes unavailable on the storefront. Unfortunately, the only thing Shopify natively allows me to do is set up a 301 redirect. So when we are forced to discontinue a product, customers currently get a 404 error when trying to go to that old URL. My planned workaround is to automatically detect when a product has been discontinued and add the NoIndex meta tag to the product page. The product page will stay up but be unavailable for purchase. I am also adjusting the LD+JSON to list the products availability as Discontinued instead of InStock/OutOfStock.
Technical SEO | | BakeryTech
Then I let the page sit for a few months so that crawlers have a chance to recrawl and remove the page from their indexes. I think that is how that works?
Once 3 or 6 months have passed, I plan on archiving the product followed by setting up a 301 redirect pointing to our internal search results page. The redirect will send the to search with a query aimed towards similar products. That should prevent people with open tabs, bookmarks and direct links to that page from receiving a 404 error. I do have Google Search Console setup and integrated with our site, but manually telling google to remove a page obviously only impacts their index. Will this work the way I think it will?
Will search engines remove the page from their indexes if I add the NoIndex meta tag after they have already been index?
Is there a better way I should implement this? P.S. For those wondering why I am not disallowing the page URL to the Robots.txt, Shopify won't allow me to call collection or product data from within the template that assembles the Robots.txt. So I can't automatically add product URLs to the list.0 -
Page missing from Google index
Hi all, One of our most important pages seems to be missing from the Google index. A number of our collections pages (e.g., http://perfectlinens.com/collections/size-king) are thin, so we've included a canonical reference in all of them to the main collection page (http://perfectlinens.com/collections/all). However, I don't see the main collection page in any Google search result. When I search using "info:http://perfectlinens.com/collections/all", the page displayed is our homepage. Why is this happening? The main collection page has a rel=canonical reference to itself (auto-generated by Shopify so I can't control that). Thanks! WUKeBVB
Technical SEO | | leo920 -
Old Blog
I have an old blog that I started long ago and it has tons of content. I'm thinking about migrating it my current blog but am worried about panda and bringing over mediocre content. The content is fine, not bad not good. Should I bring it over or should I just delete the blog?
Technical SEO | | tylerfraser0 -
Too many links on your blog?
In all of my campaigns, I have a lot of URLs with too many links on the page (defined loosely as around or over 100 links per page); these links are virtually all found on blog pages. The link count shoots up quickly when you start using things like tag clouds, showing all the tags/categories a post is in, in addition to all the cross linking thats typical of blog posts. My question is: Does this matter? Do you work to get blog pages down under that 100 link limit, or just assume most blogs are like this and move along? If you think it does matter, what strategies have you used to cut down the number of links while still keeping popular elements like tag clouds?
Technical SEO | | AdoptionHelp0