Pages not indexable?
-
Hello,
I've been trying to find out why Google Search Console finds these pages non-indexable:
https://www.visitflorida.com/en-us/eat-drink.html
https://www.visitflorida.com/en-us/florida-beaches/beach-finder.html
Moz and SEMrush both crawl the pages and show no errors but GSC comes back with, "blocked by robots.txt" but I've confirmed it is not.
Anyone have any thoughts?
-
Hello please also guide me about my website pages are not indexing too.
-
Hi, I am also facing a similar issue. Do let me know as well if you find out any helpful guide.
-
Thanks, Mazen.
robots.txt tester doesn't show that these URL's are blocked but I still get the error upon submitting the URL. I also see that Google has indexed them at some point, yes. Thanks for that. The dev. team is reviewing further for any issues.
Ken
-
Hi Specscart,
I think if your assessment was correct the entire site would not be indexed as it all falls under /en-us/.
Ken
-
Hi Ken,
Running a quick "site:" query on both URLs, I can see that they are indexed.
You can use the robots.txt tester from the Old Search Console interface and comment out the lines that could be contributing to the block.
In all cases, the pages are sitting in the index as of this writing.
-
Hi I have found issue with robots.txt.
User-agent: *
Allow: /
Disallow: /content/lookup
Disallow: /en-us/deals/* Disallow: /content/visitflorida/en-us/deals/* Sitemap: https://www.visitflorida.com/sitemap.xml Do some change in robots.txt file.This issue will be corrected.
Disallow: /en-us/deals/* (you wanted to disallow deals page. but google will cosider disallow page started with /en-us )
Mention only Disallow: /deals/*
If you will correct this thing. this issue will be resolve.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How to index e-commerce marketplace product pages
Hello! We are an online marketplace that submitted our sitemap through Google Search Console 2 weeks ago. Although the sitemap has been submitted successfully, out of ~10000 links (we have ~10000 product pages), we only have 25 that have been indexed. I've attached images of the reasons given for not indexing the platform. gsc-dashboard-1 gsc-dashboard-2 How would we go about fixing this?
Technical SEO | | fbcosta0 -
Drop in traffic, spike in indexed pages
Hi, We've noticed a drop in traffic compared to the previous month and the same period last year. We've also noticed a sharp spike in indexed pages (almost doubled) as reported by Search Console. The two seemed to be linked, as the drop in traffic is related to the spike in indexed pages. The only change we made to our site during this period is we reskinned out blog. One of these changes is that we've enable 'normal' (not ajax) pagination. Our blog has a lot of content on, and we have about 550 odd pages of posts. My question is, would this impact the number of pages indexed by Google, and if so could this negatively impact organic traffic? Many thanks, Jason
Technical SEO | | Clickmetrics0 -
Need to de-index certain pages fast
I need to de-index certain pages as fast as possible. These pages are already indexed. What is the fastest way to do this? I have added the noindex meta tag and run a few of the pages through Search Console/Webmaster tools (fetch as google) earlier today, however nothing has changed yet. The 'fetch as google' services do see the noindex tag, but it haven't changed the SERPs yet. I now I should be patient, but if there is a faster way to get Google to de-index these pages, I want to try that. I am considering the removal tool also, but I'm unsure if that is risky to do. And even if it's not, I can understand it's not a permanent solution anyway. What to do?
Technical SEO | | WebGain0 -
Investigating a huge spike in indexed pages
I've noticed an enormous spike in pages indexed through WMT in the last week. Now I know WMT can be a bit (OK, a lot) off base in its reporting but this was pretty hard to explain. See, we're in the middle of a huge campaign against dupe content and we've put a number of measures in place to fight it. For example: Implemented a strong canonicalization effort NOINDEX'd content we know to be duplicate programatically Are currently fixing true duplicate content issues through rewriting titles, desc etc. So I was pretty surprised to see the blow-up. Any ideas as to what else might cause such a counter intuitive trend? Has anyone else see Google do something that suddenly gloms onto a bunch of phantom pages?
Technical SEO | | farbeseo0 -
41.000 pages indexed two years after it was redirected to a new domain
Hi!Two years ago, we changed the domain elmundodportivo.es to mundodeportivo.com. Apparently, everything was OK, but more than two years later, there are still 41.000 pages indexed in Google (https://www.google.com/search?q=site%3Aelmundodeportivo.es) even though all the domains have been redirected with a 301 redirect. I detected some problems with redirections that were 303 instead of 301, but we fixed that one month ago.A secondary problem is that the pagerank for elmundodportivo.es is 7 yet and mundodeportivo.com is 3.What I'm doing wrong?Thank you all,Oriol
Technical SEO | | MundoDeportivo0 -
How do I get google to index the right pages with the right key word?
Hello I notice that even though I have a site map google is indexing the wrong pages under the wrong key words. As a result its not as relevant and is not ranking properly.
Technical SEO | | ursalesguru0 -
2 links on home page to each category page ..... is page rank being watered down?
I am working on a site that has a home page containing 2 links to each category page. One of the links is a text link and one link is an image link. I think I'm right in thinking that Google will only pay attention to the anchor text/alt text of the first link that it spiders with the anchor text/alt text of the second being ignored. This is not my question however. My question is about the page rank that is passed to each category page..... Because of the double links on the home page, my reckoning is that PR is being divided up twice as many times as necessary. Am I also right in thinking that if Google ignore the 2nd identical link on a page only one lot of this divided up PR will be passed to each category page rather than 2 lots ..... hence horribly watering down the 'link juice' that is being passed to each category page?? Please help me win this argument with a developer and improve the ranking potential of the category pages on the site 🙂
Technical SEO | | QubaSEO0 -
Google News not indexing .index.html pages
Hi all, we've been asked by a blog to help them better indexing and ranking on Google News (with the site being already included in Google News with poor results) The blog had a chronicle URL duplication problem with each post existing with 3 different URLs: #1) www.domain.com/post.html (currently in noindex for editorial choices as showing all the comments) #2) www.domain.com/post/index.html (currently indexed showing only top comments) #3) www.domain.com/post/ (very same as #2) We've chosen URL #2 (/index.html) as canonical URL, and included a rel=canonical tag on URL #3 (/) linking to URL #2.
Technical SEO | | H-FARM
Also we've submitted yesterday a Google News sitemap including consistently the list of URLs #2 from the last 48h . The sitemap has been properly "digested" by Google and shows that all URLs have been sent and indexed. However if we use the site:domain.com command on Google News we see something completely different: Google News has indexed actually only some news and more specifically only the URLs #3 type (ending with the trailing slash instead of /index.html). Why ? What's wrong ? a) Does Google News bot have problems indexing URLs ending with .index.html ? While figuring out what's wrong we've found out that http://news.google.it/news/search?aq=f&pz=1&cf=all&ned=us&hl=en&q=inurl%3Aindex.html gives no results...it seems that Google News index overall does not include any URLs ending with /index.html b) Does Google News bot recognise rel=canonical tag ? c) Is it just a matter of time and then Google News will pick up the right URLs (/index.html) and/or shall we communicate Google News team any changes ? d) Any suggestions ? OR Shall we do the other way around. meaning make URL #3 the canonical one ? While Google News is showing these problems, Google Web search has actually well received the changes, so we don't know what to do. Thanks for your help, Matteo0