Any idea why pages are not being indexed?
-
Hi Everyone,
One section on our website is not being indexed. The product pages are, but not some of the subcategories. These are very old pages, so thought it was strange. Here is an example one one:
https://www.moregems.com/loose-cut-gemstones/prasiolite-loose-gemstones.html
If you take a chunk of text, it is not found in Google. No issues in Bing/Yahoo, only Google. You think it takes a submission to Search Console?
Jeff
-
So I am testing removing some of the restrictions in the robots.txt file and see if that helps as I still can't get it to be indexed.
-
Yeah...it's very close to what I have. I also checked other websites I own with the same category structure and robots.txt file...no issues.
I even checked other subcats on www.moregems.com, and no issues. It seems to be all the pages under "GEMSTONES" that are not being indexed. Any thoughts there?
-
I usually do robots.txt for Magento sites custom. But I did find a good example to use. Check out this site:https://www.magikcommerce.com/blog/set-up-robots-txt-in-magento/
I would edit anything that doesn't fit your site.
Hope this helps!
-
So I submitted https://www.moregems.com/loose-cut-gemstones/prasiolite-loose-gemstones.html and fetched it in Google Search Console a few hours ago. Still not being indexed. I don't see any issues in the robots.txt file. Any thoughts?
-
Hi Nicholas,
I asked him this as well, but do you have any resources for a "good" Magento specific robots.txt file? I want to try updating it, as it has been the same for about 7 years.
The strange thing is the deeper product pages are indexed, but not the subcats.
-
Hi Christian,
Do you have any resources for a recommended Magento robots.txt file? I added this probably 6-7 years ago, and have not updated it since. I can definitely try that.
Jeff
-
Hi Jeff,
In addition to Christian's recommendation (which I would do first), use Google Search Console's Fetch & Render Tool to request your non-indexed pages to Google's index. Sometimes this tool in GSC will have them indexed immediately.
It is not uncommon for deep links or internal pages of internal pages to not be immediately indexed. It is definitely important to use new pages to link our to other pages on your website, and if possible go in and link to your new pages from older (already indexed) pages on your website
-
Hey Jeff,
I just ran a quick scan of the site, it looks like you have a lot of links, pages, and directories being blocked by your robots.txt file: https://www.moregems.com/robots.txt
I would make sure the pages you want to be indexed by search engines are not being blocked in your robots.txt.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Keywords are indexed on the home page
Hello everyone, For one of our websites, we have optimized for many keywords. However, it seems that every keyword is indexed on the home page, and thus not ranked properly. This occurs only on one of our many websites. I am wondering if anyone knows the cause of this issue, and how to solve it. Thank you.
Technical SEO | | Ginovdw1 -
Purchased domain with links - redirect page by page or entire domain?
Hi, I purchased an old domain with a lot of links that I'm redirecting to my site. I want all of their links to redirect to the same page on my site so I can approach this two different ways: Entire site
Technical SEO | | ninel_P
1.) RedirectMatch 301 ^(.*)$ http://www.xyz.com or Page by page
2). Redirect 301 /retiredpage.html http://www.xyz.com/newpage.html Is there a better option I should go with in regards to SEO effectiveness? Thanks in advance!0 -
What should I do with all these 404 pages?
I have a website that Im currently working on that has been fairly dormant for a while and has just been given a face lift and brought back to life. I have some questions below about dealing with 404 pages. In Google WMT/search console there are reports of thousands of 404 pages going back some years. It says there are over 5k in total but I am only able to download 1k or so from WMT it seems. I ran a crawl test with Moz and the report it sent back only had a few hundred 404s in, why is that? Im not sure what to do with all the 404 pages also, I know that both Google and Moz recommend a mixture of leaving some as 404s and redirect others and Id like to know what the community here suggests. The 404s are a mix of the following: Blog posts and articles that have disappeared (some of these have good back-links too) Urls that look like they used to belong to users (the site used to have a forum) which where deleted when the forum was removed, some of them look like they were removed for spam reasons too eg /user/buy-cheap-meds-online and others like that Other urls like this /node/4455 (or some other random number) Im thinking I should permanently redirect the blog posts to the homepage or the blog but Im not sure what to do about all the others? Surely having so many 404s like this is hurting my crawl rate?
Technical SEO | | linklander0 -
Should you use google url remover if older indexed pages are still being kept?
Hello, A client recently did a redesign a few months ago, resulting in 700 pages being reduced to 60, mostly due to panda penalty and just low interest in products on those pages. Now google is still indexing a good number of them ( around 650 ) when we only have 70 on our sitemap. Thing is google indexes our site on average now for 115 urls when we only have 60 urls that need indexing and only 70 on our sitemap. I would of thought these urls would be crawled and not found, but is taking a very long period of time. Our rankings haven't recovered as much as we'd hope, and we believe that the indexed older pages are causes this. Would you agree and also would you think removing those old urls via the remover tool would be best option? It would mean using the url remover tool for 650 pages. Thank you in advance
Technical SEO | | Deacyde0 -
Should I put meta descriptions on pages that are not indexed?
I have multiple pages that I do not want to be indexed (and they are currently not indexed, so that's great). They don't have meta descriptions on them and I'm wondering if it's worth my time to go in and insert them, since they should hypothetically never be shown. Does anyone have any experience with this? Thanks! The reason this is a question is because one member of our team was linking to this page through Facebook to send people to it and noticed random text on the page being pulled in as the description.
Technical SEO | | Viewpoints0 -
Backlinks Indexing
Is there a way of indexing my backlinks?? I have a lot backlinks but Google can't find them
Technical SEO | | CodePlus0 -
How do https pages affect indexing?
Our site involves e-commerce transactions that we want users to be able to complete via javascript popup/overlay boxes. in order to make the credit card form secure, we need the referring page to be secure, so we are considering making the entire site secure so all of our site links wiould be https. (PayPal works this way.) Do you think this will negatively impact whether Google and other search engines are able to index our pages?
Technical SEO | | seozeelot0