Help! Pages not being indexed
-
Hi Mozzers, I need your help.
Our website (www.barnettcapitaladvisors.com) stopped being indexed in search engines following a round of major changes to URLs and content. There were a number of dead links for a few days before 301 redirects were properly put in place. And now, only 3 pages show up in bing when I do the search "site:barnettcapitaladvisors.com". A bunch of pages show up in Google for that search, but they're not any of the pages we want to show up.Our home page and most important services pages are nowhere in search results.
What's going on here?
Our sitemap is at http://www.barnettcapitaladvisors.com/sites/default/files/users/AndrewCarrillo/sitemap/sitemap.xml
Robots.txt is at: http://www.barnettcapitaladvisors.com/robots.txtThanks!
-
Thanks for the help! The Screaming Frog site crawler really makes it much easier to understand the robots.txt file. It looks like it was set to "disallow" robots.
-
Nothing really to "resolve" this.
those tools could simply provide you with cues and signs to figure out what went wrong and when... traffic and referral data on those pages could also indicate when they exactly stopped appearing in search and you can the correlate that with your dev actions if you do have some memory or log of those steps taken and backtrack and find out what was set wrong.
So you see absolutely no errors no messages nothing in GWT per this old post?
if you are using drupal, check this thread about it, it may well be your problem...
also see the very bottom of this page that may help you diagnose if you have a setting wrong in your robots file.
-
Hi Brien
The X-Robots tag has set many of your pages to noindex,nofollow. X-Robots is delivered as part of the HTTP response header so you won't find it in your HTML code.
In addition to that, your canonical tags are a little confused: the homepage canonical tag is set to
http://barnettcapitaladvisors.com/miami-financial-advisor-south-florida-financial-planner
when it should be barnettcapitaladvisors.com
I found these things by using Screaming Frog site crawler.
In your Robots file you're not allowing Yandex or Baidu to crawl the site - are you aware of this?
Have your developer correct the X-Robots and canonical issues.
-
Yes, I've searched Google Webmaster Tools and see a big dropoff in impressions on 5/4/13.
There are 14 pages in the sitemap that should be showing up. But, instead, what's getting indexed (9 pages in Google and 2 of the 3 pages in Bing) are pages that aren't in the sitemap -- ones that were created and then taken down or that we don't care to be indexed.
Yes, some of the URLs are new (three weeks ago) and that may be why they haven't appeared yet, but what about the home page? That isn't a new URL and it isn't appearing. Also, one new URL (which is also one of the important services pages that we want to appear) is appearing in bing: www.barnettcapitaladvisors.com/wealth-management but the other ones are not.I use analytics as well, but what particularly in Google Analytics will help me resolve this?
-
have you checked your google/bing webmasters tool yet? that often will give you the best answer in these situations?
Do you use analytics? Have you checked that as well?
if any pages do show up when you site search your domain on google, then you are being indexed, if pages you made are too new, it may take some time for them to become available in google's SERPs
it all just depends, how long has it been?
I see only 2 links in bing results for your site, but for google we have 9 results, how many should it be? 14?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Gradual Drop in GWT Indexed Pages for large website
Hey all, I am working on SEO for a massive sports website. The information provided will be limited but I will give you as much context as possible. I just started digging into it and have found several on-page SEO issues of which I will fix when I get to the meat of it but this seems like something else could be going on. I have attached an image below. It doesn't seem like it's a GWT bug as reported at one point either as it's been gradually dropping over the past year. Also, there is about a 20% drop in traffic in Google Analytics over this time as well. This website has hundreds of thousands of pages of player profiles, sports team information and more all marked up with JSON-LD. Some of the on-page stuff that needs to be fixed are the h1 and h2, title tags and meta description. Also, some of the descriptions are pulled from wikipedia and linked to a "view more" area. Anchor text has "sign up" language as well. Not looking for a magic bullet but to be pointed in the right direction. Where should I start checking off to ensure I cover my bases besides the on page stuff above? There aren't any serious errors and I don't see any manual penalties. There are 4,300 404's but I have seen plenty of sites with that many 404's all of which still got traffic. It doesn't look like a sitemap was submitted to GWT and when I try submitting sitemap.xml, I get a 504 error (network unreachable). Thanks for reading. I am just getting started on this project but would like to spend as much time sharpening the axe before getting to work. lJWk8Rh
Technical SEO | | ArashG0 -
Should you use google url remover if older indexed pages are still being kept?
Hello, A client recently did a redesign a few months ago, resulting in 700 pages being reduced to 60, mostly due to panda penalty and just low interest in products on those pages. Now google is still indexing a good number of them ( around 650 ) when we only have 70 on our sitemap. Thing is google indexes our site on average now for 115 urls when we only have 60 urls that need indexing and only 70 on our sitemap. I would of thought these urls would be crawled and not found, but is taking a very long period of time. Our rankings haven't recovered as much as we'd hope, and we believe that the indexed older pages are causes this. Would you agree and also would you think removing those old urls via the remover tool would be best option? It would mean using the url remover tool for 650 pages. Thank you in advance
Technical SEO | | Deacyde0 -
Pages with Duplicate Page Content Crawl Diagnostics
I have Pages with Duplicate Page Content in my Crawl Diagnostics Tell Me How Can I solve it Or Suggest Me Some Helpful Tools. Thanks
Technical SEO | | nomyhot0 -
Canonicalization of index.html - please help
I've read up on the subject but am new at this so I thought I would just put forth a simple question. We want our home page to be referred to as www.domain.com. We want the search engines to find and return this URl in search results. But the page has to have a name and the actual name is NOT to www.domain.com/index.html. This, I believe is what can cause duplicate cotnent issues (not really duplicate but perceived by the serach engines as duplicate content). Is it best to insert http://www.domain.com/" /> in the HEAD section of the index.html page or am I totally misunderstanding this concept?
Technical SEO | | TBKO0 -
Is it better to delete web pages that I don't want anymore or should I 301 redirect all of the pages I delete to the homepage or another live page?
Is it better for SEO to delete web pages that I don't want anymore or should I 301 redirect all of the pages I delete to the homepage or another live page?
Technical SEO | | CustomOnlineMarketing0 -
Will rel=canonical cause a page to be indexed?
Say I have 2 pages with duplicate content: One of them is: http://www.originalsite.com/originalpage This page is the one I want to be indexed on google (domain rank already built, etc.) http://www.originalpage.com is more of an ease of use domain, primarily for printed material. If both of these sites are identical, will rel=canonical pointing to "http://www.originalsite.com/originalpage" cause it to be indexed? I do not plan on having any links on my site going to "http://www.originalsite.com/originalpage", they would instead go to "http://www.originalpage.com".
Technical SEO | | jgower0 -
Ranked on Page 1, now between page 40-50... Please help!
My site, http://goo.gl/h0igI was ranking on page one for many of our biggest keywords. All of a sudden, we completely fell off. I believe I'm down somewhere between page 40-50. I have no warning or error messages in webmaster tools. Can anyone please help me identify what the problem is? This is completely unexpected and I don't know how to fix it... Thanks in advance
Technical SEO | | Prime850 -
If you only want your home page to rank, can you use rel="canonical" on all your other pages?
If you have a lot of pages with 1 or 2 inbound links, what would be the effect of using rel="canonical" to point all those pages to the home page? Would it boost the rankings of the home page? As I understand it, your long-tail keyword traffic would start landing on the home page instead of finding what they were looking for. That would be bad, but might be worth it.
Technical SEO | | watchcases0