Site Not Indexing After 2 Weeks - PA at 1
-
Hi Moz Community!
I'm working as a digital marketing consultant for an organization that uses us for their online registration - we do not manage their web page. The issue that I am hoping you might have some ideas on is that their SERPS still aren’t making much of a recovery since they revamped their site in mid August. I ran a MOZ campaign for them and despite that they (eventually) got all their 301s in place, they submitted an updated sitemap to Google, aren’t hitting any crawl errors, and have a working robots.txt over two-thirds of their site pages don’t seem to be indexing. MOZ is giving most of them a Page Authority of 1, and when I login to their GWT, it’s showing me that only 3 pages have been indexed of the 315 URLs submitted.
I know Google doesn’t make any guarantees in index update timelines, but 2+ weeks seems like a long time
Their website is https://www.northshoreymca.org/. The site has a DA of 43 but most of the pages on the main nav are still at 1. They gave me permission to share in this forum because we're really trying to figure out a recovery strategy.
Any thoughts or ideas as to what might be causing this? Is there anything else that you think I should check or that might be causing an issue? Is it possible that Google is just taking this long to index their page? Note: this page is built with Drupal.
THANK YOU!
-
being a standard drupal setting doesn't mean it makes SE friendly anyway I have no backup for that theory, I'm just assuming that new sites should be as simple as possible in order to google to get a better grasp of them and start trusting before complicating thing in an unnecessary way, unless the node url is adding any value to you, I would get rid of it and 301 it to its canonical.
-
Thanks again! I noticed all of this too before I submitted to Q&A but doing a little bit of research (and with my admittedly limited knowledge of the topic) I thought that this was pretty standard for how clean Drupal URLs are setup? Is that incorrect?
-
I would not focus too much on GSC if the site has been recently built, the place where you want your pages to be is google index
The place I would check is GA to see how many of your pages are getting at least 1 visits and which ones don't. You can then look for the pages which are not getting traffic and understand if they are not being indexed or not ranking high.
This page for example https://www.northshoreymca.org/programs/creative-arts I've noticed that you're using the rel="<a class="attribute-value">shortlink</a>", which I don't think it's adding too much value, probably is making things messier, as you two different version of your page, in fact if you check for /node/ folder you're finding pages you don't want (ex.https://www.northshoreymca.org/node/145 instead of this <a class="attribute-value">https://www.northshoreymca.org/content/ymca-north-shore-annual-gala</a>) I would set up a 301 at least so you ensure that google is not deciding which is the best page to index and serve to users, I know you hav a canonical but it could be something you could test, and help google not making too many decisions, especially because canonical is just a recommendation done to bots, and which google normally doesn't follow boldly for new sites.
hoep this helps, let me know how it goes!
-
Yes! Better safe than sorry!
-
Dont get spammy and add all your 300 pages. You cant never know when google pisses off. Be discrete and and dont try to rush all the indexed.
(this is just a myth, but lets not try to probe it). -
You are too kind! I thought we checked into all that stuff too. I'm glad to at least know that we're probably just up against a waiting game and there isn't something fundamentally wrong with the page that's preventing the index. I'll start manually adding main nav links from addurl and cross my fingers.
Thank you again!
-
Yeap. Just wait.
I've checked if there was a problem with robots.txt or the meta robots tag, and there isnt any problem with that particularly page.
You can speed up the indexing of some pages with google.com/addurl -
So here's an example of a page from the main nav on their site that is a problem page:
site:https://www.northshoreymca.org/programs/creative-arts
I guess we're just stuck in a waiting game for pages like this?
Thanks again!
-
You're welcome. We are here to help.
I cannot specify an exact amount of time. I've had sites that were fully indexed and informed in WMT in 3 weeks, and other sites that took over 2 months and only got informed roughly 90% of pages.
Sorry, Google is that way. Sometimes gives is a buch of information and sometimes nothing.
Again, i'd trust what you get in the search results, not what appears in WMT. -
Thanks for the input Oleg. I did verify that in GSC that they are pointing to https://www. so I'm not sure what gives in terms of the discrepancy between the index of 3 pages showing in GSC vs. the several hundred pages showing when you hit site:https://www.northshoreymca.org.
Very new to MOZ and SEO so I really appreciate your time and input!
-
Thanks Gaston! In your experience, are there some generalities you can speak to for the length of time it takes to index a site? I understand that the response to that question is probably relative given a site's size, age, update frequency etc. but would you care to ballpark it?
I'm really new to SEO (and entirely self-taught) so appreciate all the info! Thanks again!
-
Thanks so much! Any thoughts on why their GSC is saying 3 pages indexed? I checked as Oleg suggested above that it was referencing https:www. and verified that it is. I'm very new to the technical side of things so thanks for the help!
-
hey camarin, so far I see 271 URLS indexed for the search "site:https://www.northshoreymca.org". There are of course a couple of changes I would do like try to have a friendlier look to the tag URLs instead of having the tag parameter, but anyway it seems that it's working. Also PA for this page https://www.northshoreymca.org/locations/haverhill/ is 33. I think as you said it was just a matter of time. I hope your problem got fixed.
-
Hi there,
There are some points to mark here:
1- The site was redisigned, as you say there has been restructured the site completely.
2- As some of the URLs seem to be new, still none of them were crawled by roger (moz's bot) and/or they are likely to be considered to no have any leverage. That's some explanation for the PA1.
3- It's known that google takes its time to index the whole site. And also takes its time to inform it in WMT. My advise is to do a search with: 'site;yoursite.com' and check whether there has been an improval in the indexation. Just now im getting 620 results in that search.
4- Being built in Dupal is no metric to google (at least as fas as my knoledge) in order to harm or benefit you. It's transparent for Google.Hope I've helped.
GR. -
I see a good amount of pages indexed: https://www.google.com/search?q=site%3Ahttps%3A%2F%2Fwww.northshoreymca.org%2F
Make sure the profile you are using in GSC (formerly GWT) is the https://www. version of the website (any other profile will show fewer/no indexed pages).
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google Indexed Site A's Content On Site B, Site C etc
Hi All, I have an issue where the content (pages and images) of Site A (www.ericreynolds.photography) are showing up in Google under different domains Site B (www.fastphonerepair.com), Site C (www.quarryhillvet.com), Site D (www.spacasey.com). I believe this happened because I installed an SSL cert on Site A but didn't have the default SSL domain set on the server. You were able to access Site B and any page from Site A and it would pull up properly. I have since fixed that SSL issue and am now doing a 301 redirect from Sites B, C and D to Site A for anything https since Sites B, C, D are not using an SSL cert. My question is, how can I trigger google to re-index all of the sites to remove the wrong listings in the index. I have a screen shot attached so you can see the issue clearer. I have resubmitted my site map but I'm not seeing much of a change in the index for my site. Any help on what I could do would be great. Thanks
Intermediate & Advanced SEO | | cwscontent
Eric TeVM49b.png qPtXvME.png1 -
No images in Google index
No images are indexed on this site (client of ours): http://www.rubbermagazijn.nl/. We've tried everything (descriptive alt texts, image sitemaps, fetch&render, check robots) but a site:www.rubbermagazijn.nl shows 0 image results and the sitemap report in Search Console shows 0 images indexed. We're not sure how to proceed from here. Is there anyone with an idea what the problem could be?
Intermediate & Advanced SEO | | Adriaan.Multiply0 -
Transferring Domain and redirecting old site to new site and Having Issues - Please help
I have just completed a site redesign under a different domain and new wordpress woo commerce platform. The typical protocol is to just submit all the redirects via the .htaccess file on the current site and thereby tell google the new home of all your current pages on the new site so you maintain your link juice. This problem is my current site is hosted with network solutions and they do not allow access to the .htaccess file and there is no way to redirect the pages they say other than a script they can employ to push all pages of the old site to the new home page of the new site. This is of course bad for seo so not a solution. They did mention they could also write a script for the home page to redirect just it to the new home page then place a script of every individual page redirecting each of those. Does this sound like something plausible? Noone at network solutions has really been able to give me a straight answer. That being said i have discussed with a few developers and they mentioned a workaround process to avoid the above: “The only thing I can think of is.. point both domains (www.islesurfboards.com & www.islesurfandsup.com) to the new store, and 301 there? If you kept WooCommerce, Wordpress has plugins to 301 pages. So maybe use A record or CName for the old URL to the new URL/IP, then use htaccess to redirect the old domain to the new domain, then when that comes through to the new store, setup 301's there for pages? Example ... http://www.islesurfboards.com points to http://www.islesurfandsup.com ... then when the site sees http://www.islesurfboards.com, htaccess 301's to http://www.islesurfandsup.com.. then wordpress uses 301 plugin for the pages? Not 100% sure if this is the best way... but might work." Can anyone confirm this process will work or suggest anything else to redirect my current site on network solutions to my new site withe new domain and maintain the redirects and seo power. My domain www.islesurfboards.com has been around for 10 years so dont just want to flush the link juice down the toilet and want to redirect everything correctly.
Intermediate & Advanced SEO | | isle_surf0 -
How to make Google index your site? (Blocked with robots.txt for a long time)
The problem is the for the long time we had a website m.imones.lt but it was blocked with robots.txt.
Intermediate & Advanced SEO | | FCRMediaLietuva
But after a long time we want Google to index it. We unblocked it 1 week or 8 days ago. But Google still does not recognize it. I type site:m.imones.lt and it says it is still blocked with robots.txt What should be the process to make Google crawl this mobile version faster? Thanks!0 -
Google Penguin 2.1 Penalty - Recoverable?
Hello, I have a client that was hit very bad by the Google Penguin 2.1 update. He mentioned he did an intensive link analysis and removed all the bad links; however there were a lot of them (around 6000). His domain has a decent sized domain authority of 30/100. I'm wondering if it's worth it to try and save his domain, or start fresh from a new one. Due to the high number of links I'm not 100% confident that all the bad links were taken care of, and I've heard that even if you remove the links Google won't lift the penalties. What would you do...get a new domain, or risk the next couple months trying to save the existing one?
Intermediate & Advanced SEO | | reidsteven751 -
Index or not index Categories
We are using Yoast Seo plugin. On the main menu we have only categories which has consist of posts and one page. We have category with villas, category with villa hotels etc. Initially we set to index and include in the sitemap posts and excluded categories, but I guess it was not correct. Would be a better way to index and include categories in the sitemap and exclude the posts in order to avoid the duplicate? It somehow does not make sense for me, If the posts are excluded and the categories included, will not then be the categories empty for google? I guess I will get crazy of this. Somebody has perhaps more experiences with this?
Intermediate & Advanced SEO | | Rebeca10 -
Micro sites?
Hi, I have been speaking to seo firms regarding strategies and they mentioned setting up micro sites under domains that are relevant. i.e setting up armanidoamin.co.uk and we use it as a blog type site to update all info, product reviews, news relating to armani. Whats peoples thoughts on this? Does it work? Is it worth the effort? Im not so sure but obviously looking for ideas. Cheers
Intermediate & Advanced SEO | | YNWA0 -
This site got hit but why..?
I am currently looking at taking on a small project website which was recently hit but we are really at a loss as to why so I wanted to open this up to the floor and see if anyone else had some thoughts or theories to add. The site is Howtotradecommodities.co.uk and the site appeared to be hit by Penguin because sure enough it drops from several hundred visitors a day to less than 50. Nothing was changed about the website, and looking at the Analytics it bumbled along at a less than 50 visitors a day. On June 25th when Panda 3.8 hit, the site saw traffic increase to between 80-100 visitors a day and steadily increases almost to pre-penguin levels. On August 9th/10th, traffic drops off the face of the planet once again. This site has some amazing links http://techcrunch.com/2012/02/04/algorithmsdata-vs-analystsreports-fight/
Intermediate & Advanced SEO | | JamesAgate
http://as.exeter.ac.uk/library/using/help/business/researchingfinance/stockmarket/ That were earned entirely naturally/editorially. I know these aren't "get out of jail free cards" but the rest of the profile isn't that bad either. Normally you can look at a link profile and say "Yep, this link and that link are a bit questionable" but beyond some slightly off-topic guest blogging done a while back before I was looking to get involved in the project there really isn't anything all that fruity about the links in my opinion. I know that the site design needs some work but the content is of a high standard and it covers its topic (commodities) in a very comprehensive and authoritative way. In my opinion, (I'm not biased yet because it isn't my site) this site genuinely deserves to rank. As far as I know, this site has received no unnatural link warnings. I am hoping this is just a case of us having looked at this for too long and it will be a couple of obvious/glaring fixes to someone with a fresh pair of eyes. Does anyone have any insights into what the solution might be? [UPDATE] after responses from a few folks I decided to update the thread with progress I made on investigating the situation. After plugging the domain into Open Site Explorer I can see quite a few links that didn't show up in Link Research Tools (which is odd as I thought LRT was powered by mozscape but anyway... shows the need for multiple tools). It does seem like someone in the past has been a little trigger happy with building links to some of the inner pages.0