Massive Amount of Pages Deindexed
-
On or about 12/1/17 a massive amount of my site's pages were deindexed. I have done the following:
- Ensured all pages are "index,follow"
- Ensured there are no manual penalites
- Ensured the sitemap correlates to all the pages
- Resubmitted to Google
- ALL pages are gone from Bing as well
In the new SC interface, there are 661 pages that are Excluded with 252 being "Crawled - currently not indexed: The page was crawled by Google, but not indexed. It may or may not be indexed in the future; no need to resubmit this URL for crawling." What in the world does this mean and how the heck do I fix this. This is CRITICAL. Please help!
The url is https://www.hkqpc.com
-
the report was run prior canonical directives
Anytime remember to noindex your robots.txt
https://yoast.com/x-robots-tag-play/
There are cases in which the robots.txt file itself might show up in search results. By using an alteration of the previous method, you can prevent this from happening to your website:
<filesmatch "robots.txt"="">Header set X-Robots-Tag "noindex"</filesmatch>
**And in Nginx:**
location = robots.txt { add_header X-Robots-Tag "noindex"; }
-
Looking at the first report, "Redirect Chains".. As I understand the table, these are correct..
Column A is the page (source) with the redirecting link
Column B is the link that is redirecting (http://www.hkqlaw.com)
Column C shows 2 redirects happening
Column I shows the first redirect (http://www.hkqlaw.com -> http://www.hkqpc.com) (non ssl version)
Column N shows the second redirect (http://www.hkqpc.com -> https://www.hkqpc.com) (ssl version)The original link (hkqlaw.com) is a link in the footer of our news section so is common on those pages which is why it shows so often. So, like I said, this appears to be correct.
I added the canonical directives to the pages earlier so perhaps that report was run prior to me doing that?
Again, thanks so much for your effort in helping me!
-
Now I'm really baffled. I just ran Screaming Frog and don't see any of the redirects or other stats. Which software are you using that is showing this information? I'm trying to replicate it and figure out if there's something, somewhere else doing this.
-
Wow, I got it
your 301 redirecting a ton of URLs back to the homepage.
- Redirect chains https://bseo.io/cZW0w0
- internal URLs https://bseo.io/4sFqUk
- insecure content https://bseo.io/YDDKGD
- no canonical https://bseo.io/fWey1Q
- crawl overview https://bseo.io/Zg6bpM
- canonical errors https://bseo.io/YtTh7W
-
Ok, canonical is set for each page (and I fixed the // issue). I used x-robots header to noindex the robots.txt and sitemap.xml files, along with a few other extensions while I was at it.
I'll get the secured cookie header set after this is resolved. We don't store any sensitive data via cookies for this site so it's not of immediate concern but still one I'll address.
EDIT: The https://www.hkqpc.com/attorney/David-Saba.html/ page no longer exists which was the cause of the errors. I've redirected that to the appropriate page.
-
https://cryptoreport.websecurity.symantec.com/checker/
This server cannot be scanned for these vulnerabilities:HeartbleedServer scan unsuccessful. <a>See possible causes.</a>Poodle (TLS)Server scan unsuccessful. See possible causes.BEASTThis server is vulnerable to a BEAST attack. <a>More information.</a>
I am sorry I said your IP was Network solutions when it was 1&1 I still strongly recommend changing hosting companies even though I am German and so is 1&1
DNS resolves www.hkqpc.com to 74.208.236.66
The SSL certificate used to load resources from https://www.hkqpc.com will be distrusted in M70. Once distrusted, users will be prevented from loading these resources. See https://g.co/chrome/symantecpkicerts for more information.
Look: https://cl.ly/pCY5
Look: https://cl.ly/pAKa
symantec SSL certificates are now owned by DigiCert
<big>https://www.digicert.com/help/</big>
https://www.dareboost.com/en/report/5a70b33e0cf28f017576367f
The Set-Cookie HTTP header can be configured with your Apache server. Make sure that the mod_headers module is enabled. Then, you can specify the header (in your .htaccess file, for example). Here is an example: <ifmodule mod_headers.c=""># only for Apache > 2.2.4: Header edit Set-Cookie ^(.*)$ $1;HttpOnly;Secure # lower versions: Header set Set-Cookie HttpOnly;Secure</ifmodule>
- robots.txt file inside of the SERPS big photo https://i.imgur.com/cJeDR9t.png
- XML sitemap inside of SERPS should be no indexed big photo https://i.imgur.com/tlx5jc7.png
Double forward slashes after verdicts the same page without double forward slashes you need to add rel canonical tags zero canonical's on any page whatsoever.
- https://www.hkqpc.com/news/verdicts//hkq-attorneys-win-carbon-county-real-estate-case/
- https://www.hkqpc.com/news/verdicts/hkq-attorneys-win-carbon-county-real-estate-case/
The URLs above need a rel=canonical tag I have created an example below for you. For the page without the double forward slashes, and this tells Google the one you'd prefer to have indexed besides it keeps the query string pages and junk pages out of Google's index. Please see the resources below and add them to your website because I do not know what type of CMS you're using I cannot recommend a plug-in to do it but if you were using something like WordPress it would be automatically done by something like Yoast WordPress SEO for the site that you are using it may be a wise move to move to something like WordPress it is a solid platform for a site that size and makes things a lot easier for you to implement change across the entire site quickly.
- https://moz.com/blog/complete-guide-to-rel-canonical-how-to-and-why-not
- https://yoast.com/rel-canonical/
- https://moz.com/blog/canonical-url-tag-the-most-important-advancement-in-seo-practices-since-sitemaps
You need to add a canonical
- Bigger photo of problem https://i.imgur.com/1qMMPSM.png
- this page https://www.hkqpc.com/attorney/David-Saba.html/
- Warning: Creating default object from empty value in /homepages/43/d238880598/htdocs/classes/class.attorneys.php on line 38
- Warning: Invalid argument supplied for foreach() in /homepages/43/d238880598/htdocs/headers/attorney.php on line 15
- ** FIx for this**
- https://stackoverflow.com/questions/14806959/how-to-fix-creating-default-object-from-empty-value-warning-in-php
- http://thisinterestsme.com/invalid-argument-supplied-for-foreach/
You have
Heartbleed Vulnerability
An unknown error occurred while scanning for the Heartbleed Bug.
-
Thanks for the great feedback! The hkqlaw.com url simply forwards (301) to hkqpc.com. The IP address you have is for hkqlaw.com which is registered through Network Solutions, but hosting of hkqpc.com is on 1and1.com hosting. Also, the timeout error you're getting is because there is no SSL cert for hkqlaw.com, again, it's just forwarded to hkqpc.com (which does have an SSL attached to it). As far as SC, everything is setup to index hkqpc.com.
-
Right now I cannot get that site to load on my browser, and when I used https://tools.pingdom.com it was unable to load as well you could be having some serious server problems, and that could be causing the issue although I was getting it to run through screaming frog which is surprising.
This is a zip file of your screen frog results this will show if there are any no index pages which I found none of it looks to me like you have a server issue. Zip file: http://bseo.io/BXYpZh
I checked your site for malware using https://sitecheck.sucuri.net/results/www.hkqlaw.com/ ( please understand this only check the homepage and a handful of others) and found none though when I checked your IP address I noticed a lot of ransomware information tied directly to your IP
https://ransomwaretracker.abuse.ch/ip/205.178.189.131/
Here is a large screenshot of when I tried to browse your website: https://i.imgur.com/OzcLhbx.png
Here is Pingdom ( remember to test on something outside of your local computer because you have caching and other things that could give you incorrect results.)
https://tools.pingdom.com/#!/bd6d52/https://www.hkqlaw.com/
in my experience network solutions, hosting is terrible I would strongly suggest doing two things.
Get a better hosting company for your site.
A good host that is not too expensive is and also managed is liquid Web, cloudways, rack space, pairnic, you can also build out your own system on non-managed hosting like Linode, digital ocean, AWS, Google cloud, Microsoft Azure if you want a high-quality, inexpensive manage host that offers more than one back and like the ones I've listed above https://www.cloudways.com/en/ will host anything and manage it, and you can use the backends provided before this. If you want what I think is the best and price is not a big deal considering you're not running WordPress https://armor.com is my preferred hosting company. Otherwise, cloudways or liquid Web would be where I would host your site.
Considering you already have an IP address attached to ransomware and you're using hosting company that will not be beneficial to you in security terms. I would add a web application firewall/reverse proxy you can do that with https://sucuri.net/website-firewall/ https://incapsula.com https://fastly.com and if you want most basic and least secure but better than what you have https://cloudflare.com
At the very least put Cloudflare on their but what I'm seeing is a severe problem coming from your web host and knowing that hosting company I would strongly advise you to move to a better host.
I hope this was of help,
Thomas
-
Not sure if this is of help to you, I suppose it depends how many pages you are expecting to be indexed, but according to John Mu at Google - Google does not necessarily index all pages.
https://www.seroundtable.com/google-index-all-pages-20780.html
-
Not recently. It migrated well over a year ago to HTTPS.
-
First thing to confirm - did you recently migrate to HTTPS?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Can a duplicate page referencing the original page on another domain in another country using the 'canonical link' still get indexed locally?
Hi I wonder if anyone could help me on a canonical link query/indexing issue. I have given an overview, intended solution and question below. Any advice on this query will be much appreciated. Overview: I have a client who has a .com domain that includes blog content intended for the US market using the correct lang tags. The client also has a .co.uk site without a blog but looking at creating one. As the target keywords and content are relevant across both UK and US markets and not to duplicate work the client has asked would it be worthwhile centralising the blog or provide any other efficient blog site structure recommendations. Suggested solution: As the domain authority (DA) on the .com/.co.uk sites are in the 60+ it would risky moving domains/subdomain at this stage and would be a waste not to utilise the DAs that have built up on both sites. I have suggested they keep both sites and share the same content between them using a content curated WP plugin and using the 'canonical link' to reference the original source (US or UK) - so not to get duplicate content issues. My question: Let's say I'm a potential customer in the UK and i'm searching using a keyword phrase that the content that answers my query is on both the UK and US site although the US content is the original source.
Intermediate & Advanced SEO | | JonRayner
Will the US or UK version blog appear in UK SERPs? My gut is the UK blog will as Google will try and serve me the most appropriate version of the content and as I'm in the UK it will be this version, even though I have identified the US source using the canonical link?2 -
Category Page as Shopping Aggregator Page
Hi, I have been reviewing the info from Google on structured data for products and started to ponder.
Intermediate & Advanced SEO | | Alexcox6
https://developers.google.com/search/docs/data-types/products Here is the scenario.
You have a Category Page and it lists 8 products, each products shows an image, price and review rating. As the individual products pages are already marked up they display Rich Snippets in the serps.
I wonder how do we get the rich snippets for the category page. Now Google suggest a markup for shopping aggregator pages that lists a single product, along with information about different sellers offering that product but nothing for categories. My ponder is this, Can we use the shopping aggregator markup for category pages to achieve the coveted rich results (from and to price, average reviews)? Keen to hear from anyone who has had any thoughts on the matter or had already tried this.0 -
Page speed - what do you aim for?
Hi Mozzers - was just looking at website speed and know the google guidelines on average page load time but I'm not sure whether Google issues guidelines on any of the other 4? Do you know of any guidance on domain lookup, server response, server connection or page download? Page Load Time (sec) - I tend to aim for 2 seconds max: http://www.hobo-web.co.uk/your-website-design-should-load-in-4-seconds/
Intermediate & Advanced SEO | | McTaggart
Server Response Time: [Google recommends 200ms]: https://developers.google.com/speed/docs/insights/Server Redirection Time (sec) [dependent on number of redirects so probably no guide figure]
Domain Lookup Time (sec)
Server Connection Time (sec)
Page Download Time (sec) Thanks, Luke0 -
Can noindexed pages accrue page authority?
My company's site has a large set of pages (tens of thousands) that have very thin or no content. They typically target a single low-competition keyword (and typically rank very well), but the pages have a very high bounce rate and are definitely hurting our domain's overall rankings via Panda (quality ranking). I'm planning on recommending we noindexed these pages temporarily, and reindex each page as resources are able to fill in content. My question is whether an individual page will be able to accrue any page authority for that target term while noindexed. We DO want to rank for all those terms, just not until we have the content to back it up. However, we're in a pretty competitive space up against domains that have been around a lot longer and have higher domain authorities. Like I said, these pages rank well right now, even with thin content. The worry is if we noindex them while we slowly build out content, will our competitors get the edge on those terms (with their subpar but continually available content)? Do you think Google will give us any credit for having had the page all along, just not always indexed?
Intermediate & Advanced SEO | | THandorf0 -
Massive SERP crash
I came in this morning to a swath of email updates from Moz. Our site had jumped in ranking in the four geographic regions we target and we were seeing (more or less) the best results we've ever had. Most of the jumps were only 1 - 4 places but they were for our most competitive keywords. Late this afternoon I did some spot checks on keywords to confirm that we were still holding roughly in the same positions and before I reported this to the wider business. We're not. What's more we have dropped off the first page for some terms and don't seem to be ranked at all for for others. The only keywords we're getting good rankings for are branded search terms. I can only assume this is due to a technical issues over the weekend our developers caused. I'm not completely across this but I at some point our sitemaps stopped working and a huge number of links on the site were broken. Could a massive surge in 404s cause this? I'm checking google analytics and I can't see a drop in organic traffic yet although I don't have the full figures for today. Thanks
Intermediate & Advanced SEO | | ahyde0 -
How many pages should be on landscapers website
Hi Guys, We have a good website strong onsite and offsite seo. A year ago, we had a 15 pages website for all main keywords we needed and we were on top 3 for most of these keywords in google. We were happy but we wanted more.. So we created lots of unique content targeting long tail keywords and created 100 more pages for the website. In next 4-5 months we lost positions for almost all our main keywords but got lots of longtails SERPs. Trafiic grew but the quality and the conversion rate shrinked. Everybody keep saying that it doesn't matter how many pages you have on the website as long as content is unique and I don't think it is true. I see lots of 3-5 paged websites without any seo in top 3 results in google. Does it mean that if I delete all these 100 pages that I created I will have more chances to get my main keywords SERP back? Basically does the seo juice that you have on domain is spreading across all pages and the more pages you have the less juice every page will get?
Intermediate & Advanced SEO | | vadimmarusin100 -
How to have pages re-indexed
Hi, my hosting company has blocked one my web site seeing it has performance problem. Result of that, it is now reactivated but my pages had to be reindexed. I have added my web site to Google Webmaster tool and I have submitted my site map. After few days it is saying: 103 number of URLs provided 39 URLs indexed I know Google doesn't promesse to index every page but do you know any way to increase my chance to get all my pages indexed? By the way, that site include pages and post (blog). Thanks for your help ! Nancy
Intermediate & Advanced SEO | | EnigmaSolution0 -
Pages with Little Content
I have a website that lists events in Dublin, Ireland. I want to provide a comprehensive number of listings but there are not enough hours in the day to provide a detailed (or even short) unique description for every event. At the moment I have some pages with little detail other than the event title and venue. Should I try and prevent Google from crawling/indexing these pages for fear of reducing the overall ranking of the site? At the moment I only link to these pages via the RSS feed. I could remove the pages entirely from my feed, but then that mean I remove information that might be useful to people following the events feed. Here is an example page with very little content
Intermediate & Advanced SEO | | andywozhere0