Massive Amount of Pages Deindexed
-
On or about 12/1/17 a massive amount of my site's pages were deindexed. I have done the following:
- Ensured all pages are "index,follow"
- Ensured there are no manual penalites
- Ensured the sitemap correlates to all the pages
- Resubmitted to Google
- ALL pages are gone from Bing as well
In the new SC interface, there are 661 pages that are Excluded with 252 being "Crawled - currently not indexed: The page was crawled by Google, but not indexed. It may or may not be indexed in the future; no need to resubmit this URL for crawling." What in the world does this mean and how the heck do I fix this. This is CRITICAL. Please help!
The url is https://www.hkqpc.com
-
the report was run prior canonical directives
Anytime remember to noindex your robots.txt
https://yoast.com/x-robots-tag-play/
There are cases in which the robots.txt file itself might show up in search results. By using an alteration of the previous method, you can prevent this from happening to your website:
<filesmatch "robots.txt"="">Header set X-Robots-Tag "noindex"</filesmatch>
**And in Nginx:**
location = robots.txt { add_header X-Robots-Tag "noindex"; }
-
Looking at the first report, "Redirect Chains".. As I understand the table, these are correct..
Column A is the page (source) with the redirecting link
Column B is the link that is redirecting (http://www.hkqlaw.com)
Column C shows 2 redirects happening
Column I shows the first redirect (http://www.hkqlaw.com -> http://www.hkqpc.com) (non ssl version)
Column N shows the second redirect (http://www.hkqpc.com -> https://www.hkqpc.com) (ssl version)The original link (hkqlaw.com) is a link in the footer of our news section so is common on those pages which is why it shows so often. So, like I said, this appears to be correct.
I added the canonical directives to the pages earlier so perhaps that report was run prior to me doing that?
Again, thanks so much for your effort in helping me!
-
Now I'm really baffled. I just ran Screaming Frog and don't see any of the redirects or other stats. Which software are you using that is showing this information? I'm trying to replicate it and figure out if there's something, somewhere else doing this.
-
Wow, I got it
your 301 redirecting a ton of URLs back to the homepage.
- Redirect chains https://bseo.io/cZW0w0
- internal URLs https://bseo.io/4sFqUk
- insecure content https://bseo.io/YDDKGD
- no canonical https://bseo.io/fWey1Q
- crawl overview https://bseo.io/Zg6bpM
- canonical errors https://bseo.io/YtTh7W
-
Ok, canonical is set for each page (and I fixed the // issue). I used x-robots header to noindex the robots.txt and sitemap.xml files, along with a few other extensions while I was at it.
I'll get the secured cookie header set after this is resolved. We don't store any sensitive data via cookies for this site so it's not of immediate concern but still one I'll address.
EDIT: The https://www.hkqpc.com/attorney/David-Saba.html/ page no longer exists which was the cause of the errors. I've redirected that to the appropriate page.
-
https://cryptoreport.websecurity.symantec.com/checker/
This server cannot be scanned for these vulnerabilities:HeartbleedServer scan unsuccessful. <a>See possible causes.</a>Poodle (TLS)Server scan unsuccessful. See possible causes.BEASTThis server is vulnerable to a BEAST attack. <a>More information.</a>
I am sorry I said your IP was Network solutions when it was 1&1 I still strongly recommend changing hosting companies even though I am German and so is 1&1
DNS resolves www.hkqpc.com to 74.208.236.66
The SSL certificate used to load resources from https://www.hkqpc.com will be distrusted in M70. Once distrusted, users will be prevented from loading these resources. See https://g.co/chrome/symantecpkicerts for more information.
Look: https://cl.ly/pCY5
Look: https://cl.ly/pAKa
symantec SSL certificates are now owned by DigiCert
<big>https://www.digicert.com/help/</big>
https://www.dareboost.com/en/report/5a70b33e0cf28f017576367f
The Set-Cookie HTTP header can be configured with your Apache server. Make sure that the mod_headers module is enabled. Then, you can specify the header (in your .htaccess file, for example). Here is an example: <ifmodule mod_headers.c=""># only for Apache > 2.2.4: Header edit Set-Cookie ^(.*)$ $1;HttpOnly;Secure # lower versions: Header set Set-Cookie HttpOnly;Secure</ifmodule>
- robots.txt file inside of the SERPS big photo https://i.imgur.com/cJeDR9t.png
- XML sitemap inside of SERPS should be no indexed big photo https://i.imgur.com/tlx5jc7.png
Double forward slashes after verdicts the same page without double forward slashes you need to add rel canonical tags zero canonical's on any page whatsoever.
- https://www.hkqpc.com/news/verdicts//hkq-attorneys-win-carbon-county-real-estate-case/
- https://www.hkqpc.com/news/verdicts/hkq-attorneys-win-carbon-county-real-estate-case/
The URLs above need a rel=canonical tag I have created an example below for you. For the page without the double forward slashes, and this tells Google the one you'd prefer to have indexed besides it keeps the query string pages and junk pages out of Google's index. Please see the resources below and add them to your website because I do not know what type of CMS you're using I cannot recommend a plug-in to do it but if you were using something like WordPress it would be automatically done by something like Yoast WordPress SEO for the site that you are using it may be a wise move to move to something like WordPress it is a solid platform for a site that size and makes things a lot easier for you to implement change across the entire site quickly.
- https://moz.com/blog/complete-guide-to-rel-canonical-how-to-and-why-not
- https://yoast.com/rel-canonical/
- https://moz.com/blog/canonical-url-tag-the-most-important-advancement-in-seo-practices-since-sitemaps
You need to add a canonical
- Bigger photo of problem https://i.imgur.com/1qMMPSM.png
- this page https://www.hkqpc.com/attorney/David-Saba.html/
- Warning: Creating default object from empty value in /homepages/43/d238880598/htdocs/classes/class.attorneys.php on line 38
- Warning: Invalid argument supplied for foreach() in /homepages/43/d238880598/htdocs/headers/attorney.php on line 15
- ** FIx for this**
- https://stackoverflow.com/questions/14806959/how-to-fix-creating-default-object-from-empty-value-warning-in-php
- http://thisinterestsme.com/invalid-argument-supplied-for-foreach/
You have
Heartbleed Vulnerability
An unknown error occurred while scanning for the Heartbleed Bug.
-
Thanks for the great feedback! The hkqlaw.com url simply forwards (301) to hkqpc.com. The IP address you have is for hkqlaw.com which is registered through Network Solutions, but hosting of hkqpc.com is on 1and1.com hosting. Also, the timeout error you're getting is because there is no SSL cert for hkqlaw.com, again, it's just forwarded to hkqpc.com (which does have an SSL attached to it). As far as SC, everything is setup to index hkqpc.com.
-
Right now I cannot get that site to load on my browser, and when I used https://tools.pingdom.com it was unable to load as well you could be having some serious server problems, and that could be causing the issue although I was getting it to run through screaming frog which is surprising.
This is a zip file of your screen frog results this will show if there are any no index pages which I found none of it looks to me like you have a server issue. Zip file: http://bseo.io/BXYpZh
I checked your site for malware using https://sitecheck.sucuri.net/results/www.hkqlaw.com/ ( please understand this only check the homepage and a handful of others) and found none though when I checked your IP address I noticed a lot of ransomware information tied directly to your IP
https://ransomwaretracker.abuse.ch/ip/205.178.189.131/
Here is a large screenshot of when I tried to browse your website: https://i.imgur.com/OzcLhbx.png
Here is Pingdom ( remember to test on something outside of your local computer because you have caching and other things that could give you incorrect results.)
https://tools.pingdom.com/#!/bd6d52/https://www.hkqlaw.com/
in my experience network solutions, hosting is terrible I would strongly suggest doing two things.
Get a better hosting company for your site.
A good host that is not too expensive is and also managed is liquid Web, cloudways, rack space, pairnic, you can also build out your own system on non-managed hosting like Linode, digital ocean, AWS, Google cloud, Microsoft Azure if you want a high-quality, inexpensive manage host that offers more than one back and like the ones I've listed above https://www.cloudways.com/en/ will host anything and manage it, and you can use the backends provided before this. If you want what I think is the best and price is not a big deal considering you're not running WordPress https://armor.com is my preferred hosting company. Otherwise, cloudways or liquid Web would be where I would host your site.
Considering you already have an IP address attached to ransomware and you're using hosting company that will not be beneficial to you in security terms. I would add a web application firewall/reverse proxy you can do that with https://sucuri.net/website-firewall/ https://incapsula.com https://fastly.com and if you want most basic and least secure but better than what you have https://cloudflare.com
At the very least put Cloudflare on their but what I'm seeing is a severe problem coming from your web host and knowing that hosting company I would strongly advise you to move to a better host.
I hope this was of help,
Thomas
-
Not sure if this is of help to you, I suppose it depends how many pages you are expecting to be indexed, but according to John Mu at Google - Google does not necessarily index all pages.
https://www.seroundtable.com/google-index-all-pages-20780.html
-
Not recently. It migrated well over a year ago to HTTPS.
-
First thing to confirm - did you recently migrate to HTTPS?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Massive local + national disconnect in rankings (local deindexed)
I asked the question originally on webmaster central. I tried RickRoll's solutions (but it doesn't seem to have solved the issue). Problem below: I've been noticing for some time that certain pages of our site (https://www.renthop.com/boston-ma/apartments-for-rent) have been deindexed locally (or very low ranked), but indexed nationally (well ranked). In fact, it seems that the actual page isn't ranking (but the blog https://www.renthop.com/blog is). This huge mismatch between national vs local rankings seem to only happen for Boston & Chicago. Other parts of the country seem unaffected (and the national & local rankings are very similar). A bit of a background (and my personal theory as to what's happening). We use to have subdomains: boston.renthop.com & chicago.renthop.com for the site. These subdomains stopped working, though, as we moved the site to the directory format (https://www.renthop.com/boston-ma/apartments-for-rent). These subdomain URLs were inactive / broken for roughly 4 months. After the 4 months, we did a 301 from the subdomain to the main page (because these subdomains had inbound external links). However, this seems to have caused the directory pages to exhibit the national/local mismatch effect instead of helping. Is there anything I'm doing wrong? I'm not sure if the mismatch is natural, if the pages are getting algo penalized on a local level (I'm negative SEOing myself), or if it's stuck in some weird state because of what happened with bad sub-domain move). Some things I've tried: I've created webmaster console (verified) accounts for both the subdomains. I've asked Google to crawl those links. I've done a 1-1 mapping between individual page on the old site vs the new directory format I've tried both doing a 301, 302 and meta-refresh redirect from the subdomains to the directory pages. I've made sure the robots.txt on the subdomain is working properly I've made sure that the robots.txt on the directory pages are working properly. See below for a screenshot of the mismatch & deindexing in local search results (this is using SERPS - but can be replicated with any location changer). Note the difference between the ranking (and the page) when the search is done nationally vs in the actual location (Boston, MA). I'd really appreciate any help.. I've been tearing my hair out trying to figure this out (as well as experimenting). renthop%2Bboston.png
Intermediate & Advanced SEO | | lzhou0 -
Pagination on a product page with reviews spread out on multiple pages
Our current product pages markup only have the canonical URL on the first page (each page loads more user reviews). Since we don't want to increase load times, we don't currently have a canonical view all product page. Do we need to mark up each subsequent page with its own canonical URL? My understanding was that canonical and rel next prev tags are independent of each other. So that if we mark up the middle pages with a paginated URL, e.g: Product page #1http://www.example.co.uk/Product.aspx?p=2692"/>http://www.example.co.uk/Product.aspx?p=2692&pageid=2" />**Product page #2 **http://www.example.co.uk/Product.aspx?p=2692&pageid=2"/>http://www.example.co.uk/Product.aspx?p=2692" />http://www.example.co.uk/Product.aspx?p=2692&pageid=3" />Would mean that each canonical page would suggest to google another piece of unique content, which this obviously isn't. Is the PREV NEXT able to "override" the canonical and explain to Googlebot that its part of a series? Wouldn't the canonical then be redundant?Thanks
Intermediate & Advanced SEO | | Don340 -
I've seen and heard alot about city-specific landing pages for businesses with multiple locations, but what about city-specific landing pages for cities nearby that you aren't actually located in? Is it ok to create landing pages for nearby cities?
I asked here https://www.google.com/moderator/#7/e=adbf4 but figured out ask the Moz Community also! Is it actually best practice to create landing pages for nearby cities if you don't have an actual address there? Even if your target customers are there? For example, If I am in Miami, but have a lot of customers who come from nearby cities like Fort Lauderdale is it okay to create those LP's? I've heard this described as best practice, but I'm beginning to question whether Google sees it that way.
Intermediate & Advanced SEO | | RickyShockley2 -
How do you transition a keyword rank from a home page to a sub-page on the site?
We're currently ranking #1 for a valuable keyword, but the result on the SERP is our home page. We're creating a new product page focused on this keyword to provide a better user experience and create more relevant content. What is the best way to make a smooth transition to make the product page rank #1 for the keyword instead of the home page?
Intermediate & Advanced SEO | | buildasign0 -
Bridge page problem
Hello I run this website, http://www.bestillkredittkort.no (Norwegian website) I'm working all i can to make it rank, so i wanted to test adwords to see how good the page converted.
Intermediate & Advanced SEO | | katal
.
Problem is my page got labeled bridge page. I have read the Google guideline for fixing bridge page and tried to fix what they suggest to make them accept the page, but its not working. I might think there's a underlying problem here, but im not sure how to fix it. I've even seen People from the same niche running adwords campaign with less content on the target page. Last time i tried to recheck if my site would get approved. I ran over the quality score tab. And when it was in pending it showed 10/10 in every aspect. was that just a sample ? I'm realy confused on this one. And im afraid it might be a problem witht the page that can destroy my seo efforts. Anyone have any suggestion or feedback on this one?0 -
Page Authority Issue
My home page http://www.musicliveuk.com has a domain authority of 42 and page authority of 52. However I have set up other pages on the site to optimise for one keyword per page as I thought this was best practice. For example http://www.musicliveuk.com/home/wedding-bands targets 'wedding band' but this has a page authority of 24 way below my competitors. Having used the keyword difficulty tool on here it appears that is why I am struggling to rank highly (number 9). This is the same problem for several of my main keywords. I am building links to this and other pages in order to increase their authority and eventually rank highly but am I not better off optimising my home page that already has a good page authority and would probably out rank my competitors? Or am I missing something?
Intermediate & Advanced SEO | | SamCUK0 -
Removing Duplicate Page Content
Since joining SEOMOZ four weeks ago I've been busy tweaking our site, a magento eCommerce store, and have successfully removed a significant portion of the errors. Now I need to remove/hide duplicate pages from the search engines and I'm wondering what is the best way to attack this? Can I solve this in one central location, or do I need to do something in the Google & Bing webmaster tools? Here is a list of duplicate content http://www.unitedbmwonline.com/?dir=asc&mode=grid&order=name http://www.unitedbmwonline.com/?dir=asc&mode=list&order=name
Intermediate & Advanced SEO | | SteveMaguire
http://www.unitedbmwonline.com/?dir=asc&order=name http://www.unitedbmwonline.com/?dir=desc&mode=grid&order=name http://www.unitedbmwonline.com/?dir=desc&mode=list&order=name http://www.unitedbmwonline.com/?dir=desc&order=name http://www.unitedbmwonline.com/?mode=grid http://www.unitedbmwonline.com/?mode=list Thanks in advance, Steve0 -
Privacy policy page
I only link to my privacy policy page from the homepage, but the privacy policy page has a pr4, while the main domain has a pr5. Using site:domain name the policy page is at the top of the 2nd page of google so it ranks high. I was thinking of either nofollowing the link or adding a (noindex,follow) directive on the policy page, until I saw some seo professional sites using rel=canonical on their policy pages that points to their policy page itself. Am I better off using the (noindex,follow) or rel=canonical = policy page ? thanks
Intermediate & Advanced SEO | | Flapjack0