Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Massive Amount of Pages Deindexed
-
On or about 12/1/17 a massive amount of my site's pages were deindexed. I have done the following:
- Ensured all pages are "index,follow"
- Ensured there are no manual penalites
- Ensured the sitemap correlates to all the pages
- Resubmitted to Google
- ALL pages are gone from Bing as well
In the new SC interface, there are 661 pages that are Excluded with 252 being "Crawled - currently not indexed: The page was crawled by Google, but not indexed. It may or may not be indexed in the future; no need to resubmit this URL for crawling." What in the world does this mean and how the heck do I fix this. This is CRITICAL. Please help!
The url is https://www.hkqpc.com
-
the report was run prior canonical directives
Anytime remember to noindex your robots.txt
https://yoast.com/x-robots-tag-play/
There are cases in which the robots.txt file itself might show up in search results. By using an alteration of the previous method, you can prevent this from happening to your website:
<filesmatch "robots.txt"="">Header set X-Robots-Tag "noindex"</filesmatch>
**And in Nginx:**
location = robots.txt { add_header X-Robots-Tag "noindex"; }
-
Looking at the first report, "Redirect Chains".. As I understand the table, these are correct..
Column A is the page (source) with the redirecting link
Column B is the link that is redirecting (http://www.hkqlaw.com)
Column C shows 2 redirects happening
Column I shows the first redirect (http://www.hkqlaw.com -> http://www.hkqpc.com) (non ssl version)
Column N shows the second redirect (http://www.hkqpc.com -> https://www.hkqpc.com) (ssl version)The original link (hkqlaw.com) is a link in the footer of our news section so is common on those pages which is why it shows so often. So, like I said, this appears to be correct.
I added the canonical directives to the pages earlier so perhaps that report was run prior to me doing that?
Again, thanks so much for your effort in helping me!
-
Now I'm really baffled. I just ran Screaming Frog and don't see any of the redirects or other stats. Which software are you using that is showing this information? I'm trying to replicate it and figure out if there's something, somewhere else doing this.
-
Wow, I got it
your 301 redirecting a ton of URLs back to the homepage.
- Redirect chains https://bseo.io/cZW0w0
- internal URLs https://bseo.io/4sFqUk
- insecure content https://bseo.io/YDDKGD
- no canonical https://bseo.io/fWey1Q
- crawl overview https://bseo.io/Zg6bpM
- canonical errors https://bseo.io/YtTh7W
-
Ok, canonical is set for each page (and I fixed the // issue). I used x-robots header to noindex the robots.txt and sitemap.xml files, along with a few other extensions while I was at it.
I'll get the secured cookie header set after this is resolved. We don't store any sensitive data via cookies for this site so it's not of immediate concern but still one I'll address.
EDIT: The https://www.hkqpc.com/attorney/David-Saba.html/ page no longer exists which was the cause of the errors. I've redirected that to the appropriate page.
-
https://cryptoreport.websecurity.symantec.com/checker/
This server cannot be scanned for these vulnerabilities:HeartbleedServer scan unsuccessful. <a>See possible causes.</a>Poodle (TLS)Server scan unsuccessful. See possible causes.BEASTThis server is vulnerable to a BEAST attack. <a>More information.</a>
I am sorry I said your IP was Network solutions when it was 1&1 I still strongly recommend changing hosting companies even though I am German and so is 1&1
DNS resolves www.hkqpc.com to 74.208.236.66
The SSL certificate used to load resources from https://www.hkqpc.com will be distrusted in M70. Once distrusted, users will be prevented from loading these resources. See https://g.co/chrome/symantecpkicerts for more information.
Look: https://cl.ly/pCY5
Look: https://cl.ly/pAKa
symantec SSL certificates are now owned by DigiCert
<big>https://www.digicert.com/help/</big>
https://www.dareboost.com/en/report/5a70b33e0cf28f017576367f
The Set-Cookie HTTP header can be configured with your Apache server. Make sure that the mod_headers module is enabled. Then, you can specify the header (in your .htaccess file, for example). Here is an example: <ifmodule mod_headers.c=""># only for Apache > 2.2.4: Header edit Set-Cookie ^(.*)$ $1;HttpOnly;Secure # lower versions: Header set Set-Cookie HttpOnly;Secure</ifmodule>
- robots.txt file inside of the SERPS big photo https://i.imgur.com/cJeDR9t.png
- XML sitemap inside of SERPS should be no indexed big photo https://i.imgur.com/tlx5jc7.png
Double forward slashes after verdicts the same page without double forward slashes you need to add rel canonical tags zero canonical's on any page whatsoever.
- https://www.hkqpc.com/news/verdicts//hkq-attorneys-win-carbon-county-real-estate-case/
- https://www.hkqpc.com/news/verdicts/hkq-attorneys-win-carbon-county-real-estate-case/
The URLs above need a rel=canonical tag I have created an example below for you. For the page without the double forward slashes, and this tells Google the one you'd prefer to have indexed besides it keeps the query string pages and junk pages out of Google's index. Please see the resources below and add them to your website because I do not know what type of CMS you're using I cannot recommend a plug-in to do it but if you were using something like WordPress it would be automatically done by something like Yoast WordPress SEO for the site that you are using it may be a wise move to move to something like WordPress it is a solid platform for a site that size and makes things a lot easier for you to implement change across the entire site quickly.
- https://moz.com/blog/complete-guide-to-rel-canonical-how-to-and-why-not
- https://yoast.com/rel-canonical/
- https://moz.com/blog/canonical-url-tag-the-most-important-advancement-in-seo-practices-since-sitemaps
You need to add a canonical
- Bigger photo of problem https://i.imgur.com/1qMMPSM.png
- this page https://www.hkqpc.com/attorney/David-Saba.html/
- Warning: Creating default object from empty value in /homepages/43/d238880598/htdocs/classes/class.attorneys.php on line 38
- Warning: Invalid argument supplied for foreach() in /homepages/43/d238880598/htdocs/headers/attorney.php on line 15
- ** FIx for this**
- https://stackoverflow.com/questions/14806959/how-to-fix-creating-default-object-from-empty-value-warning-in-php
- http://thisinterestsme.com/invalid-argument-supplied-for-foreach/
You have
Heartbleed Vulnerability
An unknown error occurred while scanning for the Heartbleed Bug.
-
Thanks for the great feedback! The hkqlaw.com url simply forwards (301) to hkqpc.com. The IP address you have is for hkqlaw.com which is registered through Network Solutions, but hosting of hkqpc.com is on 1and1.com hosting. Also, the timeout error you're getting is because there is no SSL cert for hkqlaw.com, again, it's just forwarded to hkqpc.com (which does have an SSL attached to it). As far as SC, everything is setup to index hkqpc.com.
-
Right now I cannot get that site to load on my browser, and when I used https://tools.pingdom.com it was unable to load as well you could be having some serious server problems, and that could be causing the issue although I was getting it to run through screaming frog which is surprising.
This is a zip file of your screen frog results this will show if there are any no index pages which I found none of it looks to me like you have a server issue. Zip file: http://bseo.io/BXYpZh
I checked your site for malware using https://sitecheck.sucuri.net/results/www.hkqlaw.com/ ( please understand this only check the homepage and a handful of others) and found none though when I checked your IP address I noticed a lot of ransomware information tied directly to your IP
https://ransomwaretracker.abuse.ch/ip/205.178.189.131/
Here is a large screenshot of when I tried to browse your website: https://i.imgur.com/OzcLhbx.png
Here is Pingdom ( remember to test on something outside of your local computer because you have caching and other things that could give you incorrect results.)
https://tools.pingdom.com/#!/bd6d52/https://www.hkqlaw.com/
in my experience network solutions, hosting is terrible I would strongly suggest doing two things.
Get a better hosting company for your site.
A good host that is not too expensive is and also managed is liquid Web, cloudways, rack space, pairnic, you can also build out your own system on non-managed hosting like Linode, digital ocean, AWS, Google cloud, Microsoft Azure if you want a high-quality, inexpensive manage host that offers more than one back and like the ones I've listed above https://www.cloudways.com/en/ will host anything and manage it, and you can use the backends provided before this. If you want what I think is the best and price is not a big deal considering you're not running WordPress https://armor.com is my preferred hosting company. Otherwise, cloudways or liquid Web would be where I would host your site.
Considering you already have an IP address attached to ransomware and you're using hosting company that will not be beneficial to you in security terms. I would add a web application firewall/reverse proxy you can do that with https://sucuri.net/website-firewall/ https://incapsula.com https://fastly.com and if you want most basic and least secure but better than what you have https://cloudflare.com
At the very least put Cloudflare on their but what I'm seeing is a severe problem coming from your web host and knowing that hosting company I would strongly advise you to move to a better host.
I hope this was of help,
Thomas
-
Not sure if this is of help to you, I suppose it depends how many pages you are expecting to be indexed, but according to John Mu at Google - Google does not necessarily index all pages.
https://www.seroundtable.com/google-index-all-pages-20780.html
-
Not recently. It migrated well over a year ago to HTTPS.
-
First thing to confirm - did you recently migrate to HTTPS?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Is it ok to repeat a (focus) keyword used on a previous page, on a new page?
I am cataloguing the pages on our website in terms of which focus keyword has been used with the page. I've noticed that some pages repeated the same keyword / term. I've heard that it's not really good practice, as it's like telling google conflicting information, as the pages with the same keywords will be competing against each other. Is this correct information? If so, is the alternative to use various long-winded keywords instead? If not, meaning it's ok to repeat the keyword on different pages, is there a maximum recommended number of times that we want to repeat the word? Still new-ish to SEO, so any help is much appreciated! V.
Intermediate & Advanced SEO | | Vitzz1 -
Few pages without SSL
Hi, A website is not fully secured with a SSL certificate.
Intermediate & Advanced SEO | | AdenaSEO
Approx 97% of the pages on the website are secured. A few pages are unfortunately not secured with a SSL certificate, because otherwise some functions on those pages do not work. It's a website where you can play online games. These games do not work with an SSL connection. Is there anything we have to consider or optimize?
Because, for example when we click on the secure lock icon in the browser, the following notice.
Your connection to this site is not fully secured Can this harm the Google ranking? Regards,
Tom1 -
What are best page titles for sub-domain pages?
Hi Moz communtity, Let's say a website has multiple sub-domains with hundreds and thousands of pages. Generally we will be mentioning "primary keyword & "brand name" on every page of website. Can we do same on all pages of sub-domains to increase the authority of website for this primary keyword in Google? Or it gonna end up as negative impact if Google consider as duplicate content being mentioned same keyword and brand name on every page even on website and all pages of sub domains? Thanks
Intermediate & Advanced SEO | | vtmoz0 -
Location Pages On Website vs Landing pages
We have been having a terrible time in the local search results for 20 + locations. I have Places set up and all, but we decided to create location pages on our sites for each location - brief description and content optimized for our main service. The path would be something like .com/location/example. One option that has came up in question is to create landing pages / "mini websites" that would probably be location-example.url.com. I believe that the latter option, mini sites for each location, would be a bad idea as those kinds of tactics were once spammy in the past. What are are your thoughts and and resources so I can convince my team on the best practice.
Intermediate & Advanced SEO | | KJ-Rodgers0 -
On 1 of our sites we have our Company name in the H1 on our other site we have the page title in our H1 - does anyone have any advise about the best information to have in the H1, H2 and Page Tile
We have 2 sites that have been set up slightly differently. On 1 site we have the Company name in the H1 and the product name in the page title and H2. On the other site we have the Product name in the H1 and no H2. Does anyone have any advise about the best information to have in the H1 and H2
Intermediate & Advanced SEO | | CostumeD0 -
Different Header on Home Page vs Sub pages
Hello, I am an SEO/PPC manager for a company that does a medical detox. You can see the site in question here: http://opiates.com. My question is, I've never heard of it specifically being a problem to have a different header on the home page of the site than on the subpages, but I rarely see it either. Most sites, if i'm not mistaken, use a consistent header across most of the site. However, a person i'm working for now said that she has had other SEO's look at the site (above) and they always say that it is a big SEO problem to have a different header on the homepage than on the subpages. Any thoughts on this subject? I've never heard of this before. Thanks, Jesse
Intermediate & Advanced SEO | | Waismann0 -
Effect of Removing Footer Links In all Pages Except Home Page
Dear MOZ Community: In an effort to improve the user interface of our business website (a New York CIty commercial real estate agency) my designer eliminated a standardized footer containing links to about 20 pages. The new design maintains this footer on the home page, but all other pages (about 600 eliminate the footer). The new design does a very good job eliminating non essential items. Most of the changes remove or reduce the size of unnecessary design elements. The footer removal is the only change really effect the link structure. The new design is not launched yet. Hoping to receive some good advice from the MOZ community before proceeding My concern is that removing these links could have an adverse or unpredictable effect on ranking. Last Summer we launched a completely redesigned version of the site and our ranking collapsed for 3 months. However unlike the previous upgrade this modifications does not URL names, tags, text or any major element. Only major change is the footer removal. Some of the footer pages provide good (not critical) info for visitors. Note the footer will still appear on the home page but will be removed on the interior pages. Are we risking any detrimental ranking effect by removing this footer? Can we compensate by adding text links to these pages if the links from the footer are removed? Seems irregular to have a home page footer but no footer on the other pages. Are we inviting any downgrade, penalty, adverse SEO effect by implementing this? I very much like the new design but do not want to risk a fall in rank and traffic. Thanks for your input!!!
Intermediate & Advanced SEO | | Kingalan1
Alan0 -
Blocking Pages Via Robots, Can Images On Those Pages Be Included In Image Search
Hi! I have pages within my forum where visitors can upload photos. When they upload photos they provide a simple statement about the photo but no real information about the image,definitely not enough for the page to be deemed worthy of being indexed. The industry however is one that really leans on images and having the images in Google Image search is important to us. The url structure is like such: domain.com/community/photos/~username~/picture111111.aspx I wish to block the whole folder from Googlebot to prevent these low quality pages from being added to Google's main SERP results. This would be something like this: User-agent: googlebot Disallow: /community/photos/ Can I disallow Googlebot specifically rather than just using User-agent: * which would then allow googlebot-image to pick up the photos? I plan on configuring a way to add meaningful alt attributes and image names to assist in visibility, but the actual act of blocking the pages and getting the images picked up... Is this possible? Thanks! Leona
Intermediate & Advanced SEO | | HD_Leona0