Homepage refusing to show up in Google (rest of pages fine)
-
edit
-
Ah, I was wondering since they may have entirely different pricing based upon who you talk to.
-
SiteLock
-
So, on an invoice, do you or the client pay Incapsula or SiteLock?
-
Exactly, I've been told that these problems surfaced around the time the firewall was put up. I've just removed the timthumb file and I'm working on disavowing the spammy links pointing to us. I'm considering ditching sitelock in the next few days and seeing if that helps at all. We were also looking at Sucuri as a firewall option as well.
-
All of the header checks I've done come back with Incapsula. I don't really want to get much further into that for a number of reasons. But if you're actually paying SiteLock that's pretty interesting.
But you're saying the site ranked for it's brand term, at least, before implementing either SiteLock or Incapsula?
-
This is a huge help. I spent some time yesterday going through the site and updating my links to https where possible. Those don't all appear to have indexed yet. The bit about the timthumb exploit is particularly helpful. My theme lets me disable that, and I can get rid of the timthumb php file. I'm still concerned that sitelock could be exaggerating the problem though, we started having these issues with google around when it was implemented.
-
The site is using Incapsula as a CDN and web application firewall. The site still has a timthumb file. So I wouldn't recommend stepping out from behind that right now.
A wildcard search on the domain yields a lot of spam backlinks. Check ahrefs.
-
The entire site appears to index fine. As Patrick pointed out, it appears some of the pages in the index aren't https. But I don't know when you made the move, so things may be chugging right along.
The issue is ranking. But I know what you mean.
So what we have is (not all bad, per se - just what I see):
- Previously hacked site
- Timthumb file
- Some very spammy links
- HTTPS implemented on unknown date
- Moved to CDN / WAF
- Redirects
No doubt, you're going to have to disavow the bad links. Take down requests are nice and all, and you should note them in your disavow submission, but you don't have to manually contact each individual link/domain. It's not really a fire-and-forget process. You can submit it more than once.
I would bet a shiny nickle the attack/hack exploited the timthumb file. The site still uses it. Stop using it. Find an alternative. All it does is resize images.
The https migration (redirects... etc.) is just a confounding factor.
After you've removed the timthumb file, request a security review. Also consider the site may still have issues from the hack. So fetch as google from Webmaster Tools. If you see anything different than the real page, you still have a problem.
Read a little more about recovering from a hacked site here. I think that's more than likely the core of the problem right now.
-
Let me guess - you're using SiteLock after you were hacked to keep them out?
SiteLock creates this issue frequently (we solved it for another Q&A user about a month ago.)
Disable SiteLock, check your settings are all right in Webmasters Tools and Fetch the page in WMT. Add a link to it on Google+ so it gets recrawled quickly.
I only see 1 backlink to the site from Ahrefs (https://ahrefs.com/site-explorer/overview/subdomains?target=www.newstaradhesives.com) and only 2 in Majestic (https://majestic.com/reports/site-explorer?folder=&q=www.newstaradhesives.com)
Very, very low authority & SiteLock - those would be the two I'd start with.
-
It absolutely was very hacked. I'm currently in the process of submitting takedowns manually for those spam posts in google's index. The site has been cleaned up and relaunched since. Could these be harming the indexing of the homepage as well?
-
I think Incapsula is throwing the false noindex tag. But yeah, that's just how Incapsula do. The home page shows just fine with a site: operator.
Judging by the anchor text I see pointed at the site... and the Timthumbs.php file... the site was very very hacked at some point.
Edit: Yep. It was hacked until late last year.
-
Hi Patrick
Thanks for taking a look. If I could ask, where are you seeing this noindex tag and what are you using to see it? I've got my homepage set up in the yoast seo plugin to index and follow, and I had also previously added a into my header just to make sure. My suspicion is that the sitelock firewall installed on our site right now is blocking robots. Does this make any sense?
Thanks again
-
I wanted to attach this image - in my crawl, I am getting a "noindex,nofollow" but your code isn't showing it. I would check with your web development team to see what exactly is happening and how this can be fixed.
-
Hi there
It appears your homepage has a "noindex,nofollow" tag - change this to "index,follow". Make sure this is fixed across the site.
If for some reason that doesn't work (which it will):
Have you checked to see if you have a manual action?
If you have multiple URLs going on with the same content - check your canonical tags and make sure you do a content audit to see if this information can be removed, consolidated, or updated. Your SSL seems to not be configured properly also.
I would also make sure that you do a backlink audit to see if any links can be removed or updated. Also, check your local SEO presence and that everything is on point and consistent. Same with on-site SEO.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
NO Meta description pulling through in SERP with react website - Requesting Indexing & Submitting to Google with no luck
Hi there, A year ago I launched a website using react, which has caused Google to not read my meta descriptions. I've submitted the sitemap and there was no change in the SERP. Then, I tried "Fetch and Render" and request indexing for the homepage, which did work, however I have over 300 pages and I can't do that for every one. I have requested a fetch, render and index for "this url and linked pages," and while Google's cache has updated, the SERP listing has not. I looked in the Index Coverage report for the new GSC and it says the urls and valid and indexable, and yet there's still no meta description. I realize that Google doesn't have to index all pages, and that Google may not also take your meta description, but I want to make sure I do my due diligence in making the website crawlable. My main questions are: If Google didn't reindex ANYTHING when I submitted the sitemap, what might be wrong with my sitemap? Is submitting each url manually bad, and if so, why? Am I simply jumping the gun since it's only been a week since I requested indexing for the main url and all the linked urls? Any other suggestions?
Web Design | | DigitalMarketingSEO1 -
How does Google's AJAX Announcement Impact the likes of AngularJS?
Google's announcement last month about depreciating their AJAX crawl directive and Distilled's recent article have got me thinking a lot about how this change impacts frameworks like AngularJS. For those of you that use or are considering using frameworks like AngularJS, does this change impact you? Has it changed your mind about services like Prerender etc? All discussions relating to AJAX crawling welcome. Some resources to get started: https://prerender.io/js-seo/angularjs-seo-get-your-site-indexed-and-to-the-top-of-the-search-results/ https://www.distilled.net/resources/prerender-and-you-a-case-study-in-ajax-crawlability/
Web Design | | ecommercebc1 -
40 percent redundant content on landing pages with 60 percent unique information.
I have searched schema.org for tags to use for our redudant content on 25 unique local landing pages. The redundant content references our services and abilities on each page. Could anyone tell me how to retain this content and direct the search engines to disregard this portion of the landing page. We are a WordPress site -- if there is a plugin - I would love to know which one might work, although I have not been able to find one that will protect us from duplicate content issues. Thank you in advance.
Web Design | | seant1190 -
How to do a non-spammy "doorway page"?
Hi there, ISSUE: I have a client who wishes to use a "doorway" page, but not in a spammy way. He would like to have a nice crisp URL for use in ads/brochures. The page is strictly a landing page (just with a separate URL). DOORWAY/LANDING PAGE WILL BE: Non-spammy -- There will be no attempt to optimize the landing page/no attempt to get the page to rank. Strictly a vanity URL -- he likes the way a separate website looks in ads as opposed to a landing page on the existing website (i.e., www.websitename.com/landing page) WHAT I'M TRYING TO DO: I'm basically trying to figure out what the best things to do to protect his other sites (which are very high quality valuable sites which rank well) from getting punished. STEPS I'M CONSIDERING: Robots no follow Separate hosting server Different person's name on a private domain registration Adding additional pages, so it's not a 1-page "doorway" Many thanks in advance to anyone who would share their experience and help me protect my client in the best way possible. I've told him there are risks, but he still wants to go ahead. MC
Web Design | | marketingcupcake1 -
Schema.org - Right way to mark the pages
Dear all, Almost since we started designing our site, we are using schema microdata. It is not only because of the rich snippets, but because I want the search engines to better understand what we have. For example, the +1 buttom would not work properly without schema microdata, because it kind of ignores the OpenGraph parameters that specified image and description; and since we are a (very small) local bussiness directory (between other things), all our clients have a hand written schema complient description on their lisings, including address, opening ours, telephone number, description, etc. It is hand written by us because the tools avialable are simply not good enough to cover all different scenarios that a listing can present. I have not use, until today, a proper for the homepage, and it is probably the cause that our page lost the nice links below the site description in the google snippet. I did not place it on the body tag, but near the description, closing it inmediately after the description finishs. Now this is solved and we will wait to see if the links come back in the next weeks. Now to the question. Our site has three sections, with three different systems installed, two running wordpress and a third running another script. the main site is the local bussiness directory. The front page is mark as "schema.org/WepPage", and I do not know how to mark the other pages of the main site. I was thinking of marking the listings as "schema.org/ItemPage" since they are related to specific clients. Would you consired it to be right? Then, we have landing pages for the categories, should they be mark as WepPage, or as an Article, or something else? Many thanks in advance for your help, Best Regards, Daniel
Web Design | | te_c0 -
Solutions for too many links on page (Ecommerce)?
Hello Mozzers, Most Ecommerce websites I've come across have four main link sections - Main Nav - About, Contact etc Side Nav - List of Categories + Products Footer - Useful links etc Promotional Area - Promoting Best sellers / Latest products This ends up totalling anything from 200 to 500 links. I was wondering is there a reasonable solution to hide some of the links? Or should I just ignore the warning? Thanks, Dan
Web Design | | Sparkstone0 -
Order of my products on page?
Hi, I read somewhere that Google reads a page in a certain way. All my product pages are listed (or most of them) in Alphabetical order. Now say I am targeting brands named Cruyff and Money Clothing, should I put all the Cruyff and Money products above everything else? See here for example... http://www.designerboutique-online.com/jackets/ They are in Alph order, except the sales items at the bottom. So would it be beneficial to do this? To put my targeted brands at the top of the page? And if not, is there anything I should be doing with the layout of the products to improve/help with SEO? Thanks Will
Web Design | | WillBlackburn0 -
Two home pages?
One of my campaigns shows duplicate page content for domain xxx and xxx/index. There is only one index (home) page, so why does it report on two?
Web Design | | Beemer0