Whitehat site suffering from drastic & negative Keyword/Phrase Shifts out of the blue!
-
I am the developer for a fairly active website in the education sector that offers around 30 courses and has quite an actively published blog a few times a week and social profiles.
The blog doesn't have comments enabled and the type of visitor that visits is usually looking for lessons or a course.
Over the past year we have had an active input in terms of development to keep the site up to date, fast and following modern best practises. IE SSL certificates, quality content, relevant and high powered backlinks ect...
Around a month ago we got hit by quite a large drop in our ranked keywords / phrases which shocked us somewhat.. we attributed it to googles algorithm change dirtying the waters as it did settle up a couple of weeks later.
However this week we have been smashed again by another large change dropping almost 100 keywords some very large positions.
My question is quite simple(I wish)... What gives?
I don't expect to see drops this large from not doing anything negative and I'm unsure it's an algorithm change as my other clients on Moz don't seem to have suffered either so it's either isolated to this target area or it's an issue with something occurring to or on the site?
-
Snowflake,
When you migrate to HTTP's i believe you have to add the new protocol to Search Console. Google looks at HTTP and HTTPs as 2 different sites, which is why you might be seeing your index count going down under your HTTP account in SC. If you add the HTTPs version of your website to search console, you may see that those pages have been indexed under the HTTPs protocol. Check it out, wait a few weeks and see what happens.
Secure Your Site With HTTPS - Search Console Help
https://support.google.com/webmasters/answer/6073543?hl=en -
That is a very good shout!
-
Thanks Don,
I had read that article initially actually which is why I thought a few weeks was enough for it all to have settled back out but maybe I'm expecting a bit much for a 600 page site.
Many thanks for your help I'll maybe just be patient if there is nothing glaringly wrong
-
Also a quick point, if you still have your Google search console setup for HTTP, even though you use HTTPS now, I'd suggest looking at what is being reported as indexed in there. That maybe the missing link.
Cordialement,
Don
-
So I'm not seeing anything blocking crawling on your site which is good. But I did notice that you have at one time used URL types "http" and "https" which leads me to believe you may have recently switched to HTTPS. In such case you should know that it may take Google sometime to adjust. On a technical level, https and http are 2 different domains.
It is highly likely that Google has index the HTTP version of some of these pages which is why your index count maybe lower then normal for the HTTPS version.
I do see you properly 301 redirected these pages and your sitemaps are reflecting the https as well, if again this was a recent change it just looks like its going to take a bit of time for Google to catch up.
This is worth a quick look, https://support.google.com/webmasters/answer/6073543?hl=en (scroll to the bottom) and see the section "Migrating from HTTP to HTTPS".
I sent you additional info in PM.
Hope this helps,
Don
-
We did go from http to https about a month ago but we were careful that all the redirects and sitemaps were reflected correctly. I dont think there is an issue with the robots text (it is present and nothing weird blocking).
I'll take a look at those links and send you a pm - many thanks Don
-
Hi,
There are several reasons.
If you have recently changed your url structure. IE (went from using www to not, or https or not, or trailing / or not). In these cases Google could have indexed the pages already under the "other" version.
Google could be having a crawling error, like in a robots.txt or lack there of. Improper canonical tags, blocked access, improper redirects, or a manual penalty.
If you would like to post a link (or pm me) I will take a look and see if I can spot a potential problem for you.
Here are a couple links on Google that should help:
Why Pages Drop From Index
Overview Pages Not Being CrawledHope this helps,
Don
-
Checking Webmaster Tools it looks like Google has unindexed 500 out of our 630 pages in the last 2 weeks.
Is there any reason for why this maybe?
-
Thanks for your input Donford,
I've had a look in OSE again and I can't see any spam links (all the genuine links are rated 0 through to 3) which looks very good. So it doesn't appear to be a negative campaign against me.
I may try Majestic for peace of mind... it makes it even more the stranger that we are being penalised so much
-
Hi Snowflake,
You can use the OSE (Open Site Explorer) here on Moz to check the links they found. You can download that report to CSV to easily sort and see if you have a possible negative campaign running against you.
You could also use, Majestic, or SEMRush. to find more links. Just note there is no tool, free or paid that is going to be able to get all the links pointing to your site.
If you don't find a lot of spam links to your site, chances are there isn't somebody trying to target you with a negative campaign.
Hope it helps,
-
Thanks Eric,
There are a few languages of the site but as far as I'm aware no duplicate content in the same language but I will check with Siteliner just to be sure.
For disavowing backlinks - is this just via webmaster tools you are recommending to do that? If so we haven't done that yet but it seems sensible to try. When I last checked back links there were a few random sites that we certainly hadn't submitted to and looked spammy but when I went onto them we couldn't see our links.
Do you have a recommendation for a better backlink testing tool?
-
I know this is frustrating. There are a few areas that I would look into that could be causing this: duplicate content issues and links. First, look to see if you have any duplicate content issues on the site. There could be a duplicate copy of the site (perhaps a dev version that should not be indexed) or even certain content on your site that's causing issues. You might try Siteliner's crawler to identify if there are any issues you can fix.
Another possible reason is the links to the site. The site could have been hit by negative SEO, and a lot of "low quality" links or off-topic links could be pointing to your site. I've seen this in the past, and the only thing you can do is identify the links and disavow them. Sometimes you can get them removed, but disavowing them should work.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Hlp with site setup
Hi there and thanks for the great information, certainly lots to take in. Can anyone suggest the best way to setup product / category url structure for a store? At the moment we have something like domainname.com/parentcategory/subcategory/product name.html As the product url, we edited url structure using a plugin, we don't use default WooCommerce url settings. domainname.com/parentcategory/subcategory/product name.html. this can sometimes be long But when you click on the product the url changes to the following. domainname.com/product name.html. This can shorted the url by 40% and still have keyword in url Is there any benefit in doing his? Re canonical urls, I only have about 15 products that are selected in many categories.the other 200 are under once category only. Product pages don't have many backlinks at the moment. Thanking you so much.
White Hat / Black Hat SEO | | IvanaDaulay0 -
Linking Websites/ Plagiarized Content Ranking Above Original Content
Hey friends! Sooo this article was originally published in December 2016: https://www.realwealthnetwork.com/learn/best-places-to-buy-rental-property-2017/ It has been consistently ranking in positions 2-3 for long tail keyword "best places to buy rental property 2017" (and related keywords) since January-ish. It's been getting about 2000-2,500 unique views per week, until last week when it completely dropped off the internet (it's now ranking 51+). We just did a site redesign and changed some URL structures, but I created a redirect, so I don't understand why that would affect our ranking so much. Plus all of our other top pages have held their rankings -- in fact, our top organic article actually moved up from position 3 to 2 for much more competitive keywords (1031 exchange). What's even weirder is when I copy the sections of my article & paste into Google with quotes, our websites doesn't show up anywhere. Other websites that have plagiarized my article (some have included links back to the article, and some haven't) are ranking, but mine is nowhere to be found. Here are some examples: https://www.dawgsinc.com/rental-property-the-best-places-to-buy-in-the-year-2017/ http://b2blabs.com/2017/08/rental-property-the-best-places-to-buy-in-the-year-2017/ https://www.linkedin.com/pulse/best-places-buy-rental-property-year-2017-missy-lawwill/?trk=mp-reader-card http://news.sys-con.com/node/4136506 Is it possible that Google thinks my article is newer than the copycat articles, because of the new URL, and now I'm being flagged as spam? Does it think these are spam websites we've created to link back to our own content? Also, clearly my article is higher quality than the ranking articles. Why are they showing up? I double checked the redirect. It's good. The page is indexed... Ahhh what is going on?! Thanks for your help in advance!
White Hat / Black Hat SEO | | Jessica7110 -
Massive site-wide internal footer links to doorway pages: how bad is this?
My company has stuffed several hundred links into the footer of every page. Well, technically not the footer, as they're right at the end of the body tag, but basically the same thing. They are formatted as follows: [" href="http://example.com/springfield_oh_real_estate.htm">" target="_blank">http://example.com/springfield_pa_real_estate.htm">](</span><a class= "http://example.com/springfield_oh_real_estate.htm")springfield, pa real estate These direct to individual pages that contain the same few images and variations the following text that just replace the town and state: _Springfield, PA Real Estate - Springfield County [images] This page features links to help you Find Listings and Homes for sale in the Springfield area MLS, Springfield Real Estate Agents, and Springfield home values. Our free real estate services feature all Springfield and Springfield suburban areas. We also have information on Springfield home selling, Springfield home buying, financing and mortgages, insurance and other realty services for anyone looking to sell a home or buy a home in Springfield. And if you are relocating to Springfield or want Springfield relocation information we can help with our Relocation Network._ The bolded text links to our internal site pages for buying, selling, relocation, etc. Like I said, this is repeated several hundred times, on every single page on our site. In our XML sitemap file, there are links to: http://www.example.com/Real_Estate/City/Springfield/
White Hat / Black Hat SEO | | BD69
http://www.example.com/Real_Estate/City/Springfield/Homes/
http://www.example.com/Real_Estate/City/Springfield/Townhomes/ That direct to separate pages with a Google map result for properties for sale in Springfield. It's accompanied by the a boilerplate version of this: _Find Springfield Pennsylvania Real Estate for sale on www.example.com - your complete source for all Springfield Pennsylvania real estate. Using www.example.com, you can search the entire local Multiple Listing Service (MLS) for up to date Springfield Pennsylvania real estate for sale that may not be available elsewhere. This includes every Springfield Pennsylvania property that's currently for sale and listed on our local MLS. Example Company is a fully licensed Springfield Pennsylvania real estate provider._ Google Webmaster Tools is reporting that some of these pages have over 30,000 internal links on our site. However, GWT isn't reporting any manual actions that need to be addressed. How blatantly abusive and spammy is this? At best, Google doesn't care a spit about it , but worst case is this is actively harming our SERP rankings. What's the best way to go about dealing with this? The site did have Analytics running, but the company lost the account information years ago, otherwise I'd check the numbers to see if we were ever hit by Panda/Penguin. I just got a new Analytics account implemented 2 weeks ago. Of course it's still using deprecated object values so I don't even know how accurate it is. Thanks everyone! qrPftlf.png0 -
Do legitimately earned links from unrelated sites help or hurt?
We have a few charity events coming up that have offered to link back to our homepage. While we do genuinely like the charities we are going to sponsor, I'm not sure how those links will look seo-wise. For example, one is for the local high school basketball team and another is for a Pediatric Care Mud Run. To a human, these links make perfect sense, but to a robot, I'm not sure if it differentiates these links from spam/some negative link. Granted, I understand that a small percentage of links probably won't do anything either way, but I'd like to ignore that for the purposes of my question. All things being equal, do links such as these help or hurt? Thanks for your time and insight, Ruben
White Hat / Black Hat SEO | | KempRugeLawGroup0 -
Removing Poison Links w/o Disavow
Okay so I've been working at resolving former black-hat SEO tactics for this domain for many many months. Finally our main keyword is falling down the rankings like crazy no matter how many relevant, quality links I bring to the domain. So I'm ready to take action today. There is one inner-page which is titled exactly as the keyword we are trying to match. Let's call it "inner-page.html" This page has nothing but poison links with exact match anchor phrases pointing at it. The good links I've built are all pointed at the domain itself. So what I want to do is change the url of this page and let all of the current poison links 404. I don't trust the disavow tool and feel like this will be a better option. So I'm going to change the page's url to "inner_page.html" or in otherwords, simply changed to an underscore instead of a hyphen. How effective do you think this will be as far as 404ing the bad links and does anybody out there have experience using this method? And of course, as always, I'll keep you all posted on what happens with this. Should be an interesting experiment at least. One thing I'm worried about is the traffic sources. We seem to have a ton of direct traffic coming to that page. I don't really understand where or why this is taking place... Anybody have any insight into direct traffic sources to inner-pages? There's no reason for current clients to visit and potentials shouldn't be returning so often... I don't know what the deal is there but "direct" is like our number 2 or 3 traffic source. Am I shooting myself in the foot here? Here we go!
White Hat / Black Hat SEO | | jesse-landry0 -
Some pages of my website http://goo.gl/1vGZv stopped crawling in Google
hi , i have 5 years old website and some page of my website http://goo.gl/1vGZv stopped indexing in Google . I have asked Google webmaster to remove low quality link via disavow tool . What to do ?
White Hat / Black Hat SEO | | unitedworld0 -
Site Maps
I have provided a site maps for google but although it craws my site www.irishnews.com at 6:45AM the details in the site map are not seen on google for a few days - any ideas how to get this feature working better would be great. example <url><loc>http://www.irishnews.com/news.aspx?storyId=1126126</loc>
White Hat / Black Hat SEO | | Liammcmullen
<priority>1</priority>
<lastmod>2012-01-23</lastmod>
<changefreq>never</changefreq></url> thanks0 -
Does the site have duplicate, overlapping, or redundant articles on the same or similar topics with slightly different keyword variations?
Hi All, In relation to this thread http://www.seomoz.org/q/what-happend-to-my-ranks-began-dec-22-detailed-info-inside I'm still getting whipped hard from Google, this week for some reason all rankings have gone for the past few days. What I was wondering though is this, when Google says- Does the site have duplicate, overlapping, or redundant articles on the same or similar topics with slightly different keyword variations? I assume my site hits the nail on the head- [removed links at request of author] As you can see I target LG Optimus 3D Sim Free, LG Optimus 3D Contract and LG Optimus 3D Deals. Based on what Google has said, I know think there needs to be 1 page that covers it all instead of 3. What I'm wondering is the best way to deal with the situation? I think it should be something like this but please correct me along the way 🙂 1. Pick the strongest page out of the 3 2. Merge the content from the 2 weaker pages into the strongest 3. Update the title/meta info of the strongest page to include the KW variations of all 3 eg- LG Optimus 3D Contract Deals And Sim Free Pricing 4. Then scatter contract, deals and sim free throughout the text naturally 5. Then delete the weaker 2 pages and 301 redirect to the strongest page 6. Submit URL removal via webmastertools for the 2 weaker pages What would you do to correct this situation? Am I on the right track?
White Hat / Black Hat SEO | | mwoody0