Homepage alone dropped for one "keyword"
-
Hi Moz community,
Our websites has dropped almost 50 positions for main keyword and Okay with other keywords. Other pages are doing consistent for other keywords. We haven't made any changes in website. What could be reason this ideal scenario of homepage dropping for main keyword. And recent unconfirmed algo update have anything do with this?
Thnaks
-
Google left us in "confusion world" and making money. First, does anybody guess what is good or bad link as per Google? No. May be very less SEO experts; that too after using expensive tools. Google agreed that they "try hard" to stop the negative SEO affect; but cannot guarantee you that no links hurt. So it implies that Google algorithm is never going to be accurate even they update with Penguin or Peacock. That too it'll be more less accurate coming to websites like us with thousands of backlinks where hundreds and thousands of new clueless backlinks add every month making it hard finding the culprits.
Wikipedia page is sure a strong hit. Wikipedia page is not a feasibility every company but only for which holds some genuine reputation can get a page and backlink. So, even it's a nofollow technically, Google gives a weight to it.
One of our sub-domain is hitting with backlinks from same domain for last few months...all added up to 5k links...mostly from comments. Do we need to worry about this hitting our domain and website rankings?
-
It's not that Google will penalize you for using the disavow tool, but rather, that if you disavow a good link you'll potentially end up doing harm.
I don't think Matt's statements were confusing. He said that if you reavow a link it may not be given the weight that it once had. This is a measure Google takes to make it harder for people to experiment with the disavow tool. I wrote a lot more about this here:
https://searchenginewatch.com/sew/how-to/2409081/can-you-reavow-links-you-have-peviously-disavowed
Losing a link from a wikipedia page should not cause a drop IMO. Links from wikipedia are nofollowed and do not pass PageRank. Now, in some cases you can get followed links from sites that scrape wikipedia, but I would be surprised if losing these links hurt you.
In regards to Google ignoring bad backlinks, that's what they say they do now. Still, if I see a site that has a lot of self made SEO links then I'll disavow just to be sure. Also, there are other algorithms that use links and there is the potential for manual actions, so it's not like we can completely ignore unnatural links.
-
Hi Marie...Thanks for sharing your views and guidance.
I agree that disavowing some useful links might push us down in rankings. But I have only disavowed the links which have more Moz spam score after checking them manually. These links look useful and I don't find any value for them. I don't think removing 10 links from our back-link profile with thousands of backlinks will affect us.
I disagree that Google will penalise for using disavow tool. Statement from Matt Cutts is one of the confusing statement to make sure users not misusing the tools and put a load on Google to process the requests. Disavow is complete auto tool with no human interference as per my knowledge and Google will not punish for just using it. And if Google is against experimenting, first they must able to judge if a request is a experiment or genuine try. This is highly impossible for Google which is like trying to read minds of users.
Something interesting happened in our case is....we actually lost the back link from wikipedia page. So we are presuming this might be the obvious cause for the drop. Do you think so?
And do you believe that Google completely ignores bad backlinks? And only good backlinks are ranking factor? (beside on-site factors)
-
The potential harm in using the disavow tool is that you could be disavowing links that are actually helping you. If a link is truly an unnatural link, then yes, it should be disavowed, but if you are disavowing and then re-avowing, and then trying different links to disavow this could be dangerous.
Matt Cutts a few years ago said that Google had built in some features to the disavow tool to prevent people from trying to experiment with it. He hinted that a reavowed link may not carry the same power that it once did. Also, Cyrus Shepard from Moz did an experiment where he disavowed every link to his site and rankings plummeted. He later removed his entire disavow file and his rankings did not recover at all.
Regarding discounting links vs penalizing for links, Gary Illyes from Google made statements saying that the new Penguin algorithm no longer penalizes sites. With that said, if you have a lot of unnatural links I still recommend disavowing as you could get a manual penalty.
Also, there are other algorithms that use links, so yes, I still do disavow. My reasoning for advising that you don't disavow is because it sounds like you are experimenting with the tool and disavowing and reavowing. Again, if a link needs to be disvaowed, then disavow it and leave it at that.
-
Thanks for your thoughts Marie.
I don't understand what's wrong in using disavow tool with any number of links and how it'll harm. It's an automation where link juice will stop passing from the links we disavowed and nothing behind that. Moreover if disavow tool made our push, why don't we recover even after weeks removing it? Also we hardly kept the disavow file for few days.
I also don't agree that Google gonna just ignore unnatural links and consider good links with it's algorithm. After all humans itself these days couldn't able to conclude some links; so Google doing a smart job here is impossible and it's never going to be accurate. Definitely some links will trigger the backlink profile and that's how most of the penalties have been removed by SEO experts these days by using disavow tool.
Page title: I can see in my niche, most of the top ranking pages are starting with "brand & keyword"..like I said "vertigo tiles". I can see this mostly on homepage which might be contributing better as this phrase has been mentioned more time across internet. I mean if "vertigo tiles" has more visibility, starting home page title with same will boost the rankings.
Thanks
-
It's tough to comment without seeing the actual page, but here are my thoughts.
You should not try to experiment with the disavow tool. If you've got links that you yourself made for SEO purposes and they serve no other purpose then yes, disavow them. But, if you're not sure, it's best to just leave the disavow tool alone as it's possible to do more harm than good. Google's new version of Penguin just ignores unnatural links and doesn't penalize sites for having them. If you have lots of spammy links I still advise disavowing, but disavowing just a few links is not a good idea.
Regarding page titles, it's generally best practice to have your most important keywords at the beginning of the title tag, so with that in mind the old title tags look better to me.
-
We dropped on Jan 20th and dropped more in Jan last week. We haven't changed anything around the time. We have disavowed few links and removed the links again if they might dropped us; but this disavow removal does't help us. I think it's more about On-page. I can see page titles with exact match are pitching more effectively on top results. Below example is my hypothesis on monitoring the new results and comparing them:
Let's say, for "tiles" keyword, below are how old and new page title types making on top:
Old page titles: "tiles for kitchen, hall and bedroom - vertigo tiles"
New page titles: "vertigo tiles - tiles for kitchen, hall and bedroom"
Please share your thoughts.
Thanks
-
When did the drop happen? If it was within the last few days, I'd say not to change anything and just wait. I've had a number of clients recently that have noticed a huge drop and then within a week, they popped back up higher than they were before. I personally think this could be a part of how Google tests a site's worth. I think they may remove a page from the first page temporarily to see if it affects where people click.
One other thing to check is keyword stuffing. That can sometimes cause a page to drop for one keyword. But again, I wouldn't change anything just yet.
-
A few Days ago I faced same the problem
But I recovered this by Optimizing my ON Page like I used more Keyword in Title and extended to more that 70 characters.
After that, i have linked some internal link on the Home page and linked 1-2 Outbound link to High Authority Website.
That's It. I Got My Website Rankings.
I am Sure this will Help you to Get Your Ranking Soon.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How to do effective keyword research with categories and subcategories?
Hi all, I'm trying to breakdown some SEO 101 tips and start from scratch. Starting with Keywords! I would like to audit our site for main keywords, grouping them in categories and subcategories. My questions are: 1. Is it possible to see where we rank on google AND search trends of visits to our site?
Algorithm Updates | | Eric_S
2. What is a good method or structure to document (excel?)
3. What analysis can be made from finding the results of these keywords and how can I make use of this? As a beginner your help is much appreciated!!2 -
One keyword gone in Google SERPs - Fred?
I have an ecommerce site. One keyword, which I use to rank #1 for on Google years ago, I'm now completely gone from the SERP's as of a couple weeks ago. I'm scratching my head here, my other keywords don't seem to have changed much recently. Around mid-March of this year, which seems to line up with the Fred update, I noticed I went from page 3 to middle of page 1 for a few days with this keyword. It was a very happy few days. Then it slipped down and down and hovered around page 6. But as of a couple weeks ago, it's now gone. Before the Fred update, I changed a bunch of product pages within the keyword category that had duplicate content because they were kits of items arranged different ways. So instead of repeating the individual item descriptions over and over in the different kits, I changed the descriptions on the kits to links to the individual items within the kits. After the Fred update, at the end of March, I set all these kit item pages that I reduced to very thin content with just links to noindex. My theory is that the Fred update reset algorithmic penalties for a couple days as it was being introduced. So the penalty of duplicate content that I may have had was lifted since I took out the duplicate content, and I made it back to page one. Then as Fred saw I now had a new penalty of thin content, I got hit and slid back down the rankings. Now that I updated the pages that had very thin content to be noindex, do you think I'll see a return of the keyword to a higher position? Or any other theories or suggestions? I remember seeing keywords disappear and come back stronger years ago, but haven't seen anything like this in a long time.
Algorithm Updates | | head_dunce0 -
Significant Drop In Traffic Early April
I had a downward slide in Google organic traffic at the beginning of April (around 8-10th). It doesn't coincide with any of the algo updates, and I can't figure out what the cause it. Has anyone else experienced such a drop? Any suggestions as to finding the source/cause? TIA
Algorithm Updates | | inhouseseo0 -
What is the appropriate Robot.txt to unblock if Google cannot get all the resources from my homepage?
Hello everyone. I did some research as to why my website has decreased in the Google search rankings recently. After reading this Yoast article I believe it's because the robot.txt files I have set up on my wordpress website. The following is a screen shot of the results of a fetch & render query of my webpage.Googlebot couldn't get all resources for this page. Here's a list: URL Type Reason http://fonts.googleapis.com/css?family=Open+Sans:400,600,700,800%7CPT+Sans:400,400italic,700,700italic%7COswald:400,300,700&subset=latin,latin-ext Style Sheet Denied by robots.txt http://www.kmollinslaw.com/wp-content/plugins/slick-contact-forms/css/admin.css?ver=3.9.1 Style Sheet Denied by robots.txt http://www.kmollinslaw.com/wp-content/plugins/contact-form-plugin/css/style.css?ver=3.9.1 Style Sheet Denied by robots.txt http://www.kmollinslaw.com/wp-content/plugins/hupso-share-buttons-for-twitter-facebook-google/style.css?ver=3.9.1 Style Sheet Denied by robots.txt http://www.kmollinslaw.com/wp-content/plugins/latest-post-accordian-slider/css/lpaccordion.css?ver=3.9.1 Style Sheet Denied by robots.txt http://www.kmollinslaw.com/wp-content/plugins/latest-post-accordian-slider/css/style.css?ver=3.9.1 Style Sheet Denied by robots.txt http://www.kmollinslaw.com/wp-content/plugins/revslider/rs-plugin/css/settings.css?rev=4.1.1&ver=3.9.1 Style Sheet Denied by robots.txt http://www.kmollinslaw.com/wp-content/plugins/revslider/rs-plugin/css/dynamic-captions.css?rev=4.1.1&ver=3.9.1 Style Sheet Denied by robots.txt http://www.kmollinslaw.com/wp-content/plugins/revslider/rs-plugin/css/static-captions.css?rev=4.1.1&ver=3.9.1 Style Sheet Denied by robots.txt http://www.kmollinslaw.com/wp-content/plugins/wp-email-capture/inc/css/wp-email-capture-styles.css?ver=1.0 Style Sheet Denied by robots.txt http://www.kmollinslaw.com/wp-content/themes/infographer/style.css?ver=3.9.1 Style Sheet Denied by robots.txt http://www.kmollinslaw.com/wp-content/themes/infographer/css/stylesheet.min.css?ver=3.9.1 Style Sheet Denied by robots.txt http://www.kmollinslaw.com/wp-content/themes/infographer/css/style_dynamic.php?ver=3.9.1 Style Sheet Denied by robots.txt http://www.kmollinslaw.com/wp-content/themes/infographer/css/custom_css.php?ver=3.9.1 Style Sheet Denied by robots.txt http://www.kmollinslaw.com/wp-content/plugins/convertable-contact-form-builder-analytics-and-lead-management-dashboard/assets/css/convertable.css?ver=3.9.1 Style Sheet Denied by robots.txt http://www.kmollinslaw.com/wp-content/plugins/google-maps-widget/css/gmw.css?ver=1.66 Style Sheet Denied by robots.txt http://www.kmollinslaw.com/wp-content/plugins/acurax-social-media-widget/style.css?ver=3.9.1 Style Sheet Denied by robots.txt http://www.kmollinslaw.com/wp-includes/js/swfobject.js?ver=2.2-20120417 Script Denied by robots.txt My current robot.txt settings are as follows; User-agent: * Disallow: /wp-admin/ Disallow: /wp-includes/ Disallow: */xmlrpc.php Disallow: */wp-*.php Disallow: */trackback/ Disallow: *?wptheme= Disallow: *?comments= Disallow: *?replytocom Disallow: */comment-page- Disallow: *?s= Disallow: */wp-content/ Allow: */wp-content/uploads/ ```What to I need to allow/disallow to allow Google spiders to properly read my website?
Algorithm Updates | | gamesotd0 -
How could Google define "low quality experience merchants"?
Matt Cutts mentioned at SXSW that Google wants to take into consideration the quality of the experience ecommerce merchants provide and work this into how they rank in SERPs. Here's what he said if you missed it: "We have a potential launch later this year, maybe a little bit sooner, looking at the quality of merchants and whether we can do a better job on that, because we don’t want low quality experience merchants to be ranking in the search results.” My question; how exactly could Google decide if a merchant provides a low and high quality experience? I would image it would be very easy for Google to decide this with merchants in their Trusted Store program. I wonder what other data sets Google could realistically rely upon to make such a judgment. Any ideas or thoughts are appreciated.
Algorithm Updates | | BrianSaxon0 -
When Google crawls and indexes a new page does it show up immediately in Google search - "site;"?
We made changes to a site, including the addition of a new page and corresponding link/text changes to existing pages. The changes are not yet showing up in the Google index (“site:”/cache), but, approximately 24 hours after making the changes, The SERP's for this site jumped up. We obtained a new back link about a couple of weeks ago, but it is not yet showing up in OSE, Webmaster Tools, or other tools. Just wondering if you think the Google SERP changes run ahead of what they actually show us in site: or cache updates. Has Google made a significant SERP “adjustment” recently? Thanks.
Algorithm Updates | | richpalpine0 -
If we are getting clicks from a local one box as a citation in the serps's would we see this as the referrer in GA?
If we are getting clicks from a local one box as a citation in the serps's
Algorithm Updates | | Mediative
would we see this as the referrer in GA?0 -
Why would my keywords never ranking in Bing but have great position in both Google and Yahoo?
I have several keywords ranking top 5 for Google and Yahoo but nothing from Bing. Any ideas?
Algorithm Updates | | CIEEwebTeam0