Homepage alone dropped for one "keyword"
-
Hi Moz community,
Our websites has dropped almost 50 positions for main keyword and Okay with other keywords. Other pages are doing consistent for other keywords. We haven't made any changes in website. What could be reason this ideal scenario of homepage dropping for main keyword. And recent unconfirmed algo update have anything do with this?
Thnaks
-
Google left us in "confusion world" and making money. First, does anybody guess what is good or bad link as per Google? No. May be very less SEO experts; that too after using expensive tools. Google agreed that they "try hard" to stop the negative SEO affect; but cannot guarantee you that no links hurt. So it implies that Google algorithm is never going to be accurate even they update with Penguin or Peacock. That too it'll be more less accurate coming to websites like us with thousands of backlinks where hundreds and thousands of new clueless backlinks add every month making it hard finding the culprits.
Wikipedia page is sure a strong hit. Wikipedia page is not a feasibility every company but only for which holds some genuine reputation can get a page and backlink. So, even it's a nofollow technically, Google gives a weight to it.
One of our sub-domain is hitting with backlinks from same domain for last few months...all added up to 5k links...mostly from comments. Do we need to worry about this hitting our domain and website rankings?
-
It's not that Google will penalize you for using the disavow tool, but rather, that if you disavow a good link you'll potentially end up doing harm.
I don't think Matt's statements were confusing. He said that if you reavow a link it may not be given the weight that it once had. This is a measure Google takes to make it harder for people to experiment with the disavow tool. I wrote a lot more about this here:
https://searchenginewatch.com/sew/how-to/2409081/can-you-reavow-links-you-have-peviously-disavowed
Losing a link from a wikipedia page should not cause a drop IMO. Links from wikipedia are nofollowed and do not pass PageRank. Now, in some cases you can get followed links from sites that scrape wikipedia, but I would be surprised if losing these links hurt you.
In regards to Google ignoring bad backlinks, that's what they say they do now. Still, if I see a site that has a lot of self made SEO links then I'll disavow just to be sure. Also, there are other algorithms that use links and there is the potential for manual actions, so it's not like we can completely ignore unnatural links.
-
Hi Marie...Thanks for sharing your views and guidance.
I agree that disavowing some useful links might push us down in rankings. But I have only disavowed the links which have more Moz spam score after checking them manually. These links look useful and I don't find any value for them. I don't think removing 10 links from our back-link profile with thousands of backlinks will affect us.
I disagree that Google will penalise for using disavow tool. Statement from Matt Cutts is one of the confusing statement to make sure users not misusing the tools and put a load on Google to process the requests. Disavow is complete auto tool with no human interference as per my knowledge and Google will not punish for just using it. And if Google is against experimenting, first they must able to judge if a request is a experiment or genuine try. This is highly impossible for Google which is like trying to read minds of users.
Something interesting happened in our case is....we actually lost the back link from wikipedia page. So we are presuming this might be the obvious cause for the drop. Do you think so?
And do you believe that Google completely ignores bad backlinks? And only good backlinks are ranking factor? (beside on-site factors)
-
The potential harm in using the disavow tool is that you could be disavowing links that are actually helping you. If a link is truly an unnatural link, then yes, it should be disavowed, but if you are disavowing and then re-avowing, and then trying different links to disavow this could be dangerous.
Matt Cutts a few years ago said that Google had built in some features to the disavow tool to prevent people from trying to experiment with it. He hinted that a reavowed link may not carry the same power that it once did. Also, Cyrus Shepard from Moz did an experiment where he disavowed every link to his site and rankings plummeted. He later removed his entire disavow file and his rankings did not recover at all.
Regarding discounting links vs penalizing for links, Gary Illyes from Google made statements saying that the new Penguin algorithm no longer penalizes sites. With that said, if you have a lot of unnatural links I still recommend disavowing as you could get a manual penalty.
Also, there are other algorithms that use links, so yes, I still do disavow. My reasoning for advising that you don't disavow is because it sounds like you are experimenting with the tool and disavowing and reavowing. Again, if a link needs to be disvaowed, then disavow it and leave it at that.
-
Thanks for your thoughts Marie.
I don't understand what's wrong in using disavow tool with any number of links and how it'll harm. It's an automation where link juice will stop passing from the links we disavowed and nothing behind that. Moreover if disavow tool made our push, why don't we recover even after weeks removing it? Also we hardly kept the disavow file for few days.
I also don't agree that Google gonna just ignore unnatural links and consider good links with it's algorithm. After all humans itself these days couldn't able to conclude some links; so Google doing a smart job here is impossible and it's never going to be accurate. Definitely some links will trigger the backlink profile and that's how most of the penalties have been removed by SEO experts these days by using disavow tool.
Page title: I can see in my niche, most of the top ranking pages are starting with "brand & keyword"..like I said "vertigo tiles". I can see this mostly on homepage which might be contributing better as this phrase has been mentioned more time across internet. I mean if "vertigo tiles" has more visibility, starting home page title with same will boost the rankings.
Thanks
-
It's tough to comment without seeing the actual page, but here are my thoughts.
You should not try to experiment with the disavow tool. If you've got links that you yourself made for SEO purposes and they serve no other purpose then yes, disavow them. But, if you're not sure, it's best to just leave the disavow tool alone as it's possible to do more harm than good. Google's new version of Penguin just ignores unnatural links and doesn't penalize sites for having them. If you have lots of spammy links I still advise disavowing, but disavowing just a few links is not a good idea.
Regarding page titles, it's generally best practice to have your most important keywords at the beginning of the title tag, so with that in mind the old title tags look better to me.
-
We dropped on Jan 20th and dropped more in Jan last week. We haven't changed anything around the time. We have disavowed few links and removed the links again if they might dropped us; but this disavow removal does't help us. I think it's more about On-page. I can see page titles with exact match are pitching more effectively on top results. Below example is my hypothesis on monitoring the new results and comparing them:
Let's say, for "tiles" keyword, below are how old and new page title types making on top:
Old page titles: "tiles for kitchen, hall and bedroom - vertigo tiles"
New page titles: "vertigo tiles - tiles for kitchen, hall and bedroom"
Please share your thoughts.
Thanks
-
When did the drop happen? If it was within the last few days, I'd say not to change anything and just wait. I've had a number of clients recently that have noticed a huge drop and then within a week, they popped back up higher than they were before. I personally think this could be a part of how Google tests a site's worth. I think they may remove a page from the first page temporarily to see if it affects where people click.
One other thing to check is keyword stuffing. That can sometimes cause a page to drop for one keyword. But again, I wouldn't change anything just yet.
-
A few Days ago I faced same the problem
But I recovered this by Optimizing my ON Page like I used more Keyword in Title and extended to more that 70 characters.
After that, i have linked some internal link on the Home page and linked 1-2 Outbound link to High Authority Website.
That's It. I Got My Website Rankings.
I am Sure this will Help you to Get Your Ranking Soon.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How can I discover the Google ranking number for a keyword in Brazil?
Hello, how can I discover the Google ranking number for a keyword in Brazil location. I need to know what is the position in Brazil location for the keyword "ligação internacional" in the Google search engine for the webpage www.solaristelecom.com/ligacao-internacional. I tried to use the Moz tools to discover it but only shows that I am not in the top 50, then I want to know where I am, and if I am listed or not. I tried to search it in my browser and didn't show the name of my website. Thank you.
Algorithm Updates | | lmoraes1 -
SEO Myth-Busters -- Isn't there a "duplicate content" penalty by another name here?
Where is that guy with the mustache in the funny hat and the geek when you truly need them? So SEL (SearchEngineLand) said recently that there's no such thing as "duplicate content" penalties. http://searchengineland.com/myth-duplicate-content-penalty-259657 by the way, I'd love to get Rand or Eric or others Mozzers aka TAGFEE'ers to weigh in here on this if possible. The reason for this question is to double check a possible 'duplicate content" type penalty (possibly by another name?) that might accrue in the following situation. 1 - Assume a domain has a 30 Domain Authority (per OSE) 2 - The site on the current domain has about 100 pages - all hand coded. Things do very well in SEO because we designed it to do so.... The site is about 6 years in the current incarnation, with a very simple e-commerce cart (again basically hand coded). I will not name the site for obvious reasons. 3 - Business is good. We're upgrading to a new CMS. (hooray!) In doing so we are implementing categories and faceted search (with plans to try to keep the site to under 100 new "pages" using a combination of rel canonical and noindex. I will also not name the CMS for obvious reasons. In simple terms, as the site is built out and launched in the next 60 - 90 days, and assume we have 500 products and 100 categories, that yields at least 50,000 pages - and with other aspects of the faceted search, it could create easily 10X that many pages. 4 - in ScreamingFrog tests of the DEV site, it is quite evident that there are many tens of thousands of unique urls that are basically the textbook illustration of a duplicate content nightmare. ScreamingFrog has also been known to crash while spidering, and we've discovered thousands of URLS of live sites using the same CMS. There is no question that spiders are somehow triggering some sort of infinite page generation - and we can see that both on our DEV site as well as out in the wild (in Google's Supplemental Index). 5 - Since there is no "duplicate content penalty" and there never was - are there other risks here that are caused by infinite page generation?? Like burning up a theoretical "crawl budget" or having the bots miss pages or other negative consequences? 6 - Is it also possible that bumping a site that ranks well for 100 pages up to 10,000 pages or more might very well have a linkuice penalty as a result of all this (honest but inadvertent) duplicate content? In otherwords, is inbound linkjuice and ranking power essentially divided by the number of pages on a site? Sure, it may be some what mediated by internal page linkjuice, but what's are the actual big-dog issues here? So has SEL's "duplicate content myth" truly been myth-busted in this particular situation? ??? Thanks a million! 200.gif#12
Algorithm Updates | | seo_plus0 -
Significant Drop In Traffic Early April
I had a downward slide in Google organic traffic at the beginning of April (around 8-10th). It doesn't coincide with any of the algo updates, and I can't figure out what the cause it. Has anyone else experienced such a drop? Any suggestions as to finding the source/cause? TIA
Algorithm Updates | | inhouseseo0 -
How to determine the best keyword strategy/purpose for a blog in 2014?
Currently our blog has been used to add content to our site targeting desired keywords (fairly top-level). For example, if we wanted organic traffic for "Some City Contractors" (by no means a longtail), we would write a blog using this key term in the Title, url, a sub heading perhaps and a couple variations of the term throughout any subheadings or body copy. I think the idea was that since there was so much work to be done to get the static site pages optimized (rewriting that copy), we just decided to crank out fresh content targeting these high level KWs, assuming a search engine result is a result and as long as we got real estate there, a click and there was a link to the relevant site page in that article, we were golden (well, maybe not golden, but good). We are now building a new, responsive site and taking care to make sure that the site's relevant pages are nicely optimized. Higher level page are optimized for high-level KWs and sub pages target longer tail KWs identified in KW research. Along the way an SEO said it was bad that so many of our blogs were better optimized for key terms than the actual site pages (i.e. service pages, things you would find in the main nav.) This does make some sense to me so... So what is the new purpose for our blogs in this new age of Google and ever-increasing social influence? Should we forget about focusing on KWs already addressed within the site's core? Focus more on interesting, super long-tails that maybe don't have a ton of traffic, but are relevant (and oh by they way, something like 3 million terms are searched for the first time each day, right?)? Or forget the keywords, as long as the topic is relevant and interesting the real pay-off is in social interactions. I'm really interested to see if this results in clear-cut answer or more of a lengthy discussion...
Algorithm Updates | | vernonmack1 -
Can Ecommerce help with Keyword Rankings?
I am curious to know if an ecommerce website plays a role in higher rankings. we have been struggling for some time on a term and all of our competitors have an online shopping cart. we have a custom magento website with a request a quote form as our products are very costly. (range from $500 - $250,000). Is there something we can add to the code to help boost our rankings?
Algorithm Updates | | hfranz0 -
Optimizing Main URL with Various Relevant Keywords
Hi, I am new to working with SEO on my website and making attempts to create rich content to allow me to rank above and near competitors of our very niche market for LED microlights. I understand the concept now of on-page optimization. I have found key words that are perfect for optimizing specific products. I am working on structuring my website with enriching content that specific key words will direct organic search traffic to products. However, what techniques do I use to drive traffic to my main domain with a keyword that is relevant to my site but no specific products? For example, the phrase "light show gloves" or "light show" is a keyword relevant to my site in general, so I would like my main URL to show in search results when these phrases are searched. What can I do to optimize my site for such phrases? Do I merely use them in related categories and product content? And if yes, how can I ensure I have optimized my domain fully for various relevant keywords and compare to competitors? Thanks!
Algorithm Updates | | Rofldew0 -
When was the last algorithm update? One of my pages has dropped significantly this week
One of my pages dropped 22 places last week and I'm not sure why - can any body give me some suggestions to why this might have happened?
Algorithm Updates | | lindsayjhopkins0 -
Why have I lost rankings for my top 10+ keywords
Hello, As of 04/24/11 I have lost rankings for my top 10+ keywords in an extremely competitive market. I was ranking on page 1 in Google for keywords like: iphone app reviews, ipad app reviews, iphone app videos, ipad app videos, and more. Since 04/24 I have fallen off the first 10 pages + in Google for all major keyword & keyword terms. I have identified that I have 1,359 articles from prmac, which contain links to customers who published press releases through prmac. We are in the process of removing these pages, but creating a list of urls to prove to Google the possibly offending blog posts are gone. Beyond that I am not sure what else to do, please help. URL: http://www.crazymikesapps.com. thank you Mike
Algorithm Updates | | crazymikesapps0