Should I use the Disavow Tool at this point?
-
After Penguin, our site: www.stadriemblems.com jumped up to #1 for the keyword "embroidered patches." Now, months later, it's at the top pf page two. I'm pretty sure this is because we do have a few shady links (I didn't do it!) that perhaps Penguin didn't catch the first time around, but now Google is either discounting them or counting them against us.
My question is, since I'm pretty sure those links are the reason we are gradually declining, should I submit them to Google as disavowed, even though technically, we're not penalized . . . yet?
I have done everything possible to get them removed, and it's not happening.
-
No, I've not received a notification. Thanks, Sean.
-
Totally feel your pain, Marisa. It's frustrating to see the decline because of someone else's shoddy work. However, it's probably a better fate then a full-blown penguin penalty, plus it shows that your other links must be great if the page has only dropped to #11. What a great feeling it will be too when you hit back to #1 - knowing that everything you've done, Google loves.
And plus, just checked your rank again and you've bopped back to #10 on my SERP, so the comeback is underway!
-
I would not be disavowing your links at this stage, you do not say if you received a notification from Google or not in your WMT.
If you did then take action to remove and then disavow if they can not be removed, but if you have not received the notification / warning then keep the links in place.
Build some good quality links and add new content.
I hope this helps
Sean
-
Tom,
Thanks for your advice. It seems logical, and now I think I remember reading that somewhere. it kinda sucks, though, because now all I can do is watch my rankings steadily decline and be powerless to stop it. I guess I need to hope I earn enough good links to drown the rest of those out. -
Hi Marisa
Unless you've received a notification in your WMT saying you've received a penalty, then I wouldn't use the disavow tool. Google has been quite insistent that it should only be used as a last resort at trying to remove links as part of a reconsideration request. I don't believe it should be used a precautionary measure, nor would it have any effect unless your site is under a penalty.
I'd actually be quite optimistic about this. What I think we've seen over the last month is Google getting a lot better at discounting links as the algorithm updates. This post put over on Inbound explains it quite well - it looks as though Google is aiming towards continuous devaluation of links. I'm wondering whether this is the case for your site.
What could have happened is that the algorithm has looked at some of the links pointing towards your homepage (as that's the page ranking for that term) and seen a few links and thought "nah, these are crap. Gonna remove their value". With it being a homepage link, this might be a fair old few of older, spammier links the previous SEO put in place. This is consistent with what I've seen with a few sites in the UK - a quite sharp (but not huge) drop, all for keywords ranking for a particular page (usually the homepage).
Now, if this is the case, then I'd say it's a great leap forward by Google Devaluing links on the fly could lead to less dramatic drops and clean ups in the future. I'm fairly sure you wouldn't have had a penalty, so therefore removing those few bad links you have would probably have been futile anyway - but especially so now if Google has devalued them.
All I'd recommend Marisa is continuing with the quality links you've been putting in place already. You may have lost the value of a certain few recently, but all in all it's a good thing as they were probably a bit spammy anyway. I certainly wouldn't waste time disavowing if you haven't experienced a penalty.
Just my 2 copper, but I hope it helps!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Does using non-https links (not pages) impact or penalise the website rankings?
Hi community, We have couple of pages where we we have given non-https (http) hyperlinks by mistake. They will redirect to http links anyway. Does using these http links on page hurt any rankings? Thansk
Algorithm Updates | | vtmoz0 -
Using Brand value for SEO: Can we use keyword with brand name?
Hi Moz community, I am curious to know this. Let's say there is a brand value for a company. It has it's own popularity that it's been mentioned across the internet and social media directly with brand name without their service or industry keyword. Now if the company started promoting themselves like keyword along with their brand name, will it help them to rank for that keyword. For example, Moz is already famous, now they want to rank for "SEO" and related keywords, so they started calling themselves on internet "Moz SEO"; will this fetch them in ranking for keyword SEO? My ultimate question is, using primary keyword along with brand name will work out in ranking for that primary keyword or not? Thanks
Algorithm Updates | | vtmoz0 -
Wordpress Blog Integrated into eCommerce site - Should we use one xml sitemap or two?
Hi guys, I wonder whether you can help me with a couple of SEO queries: So we have an ecommerce website (www.exampleecommercesite.com) with its own xml sitemap, which we have submitted to the Google Webmasters Console. However, recently we decided to add a blog to our site for SEO purposes. The blog is on a subdomain of the site such as: blog.exampleecommercesite.com (We wanted to have it as www.exampleecommercesite.com/blog but our server made it very difficult and it wasn't technically possible at the time) 1. Should we add the blog.exampleecommercesite.com as a separate property in the Google Webmaster tools? 2. Should we create a separate xml sitemap for the blog content or are there more benefits in terms of SEO if we have one sitemap for the blog and the ecommerce site? If appreciate your opinions on the topic! Thank you and have a good start of the week!
Algorithm Updates | | Firebox0 -
Use of http://schema-creator.org boost ranking
Hello all if we use http://schema-creator.org for structured html will it increase our ranking too. has it any benefit for SEO?
Algorithm Updates | | adnan11010 -
Wordpress Canonical Tag Pointing to Same Page
So I noticed on a few of my clients wordpress tags (via moz) that there are canonical tags on URLs, pointing to that same URL. What is the point of that, and is it harming the website? Is this being done automatically via a plugin? Should I remove the canonical tags or leave as is?
Algorithm Updates | | WebServiceConsulting.com0 -
Why are Google Webmaster Tools' Google rankings different to actual Google rankings?
Dear Moz, We have noticed that according to Google Webmaster Tools one of our client sites is ranking very prominently for some of the major key phrases that we are trying to rank them for. However, when we perform a Google search for these queries, our client's content is nowhere to be seen, not even on the 5th page (we logged out of the Google account before performing the test). A long-term manual spam action on our client's site was recently lifted by Google - is it possible that Google Webmaster Tools is providing data about our client's estimated Google rankings, without taking into consideration the penalty of the manual spam action which was taken? Thanks
Algorithm Updates | | BoomDialogue690 -
Large site with faceted navigation using rel=canonical, but Google still has issues
First off, I just wanted to mention I did post this on one other forum so I hope that is not completely against the rules here or anything. Just trying to get an idea from some of the pros at both sources. Hope this is received well. Now for the question..... "Googlebot found an extremely high number of URLs on your site:" Gotta love these messages in GWT. Anyway, I wanted to get some other opinions here so if anyone has experienced something similar or has any recommendations I would love to hear them. First off, the site is very large and utilizes faceted navigation to help visitors sift through results. I have implemented rel=canonical for many months now to have each page url that is created based on the faceted nav filters, push back to the main category page. However, I still get these damn messages from Google every month or so saying that they found too many pages on the site. My main concern obviously is wasting crawler time on all these pages that I am trying to do what they ask in these instances and tell them to ignore and find the content on page x. So at this point I am thinking about possibly using robots.txt file to handle these, but wanted to see what others around here thought before I dive into this arduous task. Plus I am a little ticked off that Google is not following a standard they helped bring to the table. Thanks for those who take the time to respond in advance.
Algorithm Updates | | PeteGregory0 -
Will google punish us for using formulaic keyword-rich content on different pages on our site?
We have 100 to 150 words of SEO text per page on www.storitz.com. Our challenge is that we are a storage property aggregator with hundreds of metros. We have to distinguish each city with relevant and umique text. If we use a modular approach where we mix and match pre-written (by us) content, demographic and location oriented text in an attempt to create relevant and unique text for multiple (hundreds) of pages on our site, will we be devalued by Google?
Algorithm Updates | | Storitz0