What should I include in disavow file and/or reconsideration request?
-
My client got a manual penalty notice. Need to submit a disavow file and reconsideration request which is new territory for me. The task of contacting/disavowing 100's of sites to remove 1000's of links is a bit overwhelming.
Answers to any of these questions would be greatly appreciated.
Search console is showing 100's of hacked websites pointing to the site. Many of the incoming links showing in search console are already gone. Should I include in the disavow file or is the disavow file only for links that persist?
I have read that Google does not actually read the #remarks in the disavow file. Since its manual penalty should I include them anyway since it's possible that a human could look it over? If anyone who has submitted a reconsideration request for unnatural links can comment on their use or non use of #remarks and the result that would be very helpful.
You can tell that Google wants an effort to be made that the site owners are contacted. What is the best way to document that? In the reconsideration request?: The disavow file? or both.
-
Hey KentH – it's such a pain to deal with linking penalties, and I'm sorry you're facing that now.
Onto your problem: Google definitely prefers that you make the request of the linking website's webmaster to remove the link to your site, or at least add a "nofollow" attribute to their link (pointing to your site). That said, they understand that it's not always possible to accomplish that. Here's what I recommend:
- First, reach out to the webmaster/host of the website (you can find this by looking up their domain on WhoIs)
- Then, if they respond and have "resolved" the linking issue (either by removing the link, or adding a nofollow attribute) you can consider that link/linking root domain addressed
- If you don't get a response, or the webmaster refuses to alter their link to your site, you can go ahead and add their domain to your disavow file.
It requires a bit of patience–don't wait too long to hear back from the webmasters (I'd personally give it no longer than a week). I hope this helps!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Paragraphs/Tables for Content & SEO
Hi Does anyone know if Google prefers paragraphs over content in a table, or doesn't it make much difference?
Intermediate & Advanced SEO | | BeckyKey0 -
Disavowing 100k Affiliate Links
Hi all, hope you're all good. I am updating our disavow file, we've noticed a couple more spammy links which are pointing at or site. While I was at it, affiliate links came to my mind. At the moment we have over 100k+ affiliate links pointing to the root of our site and other categories/products, most of them are do-follow. However, taking a look at WMT, it's one of our 'Who links the most' and the affiliate network is pointing a total of 115,065 links to us. My question; bearing it mind this site generates over 2million hits a month, is it really worth disavowing the entire affiliate link network. This would result is all of those 100,000 links being disavowed over time. Do you think this would result in a positive? Let me know your thoughts.
Intermediate & Advanced SEO | | Brett-S0 -
Parked Vs Addon/Redirect Domain
We have an old site we are trying to figure out what to do with it. Right now, we have it as a parked domain, but were considering changing it to an addon domain with a redirect. I have no reason why I chose parked vs addon, other than I had to pick one. Is one superior than the other? What are the pro's and con's for these? Thanks, Ruben
Intermediate & Advanced SEO | | KempRugeLawGroup0 -
Issue with Robots.txt file blocking meta description
Hi, Can you please tell me why the following error is showing up in the serps for a website that was just re-launched 7 days ago with new pages (301 redirects are built in)? A description for this result is not available because of this site's robots.txt – learn more. Once we noticed it yesterday, we made some changed to the file and removed the amount of items in the disallow list. Here is the current Robots.txt file: # XML Sitemap & Google News Feeds version 4.2 - http://status301.net/wordpress-plugins/xml-sitemap-feed/ Sitemap: http://www.website.com/sitemap.xml Sitemap: http://www.website.com/sitemap-news.xml User-agent: * Disallow: /wp-admin/ Disallow: /wp-includes/ Other notes... the site was developed in WordPress and uses that followign plugins: WooCommerce All-in-One SEO Pack Google Analytics for WordPress XML Sitemap Google News Feeds Currently, in the SERPs, it keeps jumping back and forth between showing the meta description for the www domain and showing the error message (above). Originally, WP Super Cache was installed and has since been deactivated, removed from WP-config.php and deleted permanently. One other thing to note, we noticed yesterday that there was an old xml sitemap still on file, which we have since removed and resubmitted a new one via WMT. Also, the old pages are still showing up in the SERPs. Could it just be that this will take time, to review the new sitemap and re-index the new site? If so, what kind of timeframes are you seeing these days for the new pages to show up in SERPs? Days, weeks? Thanks, Erin ```
Intermediate & Advanced SEO | | HiddenPeak0 -
Proper way to include Location & Zipcode Keywords
I have a client that is insisting that I add a list of approximately 50 cities and 80 zipcodes that their business serves within the keyword meta tag. Based on what I have been reading this will do absolutely nothing to help improve their search ranking. What would be the proper way today to let inform search engines of the geolocations a business serves?
Intermediate & Advanced SEO | | mmurphy0 -
We will be switching our shopping cart platform from volusion to magento and really cautious / nervous about our rankings / seo stuff. any advice for anyone that has migrated stores, etc. these urls are years old, etc.
shopping cart platform switch and SEO. What do you suggest? What's the best way to ensure we keep rankings.
Intermediate & Advanced SEO | | PaulDylan0 -
We are changing ?page= dynamic url's to /page/ static urls. Will this hurt the progress we have made with the pages using dynamic addresses?
Question about changing url from dynamic to static to improve SEO but concern about hurting progress made so far.
Intermediate & Advanced SEO | | h3counsel0 -
Should we block urls like this - domainname/shop/leather-chairs.html?brand=244&cat=16&dir=ascℴ=price&price=1 within the robots.txt?
I've recently added a campaign within the SEOmoz interface and received an alarming number of errors ~9,000 on our eCommerce website. This site was built in Magento, and we are using search friendly url's however most of our errors were duplicate content / titles due to url's like: domainname/shop/leather-chairs.html?brand=244&cat=16&dir=asc&order=price&price=1 and domainname/shop/leather-chairs.html?brand=244&cat=16&dir=asc&order=price&price=4. Is this hurting us in the search engines? Is rogerbot too good? What can we do to cut off bots after the ".html?" ? Any help would be much appreciated 🙂
Intermediate & Advanced SEO | | MonsterWeb280