Should I Use the Disavow Tool to for a Spammy Site/Landing Page?
-
Here's the situation...
There's a site that is linking to about 6 articles of mine from about 225 pages of theirs (according to info in GWT). These pages are sales landing pages looking to sell their product. The pages are pretty much identical but have different urls. (I actually have a few sites doing this to me.)
Here's where I think it's real bad -- when they are linking to me you don't see the link on the page, you have to view the page source and search for my site's url. I'm thinking having a hidden url, and it being my article that's hidden, has got to be bad. That on top of it being a sales page for a product
I've seen traffic to my site dropping but I don't have a warning in GWT.
These aren't links that I've placed or asked for in any way. I don't see how they could be good for me and I've already done what I could to email the site to remove the links (I didn't think it would work but thought I'd at least try).
I totally understand that the site linking to me may not have any affect on my current traffic.
So should I use the Disavow tool to make sure this site isn't counting against me?
-
Thanks. It seems there are so many opinions on disavow it's hard to know what's right. A lot of people say to only use it when you get a GWT warning but others say it's OK as a preventative measure.
I think I'm going to put together a list of sites that I know are garbage pointing to me and disavow them.
-
As Moosa said, try and get them to take the articles down first because this would be better and Google says that you should try to get them taken down BEFORE using the disavow tool. If you have already tried then go ahead and use it, you can disavow links from an entire domain so you can just do that.
-
The links are not build by you... you are sure that the link of link that is pointing back from certain URL is a bad link and you have tried everything to remove those links but failed... now the only option you have left is to use a disavow tool so go for it!
It is important not to use disavow tool when you didn't tried removing the bad links manually but if you did attempt and failed then you should go with option left with you that is using a Disavow tool!
-
Use it.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google Sites website https://www.opcfitness.com/ title NOT GOOD FOR SEO
We set up a website https://www.opcfitness.com/home on google sites. but google sites page title not good for SEO. How to fix it?
Technical SEO | | ahislop5740 -
524 links from a site unknown to me - contact then disavow?
Hi there, Google Webmaster Tools 'Who links the most' informs me there are 534 links from a site called econintersect.com linking to my site and 102 links from similar pages.com. I have never requested these links and am keen to know if they are harming my site due to poor link quality? Simon
Technical SEO | | simmo2350 -
Dealing with high link juice/low value pages?
How do people deal with low value pages on sites which tend to pool pagerank and internal links? For example log in pages, copyright, privacy notice pages, etc. I know recently Matt Cutts did a video saying don't worry about them, and in the past we all know various strategies like nofollow, etc. were effective but no more. Are there any other tactics or techniques with dealing with these pages and leveraging them for SEO benefit? Maybe having internal links on these pages to strategically pass off some of the link juice?
Technical SEO | | IrvCo_Interactive0 -
Webmasters tools - Need to update every time you add a new product/page to an ecommerce
Hi guys, I run an ecommere store and we are constantly receiving and uploading new products. Do we need to update the sitemap every time we upload a product? Google Webmasters tools shows that the number of URLs received is higher than the number of indexed URLs. They should match right? Thanks and regards
Technical SEO | | footd0 -
Site command / Footprint Question
Hi All, I am looking for websites with keywords in the domain and I am using: inurl:keyword/s The results that come back include sub-pages and not only domains with the keywords in the root domain. example of what i mean: www.website.com/keyword/ What I want displayed only: www.keyword/s.com Does anyone know of a site command i can use to display URL's with keywords in the root domain only? Thanks in Advance Greg
Technical SEO | | AndreVanKets0 -
Should Canonical be used if your site does not have any duplicate
Should canonical be used site wide even if my site is solid no duplicate content is generated. please explain your answer
Technical SEO | | ciznerguy0 -
Why is an error page showing when searching our website using Google "site:" search function?
When I search our company website using the Google site search function "site:jwsuretybonds.com", a 400 Bad Request page is at the top of the listed pages. I had someone else at our company do the same site search and the 400 Bad Request did not appear. Is there a reason this is happening, and are there any ramifications to it?
Technical SEO | | TheDude0 -
2 links on home page to each category page ..... is page rank being watered down?
I am working on a site that has a home page containing 2 links to each category page. One of the links is a text link and one link is an image link. I think I'm right in thinking that Google will only pay attention to the anchor text/alt text of the first link that it spiders with the anchor text/alt text of the second being ignored. This is not my question however. My question is about the page rank that is passed to each category page..... Because of the double links on the home page, my reckoning is that PR is being divided up twice as many times as necessary. Am I also right in thinking that if Google ignore the 2nd identical link on a page only one lot of this divided up PR will be passed to each category page rather than 2 lots ..... hence horribly watering down the 'link juice' that is being passed to each category page?? Please help me win this argument with a developer and improve the ranking potential of the category pages on the site 🙂
Technical SEO | | QubaSEO0