Any need to worry about spammy links in Webmaster Tools from sites that no longer exist?
-
I own an ecommerce website that had some spammy stuff done on it by an SEO firm through SEOLinkVine a few years ago.
I'm working on removing all those links, but some of the sites no longer exist. I'm assuming I don't have to worry about disavowing those in Webmaster Tools?
Thanks!
-
Thanks, Tom! That's what I assumed, but very helpful to hear from someone more experienced.
-
Hi Colin
That's correct, you shouldn't have to worry about those. Webmaster tools can be notoriously slow to update when links have been removed and no longer exist, but you should be alright having faith in the fact that if the link doesn't exist, then Google has seen it and doesn't count it any more.
If you want to be ultrasafe, pop them in the disavow file anyway as it won't do any harm. Also, you can follow Google's advice and disavow the entire domain, rather than half a dozen or so URLs from the same domain, in order to save time. You can do that by adding this to the disavow file:
domain:example.com
Hope this helps.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
If I'm using a compressed sitemap (sitemap.xml.gz) that's the URL that gets submitted to webmaster tools, correct?
I just want to verify that if a compressed sitemap file is being used, then the URL that gets submitted to Google, Bing, etc and the URL that's used in the robots.txt indicates that it's a compressed file. For example, "sitemap.xml.gz" -- thanks!
Technical SEO | | jgresalfi0 -
Trying to mark as fixed multiple errors on webmaster tools
We have 44,249 errors and I have set up for most of the URLs a 301 redirect. I know exactly which links are correctly redirected my problem is I don't want to mark as fixed each one individually. Is there a way to upload a URL list to webmaster tools and it automatically marks as fixed based on the list.
Technical SEO | | easyoffices0 -
How are Server side redirects perceived compared to direct links (on a Directory site)
Hi, Im creating some listings for a client on a relevant b2b directory (a good quality directory) I asked if the links are 'followed' or no 'followed' and they said they are 'server side redirects' so no direct links. Does anyone know how these are likely to be perceived by Google ? All BEst Dan
Technical SEO | | Dan-Lawrence1 -
Need better solution for 301s with Jekyll/S3 Site
Hey Mozzers, So, this isn't the first time that I've come to the community with questions regarding my new site. Although running a site using static HTML-generated pages has been fantastic in the first few weeks as far as load times, it's been a nightmare in terms of a few other SEO-related concerns, namely redirects. In the Q&A post above, Mat Shepherd pointed out a solution for adding 301s to an Amazon Webservices site using their "Redirection Rules" field on the "Configure Bucket for Website Hosting" page. However, I discovered soon after that I was limited to only 50 redirects using this method. Obviously, all things considered, this will not be enough. At this point, I'm basically out of ideas. If anyone else out there has a website with a similar setup, (Jekyll platform hosted on Amazon S3,) that has overcome this problem with redirects, I'd really appreciate hearing from you. Thanks in advance, everyone
Technical SEO | | danny.wood0 -
Strange Webmaster Tools Crawl Report
Up until recently I had robots.txt blocking the indexing of my pdf files which are all manuals for products we sell. I changed this last week to allow indexing of those files and now my webmaster tools crawl report is listing all my pdfs as not founds. What is really strange is that Webmaster Tools is listing an incorrect link structure: "domain.com/file.pdf" instead of "domain.com/manuals/file.pdf" Why is google indexing these particular pages incorrectly? My robots.txt has nothing else in it besides a disallow for an entirely different folder on my server and my htaccess is not redirecting anything in regards to my manuals folder either. Even in the case of outside links present in the crawl report supposedly linking to this 404 file when I visit these 3rd party pages they have the correct link structure. Hope someone can help because right now my not founds are up in the 500s and that can't be good 🙂 Thanks is advance!
Technical SEO | | Virage0 -
Can I reduce link count by no following links?
Hi, A large number of my pages contain over 100 links. This is due to a large drop down navigation which is on every page. To reduce my link count could I just no follow these navigation links or would I have to remove the navigation completely?
Technical SEO | | moesian0 -
Should you worry about adding geo-targeted pages to your site?
Post-Panda, should I worry about adding a bunch of geo-targeted landing pages at once? It's a community, people have added their location on their profile pages. I'm worried if we decide to make all the locations into hyperlinks that point to new geo-targeted pages, it could get us extra traffic for those geo-specific keyword phrases but penalize the site as a whole for having so many low-quality pages. What I'm thinking is maybe to start small and turn, say, United States into a hyperlink that points to a page (that would house our community members that reside in the United States) and add extra unique content to the page. And only add a new location page when we know we'll be adding unique content to it, so it's not basically just page sorting. Thoughts? Hope that makes sense. Thanks!
Technical SEO | | poolguy0 -
Google Webmaster Tools: Keywords
Hi SEOmozzers! I'm the Dr./owner/in-house SEO for my eye care practice. The URL is www.ofallonfamilyeyecare.com. Our practice is in O'Fallon, MO. Since I'm an optometrist, my main keywords are "optometrist o'fallon" and "o'fallon optometrist". As I get more familiarity with SEO, Google Analytics and Webmaster Tools, I've discovered the Keywords that Google feels best represent my website. About a week ago I noted Google counted 21 instances of "optometrist" on the 28-30 pages of my website, which ranks as #32 in the most common keywords. #1 is "eye" with 506 instances. Even though 21 occurrences seemed low, I went though every page adding "optometrist" a couple times in the body where it would naturally be appropriate. I also added it to the address shown on the footer of every page. I changed the top navigation option of "meet Dr. Hegyi" to "our optometrist". I must have added at least 4 occurrences to every page on my site, and submitted for a re-crawl. I even tried to scale back the "eye" occurrences on a few pages. Today I see that Google has re-crawled the site and the keywords have been updated. "Optometrist has DROPPED from #32 to #33. Does anyone have any ideas or suggestions why I'm not seeing increased occurrence in Googles eyes? I realize this may not be a big factor in SERPs, but every bit of on-page optimization helps. Or is this too minor of an issue to sweat? Thanks!
Technical SEO | | JosephHegyi0