Potential spam issue - back links
-
Hi - we have a client whom we work with for SEO. During a review we noticed in Webmaster Tools, there was an IP address with over 30,000 links to our clients site. The IP address is 92.60.0.123.
From looking up the IP address details, it looks like it is based in Europe - but we are unable to establish what it is, where the links are and who created it.
We are concerned it could be a potential spammer trying to cause an issue with the SEO campaign.
Is there any way of finding out any more details apart from the basic information about the location of the IP address?
Also - if we submit a disavow via webmaster tools, we are unsure what issue it will have on the clients site if we do not know what it is and the type of links it is creating. Any ideas?
Thanks for your help!
Phil.
-
I tried a reverse lookup on that IP, but no luck
http://mxtoolbox.com/ReverseLookup.aspx
I also tried pinging the IP, no response
Maybe its a mistake, or what ever it was it gone now
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Link Getting Deleted for Few Days
If a link gets deleted for few days and re-appears... Will Google treat it as a "new link" or give it the same old link-age.
White Hat / Black Hat SEO | | Akshayshr0 -
When to NOT USE the disavow link tool
Im not here to say this is concrete and should never do this, and please if you disagree with me then lets discuss. One of the biggest things out there today especially after the second wave of Penguin (2.0) is the fear striken web masters who run straight to the disavow tool after they have been hit with Penguin or noticed a drop shortly after. I had a friend who's site who never felt the effects of Penguin 1.0 and thought everything was peachy. Then P2.0 hit and his rankings dropped of the map. I got a call from him that night and he was desperately asking me for help to review his site and guess what might have happened. He then tells me the first thing he did was compile a list of websites back linking to him that might be the issue and create his disavow list and submitted it. I asked him "How long did you research these sites before you came the conclusion they were the problem?" He Said "About an hour" Then I asked him "Did you receive a message in your Google Webmaster Tools about unnatural linking?" He Said "No" I said "Then why are you disavowing anything?" He Said "Um.......I don't understand what you are saying?" In reading articles, forums and even here in the Moz Q/A I tend to think there is some misconceptions about the disavow tool from Google that do not seem to be clearly explained. Some of my findings with the tool and when to use it is purely based on logic IMO. Let me explain When to NOT use the tool If you spent an hour reviewing your back link profile and you are to eager to wait any longer to upload your list. Unless you have less than 20 root domains linking to you, you should spend a lot more than an hour reviewing your back link profile You DID NOT receive a message from GWT informing you that you had some "unnatural" links Ill explain later If you spend a very short amount of time reviewing your back link profile. Did not look at each individual site linking to you and every link that exists, then you might be using it WAY TO SOON. The last thing you want to do is disavow a link that actually might be helping you. Take the time to really look at each link and ask your self this question (Straight from the Google Guidelines) "A good rule of thumb is whether you'd feel comfortable explaining what you've done to a website that competes with you, or to a Google employee" Studying your back link profile We all know when we have cheated. Im sure 99.9% of all of us can admit to it at one point. Most of the time I can find back links from sites and look right at the owner and ask him or her "You placed this back link didn't you?" I can see the guilt immediately in their eyes 🙂 Remember not ALL back links you generate are bad or wrong because you own the site. You need to ask yourself "Was this link necessary and does it apply to the topic at hand?", "Was it relevant?" and most important "Is this going to help other users?". These are some questions you can ask yourself before each link you place. You DID NOT receive a message about unnatural linking This is were I think the most confusing takes place (and please explain to me if I am wrong on this). If you did not receive a message in GWT about unnatural linking, then we can safely say that Google does not think you contain any "fishy" spammy links in which they have determined to be of a spammy nature. So if you did not receive any message yet your rankings dropped, then what could it be? Well it's still your back links that most likely did it, but its more likely the "value" of previous links that hold less or no value at all anymore. So obviously when this value drops, so does your rank. So what do I do? Build more quality links....and watch you rankings come back 🙂
White Hat / Black Hat SEO | | cbielich1 -
How to handle footer links after Penguin?
With the launch of Google's Penguin I know that footer links could possibly hurt rankings. Also too many links on a page are also bad. I have a client http://www.m-scribe.com That has footer links creating well over 100 links on many of their pages. How should I handle these footer links? Suggestions are greatly appreciated.
White Hat / Black Hat SEO | | RonMedlin0 -
How is this achieved - SPAM
Hello everyone. Here's my problem: I just searched for "link inside iframe counts for backlinking?" and on #5 there's a site that caught my attention because of it's Description Snippet. http://www.freelancer.com/job-search/iframe-links-count-backlinks/ This page is totally irrelevant to my query if you take time and read what's on it, however it ranks well. It's clever because the page contains all the required elements: one h1 with keyword in it, some short paragraph under it, similar links (totally irrelevant though), a selection of people who are supposed to be relevant to my question but they are not, all the good stuff. I looked in the source code and i found this: link href="[http://www.freelancer.com/rss/search.xml?keyword=iframe+links+count+backlinks](view-source:http://www.freelancer.com/rss/search.xml?keyword=iframe+links+count+backlinks)" rel="alternate" type="application/rss+xml" title="Latest projects" Please take the time and look at this feed and you'll see something totally wrong here. Could someone please explain how this works? I'ts a total spam however they managed to trick the system... Looking forward to hearing your answers. Alex
White Hat / Black Hat SEO | | pwpaneuro0 -
Can't figure out how my competitor has so many links
I suspect something possibly black-hat is going on with the amount of inbound links for www.pacificlifestylehomes.com ( http://www.opensiteexplorer.org/links?site=www.pacificlifestylehomes.com ) mainly because they have such a large volume of links (for my industry) with their exact targeted keyword. Can anyone help clear this up for me?
White Hat / Black Hat SEO | | theChris0 -
Thought on optimising the perfect keyword location link
My site works a bit like a directory, so say I have a page called "Ice Cream Vendors" - on that page I would talk a bit about Ice Cream Vendors, then I will have a list of Ice Cream Vendor Locations. My list of locations can be quite big depending on the product and the amount of locations they occur in - when you click a location, it goes to a page showing all "ICeCream Vendors" in that location. So Currently I will have a table on the page a bit like this: ICE CREAM VENDOR LOCATIONS
White Hat / Black Hat SEO | | James77
New York
Miami
Las Vegas This is all perfectly nice, simple and usable - BUT it is not producing perfect keyword links - for perfect keyword links the list should be like this: ICE CREAM VENDOR LOCATIONS
New York Ice Cream Vendors
Miami Ice Cream Vendors
Las Vegas Ice Cream Vendors Now I have my perfect anchor links - BUT it looks rediculous and is NOT user friendly. So What do I do?
1/. Build it for users and not have perfect anchor links, and loose in SEO?
2/. Build a perfect SEO links and make it less usable and looking spammy? OR 3/. Deliver the search engine the perfect SEO links, and the user the userfriendly version? In this I mean I could do the following:
SE's (and screen readers I think would see):
ICE CREAM VENDOR LOCATIONS
New York Ice Cream Vendors
Miami Ice Cream Vendors
Las Vegas Ice Cream Vendors Users would See
ICE CREAM VENDOR LOCATIONS
New York
Miami
Las Vegas Now in my view I am doing nothing wrong - I am mearly giving the user the most userfriendly version and I am giving the SE more information on the link, that the user doesn't need. So - In my view I am doing something that is honest - but what are your thoughts?? Has anyone tried to do this? Thanks0 -
Why Does Massive Reciprocal Linking Still Work?
It seems pretty well-settled that massive reciprocal linking is not a very effective strategy, and in fact, may even lead to a penatly. However, I still see massive reciprocal linking (blog roll linking even massive resource page linking) still working all the time. I'm not looking to cast aspersion on any individual or company, but I work with legal websites and I see these strategies working almost universally. My question is why is this still working? Is it because most of the reciprocally linking sites are all legally relevant? Has Google just not "gotten around" to the legal sector (doubtful considering the money and volume of online legal segment)? I have posed this question at SEOmoz in the past and it was opined that massively linking blogs through blog rolls probably wouldn't send any flags to Google. So why is that it seems that everywhere I look, this strategy is basically dismissed as a complete waste of time if not harmful? How can there be such a discrepency between what leading SEOs agree to be "bad" and the simple fact that these strategies are working en masse over the period of at least 3 years?
White Hat / Black Hat SEO | | Gyi0