Question about spammy links to 404 Pages we never created ...
-
FYI I'm a beginner within the company, so this might be a basic question, but ...I was going through open site explorer and checking www.partnermd.com for opportunities to reclaim links and I found a bunch of 404 pages that we never created that had nothing to do with the business. Out of curiousity, I plugged in one of the weird links like this one:http://www.partnermd.com/images/2015-best-space-heater-best-wers.html into open site explorer and found several bad spammy links pointing to it. When I clicked on one of them I got a notice that the site might have been hacked.I did some research and it looks like Google doesn't penalize you for spammy links to 404 pages, but how do we prevent this from occurring in the first place if possible?
-
Hi,
Although I agree with Andy Drinkwater, yet I would still go ahead and disavow these links in time. It's the right thing to do, you never know if the next Google update starts looking at these spammy links as well.
Always do the right thing first, don't wait for to react to the situation when it arrives.
I hope this helps, feel free to respond if you have further questions.
Best Regards,
Vijay
-
I agree with Andy.
This shouldn't be harmful to your website or your organic rankings. But, it will be extremely difficult to stop other websites from linking to you.
I would monitor your backlinks on a regular basis to see if anything seems very abnormal.
-
Hi,
This is quite common for sites that are hacked and you then end up with these really strange spam links pointing to you.
The good news is, this won't do any harm to you. The bad news is, it's virtually impossible to stop someone else linking to you - especially so if it's a hacked site.
I would advise just keeping an eye on them and if you think it's a problem, just disavow the sites that are linking to you.
-Andy
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
SEO Content Audits Questions (Removing pages from website, extracting data, organizing data).
Hi everyone! I have a few questions - we are running an SEO content audit on our entire website and I am wondering the best FREE way to extract a list of all indexed pages. Would I need to use a mix of Google Analytics, Webmaster Tools, AND our XML sitemap or could I just use Webmaster Tools to pull the full list? Just want to make sure I am not missing anything. As well, once the data is pulled and organized (helpful to know the best way to pull detailed info about the pages as well!) I am wondering if it would be a best practice to sort by high trafficked pages in order to rank them for prioritization (ie: pages with most visits will be edited and optimized first). Lastly, I am wondering what constitutes a 'removable' page. For example, when it is appropriate to fully remove a page from our website? I understand that it is best, if you need to remove a page, to redirect the person to another similar page OR the homepage. Is this the best practice? Thank you for the help! If you say it is best to organize by trafficked pages first in order to optimize them - I am wondering if it would be an easier process to use MOZ tools like Keyword Explorer, Page Optimization, and Page Authority to rank pages and find ways to optimize them for best top relevant keywords. Let me know if this option makes MORE sense than going through the entire data extraction process.
Technical SEO | | PowerhouseMarketing0 -
What's the best way to pass link juice to a page on another domain?
I'm working with a non-profit, and their donation form software forces them to host their donation pages on a different domain. I want to attempt to get their donation page to appear in their sitelinks in Google (under the main website's entry), but it seems like the organization's donation forms are at a disadvantage because they're not actually hosted on that site. I know that no matter what I do, there's no way to "force" a sitelink to appear the way I want it, but... I was trying to think if there's a way I can work around this. Do you think 1) creating a url like orgname.org/donate and having that be a 301 redirect to the donation form, and 2) using the /donate redirect all over the site (instead of linking directly to the form) would help? Are there alternatives other folks recommend?
Technical SEO | | clefevre0 -
So many internal links to the same page
Hey guyz,
Technical SEO | | atakala
I'm working with a client that has a page which has many internal links to the same page .
Let me illustrate it.
So as you can see I have a page which is called in the image "page" :D.
As you can see, the **page **has many links to the solutions.htmls' anchor links which mean they are basically the same page ( solutions.html)
Is it going to be a problem for us to do that ?
And is there anyway to handle this problem?
Thank you for you patience. And sorry for my bad english 😄 4deRc1W.png0 -
Should I consider webmaster tools links and linked pages ratio to remove unnatural links?
I don't know this is a suitable place for post this question. Anyway I have done it. According to the Google webmaster tools, Links to your site page. My blog has considerable amount of links, from linked pages (from certain domain names). For an instance please refer following screenshot. When I am removing unnatural links, should I consider these, links from linked pages ratio? Almost all of these sites are social bookmarking sites. When I publish a new bookmark on those sites, they automatically add a homepage link. As a result of that, I got a huge number of home page links from linked pages. What is your recommendation? Thanks! webmaster.png web_master_tools.png
Technical SEO | | Godad0 -
Can view pages of site, but Google & SEOmoz return 404
I can visit and view every page of a site (can also see source code), but Google, SEOmoz and others say anything other than home page is a 404 and Google won't index the sub-pages. I have check robots.txt and HTAccess and can't find anything wrong. Is this a DNS or server setting problem? Any ideas? Thanks, Fitz
Technical SEO | | FitzSWC0 -
Can 404 results from external links hurt site ranking?
Hello, I'm helping a university transition to a brand new website. In some cases the URLs will change between the old site and new site. They will put 301 redirects in place to make sure that people who have old URLs will get redirected properly to the new URLs. However they also have a bunch of old pages that they aren't using anymore. They don't really care if people still try to get to them (because they don't think many will), but they do care about the overall search engine rankings. I know that if a site has internal 404 links, that could hurt rankings. However can external links that return a 404 hurt rankings? Ryan
Technical SEO | | GreenHatWeb0 -
What to do when you want the category page and landing page to be the same thing?
I'm working on structuring some of my content better and I have a dilemma. I'm using wordpress and I have a main category called "Therapy." Under therapy I want to have a few sub categories such as "physical therapy" "speech therapy" "occupational therapy" to separate the content. The url would end up being mysite/speech-therapy. However, those are also phrases I want to create a landing page for. So I'd like to have a page like mysite.com/speech-therapy that I could optimize and help people looking for those terms find some of the most helpful content on our site for those certain words. I know I can't have 2 urls that are the same, but I'm hoping someone can give me some feedback on the best way to about this. Thanks.
Technical SEO | | NoahsDad0 -
Robots.txt file question? NEver seen this command before
Hey Everyone! Perhaps someone can help me. I came across this command in the robots.txt file of our Canadian corporate domain. I looked around online but can't seem to find a definitive answer (slightly relevant). the command line is as follows: Disallow: /*?* I'm guessing this might have something to do with blocking php string searches on the site?. It might also have something to do with blocking sub-domains, but the "?" mark puzzles me 😞 Any help would be greatly appreciated! Thanks, Rob
Technical SEO | | RobMay0