Are lots of links from an external site to non-existant pages on my site harmful?
-
Google Webmaster Tools is reporting a heck of a lot of 404s which are due to an external site linking incorrectly to my site.
The site itself has scraped content from elsewhere and has created 100's of malformed URLs.
Since it unlikely I will have any joy having these linked removed by the creator of the site, I'd like to know how much damage this could be doing, and if so, is there is anything I can do to minimise the impact?
Thanks!
-
Thanks for this - definitely some food for thought regarding how we handle 404s in general...
I am more worried about search engines than humans from this type of thing (we have had no referrals from this dodgy site) so would be interested to see if you still think a 301 is the best way to go since the link text may not be appropriate to our site (and perhaps this would be worse?!).
-
404s are something do be avoided as it makes your website look 'abandoned' in a way. If possible, I would set up 301s or similar to send these links to your main website rather than to 404 pages. You could also create a nice 404 site.
See these for more information:
http://www.seomoz.org/blog/are-404-pages-always-bad-for-seo
http://www.seomoz.org/blog/personalizing-your-404-error-pages
A bit of work can go a long way.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Unsolved Why My site pages getting video index viewport issue?
Hello, I have been publishing a good number of blogs on my site Flooring Flow. Though, there's been an error of the video viewport on some of my articles. I have tried fixing it but the error is still showing in Google Search Console. Can anyone help me fix it out?
Technical SEO | | mitty270 -
Site's IP showing WMT 'Links to My Site'
I have been going through, disavowing spam links in WMT and one of my biggest referral sources is our own IP address. Site: Covers.com
Technical SEO | | evansluke
IP: 208.68.0.72 We have recently fixed a number of 302 redirects, but the number of links actually seems to be increasing. Is this something I should ignore / disavow / fix using a redirect?0 -
New site: More pages for usability, or fewer more detailed pages for greater domain authority flow?
Ladies and gents! We're building a new site. We have a list of 28 professions, and we're wondering whether or not to include them all on one long and detailed page, or to keep them on their own separate pages. Thinking about the flow of domain authority - I could see 28 pages diluting it quite heavily - but at the same time, I think having the separate pages would be better for the user. What do you think?
Technical SEO | | Muhammad-Isap1 -
Ecommerce website with too many links on page
Hi, I'm working on onsite seo for an ecommerce website and my recent report has shown that I have a high number of pages where there are 'too many links on page'. Does anyone have tips on how to avoid this when we're using mega menus, plenty of navigation for the user and links to products on each page? Thanks
Technical SEO | | Will_Craig1 -
Site rebuild without HTML extension = broken links?
I have a client and his site is pure HTML. He made huge amount of link building with some hundreds of links point to example.com/target.HTML Now we decided to use WordPress with the same site structure. If we use the exact same URLs but without the HTML ending will it cause broken links? Is there any best practice for this kind of change? Many thanks for any idea!
Technical SEO | | seozoltan0 -
Can dynamically translated pages hurt a site?
Hi all...looking for some insight pls...i have a site we have worked very hard on to get ranked well and it is doing well in search. The site has about 1000 pages and climbing and has about 50 of those pages in translated pages and are static pages with unique urls. I have had no problems here with duplicate content and that sort of thing and all pages were manually translated so no translation issues. We have been looking at software that can dynamically translate the complete site into a handfull of languages...lets say about 5. My problem here is these pages get produced dynamically and i have concerns that google will take issue with this aswell as the huge sudden influx of new urls....as now we could be looking at and increase of 5000 new urls. (which usually triggers an alarm) My feeling is that it could be risking the stability of the site that we have worked so hard for and maybe just stick with the already translated static pages. I am sure the process could be fine but fear a manual inspection and a slap on the wrist for having dynamically created content?? and also just risk a review trigger period. These days it is hard to know what could get you in "trouble" and my gut says keep it simple and as is and dont shake it up?? Am i being overly concerned? Would love to here from others who have tried similar changes and also those who have not due to similar "fear" thanks
Technical SEO | | nomad-2023230 -
Too many on page links
Hello I have about 800 warnings with this. Example of one url with this problem is: http://www.theprinterdepo.com/clearance?dir=asc&order=price I was checking and I think all links are important. But I suppose that if I put a nofollow on the links on the left which are only for navigation purposes I can get rid of these warnings. Any other idea?
Technical SEO | | levalencia10 -
External Links from own domain
Hi all, I have a very weird question about external links to our site from our own domain. According to GWMT we have 603,404,378 links from our own domain to our domain (see screen 1) We noticed when we drilled down that this is from disabled sub-domains like m.jump.co.za. In the past we used to redirect all traffic from sub-domains to our primary www domain. But it seems that for some time in the past that google had access to crawl some of our sub-domains, but in december 2010 we fixed this so that all sub-domain traffic redirects (301) to our primary domain. Example http://m.jump.co.za/search/ipod/ redirected to http://www.jump.co.za/search/ipod/ The weird part is that the number of external links kept on growing and is now sitting on a massive number. On 8 April 2011 we took a different approach and we created a landing page for m.jump.co.za and all other requests generated 404 errors. We added all the directories to the robots.txt and we also manually removed all the directories from GWMT. Now 3 weeks later, and the number of external links just keeps on growing: Here is some stats: 11-Apr-11 - 543 747 534 12-Apr-11 - 554 066 716 13-Apr-11 - 554 066 716 14-Apr-11 - 554 066 716 15-Apr-11 - 521 528 014 16-Apr-11 - 515 098 895 17-Apr-11 - 515 098 895 18-Apr-11 - 515 098 895 19-Apr-11 - 520 404 181 20-Apr-11 - 520 404 181 21-Apr-11 - 520 404 181 26-Apr-11 - 520 404 181 27-Apr-11 - 520 404 181 28-Apr-11 - 603 404 378 I am now thinking of cleaning the robots.txt and re-including all the excluded directories from GWMT and to see if google will be able to get rid of all these links. What do you think is the best solution to get rid of all these invalid pages. moz1.PNG moz2.PNG moz3.PNG
Technical SEO | | JacoRoux0