Best way to clean up a nasty backlink profile?
-
A new client of mine sadly has a TON of terrible links (3800 links from 1500 domains) which are pointing to landing pages that have been created specifically for manipulating engines. Besides contacting these sites and asking to have the links removed the only solution I can think of it to delete these pages and let them 404. Obviously I am not thrilled about that but I'm not sure what else to do. Does anyone have any other ideas for how to clean up this backlink profile? Thanks
-
Thank you everyone for the comments and confirmation. Now I just have to break the news to the client that we are going to have to delete those pages haha. Wish me luck
-
Definitely remove the pages from the server and 404 them, you are lucky they linked to internal pages and not the homepage or you would have many many hours of reaching out to webmasters on your hands.
This is a simple fix. and 404s are ok if the 404's are not a result of dead links pointing to no longer existing pages on your site. external sites 404ing to you is fine.
-
No, 404 errors are a natural part of the Internet.
-
If you have 404 errors when Google crawls your site does this not also impact your rankings?
-
Nakul,
Thank you for clarifying this question. We are also in the same situation. It is so good to have someone in the field to offer advice.
-
Honestly, I think you gut is telling you the right thing. And thankfully you answered the question yourself.
The fact that those pages were created to manipulate rankings (as in doorway pages?)
If that's the case, and there are external links to those pages, I would do exactly what you are thinking. Delete those pages and let them be a 404.
Did your client also receive the unnatural links google penalty notice ? If that's the case, then it's a no brainer to do this right away. You could also try to find if there are link networks involved and if there are certain webmasters who control multiple website specially when you say 3800 links from 1500 domains. I would atleast try to get "some" of them removed or atleast have some effort. If there's a way to send a blanket email to all of the 1500 domains, it's well worth the effort of atleast trying.
I hope this helps.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What’s the best way to handle multiple website languages in terms of metatags that should be used and pages sent on our sitemap?
Hey everyone, Has anyone here worked with SEO + website translations? When should we use canonical or alternate tag if we want the user to find our page on the language he used on Google? Should we send all pages on all the different locales on the sitemap? Looking forward to hearing from you! Thanks!
Intermediate & Advanced SEO | | allanformigoni0 -
Whats the best way to set up a directory listing website
Hello all, I am building a website that lists homeschool events and field trips across various states (locker-time.com) and I have a few questions on setting it up correctly. Both the events and field trips are searchable by distance. For clarification, events are associated with a specific date and time and field trips are not. I currently have a link that says homeschool events and you enter your zip to find things close by. Is it better to create a separate page for each state I am targeting instead? So the link would be homeschool events and then a sub-link that says homeschool events in GA and the GA page brings up all the events in GA, still searchable by zip. Or does it matter? I was thinking if its a separate page, I could put keyword rich copy on top, but then clicking on the menu and choosing the appropriate sub-menu is an additional step for users on the site and as the number of states increase, that sub-menu could get pretty big. The search results pages lists the post title of any events or field trips found and the links go to a page on my website with more information, such as the location, details on the event / field trip and a link to their website. I am wondering for SEO purposes, is this the right way to do it? Or I could set up the results page to show an excerpt and some listing info and then link directly to their website. Does it matter? I was thinking a page on my own website since then I could add images (but that might end up sucking up all my hosting space). As I am adding these listings to my website, I simply copied/pasted the details on the event. Now that I'm thinking about it, original content is best, so should I stop doing that and rewrite the description in my own words? Since the events are date specific events and when they pass, they are no longer on the site, does it matter as much for the events? The field trips do not have dates associated with them, so I can probably work on creating my own descriptions for those. Just not sure if I should bother with events that are more short term. Thanks in advance for ANY advice or suggestions. I'm so looking forward to getting this all set up correctly! I find working on this SEO stuff such fun! Jeanette
Intermediate & Advanced SEO | | fatcreat0 -
Cleaning up backlinks and changing URLs
Currently we are performing very poorly in organic clicks. We are a e-commerce site with over 2000 products. Issues we thought plagued us: Copied Images from competitors Site wide duplicate content duplicate content from competitor site Number of internal links on a page (300+) Bad backlinks (2.3k from 22 domains and ips) being linked to from sites like m.biz URLs URLs are abbreviated, over 50% lack our keywords Lack of meta descriptions, or too long meta descriptions Current State of fixing these issues: 50% images are now our own Site wide duplicate content near 100% completed Internal links have been dealt with Rewrote content for every product 90% of meta descriptions are fixed From all of these changes we have yet to see increase in traffic...10% increase at best in organic clicks. We think we have penalties on certain URLs. My question for the MOZ community is what is the best way to attack the lack of organic clicks. Our main competition is getting 900% more clicks than us. Any more information you need on the topic let me know and will get back to you.
Intermediate & Advanced SEO | | TITOJAX0 -
What's the best way to deal with deleted .php files showing as 404s in WMT?
Disclaimer: I am not a developer During a recent site migration I have seen a bit of an increase in WMT of 404 errors on pages ending .php. Click on the link in WMT and it just shows as File Not Found - no 404 page. There are about 20 in total showing in webmaster tools and I want to advise the IT department what to do. What is the best way to deal with this for on-page best practice? Thanks
Intermediate & Advanced SEO | | Blaze-Communication0 -
Best way to handle page filters and sorts
Hello Mozzers, I have a question that has to do with the best way to handle filters and sorts with Googlebot. I have a page that returns a list of widgets. I have a "root" page about widgets and then filter and sort functionality that shows basically the same content but adds parameters to the URL. For example, if you filter the page of 10 widgets by color, the page returns 3 red widgets on the top, and 7 non-red widgets on the bottom. If you sort by size, the page shows the same 10 widgets sorted by size. We use traditional php url parameters to pass filters and sorts, so obviously google views this as a separate URL. Right now we really don't do anything special in Google, but I have noticed in the SERPs sometimes if I search for "Widgets" my "Widgets" and "Widgets - Blue" both rank close to each other, which tells me Google basically (rightly) thinks these are all just pages about Widgets. Ideally though I'd just want to rank for my "Widgets" root page. What is the best way to structure this setup for googlebot? I think it's maybe one or many of the following, but I'd love any advice: put rel canonical tag on all of the pages with parameters and point to "root" use the google parameter tool and have it not crawl any urls with my parameters put meta no robots on the parameter pages Thanks!
Intermediate & Advanced SEO | | jcgoodrich0 -
How best to handle (legitimate) duplicate content?
Hi everyone, appreciate any thoughts on this. (bit long, sorry) Am working on 3 sites selling the same thing...main difference between each site is physical location/target market area (think North, South, West as an example) Now, say these 3 sites all sell Blue Widgets, and thus all on-page optimisation has been done for this keyword. These 3 sites are now effectively duplicates of each other - well the Blue Widgets page is at least, and whist there are no 'errors' in Webmaster Tools am pretty sure they ought to be ranking better than they are (good PA, DA, mR etc) Sites share the same template/look and feel too AND are accessed via same IP - just for good measure 🙂 So - to questions/thoughts. 1 - Is it enough to try and get creative with on-page changes to try and 'de-dupe' them? Kinda tricky with Blue Widgets example - how many ways can you say that? I could focus on geographical element a bit more, but would like to rank well for Blue Widgets generally. 2 - I could, i guess, no-index, no-follow, blue widgets page on 2 of the sites, seems a bit drastic though. (or robots.txt them) 3 - I could even link (via internal navigation) sites 2 and 3 to site 1 Blue Widgets page and thus make 2 blue widget pages redundant? 4 - Is there anything HTML coding wise i could do to pull in Site 1 content to sites 2 and 3, without cloaking or anything nasty like that? I think 1- is first thing to do. Anything else? Many thanks.
Intermediate & Advanced SEO | | Capote0 -
What is the best way to scrape serps for targeted keyword research?
Wanting to use search operators such as "KEYWORD inurl:blog" to identify potential link targets, then download target url, domain and keyword into an excel file. Then use SEOTools to evaluate the urls from the list. I see the link aquisition assistant in the Moz lab, but the listed operators are limited. Appreciate any suggestions on doing this at scale, thanks!
Intermediate & Advanced SEO | | Qualbe-Marketing-Group0 -
Best Product URL For Indexing
My proposed URL: mydomain.com/products/category/subcategory/product detail Puts my products 4 levels deep. Is this too deep to get my products indexed?
Intermediate & Advanced SEO | | waynekolenchuk0