Site Hacked: Is it Faster and Better to 301 or 404 Irrelevant URLs?
-
Hey Everyone,
So our site was hacked which created a large amount of irrelevant URLs on our domain; resulting in thousands of 404 errors and pages coming up for searches unrelated to our brand. The question is now that the issues have been resolved (and site re-submitted) would it be quicker (and more ideal) to redirect important 404 errors that see traffic, have links…etc. although not relevant or just let everything 404 out?
We’re not as concerned with offering a relevant user experience because these are not in our demographic but want to avoid these pages convoluting our analytics as well as issues that might arise from Google thinking these topics do apply.
Any help or insight would be very appreciated.
Please let us know if you have any questions, concerns or we could provide further details that might help.
Looking forward to hearing from all of you!
Thanks in advance.
Best,
-
Hi Ben!
I have similar problem with one of my sites. It was hacked before and now Google is showing number of irrelevant links that too in a different language. I have implemented redirects to most of them but they still appear in results.
- What do you think about blocking them in robots.txt?
- Or requesting URL removals in GSC?
I'd really appreciate any help from the community!
Thanks!
SK
-
Hi Ben,
No, will not expedite it. It will take time for Google index to catch up and remove them. I know it's frustrating, so just be patient with it. Most of us, including myself, have been through it. Good luck!
-
Hello Kevin,
Thank you for your response and help. That’s a great point, do you think it would help speed things along if we also redirect on top of the 410 error? - Just in case and more in the meantime.
Please let us know if we could provide any further details that might help.
Looking forward to hearing from you!
Thanks again.
Best,
-
If not relevant, why 301 them? 410 them and remove url's in search console.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Are these Search Console crawl errors a major concern to new client site?
We recently (4/1) went live with a new site for a client of ours. The client site was originally Point2 before they made the switch to a template site with Real Estate Webmasters. Now when I look into the Search Console I am getting the following Crawl Errors: 111 Server Errors (photos) 104 Soft 404s (blogs, archives, tags) 6,229 Not Found (listings) I have a few questions. The server errors I know not a lot about so I generally ignore. My main concerns are the 404s and not found. The 404s are mostly tags and blog archives which I wonder if I should leave alone or do 301s for each to /blog. For not found, these are all the previous listings from the IDX. My assumption is these will naturally fall away after some time, as the new ones have already indexed. But I wonder what I should be doing here and which will be affecting me. When we launched the new site there was a large spike in clicks ( 250% increase) which has now tapered off to an average of ~85 clicks versus ~160 at time of launch. Not sure if the Crawl Errors have any effect, I'm guessing not so much right now. I'd appreciate your insights Mozzers!
Reporting & Analytics | | localwork0 -
Why are these sites outranking me?
I've been hit by every update and have spent thousands of $ and hundreds of hours trying to survive. Survival looks doubtful if I can't get turned around in 4 weeks or less. I have found adwords and google errors and fixed them. Alexa says us-nano.com is the best ranked site. I used my moz bar and they are doing everything wrong, keyword stuffing, no H1's tags, poor design. How are they ranking & I'm not? My duplicate meta tags are from this week when I added alexa and bing ID's to my header to verify my site ownership. http://imgur.com/a/I1bsw I1bsw
Reporting & Analytics | | cheaptubes0 -
Shall I 301 a Url that has a discontinued product or shall we remove from Googles index
My web site sells shoes. These items go out of fashion and are replaced. Shall I 301 a Url that has a discontinued product or shall we remove from Googles index using webmaster tools. I seem to have a massive 301 list that carries on growing and Im concerned that to carry on doing 301s is not the right way.
Reporting & Analytics | | weddingshoesandaccessories0 -
Universal Analytics & Google Tag Manager - Track URLs that include hashes
Does anyone have any experience tracking URLs that include hashes (#) using Universal Analytics and Google Tag Manager? Can it be done using GTM's container for UA, using the "more settings" options? Or building another tag to work with the GTM UA container? The fallback I'm considering is implementing the UA code in GTM for every page as Custom HTML with the "ga('send', 'pageview', location.pathname + location.search + location.hash);" solution, rather than GTM's specialized UA tag. I'm not yet sure what problems may arise from that, if any. Thanks in advance.
Reporting & Analytics | | 352inc0 -
Does anyone know of a way to do a profile level filter to exclude all traffic if it enters the site via certain landing pages?
Does anyone know of a way to do a profile level filter to exclude all traffic if it enters the site via certain landing pages? The problem I have is that we have several pages that are served to visitors of numerous other domains but are also served to visitors of our site. We end up with inflated Google Analytics numbers because people are viewing these pages from our partners' domains but never actually entering our site. I've made an advanced segment that serves the purpose but I'd really like to filter it at the profile level so the numbers across the board are more accurate without having to apply an advanced segment to every report. The advanced segment excludes visits that hit these pages as landing pages but includes visits where people have come from other pages on our domain. I know that you can do profile filters to exclude visits to pages or directories entirely but is there a way to filter them only if they are a landing pages? Any other creative thoughts? Thanks in advance!
Reporting & Analytics | | ATIseo0 -
.com version and .org version of site
So i just discovered that a site I now managae has a .com version - as well as the .org version that is the one everyone knows about! I'm guessing this is not a good thing... So the whole site eg www.abc.org/example has a mirror page www.abc.com/example.... What should I do about this? Is it really bad to have 2 versions out there? Thanks!
Reporting & Analytics | | inhouseninja0 -
Why are Seemingly Randomly Generated URLs Appearing as Errors in Google Webmaster Tools?
I've been confused by some URLs that are showing up as errors in our GWT account. They seem to just be randomly generated alphanumeric strings that Google is reporting as 404 errors. The pages do 404 because nothing ever existed there or was linked to. Here are some examples that are just off of our root domain: /JEzjLs2wBR0D6wILPy0RCkM/WFRnUK9JrDyRoVCnR8= /MevaBpcKoXnbHJpoTI5P42QPmQpjEPBlYffwY8Mc5I= /YAKM15iU846X/ymikGEPsdq 26PUoIYSwfb8 FBh34= I haven't been able to track down these character strings in any internet index or anywhere in our source code so I have no idea why Google is reporting them. We've been pretty vigilant lately about duplicate content and thin content issues and my concern is that there are an unspecified number of urls like this that Google thinks exist but don't really. Has anyone else seen GWT reporting errors like this for their site? Does anyone have any clue why Google would report them as errors?
Reporting & Analytics | | kimwetter0 -
Tagging URLs Linkbuilding and anchor links
Hi, I am going to publish a press release on a number of different websites. First and foremost, I want to build anchor links back to website for specific keywords. Secondly I want to measure clickthrus from each site using parameter tracking in GA. I want to know if I put in a url with ?utm_source=xxx, will this have any impact upon my linkbuilding efforts? i.e. will search engines attribute the keyword to the long url with tracking or the url without tracking. I understand that everything from the ? mark is ignored. However, i just want to double check before I publish release. Thanks for your help. Mik
Reporting & Analytics | | increation0