Site Hacked: Is it Faster and Better to 301 or 404 Irrelevant URLs?
-
Hey Everyone,
So our site was hacked which created a large amount of irrelevant URLs on our domain; resulting in thousands of 404 errors and pages coming up for searches unrelated to our brand. The question is now that the issues have been resolved (and site re-submitted) would it be quicker (and more ideal) to redirect important 404 errors that see traffic, have links…etc. although not relevant or just let everything 404 out?
We’re not as concerned with offering a relevant user experience because these are not in our demographic but want to avoid these pages convoluting our analytics as well as issues that might arise from Google thinking these topics do apply.
Any help or insight would be very appreciated.
Please let us know if you have any questions, concerns or we could provide further details that might help.
Looking forward to hearing from all of you!
Thanks in advance.
Best,
-
Hi Ben!
I have similar problem with one of my sites. It was hacked before and now Google is showing number of irrelevant links that too in a different language. I have implemented redirects to most of them but they still appear in results.
- What do you think about blocking them in robots.txt?
- Or requesting URL removals in GSC?
I'd really appreciate any help from the community!
Thanks!
SK
-
Hi Ben,
No, will not expedite it. It will take time for Google index to catch up and remove them. I know it's frustrating, so just be patient with it. Most of us, including myself, have been through it. Good luck!
-
Hello Kevin,
Thank you for your response and help. That’s a great point, do you think it would help speed things along if we also redirect on top of the 410 error? - Just in case and more in the meantime.
Please let us know if we could provide any further details that might help.
Looking forward to hearing from you!
Thanks again.
Best,
-
If not relevant, why 301 them? 410 them and remove url's in search console.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Submitted URL marked 'noindex'
Search Console is giving this issue for near 100 pages of my website. I have checked the Yoast plugin settings. We haven't used any meta robots tag for these pages, neither have these pages been disallowed in robots.txt. Previosuly this issue was for some 20+ pages. I tried to reindex them by submitting the urls again. Now the count has risen to 100+. There is also this "Submitted URL blocked by robots.txt" issue for pages which are NOT disallowed in robots.txt. Can anyone please suggest me a solution here?
Reporting & Analytics | | Reema240 -
Tool to check page size for multiple url's at once
In Google Analytics under Site Speed > Page Timings, you can see all pages and their loading time compared to the average. This is very handy to check which pages maybe need some optimization. I would also like to check the size for these pages in a similar way. There are multiple tools out there like GTmetix and Pingdom that give specific information and performance insights. The problem is that they are limited to check one url at a time. Does someone know about a tool to check the page size of multiple url’s at once (and if possible to easily export to Excel)? That way I can check which pages are big in size and research/optimize them. Thanks in advance
Reporting & Analytics | | Mark.0 -
Curious, anyone ever had over half of their indexed links drop on an e-commerce site?
In a year went from around 300k indexed pages to around >100k according to GWT. Could this be duplicate content issue, lost links, spam, aged links or all of the above? either way an audit is in order. Thanks! Chris
Reporting & Analytics | | Sundance_Kidd0 -
Universal Analytics & Google Tag Manager - Track URLs that include hashes
Does anyone have any experience tracking URLs that include hashes (#) using Universal Analytics and Google Tag Manager? Can it be done using GTM's container for UA, using the "more settings" options? Or building another tag to work with the GTM UA container? The fallback I'm considering is implementing the UA code in GTM for every page as Custom HTML with the "ga('send', 'pageview', location.pathname + location.search + location.hash);" solution, rather than GTM's specialized UA tag. I'm not yet sure what problems may arise from that, if any. Thanks in advance.
Reporting & Analytics | | 352inc0 -
Irrelevant page with high bounce rate
I have a page on my site, www.waikoloavacationrentals.com/kolea-rentals/floor-plans, that gets me roughly 17% of my traffic. That being said it is not really relevant traffic because it comes from the search term "floor plans", which really has nothing to do with Hawaii vacation rentals, which is what I do. My question is does Google know how to figure that out when they are looking at my stats or is there a way to let google know that that page probably should not show up for that search phrase? On the positive, they are nice floor plans and if someone is searching for ideas for floor plans and see one of them in google images then it probably could help them, but it really is not relevant to my business. It has a 80% bounce rate, but does have an average time on page of 1.5 minutes, which is a fair amount for what is there.
Reporting & Analytics | | RobDalton0 -
Multiple-Domain tracking for sister sites- NO retail checkout- Please help
Hello, I have about 5 sites I want to set up multiple-domain tracking in google analytics. All posts I read seem to be focused on cross-domain tracking for the purpose of tracking a visitor from one domain across another domain for shopping cart check outs. I don't need that. I have about 3 sister sites (mastersite.com, sistersite1.com, sistersite2.com, sistersite3.com) related to my primary site. I want 1 Master Analytics Profile to track traffic for all of these sites combined. My visitors will not jump from mastersite.com over to sistersite1.com. There will be no cross-domain visits. How can I set up 1 master google analytics profile that will aggregate traffic data from all sites and present the data to me in one analytics profile. Please help
Reporting & Analytics | | AndreGant0 -
How to change a url for google analytics account
We recently changed the url of a client's website. Is there a way to change the url on the ga account instead of creating a new account so that we don't lose comparative data? Thanks! Sorry- I know this is a novice question.
Reporting & Analytics | | marketing12340 -
Setting up Goals in Google Analytics that involve a 3rd party site
I've set up several goals for one of my clients in Google Analytics. The ones that relate to things on the site -- such as clicking on the "Contact Us" button -- work just fine. However, I set up one that is tracking when someone clicks on a purchase button, which sends the user to a third party site (PayPal). This one doesn't seem to work. (I purchased and item and the goal was not recorded). Looking to see if I have to do anything different when setting up the goal.
Reporting & Analytics | | EricVallee340