404 Errors For Pages That Never Existed
-
I'm seeing a lot of 404 errors with slugs related to cryptocurrency (not my website's industry at all). We've never created pages remotely similar, but I see a lot of 404 errors with keywords like "bitcoin" and "litecoin". Any recommendations on what to do about this? Another keyword is "yelz".
It usually presents like .../yelz/-ripper-vs-steller/ or .../bitcoin-vs-litecoin/. I don't really even have the time to fix all the legitimate 404 errors, let alone these mysterious requests. Any advice is appreciated.
-
This is a great answer.
From the sounds of it OP has:
- A scraper site gone wrong, creating malformed links to your site
- Some kind of shady negative SEO attack trying to create garbage URLs on your site
- A data pollution attack trying to mess up your analytics
- A hack-attack
- Any mixture of the above
If a site has been hacked, sometimes it can take some proper dev work to pull it out at the roots. A hacked site is a liability to Google and Google don't like to rank hacked sites / content
I would suggest checking whether the site has been hacked with some urgency and back everything Gaston has said
-
Hi Brandon,
It's perfectly fine to have some 404s. Google Search Console reports them as a warning, as you might have some 404s that you aren't aware of.
Also, as you are having that much errors where your site has no business, I'd suggest you to check whether the site has been hacked or there is something going on there. You might want to check your website security and update your plugins and CMS, if that the case.Hope this helps.
Best luck.
Gaston
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Should I submit a sitemap for a site with dynamic pages?
I have a coupon website (http://couponeasy.com)
White Hat / Black Hat SEO | | shopperlocal_DM
Being a coupon website, my content is always keeps changing (as new coupons are added and expired deals are removed) automatically. I wish to create a sitemap but I realised that there is not much point in creating a sitemap for all pages as they will be removed sooner or later and/or are canonical. I have about 8-9 pages which are static and hence I can include them in sitemap. Now the question is.... If I create the sitemap for these 9 pages and submit it to google webmaster, will the google crawlers stop indexing other pages? NOTE: I need to create the sitemap for getting expanded sitelinks. http://couponeasy.com/0 -
A doorway-page vendor has made my SEO life a nightmare! Advice anyone!?
Hey Everyone, So I am the SEO at a mid-sized nationwide retailer and have been working there for almost a year and half. This retailer is an SEO nightmare. Imagine the worst possible SEO nightmare, and that is my unfortunate yet challenging everyday reality. In light of the new algorithm update that seems to be on the horizon from Google to further crack down on the usage of doorway pages, I am coming to the Moz community for some desperately needed help. Before I was employed here, the eCommerce director and SEM Manager connected with a vendor that told them basically that they can do a PPC version of SEO for long-tail keywords. This vendor sold them on the idea that they will never compete with our own organic content and can bring in incremental traffic and revenue due to all of this wonderful technology they have that is essentially just a scraper. So for the past three years, this vendor has been creating thousands of doorway pages that are hosted on their own server but our masked as our own pages. They do have a massive index / directory in HTML attached to our website and even upload their own XML site maps to our Google Web Master Tools. So even though they “own” the pages, they masquerade as our own organic pages. So what we have today is thousands upon thousands of product and category pages that are essentially built dynamically and regurgitated through their scraper / platform, whatever. ALL of these pages are incredibly thin in content and it’s beyond me how Panda has not exterminated them. ALL of these pages are built entirely for search engines, to the point that you would feel like the year was 1998. All of these pages are incredibly over- optimized with spam that really is equivalent to just stuffing in a ton of meta keywords. (like I said – 1998) Almost ALL of these scraped doorway pages cause an incredible amount of duplicate content issues even though the “account rep” swears up and down to the SEM Manager (who oversees all paid programs) that they do not. Many of the pages use other shady tactics such as meta refresh style bait and switching. For example: The page title in the SERP shows as: Personalized Watch Boxes When you click the SERP and land on the doorway page the title changes to: Personalized Wrist Watches. Not one actual watch box is listed. They are ALL simply the most god awful pages in terms of UX that you will ever come across BUT because of the sheer volume of this pages spammed deep within the site, they create revenue just playing the odds game. Executives LOVE revenue. Also, one of this vendor’s tactics when our budget spend is reduced for this program is to randomly pull a certain amount of their pages and return numerous 404 server errors until spend bumps back up. This causes a massive nightmare for me. I can go on and on but I think you get where I am going. I have spent a year and half campaigning to get rid of this black-hat vendor and I am finally right on the brink of making it happen. The only problem is, it will be almost impossible to not drop in revenue for quite some time when these pages are pulled. Even though I have helped create several organic pages and product categories that will pick-up the slack when these are pulled, it will still be awhile before the dust settles and stabilizes. I am going to stop here because I can write a novel and the millions of issues I have with this vendor and what they have done. I know this was a very long and open-ended essay of this problem I have presented to you guys in the Moz community and I apologize and would love to clarify anything I can. My actual questions would be: Has anyone gone through a similar situation as this or have experience dealing with a vendor that employs this type of black-hat tactic? Is there any advice at all that you can offer me or experiences that you can share that can help be as armed as I can when I eventually convince the higher-ups they need to pull the plug? How can I limit the bleeding and can I even remotely rely on Google LSI to serve my organic pages for the related terms of the pages that are now gone? Thank you guys so much in advance, -Ben
White Hat / Black Hat SEO | | VBlue1 -
Best practice to preserve the link juice to internal pages from expired domain?
This question relates to setting up an expired domain, that already has quality links, including deep links to internal pages. Since the new site structure will be different, what's the best practice to preserve the link juice to these internal pages? Export all the internal pages linked to using majestic Seo/ ahrefs etc, and set these pages previously linked to? Or 301 redirect these pages to home page? I heard there's a Wordpress plugin that 301 redirects all the 404 errors successfully preserving all the potential link juice.
White Hat / Black Hat SEO | | adorninvitations0 -
Ajax Pagination on Ecommerce category pages - Good or Bad?
We have an ecommerce site. We installed an AJAX feature that when you scroll down to say, the end of 6 rows of products, it loads another page below the seam. Question is, is this good or bad for SEO? Any tests you can suggest? Thanks Ben
White Hat / Black Hat SEO | | bjs20100 -
Negative SEO to inner page: remove page or disavow links?
Someone decided to run a negative-SEO campaign, hitting one of the inner pages on my blog 😞 I noticed the links started to pile up yesterday but I assume there will be more to come over the next few days. The targeted page is of little value to my blog, so the question is: should I remove the affected page (hoping that the links won't affect the entire site) or to submit a disavow request? I'm not concerned about what happens to the affected page, but I want to make sure the entire site doesn't get affected as a result of the negative-SEO. Thanks in advance. Howard
White Hat / Black Hat SEO | | howardd0 -
Some pages of my website http://goo.gl/1vGZv stopped crawling in Google
hi , i have 5 years old website and some page of my website http://goo.gl/1vGZv stopped indexing in Google . I have asked Google webmaster to remove low quality link via disavow tool . What to do ?
White Hat / Black Hat SEO | | unitedworld0 -
New sub-domain launches thousands of local pages - is it hurting the main domain?
Would greatly appreciate some opinions on this scenario. Domain cruising along for years, top 1-3 rankings for nearly all top non-branded terms and a stronghold for branded searches. Sitelinks prominently shown with branded searches and always ranked #1 for most variations of brand name. Then, sub-domain launches that was over 80,000 local pages - these pages are 90-95% similar with only city and/or state changing to make them appear like unique local pages. Not an uncommon technique but worrisome in a post Panda/Penguin world. These pages are surprisingly NOT captured as duplicate content by the SEOMoz crawler in my campaigns. Additionally about that same time a very aggressive, almost entirely branded paid search campaign was launched that took 20% of the clicks previously going to the main domain in organic to ppc. My concern is this, shortly after this launch of over 80k "local" pages on the sub-domain and the cannibalization of organic clicks through ppc we saw the consistency of sitelinks 6 packs drop to 3 sitelinks if showing at all, including some sub-domains in sitelinks (including the newly launched one) that had never been there before. There's not a clear answer here I'm sure but what are the experts thoughts on this - did a massive launch of highly duplicate pages coupled with a significant decrease in organic CTR for branded terms harm the authority of the main domain (which is only a few dozen pages) causing less sitelinks and less strength as a domain or is all this a coincidence? Or caused by something else we aren't seeing? Thanks for thoughts!
White Hat / Black Hat SEO | | VMLYRDiscoverability0 -
Content box (on page content) and titles Google over-optimization penalty?
We have a content box at the bottom of our website with a scroll bar and have posted a fair bit of content into this area (too much for on page) granted it is a combination of SEO content (with links to our pages) and informative but with the over optimization penalty coming around I am a little scared if this will result in a problem for us. I am thinking of adopting the process of this website HERE with the content behind a more information button that drops down, would this be better as it could be much more organised and we will be swopping out to more helpful information than the current 50/50 (SEO – helpful content) or will it be viewed the same and we might as well leave it as is and lower the amount of repetition and links in the content. Also we sell printed goods so our titles may be a bit over the top but they are bring us a lot of converting traffic but again I am worried about the new Google release this is an example of a typical title (only an example not our product page) Banner Printing | PVC Banners | Outdoor Banners | Backdrops | Vinyl Banners | Banner Signs Thank you for any help with these matters.
White Hat / Black Hat SEO | | BobAnderson0