URL Errors Help - 350K Page Not Founds in 22 days
-
Got a good one for you all this time...
For our site, Google Search Console is reporting 436,758 "Page Not Found" errors within the Crawl Error report.
This is an increase of 350,000 errors in just 22 days (on Sept 21 we had 87,000 errors which was essentially consistently at that number for the previous 4 months or more). Then on August 22nd the errors jumped to 140,000, then climbed steadily from the 26th until the 31st reaching 326,000 errors, and then climbed again slowly from Sept 2nd until today's 436K.
Unfortunately I can only see the top 1,000 erroneous URLs in the console, of which they seem to be custom Google tracking URLs my team uses to track our pages.
A few questions:
1. Is there anyway to see the full list of 400K URLs Google is reporting they cannot find?
2. Should we be concerned at all about these?
3. Any other advice?thanks in advance!
C
-
No problem! Please let us know if you need any help once you have your results.
-
thank you all for the feedback. A comprehensive deep crawl is being conducted on the site now to help find out more. I truly appreciate all your guidance.
best
CC
-
I'm guessing this is for a news or ecommerce site? That is a lot of URLs.
Screaming Frog is a good resource, but I would look at the format of the URLs and how your platform creates URLs. I remember years ago many people were having issues with Wordpress, Joomla and other CMS's creating alternate URLs without the publisher knowing about them. Most likely its a setting in your system. Take a look at the URL settings and also the URLs that the tracking software is stating that it cannot find. Look for patterns across URLs and categories. You may find what you are looking for.
-
Not sure if this is related but myself and someone else have seen something similar around the same time happen, see here: https://moz.com/community/q/strange-increase-of-pages-not-found-gwt
-
Hi Usnseomoz,
1. Maybe perform a deeper search via a bit of kit like ScreamingFrog. This should help to further highlight any missing pages / errors etc
2. I would always be concerned with any problem until you have either been able to resolve it or discount it. Sounds like it could be the URL tracking parameters which are causing you some issues, especially if you are tracking users for multiple sources / sales affiliates. If they are solely used for tracking and no other purpose I would consider adding these parameter vairables to the crawl filters.
Side menu >> Crawl >> Url Parameters
https://support.google.com/webmasters/answer/6080548?rd=1Hope this is of some use
Cheers
Tim
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
URL indexed but not submitted in sitemap, however the URL is in the sitemap
Dear Community, I have the following problem and would be super helpful if you guys would be able to help. Cheers Symptoms : On the search console, Google says that some of our old URLs are indexed but not submitted in sitemap However, those URLs are in the sitemap Also the sitemap as been successfully submitted. No error message Potential explanation : We have an automatic cache clearing process within the company once a day. In the sitemap, we use this as last modification date. Let's imagine url www.example.com/hello was modified last time in 2017. But because the cache is cleared daily, in the sitemap we will have last modified : yesterday, even if the content of the page did not changed since 2017. We have a Z after sitemap time, can it be that the bot does not understands the time format ? We have in the sitemap only http URL. And our HTTPS URLs are not in the sitemap What do you think?
Intermediate & Advanced SEO | | ZozoMe0 -
Can a duplicate page referencing the original page on another domain in another country using the 'canonical link' still get indexed locally?
Hi I wonder if anyone could help me on a canonical link query/indexing issue. I have given an overview, intended solution and question below. Any advice on this query will be much appreciated. Overview: I have a client who has a .com domain that includes blog content intended for the US market using the correct lang tags. The client also has a .co.uk site without a blog but looking at creating one. As the target keywords and content are relevant across both UK and US markets and not to duplicate work the client has asked would it be worthwhile centralising the blog or provide any other efficient blog site structure recommendations. Suggested solution: As the domain authority (DA) on the .com/.co.uk sites are in the 60+ it would risky moving domains/subdomain at this stage and would be a waste not to utilise the DAs that have built up on both sites. I have suggested they keep both sites and share the same content between them using a content curated WP plugin and using the 'canonical link' to reference the original source (US or UK) - so not to get duplicate content issues. My question: Let's say I'm a potential customer in the UK and i'm searching using a keyword phrase that the content that answers my query is on both the UK and US site although the US content is the original source.
Intermediate & Advanced SEO | | JonRayner
Will the US or UK version blog appear in UK SERPs? My gut is the UK blog will as Google will try and serve me the most appropriate version of the content and as I'm in the UK it will be this version, even though I have identified the US source using the canonical link?2 -
Client wants to remove mobile URLs from their sitemap to avoid indexing issues. However this will require SEVERAL billing hours. Is having both mobile/desktop URLs in a sitemap really that detrimental to search indexing?
We had an enterprise client ask to remove mobile URLs from their sitemaps. For their website both desktop & mobile URLs are combined into one sitemap. Their website has a mobile template (not a responsive website) and is configured properly via Google's "separate URL" guidelines. Our client is referencing a statement made from John Mueller that having both mobile & desktop sitemaps can be problematic for indexing. Here is the article https://www.seroundtable.com/google-mobile-sitemaps-20137.html
Intermediate & Advanced SEO | | RosemaryB
We would be happy to remove the mobile URLs from their sitemap. However this will unfortunately take several billing hours for our development team to implement and QA. This will end up costing our client a great deal of money when the task is completed. Is it worth it to remove the mobile URLs from their main website to be in adherence to John Mueller's advice? We don't believe these extra mobile URLs are harming their search indexing. However we can't find any sources to explain otherwise. Any advice would be appreciated. Thx.0 -
SEO is changing - how has your day to day changed?
I'm sure we all read on our alternatives to Google Reader that SEO is changing - "here's what we must do to be relevant in 2014". I find these articles boring and uninformative. I suspect I'm not alone. The reason I'm not their biggest fan is because I feel like I've invested 10 minutes into an article that I have no actual guidance from. Therefore, I thought I'd ask the real SEO's, you guys, what has actually changed for you? Are you now not creating content with the aim of getting links? If you run a commercial website, what are you doing different to rank your product pages - directly or indirectly? Please share with the group. I'm sure many like me are still brainstorming and creating content they think will grab people's attention and gain them links, whilst also pushing their Facebook, Twitter, Youtube profiles, etc etc. What has changed about this?
Intermediate & Advanced SEO | | purpleindigo0 -
Better UX or more Dedicated Pages (and page views)?
Hi, I'm building a new e-commerce site and I'm conflicting about what to do in my category pages. If we take for example a computer store.
Intermediate & Advanced SEO | | BeytzNet
I have a category of laptops and inside there are filters by brand (Samsung, HP, etc.). I have two options - either having the brand choice open a new dedicated page -
i.e. Samsung-Laptops.aspx or simply do a JQuery filter which gives a better and faster user experience (immediate, animated and with no refresh). **Which should I use? (or does it depend on the keyword it might target)? **
Samsung laptops / dell laptops / hp laptops - are a great keyword on there own! By the way, splitting Laptops.aspx to many sub category physical pages might also help by providing the site with many actual pages dealing with laptops altogether.0 -
I have removed over 2000+ pages but Google still says i have 3000+ pages indexed
Good Afternoon, I run a office equipment website called top4office.co.uk. My predecessor decided that he would make an exact copy of the content on our existing site top4office.com and place it on the top4office.co.uk domain which included over 2k of thin pages. Since coming in i have hired a copywriter who has rewritten all the important content and I have removed over 2k pages of thin pages. I have set up 301's and blocked the thin pages using robots.txt and then used Google's removal tool to remove the pages from the index which was successfully done. But, although they were removed and can now longer be found in Google, when i use site:top4office.co.uk i still have over 3k of indexed pages (Originally i had 3700). Does anyone have any ideas why this is happening and more importantly how i can fix it? Our ranking on this site is woeful in comparison to what it was in 2011. I have a deadline and was wondering how quickly, in your opinion, do you think all these changes will impact my SERPs rankings? Look forward to your responses!
Intermediate & Advanced SEO | | apogeecorp0 -
Are 17000+ Not Found (404) Pages OK?
Very soon, our website will go a rapid change which would result in us removing 95% or more old pages (Right now, our site has around 18000 pages indexed). It's changing into something different (B2B from B2C) and hence our site design, content etc would change. Even our blog section would have more than 90% of the content removed. What would be the ideal scenario be? Remove all pages and let those links be 404 pages Remove all pages and 301 redirect them to the home page Remove all unwanted pages and 301 redirect them to a separate page explaining the change (Although it wouldn't be that relevant since our audience has completely changed)- I doubt it would be ideal since at some point, we'd need ot remove this page as well and again do another redirection
Intermediate & Advanced SEO | | jombay0 -
Should you replace the url on a damaged page and 301 to it ?
Hi, We have a couple of pages which have been damaged due to an SEO person we hired creating a stupid amount of bookmarks and generally poor links. I've tried to get the links removed where I can but on most of these blogging sites there is no contact webmaster etc so I am struggling. Panda update as also affected traffic by about 35%. My question is , should I consider creating new urls for the "damaged " pages and then doing 301 redirects to them from the damaged page to the new page. Then start to build up good links to the new page whilst google should de-index the old pages over a couple of months ?. Just at my witts end how to get rid of these blogging rubbish etc etc. Thanks Sarah.
Intermediate & Advanced SEO | | SarahCollins0