Error 404 Search Console
-
Hi all,
We have a number of 404 https status listed in Search Console even treated, not decrease. What happened:
- We launched a website with the urls www.meusite.com/url-abc.
- We launched these urls sitemap.
- Google has indexed.
... For some reason, the urls were changed four days later by some developer in my equipe. So
- I asked the redirection of URLs "old" already indexed to the new (of: / url-abc to / url-xyz) all correspondingly.
- I submit the sitemap with new urls.
- We fixed the internal links.
- And than marked as fixed in the Search Console.
But it does not work!
Has anyone had a similar experience?
Thanks for any advice!
-
Thanks for the answer Bernadette Coleman.
Yes, it can be a Google issues. But the biggest problem is that the Google Adwords team, are trying to make ads with some urls , and some Urls are being blocked by Google Adwords.
Note: The URLs that are reproved are not on the console search sampling, do not have 404 http status or redirects. Are the final URLs with 200 https Status
- Google Support answer - Disapproved ads
It seems that the issue is with the URLs. When our system crawled, they resulted in 404 violations
Hence, I request you to check Google Webmaster tools to identify the issue. Also, I request you to speak with your development team to identify the issue.
Once the issue has been fixed, I kindly ask you to re-submit the ads in the account, which will cause our system to re-review them.About the 404 errors on console. I had already taken the 404 urls sampling console and tracked screaming frog:
246 urls Status 404
303 urls Status 200
459 urls status 30170% of URLs were corrected, were marked as corrected on console, but Google insists back them to the list.
-
This could be a Google issue. It takes some time for Google to "forget" about URLs they know about, so they may continue to crawl old URLs.
If you have redirected these URLs and they are not showing a 404 error, then you shouldn't have anything to worry about. I would still mark them as fixed in Google Search Console and then see if they come back again. I would also test those URLs randomly using the Googlebot user agent.
One thing you can do, however, is to crawl those URLs yourself using Screaming Frog or another similar spider tool. Make sure you have the user agent set as Googlebot just to make sure that you're seeing what Google might potentially see. When you crawl, you should see the redirects. If not, then you will need to look into why you're seeing a 404 error.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Search function rendering cached pages incorrectly
On a category page the products are listed via/in connection with the search function on the site. Page source and front-end match as they should. However when viewing a browser rendered version of a google cached page the URL for the product has changed from, as an example - https://www.example.com/products/some-product to https://www.example.com/search/products/some-product The source is a relative URL in the correct format, so therefore /search/ is added at browser rendering. The developer insists that this is ok as the query string in the Google cache page result URL is triggering the behaviour, confusing the search function - all locally. I can see this but just wanted feedback that internally Google will only ever see the true source or will it's internal rendering mechanism possibly trigger similar behaviour?
Intermediate & Advanced SEO | | MickEdwards1 -
Search Console - Best practice to fetch pages when you update them?
Hi guys, If you make changes to a page e.g. add more content or something is it good practice to get google to fetch that page again in search console? My assumption is this way, Google can review the updated page quicker, resulting in faster changes in the SERPs for that page. Thoughts? Cheers.
Intermediate & Advanced SEO | | wozniak650 -
Indexation of internal search results from infinite scroll
Hello, I have an issue where we will have a website set up with dynamic (AJAX) result pages based on the selection of certain filters chosen by the user. The result page will have 12 results shown and if the user scrolls down, the page will lazy load (infinite scroll) additional results. So for example, with these filters: Filter A: Size Filter B: Color Filter 😄 Location We could potentially have a page for "Large, Blue, New York" results dynamically generated. My issue is that I want Google to potentially crawl and index all these variations, so that I can have a page that ranks for "Large Blue New York", another page that ranks for "Small Orange Miami" etc. However, I do not need all the products indexed--- just the page with the first set of dynamic results would be enough since the additional products would just be more of the same. In other words, I am trying to get these pages with filters applied indexed and not necessarily get every possible product indexed. Can anyone comment on the best way to Get Google to index all dynamic variations? The proper way of paginating pages? Thank you
Intermediate & Advanced SEO | | Digi12340 -
Ranking for local searches without city specific keywords?
Hey guys! I had asked this question a few months ago and now that we are seeing even more implicit information determining search results, I want to ask it again..in two parts. Is is STILL best practice for on-page to add the city name to your titles, h1s, content etc? It seems that this will eventually be an outdated tactic, right? If there is a decent amount of search volume without any city name in the search query (ie. "storefont signs", but no search volume for the phrase when specific cities are added (ie. "storefront signs west palm beach) is it worth trying to rank and optimize for that search term for a company in West Palm Beach? We can assume that if there are 20,000 monthly searches for the non-location specific term that SOME of them would be fairly local, so do we optimize the page without the city name and trust Google to display results with a local intent...therefore showing our client's site in the SERPS when someone searches "sign company" and they are IN West Palm Beach? If there is any confusion, please just ask me to clarify! I think this would be a great WhiteBoard Friday topic for Rand!
Intermediate & Advanced SEO | | RickyShockley0 -
How Do You Remove Video Thumbnails From Google Search Result Pages?
This is going to be a long question, but, in a nutshell, I am asking if anyone knows how to remove video thumbnails from Google's search result pages? We have had video thumbnails show up next to many of our organic listings in Google's search result pages for several months. To be clear, these are organic listings for our site, not results from performing a video search. When you click on the thumbnail or our listing title, you go to the same page on our site - a list of products or the product page. Although it was initially believed that these thumbnails drew the eye to our listings and that we would receive more traffic, we are actually seeing severe year over year declines in traffic to our category pages with thumbnails vs. category pages without thumbnails (where average rank remained relatively constant). We believe this decline is due to several things: An old date stamp that makes our listing look outdated (despite the fact that we can prove Google has spidered and updated their cache of these pages as recent as 2 days ago). We have no idea where Google is getting this datestamp from. An unrelated thumbnail to the page title, etc. - sometimes a picture of a man's face when the category is for women's handbags A difference in intent - user intends to shop or browse, not watch a video. They skip our listing because it looks like a video even though both the thumbnail and our listing click through to a category page of products. So we want to remove these video thumbnails from Google's search results without removing our pages from the index. Does anyone know how to do this? We believed that this connection between category page and video was happening in our video sitemap. We have removed all reference to video and category pages in the sitemap. After making this change and resubmitting the sitemap in Webmaster Tools, we have not seen any changes in the search results (it's been over 2 weeks). I've been reading and it appears many believe that Google can identify video embedded in pages. That makes sense. We can certainly remove videos from our category pages to truly remove the connection between category page URL and video thumbnail. However, I don't believe this is enough because in some cases you can find video thumbnails next to listings where the page has not had a video thumbnail in months (example: search for "leather handbags" and find www.ebags.com/category/handbags/m/leather - that video does not exist on that page and has not for months. Similarly, do a search for "handbags" and find www.ebags.com/department/handbags. That video has not been on that page since 2010. Any ideas?
Intermediate & Advanced SEO | | SharieBags0 -
GWT Crawl Error Report Not Updating?
GWT's crawl error report hasn't updated for me since April 25. Crawl stats are updating normally, as are robots.txt and sitemap accesses. Is anyone else experiencing this?
Intermediate & Advanced SEO | | tonyperez0 -
Duplicate Content Error because of passed through variables
Hi everyone... When getting our weekly crawl of our site from SEOMoz, we are getting errors for duplicate content. We generate pages dynamically based on variables we carry through the URL's, like: http://www.example123.com/fun/life/1084.php
Intermediate & Advanced SEO | | CTSupp
http://www.example123.com/fun/life/1084.php?top=true ie, ?top=true is the variable being passed through. We are a large site (approx 7000 pages) so obviously we are getting many of these duplicate content errors in the SEOMoz report. Question: Are the search engines also penalizing for duplicate content based on variables being passed through? Thanks!0 -
Search Engine Pingler
Hello everyone, it's me again 😉 I've just got a Pro membership on SeoMoz and I am full of questions. A few days ago I found very interesting tool called: Search Engine Pingler And description of it was like this: Your website or your page was published a long time, but you can not find it on google. Because google has not index your site. Tool Search engine pingler will assist for you. It will ping the URL of your Page up more than 80 servers of google and other search engines. Inform to the search engine come to index your site. So my question is that tool really helps to increase the indexation of the link by search engine like Google, if not, please explain what is a real purpose of it. Thank you to future guru who can give a right answer 🙂
Intermediate & Advanced SEO | | smokin_ace0