On our site by mistake some wrong links were entered and google crawled them. We have fixed those links. But they still show up in Not Found Errors. Should we just mark them as fixed? Or what is the best way to deal with them?
-
Some parameter was not sent. So the link was read as :
null/city, null/country instead cityname/city
-
If you look at those links in Google Search Console (crawl errors), you'll see that there is a date there. If there are some that show up with an older date (than yesterday, for example), you can mark as fixed. If those errors are still there, then they'll show up again.
-
It's not a big issue. Just mark as fixed - if they 404 & no links exist them they will disappear from the reporting (every site has 404 errors). Even if you don't do anything it's not a problem;
Dirk
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How to deal with Pages not present anymore in the site
Hi, we need to cut out from the catalog some destinations for our tour operator, so basically we need to deal with destination pages and tour pages not present anymore on the site. What do you think is the best approach to deal with this pages to not loose ranking? Do you think is a good approach to redirect with 301's these pages to the home page or to the general catalog page or do you suggest another approach? tx for your help!
Technical SEO | | Dreamrealemedia0 -
Removed Subdomain Sites Still in Google Index
Hey guys, I've got kind of a strange situation going on and I can't seem to find it addressed anywhere. I have a site that at one point had several development sites set up at subdomains. Those sites have since launched on their own domains, but the subdomain sites are still showing up in the Google index. However, if you look at the cached version of pages on these non-existent subdomains, it lists the NEW url, not the dev one in the little blurb that says "This is Google's cached version of www.correcturl.com." Clearly Google recognizes that the content resides at the new location, so how come the old pages are still in the index? Attempting to visit one of them gives a "Server Not Found" error, so they are definitely gone. This is happening to a couple of sites, one that was launched over a year ago so it doesn't appear to be a "wait and see" solution. Any suggestions would be a huge help. Thanks!!
Technical SEO | | SarahLK0 -
Site's IP showing WMT 'Links to My Site'
I have been going through, disavowing spam links in WMT and one of my biggest referral sources is our own IP address. Site: Covers.com
Technical SEO | | evansluke
IP: 208.68.0.72 We have recently fixed a number of 302 redirects, but the number of links actually seems to be increasing. Is this something I should ignore / disavow / fix using a redirect?0 -
Moz showing 404 error on one of my sites
I have a problem. Everything seems to be ok, but moz shows a HTTP code of 404 for http://www.centralevapeurguide.com and I don't really know why. All my others websites return 200 but this one return 404. And obviously, only this website don't want to rank in google.. Thanks for your help. Sebastian
Technical SEO | | sebagorka0 -
Correct linking to the /index of a site and subfolders: what's the best practice? link to: domain.com/ or domain.com/index.html ?
Dear all, starting with my .htaccess file: RewriteEngine On
Technical SEO | | inlinear
RewriteCond %{HTTP_HOST} ^www.inlinear.com$ [NC]
RewriteRule ^(.*)$ http://inlinear.com/$1 [R=301,L] RewriteCond %{THE_REQUEST} ^./index.html
RewriteRule ^(.)index.html$ http://inlinear.com/ [R=301,L] 1. I redirect all URL-requests with www. to the non www-version...
2. all requests with "index.html" will be redirected to "domain.com/" My questions are: A) When linking from a page to my frontpage (home) the best practice is?: "http://domain.com/" the best and NOT: "http://domain.com/index.php" B) When linking to the index of a subfolder "http://domain.com/products/index.php" I should link also to: "http://domain.com/products/" and not put also the index.php..., right? C) When I define the canonical ULR, should I also define it just: "http://domain.com/products/" or in this case I should link to the definite file: "http://domain.com/products**/index.php**" Is A) B) the best practice? and C) ? Thanks for all replies! 🙂
Holger0 -
Google Shows 24K Links b/w 2 sites that are not linked
Good Morning, Does anyone have any idea why Google WMT shows me that i have 24,101 backlinks from one of my sites ( http://goo.gl/Jb4ng ) pointing to my other site ( http://goo.gl/JgK1e ) ... These sites have zero links between them, as far as I can see/tell. Can someone please help me figure out why Google is showing 24k backlinks? Thanks
Technical SEO | | Prime850 -
Open Site Explorer - Showing No links
Hello, I have ran Link Analysis report in Site Explorer for my client qtmoving.com http://www.opensiteexplorer.org/comparisons?site=www.qtmoving.com These competitors are the top 3 that consistently appear in the Local
Technical SEO | | CohesiveMarketing
Search top 7 or the Local Search Blended SERP. I don't understand why I have No Internal Followed Links, no Internal Links, and only 2 external followed and external links. This doesn't make sense to me because I know there are links internally and that there are some sites that link back to us: I'm not sure how this can happen when we have the following sites that
link to us:
http://www.bbb.org/manitoba/business-reviews/moving-storage-companies/quick-transfer-ltd-in-winnipeg-mb-14125
http://www.yelp.ca/biz/quick-transfer-ltd-winnipeg
http://www.ourbis.com/617657-quick-transfer-ltd-winnipeg Your help is greatly appreciated. Thank you, Lyn0 -
Best blocking solution for Google
Posting this for Dave SottimanoI Here's the scenario: You've got a set of URLs indexed by Google, and you want them out quickly Once you've managed to remove them, you want to block Googlebot from crawling them again - for whatever reason. Below is a sample of the URLs you want blocked, but you only want to block /beerbottles/ and anything past it: www.example.com/beers/brandofbeer/beerbottles/1 www.example.com/beers/brandofbeer/beerbottles/2 www.example.com/beers/brandofbeer/beerbottles/3 etc.. To remove the pages from the index should you?: Add the Meta=noindex,follow tag to each URL you want de-indexed Use GWT to help remove the pages Wait for Google to crawl again If that's successful, to block Googlebot from crawling again - should you?: Add this line to Robots.txt: DISALLOW */beerbottles/ Or add this line: DISALLOW: /beerbottles/ "To add the * or not to add the *, that is the question" Thanks! Dave
Technical SEO | | goodnewscowboy0