Yes, I understand that much. But in the actual report from MOZ, if you had any 4xx error it will tell you in the report the exact URL that returned the error. If there was no URL listed then you had no URLs found returning that error.
Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Posts made by MikeRoberts
-
RE: Receiving 4XX status codes
-
RE: Receiving 4XX status codes
Where did the original report say the error was coming from? Does the originating URL return a 4xx error if you go to it manually?
-
RE: Receiving 4XX status codes
Normally the reports will let you know what page the error was encountered on and/or originated from. If there's an error listed there then odds are either the crawl found a broken link within your site that returned a 404 Not Found or you have something that was blocked returning a 403 Forbidden. Have you run any other crawls of the site (such as with Screaming Frog) or checked Google Search Console to see if there are any crawl errors listed on the site?
-
RE: Include or exclude noindex urls in sitemap?
That opens up other potential restrictions to getting this done quickly and easily. I wouldn't consider it best practices to create what is essentially a spam page full of internal links and Googlebot will likely not crawl all 4000 links if you have them all there. So now you'd be talking about maybe making 20 or so thin, spammy looking pages of 200+ internal links to hopefully fix the issue.
The quick, easy sounding options are not often the best option. Considering you're doing all of this in an attempt to fix issues that arose due to an algorithmic penalty, I'd suggest trying to follow best practices for making these changes. It might not be easy but it'll lessen your chances of having done a quick fix that might be the cause, or part of, a future penalty.
So if Fetch As won't work for you (considering lack of manpower to manually fetch 4000 pages), the sitemap.xml option might be the better choice for you.
-
RE: Include or exclude noindex urls in sitemap?
You could technically add them to the sitemap.xml in the hopes that this will get them noticed faster but the sitemap is commonly used for the things you want Google to crawl and index. Plus, placing them in the sitemap does not guarantee Google is going to get around to crawling your change or those specific pages. Technically speaking, doing nothing and jut waiting is equally as valid. Google will recrawl your site at some point. Sitemap.xml only helps if Google is crawling you to see it. Fetch As makes Google see your page as it is now which is like forcing part of a crawl. So technically Fetch As will be the more reliable, quicker choice though it will be more labor-intensive. If you don't have the man-hours to do a project like that at the moment, then waiting or using the Sitemap could work for you. Google even suggests using Fetch As for urls you want them to see that you have blocked with meta tags: https://support.google.com/webmasters/answer/93710?hl=en&ref_topic=4598466
-
RE: Include or exclude noindex urls in sitemap?
If the pages are in the index and you've recently added a NoIndex tag with the express purpose of getting them removed from the index, you may be better served doing crawl requests in Search Console of the pages in question.
-
RE: What is the radius for local search results
For a local shop that has their Google My Business set to indicate they serve customers at their location (as opposed to a delivery radius or areas served), Google will base the businesses shown in a search on either the Location the user specifies or will base them on what they know about the given businesses of the area in relation to the query plus any geolocation information of the user in question. So there isn't exactly a radius bubble that you would need to fall into for those specific kinds of situations.
Now, for an industry like landscaping, cleaning companies, food delivery, emergency auto repair, tow trucks, etc. They can set a radius they serve within or they can set specific areas. So a locksmith might set a handful of postal codes as the regions they will drive to in order to fix your locks. While a Pizza Delivery Service might choose to set a radius of 25km for their service area because they might not be able to reliably deliver outside that.
All of these things can be set up in their Google My Business account.
I know from personal experience that Google will show me things easily 100 miles away from my location if there is nothing in between that fits my search.
-
RE: Do you need contact details (NAP) on every page of your website for local search ranking ?
If there's no footer, why not at the top of the page. Something along the lines of "Located at the intersection of street and road in the center of Town" with a nice, obvious Click to Call?
-
RE: Do Search Engines Try To Follow Phone Number Links
While a bot might try to follow it (because it is, in it simplest form, a type of link), that will not in any way adversely affect you. That tel: in the tag will tip them off that it is a telephone number and/or should be click-to-call. So no link equity will be lost, you won't start seeing tons of 404 warnings, or anything of that sort.
-
RE: When is Too Many Categories Too Many on a eCommerce site?
Are the categories helpful for the customer? On one hand you don't want to lump too many things into one category when they can be broken out into more granular categories that better serve visitors. On the other hand, it won't help you or your customers if you get too granular and break everything out into categories based on the mot insignificant details.
While keyword cannibalization is a concern, serving your visitors/customers what they want and how they prefer to see it will likely improve metrics more on your site than concerning yourself with a nebulous concept like "how many categories is too many." If you have 200 different categories but they are well targeted and you want to add another (or ten more) that are also equally well targeted, then why wouldn't you do it?
-
RE: Weird 404 URL Problem - domain name being placed at end of urls
I had this problem in Wordpress about a year ago. In my case it was caused by links being entered into posts getting turned into relative links instead of being absolute links. Somehow this was causing the links to append the domain name to the end of the url. In our case it turned out to be an incompatibility between plugins. Have you tested all your plugins to see if any of them are interfering and causing this issue?