Warnings, Notices, and Errors- don't know how to correct these
-
I have been watching my Notices, Warnings and Errors increase since I added a blog to our WordPress site. Is this effecting our SEO? We now have the following:
2 4XX errors. 1 is for a page that we changed the title and nav for in mid March. And one for a page we removed. The nav on the site is working as far as I can see. This seems like a cache issue, but who knows?
20 warnings for “missing meta description tag”. These are all blog archive and author pages. Some have resulted from pagination and are “Part 2, Part 3, Part 4” etc. Others are the first page for authors. And there is one called “new page” that I can’t locate in our Pages admin and have no idea what it is.
5 warnings for “title element too long”. These are also archive pages that have the blog name and so are pages I can’t access through the admin to control page title plus “part 2’s and so on.
71 Notices for “Rel Cononical”. The rel cononicals are all being generated automatically and are for pages of all sorts. Some are for a content pages within the site, a bunch are blog posts, and archive pages for date, blog category and pagination archive pages
6 are 301’s. These are split between blog pagination, author and a couple of site content pages- contact and portfolio. Can’t imagine why these are here.
8 meta-robot nofollow. These are blog articles but only some of the posts. Don’t know why we are generating this for some and not all. And half of them are for the exact same page so there are really only 4 originals on this list. The others are dupes.
8 Blocked my meta-robots. And are also for the same 4 blog posts but duplicated twice each.
We use All in One SEO. There is an option to use noindex for archives, categories that I do not have enabled. And also to autogenerate descriptions which I do not have enabled.
I wasn’t concerned about these at first, but I read these (below) questions yesterday, and think I'd better do something as these are mounting up. I’m wondering if I should be asking our team for some code changes but not sure what exactly would be best.
http://www.seomoz.org/q/pages-i-dont-want-customers-to-see
http://www.robotstxt.org/meta.html
Our site is http://www.fateyes.com
Thanks so much for any assistance on this!
-
Thanks so much, Mike. Good to know I can let this go and I've done my due diligence with checking it all out.
I wish our WP would always create the 301's automaticallybushmen needed, but it doesn't seem to. I just installed Redirection plugin today for a URL change I wanted to make.
-
You don't need to really worry or stress about the missing meta descriptions and long titles.
Meta descriptions do not impact your rankings and Google will automatically create a description for your page if it appears in the SERPs.
Title tags that are too long do not impact your rankings... at least not directly. If your title tag is over by 10 or even 20 characters, it will not impact whether your pages ranks or not. The 70 characters is a suggestion as that was the number of characters that would display in the SERPs; however, now it is based on pixil width. The only other important info you need to know about titles is that you put your most important keywords towards the beginning of the title.
If you are unsure about how or are unable to edit these pages to add or edit the description and title, it isn't going to make our break your site from a ranking standpoint.
Some CMS will automatically generate 301s if you edit a URL's structure. It does this so that any old links pointing to the old URL will be brought to the edited URL. The CMS will not fix broken links that point to the old URL, but on the server side, if someone clicks on an old, broken link, they will be brought to the edited URL page - if that makes sense.
I understand that you want to attack warnings and notices and get things perfect; however, sometimes it just isn't possible. Whether it is a CMS issue or knowing how to fix something complex - what does matter is that you investigate each warning and notice and make sure that it is not negatively impacting your site. From the sounds of it, the handful of warnings and notices you have are just fine.
Hope this helps answer your question.
Mike
-
I'm really sorry to be confusing! It's hard to find the precise language for stuff when you don't really understand it well enough. ;o) I really appreciate that you have stuck with this and are trying to understand my concerns.
Pasted here from my last comment: "I was saying that the metarobots/nofollows were for blog posts, but in looking again, I am realizing these are blog post Comments and Replies, so I understand why WP would automatically put the noindex/nofollow on those. I typo-ed and put "robots" instead of "index". Sorry!"
So, in other words, I found that the noindex/nofollows that SEOMoz is reporting are for the blog comments which means all is well on those. I don't want Google to index comments and my replies to comments.
I'm going to see if I can ask my other question more clearly:
What I am still trying to determine is how to cut down on the number of notices and warnings by fixing or changing the conditions that are causing them.
I do not know what to do programming-wise to either create meta descriptions since they are "missing" and fix title tags that are too long for the archive and author type pages that are generating those notices and warnings. I don't know whether to use noindex, nofollow or block robots so that they won't matter.
I also don't know how/where the 301s were generated as we did not implement those manually or knowingly.
I hope this is better said and more understandable. Crossed fingers as I push "Post Reply".
-
I don't completely understand where you are saying the noindex/nofollow is located. If both are in the head, it applies to the whole page; however, "nofollow" can be used specifically for links (in most cases blog comments).
The easiest thing you can do is ask yourself, "Do I want this page to be indexed by Google?" If no, then you want to use the noindex directive; however, if you want the page indexed, you will want to make sure you are not using the noindex directive.
As far as nofollows are concerned, those can/should be used for blog comments. Nofollow can be used in other instances, but it generally isn't a tag that you throw around much.
This Matt Cutts article talks about how the nofollow directive works in relation to link juice... it is worth a read.
Hope this answers your question Gina.
Mike
-
Thanks much, John! And Mike!
404s:
These are now fixed. Thanks, Mike, for finding them. I tried to subscribe to Screaming Frog awhile back and had a roadblock due to my system. (older MacBook Pro and I can't update the OS any further)Blog Archives:
I have wanted to use archive pages for alternate ways a user can find posts. I tend to like those on other blogs. But thank you for the article link. I look forward to reading that.I am happy to hear the duplicated descriptions on archive pages is ok. I'm guessing you mean the post excerpt with the thumbnails? But I don't quite understand why SEOMoz is telling me that I am missing descriptions then AND I don't know how to access archive pages to insert meta descriptions onto them. Or author pages for that matter.
301's:
We did not implement 301s and I don't have a clue as to why they are there except that I change the name of the Gina Fiedel page. So I guess WP automatically created a 301?? That seems odd. And for the others, I have no idea. They are author pages generated from the User page in the admin and one is our website contact page with an inquiry form.Noindex/nofollow: "These are blog articles but only some of the posts. Don’t know why we are generating this for some and not all. And half of them are for the exact same page so there are really only 4 originals on this list. The others are dupes."
What the heck did I mean by that? Just kidding- I figured it out. I was saying that the metarobots/nofollows were for blog posts, but in looking again, I am realizing these are blog post Comments and Replies, so I understand why WP would automatically put the noindex/nofollow on those. I typo-ed and put "robots" instead of "index". Sorry!
Mike- I am still wondering which tag(s) is/are recommended for the notices and warnings. I'm not sure what to request from our programming team on this.
Again! Thank you both for all the time you've spent on this. So grateful.
-
Screaming Frog - I usually wait for SEOmoz or Webmaster Tools to identify issues, then use Screaming Frog to verify that I have fixed them. It is a great tool and FREE if your site is under 500 pages.
Here are the SEOmoz definitions of the other warnings you are talking about:
"Meta Robots Nofollow - When the meta robots tag for a page includes 'nofollow', no link juice is passed on through the links on that page.
Blocked by Meta Robots - This page is being kept out of the search engine indexes by meta-robots."
I am guessing someone put the following in the head of those blog posts:
It is just telling Google to not index the page and to not pass page rank or anchor text for any links on that page.
Typically the "nofollow" is used in blog comments, so commenters cannot provide links back to their personal websites.
"noindex" shouldn't have any affect on rankings. It is just telling Google that certain pages are not worth putting in their index (copyright, terms of use, etc.).
"nofollow" links if not implemented correctly can look kind of spammy to Google, but in most cases you should be fine.
Does that help?
Mike
-
Hi Gina -
Great questions here. Some of these you should worry about, others are just notices and not necessarily an issue.
Fix the 4XX errors if those pages have links, or have a 404 page that redirects users. 404s are not always bad, but if the user isn't supposed to end up there (ie your product page is expired), then redirect.
Don't worry about the duplicated meta descriptions on archive pages, but do think about if these pages are needed. Ayima had a good post on pagination recently - http://www.ayima.com/seo-knowledge/conquering-pagination-guide.html
Same as above with the title tags on paginated archives.
Rel-canonicals are fine. Once again, just notices that they are there.
Did you implement those 301s? Moz notifies you of them because they might pass less link equity than straight links, but 301s are not bad.
What do you mean by "These are blog articles but only some of the posts. Don’t know why we are generating this for some and not all. And half of them are for the exact same page so there are really only 4 originals on this list. The others are dupes." It seems that this may have been implemented manually on your side, though I don't know how All In One SEO Pack handles it (I use Yoast).
-
Thanks, Mike.
I agree about 404s! Thank you for locating those. Interestingly, the 404s that SEOMoz is picking up are the ones I was guessing are cached because those were fixed within minutes of being created.What I didn't realize is that there were additional internal links to these pages within blog posts. How'd you find those?
I would like to fix to avoid the warnings and notices continuing to generate, can you please explain the norobots vs noindex and how I should set those?
Since there are 8 norobots, how will these effect rankings?
thanks again!
-
Hi Gina,
You should try to fix any errors. Errors can impact your users' experience, as well as interfere with web crawlers and even impact your rankings.
404 errors:
- /balancing-seo-with-your-website-design/ links to /on-target-web-design-santa-barbara/ using anchor text "It is best to get a custom design
- /5-steps-to-increase-traffic-to-your-website/ links to /we-create-websites-that-bring-you-more-business/ using anchor text "increasing traffic to your website,"
Warnings are more or less a "if you have time and can, you could fix these". They really do not impact your rankings, but if you are trying to be perfect, you could fix them.
Notices are just a "heads-up". They do not impact rankings, UNLESS you are blocking robots ; )
Long story short, fix Errors, work on Warnings when you have time, verify you already knew about the Notices.
Hope this helps.
Mike
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
I have a duplicate content on my Moz crawler, but google hasn't indexed those pages: do I still need to get rid of the tags?
I received an urgent error from the Moz crawler that I have duplicate content on my site due to the tags I have. For example: http://www.1forjustice.com/graves-amendment/ The real article found here: http://www.1forjustice.com/car-accident-rental-car/ I didn't think this was a big deal, because when I looked at my GWT these pages weren't indexed (picture attached). Question: should I bother fixing this from an SEO perspective? If Google isn't indexing the pages, then am I losing link juice? 6c2kxiZ
Moz Pro | | Perenich0 -
Solving 'duplicate content' for page 2 of X for 1 blog post
Hi to all SEO wizards, For my Dutch blog google-plus-marketing.nl I'm using WordPress Genesis framework 2.0 with news theme pro 2.0 responsive theme. I love the out of the box SEO friendliness and features. One of those features is that it allows for a blog post or page to be divided into several pages. This results in MOZ signaling duplicate titles for all pages after the 1st page. Now I was thinking that a canonical url set to the first page should do the trick for me as I reason that the rank will go the the first page and the rest will not be seen as duplicates. Genesis does some good stuff on it's own and places the following meta tags in the header for the first page. All looks well and my question is about the same meta tags for the 2nd page and higher that I pasted below this one for the 1st page. Meta tags page 1 of X for blog post Meta tags page 2 of X for the same blog post Would it not be better to point the canonical url for page 2 till X to always point to the first page? In this case:
Moz Pro | | DanielMulderNL0 -
Does Anyone Know The Moz Analytics Release Date?
I know it's coming soon, but does anyone know the actual release date for Moz Analytics? Thanks!
Moz Pro | | eglassman0 -
Having 1 page crawl error on 2 sites
Help! A few weeks back, my dev team did some "changes" (that I don't know anything about), but ever since then, my Moz crawl has only shown one page for either http://betamerica.com or http://fanex.com. Moz service was helpful in talking about a redirect loop that existed, and I asked my team to fix it, which it looks to me like they have. Still, 1 page. I used SEO Book's spider tool and it also only sees 1 page, and sees the sites as http://https://betamerica.com (for example), which is just weird. I don't know enough about HT Access or server stuff to figure out what's going on, so if someone can help me figure that out, I'd appreciate it.
Moz Pro | | BetAmerica0 -
Seomoz crawl: 4XX (Client Error) How to find were the error are?
I got eight 404 errors with the Seomoz crawl, but the report does not says where the 404 page is linked from (like it does for dup content), or I'm I missing something? Thanks
Moz Pro | | PaddyDisplays0 -
On Page Optimisation rankings keep yo-yo'ing
Hi All Can you please clarify this for me as its getting a little frustrating. I get the weekly reports from the SEO Pro campaign tool and from one week to the next it keeps changing the ranking. For example cheap a boards For this page I have a on page score of A through the campaign: http://www.cheapsnapframes.co.uk/cheap-a-boards.html Last weeks report showed it was an A and then I get this weeks snapshot after the crawl and its saying its now an F but when I click on the link on the onpage tool it then is showing me an A and then changes the ranking again on the campaign. I am seeing this a lot - why is it happening? thanks Tracy
Moz Pro | | dashesndots0 -
Fetch googlebot for sites you don't own?
I've used the "fetch as googlebot" tool in Google webmaster tools to submit links from my site, but I was wondering if there was any type of tool or submission process like this for submitting links from other sites that you do not own? The reason I ask is, I worked for several months to get a website to accept my link as part of their dealer locator tool. The link to my site was published a few months ago, however I don't think google has found it and the reason could be because you have to type in your zip code to get the link to appear. This is the website that I am referencing: http://www.ranchhand.com/dealers.php?zip=78070&radius=20 (my website is www.rangeroffroad.com) Is there any way for Google to index the link? Any ideas?
Moz Pro | | texmeix0 -
SEOmoz crawl error questions
I just got my first seomoz crawl report and was shocked at all the errors it generated. I looked into it and saw 7200 crawl errors. Most of them are duplicate page titles and duplicate page content. I clicked into the report and found that 97% of the errors were going off of one page It has ttp://legendzelda.net/forums/index.php/members/page__sort_key__joined__sort_order__asc__max_results__20 http://legendzelda.net/forums/index.php/members/page__sort_key__joined__sort_order__asc__max_results__20__quickjump__A__name_box__begins__name__A__quickjump__E etc Has 20 pages of slight variations of this link. It is all my members list or a search of my members list so it is not really duplicate content or anything. How can I get these errors to go away and make search my site is not taking a hit? The forum software I use is IPB.
Moz Pro | | NoahGlaser780