Keri's example of Moz's "top secret project" is a good one.
If you Google "top secret project" they are appearing at the bottom of the second page SERP.
Welcome to the Q&A Forum
Browse the forum for helpful insights and fresh discussions about all things SEO.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Keri's example of Moz's "top secret project" is a good one.
If you Google "top secret project" they are appearing at the bottom of the second page SERP.
Hi Gina,
You should try to fix any errors. Errors can impact your users' experience, as well as interfere with web crawlers and even impact your rankings.
404 errors:
Warnings are more or less a "if you have time and can, you could fix these". They really do not impact your rankings, but if you are trying to be perfect, you could fix them.
Notices are just a "heads-up". They do not impact rankings, UNLESS you are blocking robots ; )
Long story short, fix Errors, work on Warnings when you have time, verify you already knew about the Notices.
Hope this helps.
Mike
The only thing that can happen is if Google indexes an http and https version of the same page. It isn't a HUGE deal... just depends on how obsessed you are about the site structure.
You could potentially have visitors start linking to the https version vs the http version, in which case that would be a problem.
Depending on the complexity of your site and code, this rule may be able to help you... where it says, pattern="download" that means that if someone visits the downloads section of your website, it will allow for https. You can continue to add additional folders by just repeating that code and replacing "download" with whatever the folder names are that you want to allow to use HTTPS. I did a Google search of site:lucid8.com inurl:https and noticed that your download section was indexed as using HTTPS, that is why I used it in this example.
If you are just using simple folder structures, this rule is not too bad to implement. I previously just implemented it with pattern matching and that was not fun.
Any way, if it helps great, if not, just try to keep your internal linking as consistent as possible. Sometimes the best way to do this is use absolute paths vs relative.
Mike
<rule name="AllHTTPexceptSIGNIN" stopprocessing="true"><conditions><add input="{HTTP_HOST}" pattern="<a href=" http:="" (www.)(.)$"="">http://(www.)(.)$" negate="true" />
<add input="{HTTPS}" pattern="on"><add input="{URL}" pattern="download" negate="true"></add></add></add></conditions>
<action type="Redirect" url="<a href=" http:="" {http_host}="" {r:0"="">http://{HTTP_HOST}/{R:0}" redirectType="Permanent"/>
</action></rule>
You don't need to really worry or stress about the missing meta descriptions and long titles.
Meta descriptions do not impact your rankings and Google will automatically create a description for your page if it appears in the SERPs.
Title tags that are too long do not impact your rankings... at least not directly. If your title tag is over by 10 or even 20 characters, it will not impact whether your pages ranks or not. The 70 characters is a suggestion as that was the number of characters that would display in the SERPs; however, now it is based on pixil width. The only other important info you need to know about titles is that you put your most important keywords towards the beginning of the title.
If you are unsure about how or are unable to edit these pages to add or edit the description and title, it isn't going to make our break your site from a ranking standpoint.
Some CMS will automatically generate 301s if you edit a URL's structure. It does this so that any old links pointing to the old URL will be brought to the edited URL. The CMS will not fix broken links that point to the old URL, but on the server side, if someone clicks on an old, broken link, they will be brought to the edited URL page - if that makes sense.
I understand that you want to attack warnings and notices and get things perfect; however, sometimes it just isn't possible. Whether it is a CMS issue or knowing how to fix something complex - what does matter is that you investigate each warning and notice and make sure that it is not negatively impacting your site. From the sounds of it, the handful of warnings and notices you have are just fine.
Hope this helps answer your question.
Mike
If you go to your campaign overview page, you will see a little box below the mini overviews that will say something like, "Last Crawl Completed: Apr. 3rd, 2013 Next Crawl Starts: Apr. 10th, 2013"
I personally use SEOmoz PRO tools in combination with Screaming Frog. I verify many of the problems using Screaming Frog, then fix them, then rescan (which is instant), then wait for my SEOmoz PRO tools to reflect my changes. SEOmoz does a great job of warning you and keeping you in the know... where Screaming Frog gives you a lot of information, but you really have to know what you are looking for and would have to keep on top of it... SEOmoz is more automated... if that makes sense.
Mike
As far as when to submit sitemaps, you do not "need" to even submit one. After your initial sitemap is submitted, as long as you have a strong interlinking structure in place, your new pages should be crawled. Submitting a sitemap when you add new pages is more or less a "heads-up" to Google saying, "I added new pages here... you should crawl them when you have time."
Now, if you remove pages from your site, you should also make sure that your sitemap reflects these changes. If you do not, you will start to get 404 errors in Google Webmaster Tools.
I don't think you can overdo it; however, you would probably make less work for yourself if you only resubmitted when you are 100% sure you actually need to.
Does that make sense?
Mike