Hi Daniel,
Whenever I hear "Nothing is redirecting when I add this to the .htaccess file" I have one thought:
Is the site using Joomla, Drupal or some other CMS that will overwrite .htaccess rules externally?
Sha
Welcome to the Q&A Forum
Browse the forum for helpful insights and fresh discussions about all things SEO.
Hi Daniel,
Whenever I hear "Nothing is redirecting when I add this to the .htaccess file" I have one thought:
Is the site using Joomla, Drupal or some other CMS that will overwrite .htaccess rules externally?
Sha
Hey Ryan,
Just using Screaming Frog SEO Spider. It has proven very reliable and quickly identifies errors, server status for every page and much more.
The caveat is that I always check any Status code errors in the browser as there are quite often situations like this where the server is returning a Status error when the page renders fine in the browser.
You just have to be careful to ensure that if you want to scan the root domain you use the non-www URL as usual.
Hope it's useful,
Sha
Hi Mont,
I think maybe you are thinking of Clicktale or similar.
Here is a quite comprehensive blog post on analytics tools which includes a short review of clicktale. There are a few around like it, but this one does actually have a free account which, while limited, does give you a little ongoing insight. (Look for the text link toward the bottom of the pricing page for free account)
Hope this helps,
Sha
Hi again waspman,
No worries...it takes a little getting your head around when you start out
As Chris confirmed, the root domain is waspkilluk.co.uk.
The most common subdomain is the www, which is standard, but you can also create other subdomains if you wish.
Perhaps the most visible real world example of a site that uses subdomains is Google - which uses subdomains to manage each of its specific products within the site. If you take a look at the URLs when you are using Google products you will see examples like places.google.com, maps.google.com, blogger.google.com, webmasters.google.com, www.google.com etc.
A subdomain is referred to as a "third level" domain, created within another domain. As a rule, each level of a domain is defined by the dots (.) appearing in reverse order within a URL.
So, in the Google Maps example:
the Top Level Domain (TLD) is .com
the second level domain is the chosen domain name google
the third level domain (or subdomain) is maps
In your case it is a little more complicated. You have an extra level in your domains because your Top Level Domain is the country designator .uk so your second level is actually .co, but for the purposes of understanding subdomains it won't hurt for you to think of .co.uk as your top level domain.
Generally, most small sites don't require extra subdomains, so often the only subdomain in play is the www.
Matt Cutts has a simple tutorial on the parts of a URL that makes it much easier to understand what is going on at a glance.
Hope that helps,
Sha
Hi Daniel,
The first thing you need to do is to find the definition of the container that “this” is built out of and then add an element called metaDescription.
This will create an empty string until you populate it elsewhere.
Now where you want metaDescription shown, you just use
<title><?= $this->metaDescription ?></title> (replace metaTag with metaDescription).
Next, you need to search your code for the place where metaTag is assigned and add an assignment for metaDescription which can be whatever you want it to be ie a substring etc.
Hope that helps,
Sha
Wow! I'll take that as a compliment
Just don't tell Dr Pete!
Always easier to switch on to something when you see it produce a real result. Glad I could help.
Have a great night,
Sha
Hi Adam,
"Other" is defined by exclusion.
It includes any traffic which comes from defined sources that do not comply with the definition in force for Referring Sites, Search Engines or Direct Traffic.
This thread on Stack Exchange provides the specific definitions for each of those.
Hope that helps,
Sha
Hi there,
You can do what you want to do as long as the keywords are loaded in your campaign. Its just a little hidden
If you go into the On-page tool you see the Summary view. If you look right above the big heading On-page Optimization, there are two blue text links. Choose Report Card.
Now you will see a selector which allows you to choose any keyword from your campaign list and enter the URL of any page in the site.
Click Grade my On-page Optimization and voila! Roger will fetch your Report Card!
Hope that helps.
Sha
Hi again,
There are a few really helpful posts you can take a look at that come from Duane Forrester. This one on the Bing Webmasters Blog is about developing great content.
Duane also gives some great Bing specific information in these Whiteboard Fridays from March, June and October.
Hope that helps,
Sha
Hi Vjay,
The sitemaps protocol was adopted as a standard by the three major search engines back in 2007.
This is not to say that other methods of submitting a sitemap don't work, but if you follow the sitemap protocol as described at sitemaps.org then you know that you are working within the standard recognized by Google, Bing & Yahoo.
Hope that helps,
Sha
Hi There,
Another nifty little SEOmoz Tool to play with
Historical Pagerank Checker reports the Google (toolbar) Pagerank of a URL and shows a historical view of previous PR scores. Remember that PR is page based, not site wide.
Hats off to Roger & the Moz team once again!
Hope that helps,
Sha
Hi,
You are not the only one experiencing issues at the moment.
Best to report your problem direct to support [at] seomoz.org with details of what you are seeing. The Rankings system is in Beta at present so far as I know, so just need to let the Moz staff know ASAP.
Sha
Hi Michael,
You do not need to make any changes to your .htaccess file. Actually, if you 301 these URLs you will break your search so that it no longer works.
The solution I would use is to go into Google Webmaster Tools and tell Googlebot to ignore the parameters you are concerned about.
In your code, the ? says "here come some parameters" and the & separates those parameters. So, in the case you have quoted, the parameters are a, b, c, d.
Be aware of course, that Roger will still see these URLs as duplicates since he doesn't know about your private conversations with Google This means that they will still appear in your SEOmoz report, but as long as you make a note of them so you know they can be ignored that shouldn't be a problem.
Hope that helps,
Sha
Hi Alexandre,
You will need to look at the code in the .htaccess file generated by All in One SEO to see whether the plugin is just using URL rewriting or creating 301 redirects. As far as I am aware, that particular plugin does not have an option to manually stipulate when you wish to create a 301, but the only sure way is to check the code.
You will need to go into your wordpress /blog directory and download the .htaccess file, then open it in a text editor (like notepad). This is a separate .htaccess file, specifically relating to what happens within your wordpress installation.The one in the root folder for your site will not tell you what you are wanting to know.
I don't use All In One SEO as I prefer the Yoast plugin, but typically, what you might expect to see is code that looks something like this:
<ifmodule mod_rewrite.c="">RewriteEngine On
RewriteBase /blog/
RewriteRule ^index.php$ - [L]
RewriteCond %{REQUEST_FILENAME} !-f
RewriteCond %{REQUEST_FILENAME} !-d
RewriteRule . /blog/index.php [L]</ifmodule>
The key to identifying whether the plugin is using standard URL rewrites or creating a 301 Redirect is to look at the very last part of the Rule - the part that is enclosed in square brackets [ ]
A plain vanilla flavored standard URL Rewrite Rule will end with [L]
If the Rule is creating a 301 (Permanent) Redirect, it will end with [R=301,L]
and for a 302 (Temporary) Redirect, it will end with [R=302,L]
As far as your question, which I understand to be asking essentially, "what is the difference" between the two:
A standard URL Rewrite is simply instructing the server that any request for a certain URL should be served a different URL. As far as the search engine is concerned, nothing changes. It is simply used to change the ugly URL to a pretty URL (in your case one that contains the keywords you want).
A 301 redirect serves the alternative URL, but also sends a signal to the search engine that the URL requested has been permanently replaced with the one that is served. This indicates to the search engine that the requested URL should be removed from the index and replaced with the URL that is served. A 301 redirect also signals to the search engine that most of the link value being passed to the requested URL should now be passed to the URL that replaces it in the index.
Hope that helps,
Sha
Hey William!
I should say thanks for leaving that open for me
I have so been hanging out for the opportunity to get back in and get a Q&A fix lately, but the support phone always takes precedence. It was so good when I made it back in to find some questions that I could contribute to.
Glad you love rmoov!
That started my day off with a big grin
Sha
Hi,
The reason that the Crawl Diagnostics report is divided into three different sections is because the impact of the things highlighted within is completely different.
Warnings are there because it is a known fact that "Pages with crawl warnings are often penalized by search engines." It is up to you whether you take notice of them, and obviously if you have specific conditions on those pages, such as the presence of a noindex meta tag, then obviously you may choose not to bother adding them.
That doesn't actually make the function spammy. It just means that the warning isn't relevant for your particular situation. Requiring the App to check pages with the noindex meta against pages that have missing meta descriptions and remove them from the reporting would add a whole other layer of complexity to the tool.
In short, the tool is meant to identify pages with missing meta descriptions. It does that very well. If you don't need to use the information then you are free to ignore it.
Sha
Hi searchpl,
If you are worried about "freshness" for ecommerce sites, there is one very important thing to do - eliminate wasted effort.
The fact is that what I call the "freshness effect" does not apply to every keyword term. Google appears to be determining whether fresh information is, or is not more relevant according to the individual term. If you manually check SERPs you will see this easily.
So, eliminating wasted effort while working toward providing new and relevant content all comes back to good old fashioned research. The smart approach is to spend some time manually checking SERPs for your "money" keywords. If you see evidence of the "freshness effect" for particular terms, those are the ones you could consider focusing new content development efforts on.
The keyword terms that might be affected will entirely depend upon the types of products in your stores - for example, I know that "weight loss" is a term where the "freshness effect" seems evident in SERPs.
Of course, if you decide to develop new content you should follow the advice already given by EGOL and James on quality and method. Incidentally, I would say the smart thing to do in this situation would be to come up with the type of content that is easy to add on a continuous basis - things like ongoing series, videos, podcasts, and cleverly managed user generated content.
Incidentally, if you listen carefully to information coming out of Bing via Duane Forrester, you may notice that Google is not the only Search engine that takes notice of freshness
Hope that helps,
Sha
Also: Don't stress too much if you are using automated feeds to update your product offerings on a daily basis ... you may already be providing fresh content if products are frequently added. The challenge then is to ensure that quality is up to scratch
Hi GaB,
Sorry, I missed your reference to the URL changes in the original question.
It will depend on exactly what the changes are as to how many Rules will be needed to create the 301 redirects.
If you are retaining file names, but moving whole directories to a new location, then this can be achieved with a single Rule.For example, to 301 Redirect all filenames in Useless Folder 1 to the identical filenames in Relevant Folder 2 can be achieved with a single Rule.
For URLs where the actual filenames will change, or where only some files are relocated you would need to implement individual Rules for each URL.
However, if your site is large, there is another alternative, provided that your URL structure supports it. That would be the use of Database Rewrites to implement the 301 redirects as URL's are requested from the server.
There are some requirements for this to be a viable option:
Basically what happens with Database Rewrites is that when the server receives a request for a URL, the identifier is matched against those in the Database and when the match is found the 301 Redirect is written and the new URL served.
For very large sites Database Rewrites would be the most suitable solution as very long lists of Rules in the .htaccess will eventually impact processing and load times.
Hope that helps,
Sha
Hi Ryan,
The "lost" post is one of a couple of issues I have been talking to the help team about during the past week. If you haven't already let them know that it happened, might be helpful if you could do that.
Seems there's a little gremlin in there somewhere
Sha
Hi Dennis,
SEOmoz will track rankings for all of the pages crawled within the root domain or subdomain that you have entered for your campaign, so the answer is NO.
If you are seeing different results when you check rankings manually in Google, you need to keep in mind that the SEOmoz Ranking report provides non-personalized results. Also, the way in which embedded local results appear in the SERPs will affect the result. When they are normal results with enhanced local listing they are counted. If they appear in a 6,8, or 12 pack they are not counted. While Google continues to experiment with local listings, the SEOmoz team continues to respond with updates.
Hope that helps,
Sha
PS - If you want to optimize different pages for your keywords using the On-page Analysis tool, you can click "Report Card" in the On-page tool. You will see a selector at the top where you can type in any URL and choose any keyword from your list.
Hi Ruslan,
If you have a Google Webmaster Tools account, you can use the "fetch as googlebot" feature to notify Google of up to 50 URLs per month. The way it works is that you run "fetch as googlebot" and if the page is fetched successfully, you are given the option to add it to the Google index.
Obviously if it is not fetched successfully then you know that there is a problem with the URL that needs fixing.
Bing also has a similar feature which allows you to manually add up to 10 URLs per day with a maximum of 50 per month.
Hope that helps,
Sha
Hi Salman,
If you haven't already seen it, there is a great Whiteboard Friday video from Rand about Pagerank and what you should and shouldn't use it for. In the video, Rand talks about the things that Ryan mentioned in his answer, the points you asked about and a few other things.
Here is the link: What is Google's Pagerank Good For?
Hope that helps,
Sha
Hi Shebin,
The best thing to do is email the Help team direct - help [at] seomoz.org
I had a campaign with a similar issue a while back and the Help team sorted it out for me very quickly.
Hope that helps,
Sha
Hi Nikos,
A simple 301 redirect should not cause your rankings to evaporate. The 301 Redirect is just signalling to search engines that they should de-index the old domain and replace the link in their index with the new one. It generally doesn't happen in less than a day either
It seems perhaps there is a problem with the way that the redirect has been implemented, or some other completely unrelated issue that has caused a coincidental loss of rankings.
Can you post a link to your site so that we can take a look at what is happening? If you would rather not post the link publicly, you can go to your profile and send me a private message. There are really too many possibilities to guess without taking a look at the site.
Sha
Hi strasshgoa,
Good advice from Calin - my guess would be that you don't have a redirect in place for that, or that you may have some other canonical issue, perhaps caused by having written the same URL differently in a link. An example of this would be using both www.mysite.com and www.mysite.com/index.html in your code. While both call the same page, they are different URLs and therefore seen by the crawler as duplicate pages.
The easiest way to identify the problem is to click the blue links in the column to the right of the URL that has been identified as having a duplication issue in your Report. The number of URLs that have been identified as duplicates of the page will appear as a link and when you click the number you will see the list of URLs.
There is also a help page for each of the tools in the Pro App which you can access by clicking the tiny blue "? Help" link to the right of the page towards the top (directly opposite the summary link on the left of the page). The help page for Crawl Diagnostics is here.
Hope that helps,
Sha
Hi again,
I would say that both approaches are valid, as long as you are very strategic about targeting your content.
My personal preference is to work first on building better content on-site (I am a member of the "why help someone else when you can help yourself" movement :), but if there is an opportunity to use the domain authority of another site to improve your own site, then that can obviously be of value too (especially if you are trying to build the strength of your own domain).
The key is to target the content you develop for each and NEVER use the same content for both.
When it comes to targeting, I follow two basic rules - articles on other sites should be aimed at an audience seeking "how to" assistance, while on-site content should be the type that will help to "engage" your audience with your company or brand.
For example:
An article on an external directory might be "How Early Should I Book My Wedding Limo", while a new page of on-site content might be "Why You Can Trust Us To Drive Your Daughter To Her Prom"
Hope that helps,
Sha
Hi,
Since this is really about the way that the tool works, the quickest and most accurate way of getting the correct answer would be to email help@seomoz.org.
That being said, avoiding special characters would be our company's preferred option. This thread from Google Webmaster Central would be worth a read.
Sha
Hi Talha,
Perfect answer from MyHolidayMarketing and a big thumbs up!
The benefit of this method of presentation is that it also reflects for your developer the practical implementation of a menu structure that will make it easier to expose search engine crawlers to all page levels on the site.
Basically, the sub-levels of your dot pointed list represent the sub-levels of your menu. That way, when any page is crawled, the basic menu structure will guide the crawler through all levels of the site.
Keeping the site structure within 3 levels is also a good idea.
Hope that helps,
Sha
Hi again,
We set up an example page for you with working tests and links to example code and zipped version.
Hope that is what you need,
Sha
Glad to help
Just another thought on using video on-site...if it were possible to grab very short video reviews from actual clients while using your services, that could be gold in terms of converting visitors to your site.
Not entirely sure about the detail of how to make it happen, but I'm imagining a happy, smiling bride and groom shooting a 30 second video snippet while in the back of the limo being driven to the reception..."the limo is gorgeous, our day has been wonderful and Joe our driver even brought barley sugar candy to help settle my butterflies on the way to the church!" these are the type of videos that will engage your audience and tell the story of your business.
Best of luck with it,
Sha
Just to clarify what you are doing here, a couple of questions:
These things will make a difference as to how you can approach the issue.
Sha
Hi Robert,
There is a "Getting Started" video intro to the software which will give you a place to start on this page where you will also find information about the weekly Welcome Webinar with Moz staff.
I would recommend that you also sign up for the next Welcome Webinar so you can ask any questions you may have about the different tools in the SEOmoz toolkit and how you can use them.
There is a Welcome Webinar every Friday morning and you will find a button on that page to Reserve your Spot at the next one.
When using the Pro App, if you need a little clarification of how a particular tool works (for example if you are in the On-page Tool area) you will find a tiny blue "Help" link at the top right of the page that will take you to the help page for that tool.
If there is anything that hasn't been explained by the video or help pages, you can email direct to the Help team - help at seomoz.org. They are actually pretty awesome and always happy to help you out.
There are also a lot of great SEOmoz resources that will help you get your head around it all... The Beginner's Guide to SEO is one of the best. It is included in the list of resources in this thread from a while ago.
Hope that helps,
Sha
Hi Chukwuemeka,
First a quick note - it is not possible to write a script that does this using html, but by converting your normal html files to PHP, you can achieve your aim for a standard site without having to use a CMS like Wordpress.
This post in the Youmoz Blog Adding "Related Post" Links without a Database should provide everything you need to make it work.
I hope this will help solve your problem ... if there is anything that you don't understand just let me know in the comments or come back to this thread.
I owe my boss a BIG beer gift this Christmas for writing the code that makes this work
Sha
Hi Ralzaider,
Some great recommendations & info here already, especially from Kane, but just wanted to add:
When I hear "online ecommerce sales" I start worrying about duplicated product descriptions fed from manufacturers.
So, if this is an issue for your site, I would put your person to work on writing interesting unique product descriptions for every item in your online catalog. Since you are lucky enough to be paying a tiny hourly rate for their services, I would even consider an investment in product to give to them as samples so that they can actually see, touch and feel the products, try them out etc before writing about them. If you have hundreds or thousands of products, perhaps just identify the most important ones to provide as samples.
If unique content isn't an issue for your site, then I would go with the other recommendations in this thread first.
Hope that helps,
Sha
Hi Mike,
@SEOmoz tweeted a few hours ago that the team had completed the latest Linkscape Update a week ahead of schedule.
Since there is generally a wait time before you can download your Advanced Report from the Keyword Difficulty Tool, it is quite possible that the numbers you obtained from there were Pre-Update, so may have changed in OSE.
There is currently an issue with the Keyword Difficulty Tool producing an error result, but the Help Team are aware of it and working to resolve the issue. If you want to keep up with progress on that you can follow @SEOmoz on Twitter and they will advise when it is resolved. Then you will be able to run your Advanced Report again.
You can run a 5 site comparison in Open Site Explorer using the Advanced Reports to get you through with most of the information if you need to.
If for some reason you don't believe the Update timing was a factor, then the best solution would be to email the Help Team help[at]seomoz.org.
Hope that helps,
Sha
Hi Guido,
I would agree with comments from eyepaq and Saijo with regard to redirects. Since there is a slight loss of link value with a 301 redirect, it is best to always send links to the target URL.
As to why the page is not ranking, I just wanted to add a reminder that there is more to that than just pointing links to the page. From that point of view there are a few things to consider with this page:
We need to be even more careful of search engine sensitivity to signs of over optimization when there is very little text on a page.
Hope that helps,
Sha
Hi kundrotas, The problem you have is that the code used for .htaccess functions is not terribly "intelligent". It does not allow for the use of "IF" statements etc. This being the case, the better option is to move the action to within the actual code. This is the rule at question RewriteRule ^([^/]{1,3})/([^/]+)/([^/]+)/$ /index.php?lang=$1&idr=$2&par1=$3&%{QUERY_STRING} [L] The easiest thing to do is in index.php add a check for par1 // If par1 is 1 and everything else is blank send it to the root. if( $par1 == '1' && $par2 == '' && $par3 == '' && $par4 == '' && $par5 == '' && $par6 == '' ) { $location = "/$lang/$idr/"; header ('HTTP/1.1 301 Moved Permanently'); header ('Location: '.$location); } Hope that helps, Sha
...and neither did my first link removal client, but the penalty was revealed when I insisted that he needed to lodge a reconsideration request.
I have heard this story repeated over and over while talking to rmoov users over the past few months...I am quite sure there are way more people out there who are under a manual penalty than anyone realizes.
I have my own theory as to why this has happened, but that's probably for a blog post some time.
In a nutshell, I absolutely agree with Ryan's take on the subject except for one thing...hard earned experience does not in any way amount to bias.
Hope that helps,
Sha
Hi,
Yes, there is a known issue at the moment. The Moz Team are aware of the problem and working to resolve it. Updates will be provided via Twitter, so if you follow @SEOmoz you will know when it has been resolved.
Hope that helps,
Sha
URL rewrites effectively change nothing from the SE point of view - the URL stays the same in the browser & the server just loads the alternative page content.
By the same token, just using 301's for both stages as Stephen suggested is not going to create a problem for the domain either, UNLESS there is already a chain of 301 redirects behind the existing domain.
If your client is super concerned about adding more than one 301 redirect, they might know something that you don't. Matt Cutts talked about chaining 301's together in this webmaster video.
Hope that helps,
Sha
Sorry that my answer appears to have lost all line breaks...seems to be some css issues at the moment.
Hoping you can copy it out and separate the lines ....I will try to reformat it as soon as I can, but right now it just keeps loading funky.
Sha
Hi James,
My immediate response would be DO NOT place any reliance on the presence or absence of an unnatural links warning from Google!
I have clients who never received an unnatural links warning of any kind and were actually suffering impact from a manual penalty. The penalty was not revealed until I decided that we needed to lodge a reconsideration request because something more than Penguin was going on there. Sure enough, the message that came back from the webspam team was that the site had a manual spam action in place.
Since our company developed the rmoov tool, I have been responsible for customer support and spent many hours talking to site owners who are trying to clean up their backlink profiles. During that time I have heard from many, many other site owners that have big problems with no unnatural links warning message received. A number of these have also found on lodging a reconsideration request, that they have a manual penalty applied to their site.
Basing your assessment of what is happening on whether or not you have received a message from Google is a big mistake, and unfortunately one that is being made by way too many people
Hope that helps,
Sha
Hi Miranda,
Yes, the initial "fast" crawl is limited to 250 pages. It gives you a quick start opportunity for working on your campaign while waiting for a full crawl to complete.
This is an improvement that was added a few months ago. Before that time you had to wait a week for the full crawl if your site was large.
Hope that helps,
Sha
Yes, sorry...I tend to refer to them as "simple" rewrites because a lot of people get confused by the difference
Sha
Hi again Daniel,
Notices are nothing to worry about - if you take a look at the message from Roger and the Mozzers above the blue tabs in the Notices section, you will see a little explanation:
"Notices are interesting facts about your pages we found while crawling."
They are just there to make you aware of things in the background that might not be obvious
You will find a short explanation on each of the other sections too.
Incidentally, lots of 302's is not such a good thing - which is why they are in the "Warnings" section instead of the "Notices.
Another little thing to be aware of is the tiny blue "Help" link toward the top right of each page in the Pro Tool. Whichever page you are on, this link will take you to the help documentation for that particular feature. Some good stuff in there and a great way to get to know the tools when you are starting out.
If you have questions about your campaigns, you can also email the Help Team direct - help at seomoz.org. It's worth it just to see the awesome thankyou video when you answer the feedback survey!...or register for the weekly Welcome Webinar with Moz staff.
Hope that helps,
Sha
Hi James,
Google's Disavow Tool is not a first line of defence, but the last.
Matt Cutts made it quite clear when releasing the tool that Google still expects webmasters to make a "good faith effort" to remove as many links as possible before using the Disavow Tool.
If you read Dr Pete's post using the link that Smart Lock Solutions provided in their post you can see this. If you have not seen the video from Matt Cutts about the Disavow Tool, you can watch it and read some views on the release of the tool here.
The only way to confirm whether a manual penalty has been applied is to lodge a reconsideration request. The webspam team will respond to a request, either advising that there is no manual action in place, or that a penalty exists because the site violates their search quality guidelines.
Hope that helps,
Sha
Hi Brian,
The key here is that it is a "warning" intended to alert you to the fact that the situation exists so that you have the ability to do something about it if you can.
While there are many sites (especially ecommerce sites) which are highlighted by the tool because they have extensive menus, it is also quite possible that some users have pages where the body text is full of plain text links which could be fixed.
As to whether they should be fixed, I would say definitely "yes" if they are the latter. Pages that have way too many links in the text are detrimental to user experience. They are harder to read, look spammy and the message tends to get lost if people are constantly clicking a link and going to another page.
If, on the other hand, you have a site where the large number of links is the result of the menus it is a fairly simple thing to ignore the warning.
I should mention that there is a feature request in the works which would enable users to "switch off" items in the Pro App that can be ignored. When this is implemented you would be able to remove them from the report so you don't have to keep looking at them.
Hope that helps,
Sha
...or use simple rewrites (without a redirect statement) until you are ready to move the site and then apply 301 redirects as normal. Such rewrites simply serve the alternative page content when a request is made for the target URL.
Sha
Hi David,
Having the 301's in place is a good thing rather than a problem. They have been created by Wordpress so that you do not have broken links on your site (because you have created links and then deleted them by changing the permalink).
There are 3 major reasons for using 301 (Permanent) Redirects:
The 301's have no influence on users as they cannot see them - they are written to the .htaccess file. The idea of having them there is to catch any incoming traffic that comes via existing links to those pages that no longer exist.
There is a good explanation of 301 Best Practice in the Learn SEO section here at SEOmoz.
When working through the things identified in the crawl test, the Errors (red) and Warnings (yellow) are the things to pay attention to first. If you check the single line explanation above the blue tabs you will see this message "Notices are interesting facts about your pages we found while crawling." So they are not really problems that need fixing.
A couple of other great resources if you are just starting out are The Beginners Guide to SEO and the rest of the Learn SEO section.
Hope that helps,
Sha
http://www.seomoz.org/blog/duplicate-content-in-a-post-panda-world