Have you received an Unnatural Links warning in Webmaster Tools?
Posts made by MikeRoberts
-
RE: Too much backlinks [Penalized]
-
RE: Odd URL errors upon crawl
I have a crazy Russian coder who does all of that for me so I'm honestly not 100% sure how to find these errors more easily in order to correct them.
-
RE: Odd URL errors upon crawl
Just looking at the URL you posted, it looks like an open DIV tag or an incorrectly closed link in the source causing an html encoding issue.
-
RE: Should I use canonical?
This sounds to me more like a NoIndex situation. Really you should be adding content to them but I can understand if its a lot of work and very tedious. A canonical wouldn't really make sense here... its thin content and the canonical would likely be ignored by Google. So instead I'd say NoIndex the pages for now.
-
RE: Reminder: PRIVATE Q&A going away
Looking forward to the more active and robust presence on the Public Q&A.
-
RE: 301 Redirect Dilemma - Website redesign
Best practices would be to 301 all those pages to their relevant new page. Now, if you're not worried about the traffic going to some of those old pages you could choose to 404 the page and let it die but you'd be missing out on any link equity going to that page... but if there are no external links pointing to them then you only have internal links to worry about (but those are likely all on old pages which no one will see anymore either via 301s or 404s).
As for those products with 5+ URLs on the old site... redirect them to the relevant new page. It doesn't matter that there are 5 URLs redirecting to one if that one page is relevant to all 5 old URLs. I wouldn't worry much about server load either with that amount of redirects and eventually any old listings in the SERPs will update to the newer URL (if relevant for the specific query) instead of constantly sending people to a page they know exists elsewhere.
-
RE: Empty search results labeled as Soft 404s?
Does the resulting page shown notify users that there is no relevant information to return and suggest alternates to search for or a way to contact user support/customer service? If so then I wouldn't worry about a Soft 404. Considering these internal search pages should also be "NoIndex, Follow" then the odds of them showing up in serps and/or causing you real problems is low.
-
RE: How different does content need to be to avoid a duplicate content penalty?
Yes, those landing pages sound like they will be viewed as duplicate content with only 10 or so words different... unless you only have 25 words on each page (which would then be incredibly thin content). I've heard people say that a page should be a minimum of 60% different (No idea how that number was determined though) to avoid duplicate errors. At that point it becomes simpler and easier in most cases to write up completely new content for every page to avoid any issues.
-
RE: Should I claim my site on Alexa.com?
Claiming a site on Alexa isn't a bad thing. I'm not too sure about paying for their certified metrics though. But if you claim your site then there are a few small sections you can update to make it a bit more visitor friendly instead of just a bland listing. I use it to keep a quick tab on global traffic rankings but its not something I find that I'm really using on a regular basis or digging deep into. Some other here may have a different opinion of Alexa. I'd assume those sites in the 100k and better range would be more likely to want to use the certified metrics.
-
RE: Content, for the sake of the search engines
Wwe should always be creating new, relevant content for our sites. Obviously don't over do it and don't write for the Search Engines alone... but if you have pages lacking much content that you feel could better serve your users with some copy added to it then by all means go ahead and write something up. Maybe look for underdeveloped pages that could be perfect for trying to attract a longtail term you haven't put much love into or expanding on a niche page with something insightful/interesting where you may have taken the page for granted before and/or assumed no one needed an explanation.
-
RE: Meta-Robots noFollow and Blocked by Meta-Robots
The meta robots tag set to NoIndex means that the page is blocked by Meta Robots. Not really an error to be worried about. Due to Wordpress creating duplicate content thanks to the ?replytocom= parameter you likely set it in the backend to noindex those pages.
So the actual page "http://www.fateyes.com/the-effect-of-social-media-on-the-serps-social-signals-seo/" is lacking a robots tag as far as i can see and will therefore technically be indexable but the ?replytocom= created by the comments is correctly noindex.
-
RE: Canonicalization interact with 301 redirects?
Just remember that a canonical is a signal not a directive. Google and other search engines can choose whether or not to listen to your signal. So make sure those "duplicate" pages need to exist as they are currently. In some cases it may make more sense to either update the page with fresh, original, and relevant content or to have the page marked NoIndex depending on the situation.
-
RE: Canonicalization interact with 301 redirects?
I'm not 100% sure why its throwing 404s because I've never had that exact problem when doing the same thing on any sites I work on but I agree with TextMarketing on updating the canonicals. If you originally had Page 1, Page 2, and Page 3 canonicalized to Page A and now Page A has been 301'd to Page B, you should update Page 1, Page 2 and Page 3 to have their canonical tags pointing to Page B instead of the 301 page.
-
RE: Page feedback
Only thing I think really needs adding would be image meta. Those pictures look blank to me when browsing as Googlebot so the real googlebot probably has no idea what's there.
-
RE: Canonicalization on more than one page?
You can canonicalize pretty much any page to any other page that you would like. If the page is a duplicate of the canon page or a subset of data/copy/info taken from the canon page's superset then Google will use the canonical as a signal to push link equity to the canon page like a 301 without the redirect part.
-
RE: Canonicalization Issue | E-commerce
Strange that you can't update the header to add canonicals. Personally I'd say you should then add some written content about the products to differentiate them. Otherwise that's a pretty bare looking page to me except for all the links. Or, if you don't care about them ranking & don't want to invest in copy, NoIndex them and just let people find them through site navigation instead of search. But i'd go with the expanded copy.
-
RE: Do pages with fewer headings rank higher?
I don't think that its fewer heading tags making the page rank better, I think that the pages are probably good enough that it doesn't matter there are no subheadings. From my understanding, breaking an article or blog post or page into subheadings and sub-categories by using the various header tags will help with ease of readability. Sometimes people scan the whole page prior to reading (or leaving) and want to see if there's anything to catch their interest. Having a pertinent h2, h3 or h4 could possibly get those people to stop and read a bit when they may otherwise have just left from impatience. I think too many header tags can be a bad thing though. If your page has gotten to the point where you have one h1, four h2s, 10 h3s, 27 h4s, 17 h5s.... then you may be doing it wrong.
-
RE: Help: my WordPress Blog generates too many onpage links and duplicate content
Tag Clouds are bad. People need to stay away from them.
Make sure you don't have any useless one-off tags. Far too many people think that since Tags point to relevancy and could be useful for SEO that they should then add 15 "relevant" tags to a post. But then you've potentially created 16 duplicate pages. Tags help for relevancy by relating one post to other posts. If there's only one post in that tag page then its not helping you (UNLESS you plan on eventually adding more posts to that tag in the future).
Author page can get annoying in Wordpress but its the least of all worries concerning duplicate content. You can always lessen the problem by getting more authors. Otherwise, don't get too worried about that page.
All in One SEO Pack shouldn't be integrating poorly with wordpress and causing title issues. What it may be doing that is causing the overly long titles is that it adds on " | [insert Long Blog Name]" after the title of the article. So if you put in a long title, it then also adds in your case " | INLINEAR Digital Marketing & Brasilien Blog" which gives you about 20 characters to work with for your "optimized" title. Anything more than that will trigger the Title Too Long warning.
-
RE: SEO Master accounts?
That sounds to me like they're holding your accounts hostage and/or telling you that your pages are being linked to as part of a linking scheme under their control which would no longer point said links to your social pages if you left them. You should definitely have them elaborate a little further on what a "Master SEO account" is because it sounds like BS.
-
RE: Numerous 404 errors on crawl diagnostics (non existent pages)..
This appears to be your problem... It looks to be part of a widget and is not immediately available as a clickable link on the page but appears in your page source and is crawlable.
ul><ul id="<a class="attribute-value">footer-col-4</a>" class="<a class="attribute-value">footer-col footer-non-spanning-col</a>"><li id="<a class="attribute-value">pp-custom-icon-14</a>" class="<a class="attribute-value">widget sc widget_pp-custom-icon</a>"><a id="<a class="attribute-value">pp-custom-icon-14</a>" href="[109,97,105,108,116,111,58,104,116,116,112,58,47,47,109,97,105,108,116,111,58,105,110,102,111,64,114,111,98,101,114,116,115,119,97,110,105,103,97,110,46,99,111,109](view-source:http://www.robertswanigan.com/happy-birthday-sara/109,97,105,108,116,111,58,104,116,116,112,58,47,47,109,97,105,108,116,111,58,105,110,102,111,64,114,111,98,101,114,116,115,119,97,110,105,103,97,110,46,99,111,109)" class="<a class="attribute-value">icon-link</a>"><span class="<a class="attribute-value">jsobf img</a>"><img src="[http://www.robertswanigan.com/wp-content/uploads/p4/images/widget_custom_image_1_1321921371.jpg](view-source:http://www.robertswanigan.com/wp-content/uploads/p4/images/widget_custom_image_1_1321921371.jpg)" class="<a class="attribute-value">pp-custom-icon</a>" width="<a class="attribute-value">170</a>" height="<a class="attribute-value">70</a>" />span>a>li>
-
RE: Deleting Website Section, Preserve Links with 301?
Bulk redirects to the homepage are usually not the best thing (especially if there is relevant content elsewhere on your site). Its very possible they eventually won't pass the same level of link equity as before. Have you tried instead driving more traffic to your community page in an attempt to grow the community? Perhaps via social media or blogging?
-
RE: Website Spam Backlinks Solution
Have the links actually harmed your site in any way? If the sites are extremely spammy, full of spun content, and/or feature only scraped content then you may not want them pointing to you... but if they haven't hurt your traffic, rankings, etc. then why remove your page or disavow the link? Google will likely just discount the links instead of pegging you with an algorithmic penalty. In the mean time, why not work on acquiring some relevant backlinks through outreach.
-
RE: Which hreflang tag to use for .eu domain
From my understanding when using hreflang to denote English speakers in the entirety of the European Union you wouldn't need a regional tag on there.
I.E. if you targeting English speakers in Canada it would be "en-ca" whereas targeting English speakers in the UK would be "en-gb". But when targeting English speakers in all 27 (?) EU nations at once then you would just use "en".
-
RE: Duplicate Content in Wordpress.com
Depending on your theme (sometimes) you can change Tag Archives and Category pages to show a post summary instead of the full article... which can help but will not fully solve your problem.
If you could convince them to self-host instead of using WordPress's hosting then you'd be able to install an SEO related plugin that might help more to fix the problem.
Tag things less. There are a number of tags that have only 1 post associated with them. Take for instance the post from February 5th that has 8 tags, 5 of which go to Tag Archives that only have that one post on it.
Also, consider getting rid of the Tag Cloud because it just adds unnecessary and irrelevant links pointing to those Tag pages that are causing your dupe content problem and probably harming the flow of link equity through the site.
-
RE: My number of duplicate page title and temporary redirect warnings increased after I enabled Canonical urls. Why? Is this normal?
I'd keep the canonicals in place. Always double-check using multiple sources when you have the ability to do so. SEOMoz crawls and error reporting are great but there are the occasional hiccups. When I logged in and saw that I had something to the effect of 1900 new duplicate content errors showing up, I freaked out too and got worried something went horribly wrong. I looked over a random sampling of canonicals, checked that they were implemented correctly, checked with my web team to see if they made any big changes without letting me know, looked at Webmaster Tools to see what it said, and tried to find answers here in the Q&A as well.. Then I was pointed to this post on the blog http://www.seomoz.org/blog/visualizing-duplicate-web-pages.
-
RE: My number of duplicate page title and temporary redirect warnings increased after I enabled Canonical urls. Why? Is this normal?
This exact thing happened to me after SEOMoz switched to a new method for determining duplication issues a few weeks ago. After the following crawl my numbers went back to normal though. Double-check the pages against Webmaster tools to make sure they aren't actually showing as duplicates and if WMT doesn't show anything wrong then give it a week and you should see your numbers returning to normal in the next crawl report.
-
RE: Should I use www in my url when running On-Page Report Card?
Which is your preferred domain URL: with WWW or without? And do you have your site set up to redirect visitors to the correct one?
I'd suggest using whichever is your preferred URL.
-
RE: Disavow links
If the link isn't hurting you then why would you want to hurt yourself by telling Google to disavow it?
As Irving mentioned, the Disavow Links tool is a last measure not a go-to tool for every occasion. If bad links are really hurting your site then start reaching out to those other webmasters to remove the bad links (or at least nofollow them). If you've been algorithmically penalized due to bad links then you'll also need to start working on building a better, more robust link profile through outreach and try to grab some really good ones to help counteract the bad. If all works out well then offending sites will remove the bad links (hopefully without needing to pay), those links from good sites will show that you're working towards improvement and then after a refresh you'll hopefully see rankings start to bounce back. If people won't remove your links, are asking for exorbitant amounts of cash, or are flat out ignoring you... then you can consider the Disavow Links Tool.
-
RE: Someone not removing a link to our site
Are you certain this link is causing a problem? If so, considering they aren't being cooperative have you looked into using the Disavow Links tool?
-
RE: Fix or Block Webmaster Tools URL Errors Not Found Linked from a certain domain?
I get a lot of those as well. Not sure where they get their information from but they append some of the strangest things to the end of our domain URLs that isn't even vaguely based off of real pages. In their case, I leave the 404s alone as they tend to vanish in a short amount of time.
-
RE: To Follow or Not To Follow...... ?
Mike Davis's answer pretty much hits it on the head. If every page is marked NoIndex,NoFollow then no one will ever find your site unless you specifically direct them to it or they know it exists already... which means you're missing out on a large potential customer base from organic traffic that won't ever be able to find you.
Hell, even those content updates won't matter because the search engines aren't going to care about it (since you told them not to index it) and won't be able to see newer content deeper into the site (since you told it not to follow anything on the page).
-
RE: To Follow or Not To Follow...... ?
Just a bit of clarification needed... what pages are marked "noindex, nofollow"? Sometimes NoIndex,NoFollow or NoIndex,Follow can be useful for certain pages.
-
RE: Title Tags: Does having the singular and plural version of the keyword hurt the ranking?
True true. SERPs for a singular will not be 100% the same as SERPs for the plural in many cases but there are often overlaps. Keyword research will help in determination of which may be the better trafficked and/or more valuable term. Natural inclusion in the body can potentially make up for lack of inclusion in the title. Also, considering that Google will in some cases change your title and description to better suit a searcher's query for which you are also relevant, you can't rely too heavily on title optimization alone as a factor in your ranking though it is a viable signal.
-
RE: 301 redirect all 404 pages
If people are only occasionally typing in "/troussers" instead of "/trousers" then let it 404. Its there to let people know "I'm sorry, this isn't here. Perhaps you misspelled something." You could always 301 it if you really felt like it because it wouldn't hurt anything in the long run.
Now, if you found that you're sending 500 people a day to a 404 page for "/troussers" when they're looking for "/trousers" and you find there are relevant inlinks pointing at the wrong page then by all means go and 301 those people to the correct page. They'll be better served by it. But if you're redirecting all of those people to "All Categories" then you aren't being thoughtful of the customer's needs.
Indiscriminately 301ing everyone to "All Categories" without considering what their intentions are is not helping that customer and will likely wind up with an ever increasing bounce rate on "All Categories".
-
RE: Optimizing web page
And to expand on Kevin's answer.... results in whitehat SEO (i.e. following best practices) can sometimes take months before it really kicks in. We're working towards the long haul not the short game. Once you have all of your on-page/on-site stuff done, now's the time to start looking at relevant links, guest post opportunities, outreach, social media, etc. And then you can come back to the on-page/on-site stuff, see how its been working, determine what else can be tweaked/fixed/changed, etc. and continually update your site(s) with new, fresh content on a regular basis. Lather, Rinse, Repeat.
-
RE: Title Tags: Does having the singular and plural version of the keyword hurt the ranking?
In most cases Google is smart enough to understand that a page relevant for "Wood Desk" could or should show up in searches for "Wood Desks" and vice versa. As such, it's not really necessary to make sure that you shoehorn in all of the plurals and singulars of your core terms. Worry about it more from a Human standpoint. Making the title more human accessible will help with clickthroughs, visits, and so on. Forcing multiple variations of the same word into a title in order to attempt catching every variable will probably make people skip over you. And ultimately, getting the qualified traffic is what much of SEO is about.
-
RE: Heavy Internal Linking Help
Thanks Joram. Part of me was thinking the internal links were an issue of higher priority and the other half of me was thinking it's not that big of a deal if no one is having issues with it.
As for the Search Bar, one of our coders is currently in the process of tweaking our auto-complete to make search more robust... so that will be coming down the line in the relatively near future. We've found that most people using our search bar are searching for highly specific terms like model numbers which, unfortunately, if not perfectly entered wasn't returning the correct information (which is being worked on as well). I believe I've suggested to management in the past about making the search bar more prominent but with our current setup the only feasible way of doing this involved adding a second line to the Top Navigation which then caused some conflicts with the dropdown menus. (Its something I can look into again though).
The Samples page... yep, its a mess. I've actually suggested creating samples category pages to make things easier but that's been shot down. There was talk of using CSS to hide the various sections until someone clicked to expand but that was held off at some point. Might be worth giving up on lessening the internal linking and looking at that again to make Samples more manageable.
-
Heavy Internal Linking Help
One of the sites I work on is a home improvement ecommerce website that does fairly well for its niche. One of the biggest problems that we're not sure how to adequately handle is a heavy internal linking issue. The homepage (http://www.fauxpanels.com/) has approx. 226 internal links which is mainly due to the navigation structure. There are far worse pages though (the Samples page http://www.fauxpanels.com/samples.php has over 800 internal links).
For the most part, management doesn't want any massive changes to the navigation layout. The Top navigation bar has a number of dropdown menus when you hover, the Left Navigation Bar expands to show more choices, and the Bottom navigation bar in many instances is just repeats of links that can be found elsewhere. Also, the product links in the body of the page can be found linked in the Left Navigation. This is not what I would personally consider the best way to handle navigation but the Customer Service Department has gotten numerous calls and emails over the years about how much people love our navigation and how easy it is to find things.
My thought was trying to lessen the amount of links by having things grouped more often into Category pages/hub pages where applicable so we can remove some of the links. We've also considered NoFollowing links but my understanding is that even if you NoFollow the link equity is still divided by the number of on-page links.
So, any of you much more experienced SEOs have any idea how I can lessen the heavy internal linking without completely re-doing the site's navigation layout and not harming link equity, ranking, etc.? Or, conversely, would you consider having an average 200-300 internal links per page not to be a real issue given the positive effect it has apparently had on user experience?
-
RE: 2013: Top 10 content SEO tips survey
These are all off the top of my head but I know i have guideline notes somewhere in the mess of my desk that we usually follow. These will change on a regular basis as is necessary in order to keep up with best practices and algorithm changes. The things posted below could all be completely different 3 months from now if Google decides to release a Zebra Algorithm targeting websites with puppies on them.
- Title tag: How many characters / words? Best keyword positions? Best tricks?
- Over 10 characters and under 70 characters though there are instances where titles will be truncated due to character width. I usually aim for 56. Keep in mind that Google can choose to replace your title in the SERPs with what they feel is more relevant. We stick to the format of either <sitename>[seperator] <relevant core="" terms,="" descriptor,="" or="" short="" sentence="" with="" terms="" near="" the="" front="">or we flip it.</relevant></sitename>
- Meta description: How many characters / words? Best keyword positions? Best tricks?
- Over 70, under 160... closer to 156 and similar to title can be truncated due to overall character width. Core terms nearer to the front where applicable. Keep in mind that Google can choose to replace your title in the SERPs with what they feel is more relevant.
- Meta keywords: Use them? How many?
- Bing and Yahoo can still make use of meta keywords. Google does not (unless you're News and using the new News related keywords tag). We often continue to include them but more as an idea concerning the interrelation of pages on our sites.
- H1 tag: How many characters / words? Best keyword positions? Best tricks?
- Core term or most relevant terms concerning the page. If it includes a keyword we aim for closer to the beginning but that may not always be necessary. Make sure they make sense for the over-arching theme of the page.
- H2-H6 tags: How many characters? Best keyword positions? Best tricks?
- We keep them concise and to the point. Core terms near the beginning where applicable.
- Image alternative text: How many characters / words? Best tricks?
- Short, sweet, concise and relevant.
- Text length: Minimum, maximum? What’s better: 1 long article or split an article in several pages?
- I stick to one long article with relevant h2-h6 tags highlighting important parts. Three sentences in word looks like nothing, on your page that may look like a huge paragraph. Keep paragraphs concise, 3-4 sentences each. Breaking things up with an image is always nice.
- How much links within an article? Min? Max?
- As are necessary. You can get away with no links if you really wanted to but a few can help point out other relevant copy either on your site or off. But too many links will look cluttered and spammy. This is more a personal choice but if it looks bad then it probably is bad. We had a basic rule of thumb of no more than 1 link per 100 words in the copy... this was not adhered to and often it was less links.
- Usage of keywords within links?
- Natural sounding links are best now. Stay away from overusing heavily keyword laden text links too often. One or two every now and then is fine but overall you want them to be simple and natural. More people link to sites using the site name, site url or phrases like "Click here" than they do with terms like "Cheap Red Widgets".
- Your favorite SEO tip?
- Canonicals are your friend.
- Title tag: How many characters / words? Best keyword positions? Best tricks?
-
RE: Can I 301 Re-Direct within the same site?
Same products, same copy, etc,?
If you want Page A to no longer exist and Page B to replace it in the SERPs then use the 301 to send people to Page B and pass link equity/seo juice.
If you want Page A to continue existing but you want Page B to potentially replace it in SERPs, use rel="canonical" so that Page A may still be found and visited but pass equity to Page B which will/should eventually replace Page A in the SERPs for the terms where Google feels your canonical suggestion is completely relevant.
-
RE: Bad use of the Rel="canonical" tag
I would not consider using Canonicals as a means to optimize your rankings in the SERPs. Remember that rel="canonical" is a suggestion, not a directive (http://support.google.com/webmasters/bin/answer.py?hl=en&answer=139394). Google can choose whether they feel your canonical is relevant to use or if it should be ignored. So adding that canonical from your category page to your home page when they are not similar enough and especially if there are no duplication errors will probably lead to Google choosing not to use the canonical suggestion.
-
RE: What can i do about link threaths like these? What does Google do abt these?
True true. Didn't mean to make it sound like I may have been singling you out. I personally wouldn't pay anything for removal but I can see how in some cases it would help things along... especially when your site is your livelihood and you need to get those links down as quickly and painlessly as possible.
-
RE: What can i do about link threaths like these? What does Google do abt these?
Looks like you have more problems than just that one guy's directories. As for what Google does about this, Nothing. If you paid for links and got hit by an algorithmic penalty then Google has already done what they're going to do... i.e. Devalue you until you fix everything wrong with your site and wait for a refresh to pick up that you've fixed everything.
As for getting the links removed from this specific person's sites, if there really is a legal case there then you can always try it but honestly I'd say that wouldn't be the smartest move. If there's content which falls under the guidelines to request a DMCA takedown then there is that route. Lastly, there is the Disavow Tool but I have not had a need to use it and don't know if that is really the best course of action for you.
Like Irving said, $20 per Link is a complete scam... but offering $30 to remove all of them just opens the doors for all the other link farms out there to attempt extorting people further.
-
RE: URL errors in Google Webmaster Tool
An explanation of the priority column from http://googlewebmastercentral.blogspot.com/2012/03/crawl-errors-next-generation.html:
- We’ve ranked the errors so that those at the top of the priority list will be ones where there’s something you can do, whether that’s fixing broken links on your own site, fixing bugs in your server software, updating your Sitemaps to prune dead URLs, or adding a 301 redirect to get users to the “real” page. We determine this based on a multitude of factors, including whether or not you included the URL in a Sitemap, how many places it’s linked from (and if any of those are also on your site), and whether the URL has gotten any traffic recently from search.
-
RE: Has any on else experienced a spike in crawl errors?
This leads me to a problem then. As per Dave (the author of the article), "using canonical tags will result in duplicate errors being suppressed. If one page refers to another as a duplicate, than that pair will not be reported as duplicates. Also, if two pages both refer to the same third page as their canonical, then they will not be reported as duplicates of each other, either."
But now that this change has gone into effect I have 2000+ more duplicate content errors appearing and they are all pages with rel="canonical" pointing to the original page. So, as he stated earlier in the post this has caused "the most negative customer experience we anticipate: having a behind-the-scenes change of our duplicate detection heuristic causing a sudden rash of incorrect "duplicate page" errors to appear for no apparent good reason."
Is this something that will eventually correct itself or is this something that will need tweaking of the new detection method?
-
RE: Tag archives in wordpress
I wouldn't add a canonical from a Tag archive to a post (especially if there are multiple posts in the Tag archive).
The SEO value of Tags (and Categories) comes from them creating a hierarchy in your site as well as creating relevancy signals between all of the posts that appear in that Tag archive. If you have 596 tags and there are tons that only have 1 post in them then those one post Tags aren't helping you. You may need to consider cleaning up your tags, checking traffic and ranking for your tags, re-tagging posts to the most relevant tag with good traffic and/or rankings, deleting the useless non-relevant tags, and placing 301 redirects from the removed Tags to relevant tags.
We're currently going through the same steps on one of the sites I work for but in our case there are only 274 tags to deal with.
-
RE: Do Dashes in Domain names hurt SEO ranking?
Spammy domains have been known to overuse the Hyphen... but using Hyphens does not make you spammy
Matt Cutts had previously stated that Google recognizes the Hyphen as a separator and the Underscore as a connector... i.e. "red-wigdets" gets read as "red widgets" while "red_widgets" gets read as "redwidgets". For keyword purposes, a hyphen is technically better but the difference is likely negligible. Also keep in mind the EMD update. If your core term is "cheap red widgets" and your domain URL features "cheap-red-widgets" then the EMD has made the previous positive name correlation into a less powerful signal.
Matt Cutts 2011 Underscore vs Dashes in URLs video http://youtu.be/AQcSFsQyct8
Matt Cutts 2009 Underscores or Hyphens in URLs video http://youtu.be/Q3SFVfDIS5k
Matt Cutts 2005 Dashes vs. Underscores blog post http://www.mattcutts.com/blog/dashes-vs-underscores/
-
RE: Has any on else experienced a spike in crawl errors?
I saw a huge spike after the last crawl. In my case, the canonicals we set on our site months ago to handle some duplicate content issues appear not to be seen by Seomoz's crawl. Though when I check for duplicate title & meta issues in Webmaster Tools I don't see the offending pages that SEOMoz is showing me. That leads me to believe something is happening with either how the SEOMoz system is reporting or how their bot is crawling.
-
RE: A good META title for a front page....
Here are title tag best practices straight from SEOMoz http://www.seomoz.org/learn-seo/title-tag that may help.
-
RE: Page Speed & SERPS
You might also want to check out https://developers.google.com/speed/pagespeed/ to see if there are any suggestions Google can give you to make your site faster.