Yup - all works for me.
Posts made by matbennett
-
RE: A plea to 64-bit users
What is supposed to be broken? Can't see anything obvious in Chrome, Firefox or IE on Win7 64bit
-
RE: Duplicate content resulting from js redirect?
I'm not great with JS myself - I'm lucky enough to employ people to do that for me! However, here is what the script is doing:
- First check whether "no_direct=true" has been set - presumably to allow users to override the mobile version and view the full desktop version if they choose
- If that hasn't been set then look to see if they are using iPhone/iPod/Blackberry/Android browsers
- Presumably the next line is then redirecting.
That seems fairly logical - no real problem there. However the mobile version is getting picked up and indexed somewhere.
Because you want users to have access to that "duplicate" version, but don't want the search engines too you don't really want to either prevent this URL from existing or override it with .htaccess . It would be smarter to pick a method that targets the search engines, such as:
- Stop them crawling it (through webmaster tools or robots.txt)
- Add a no-index tag to it
- Canonical it back to the main content
-
RE: Duplicate content resulting from js redirect?
The easiest way to fix this is to tell google to ignore the URL variable no_redirect . You can do this in webmaster tools under Configuration > URL parameters. find where no_redirect is listed, click edit and set it to "used for tracking".
Remember to do similar for bing.
You could also block these in robots.txt
-
RE: Is there a way to specify what SEOmoz classes as duplicate content?
There isn't no. However I'd also suggest that you definitely wouldn't want to filter out such results anyway.
If your site has a separate page each for the 35, 45 and 55 litre refrigeration units and those pages are largely similar then this is exactly the sort of issue that the tool is designed to find. It flags up such duplication as that is a problem because Google doesn't like such pages.
In the example that you gave I would be looking to either rewrite descriptions or to consolidate items where only size (for example) differs in to one page with an attribute select.
-
RE: Good idea? Dynamic to Perkalink URLs for over 10k pages.
I presume you mean switching to a friendlier URL structure. I'd only do it if there is a clear advantage to your site to change, not for the sake of it.
The key question is whether the site is getting properly crawled or indexed. If it isn't then you need to understand whether the URLs are the cause of that and then change if they are. However, if your site is getting crawled and indexed properly then you are doing it just for the possibility of some advantage in ranking for "nice" URLs. That's a riskier move in my eyes.
Depending on the size of the site it can take months for Google to update it's listings and remove all the old URLs. 301ing the pages can be fairly painless if your CMS understands both the old and new URL format, but you still lose a small proportion of value from each link (in practice I wouldn't be concerned about that too much). However it is still a big job to do if you are not clear about the benefit to be had.
-
RE: Increase Search Ranking for CEO
It's a good plan, although if the "competing" picture isn't an exact name match then it might not be needed.
If the CEO has their picture of a well linked page on the main site ("about us" usually does the trick) that is a good start. Have the image name match the search term and be sure to use the alt tag as well. Also have the exact term on page. Links (mentioning the target phrase) are always helpful as well.
Here is a good example of how it is done: Search for "best looking man in the world". You'll probably get some image results including some predictably polished looking gents. However there is also one pasty looking capped Canadian there - a nice chap called Joel. See what he was up to at bestlookingmanintheworld.com
-
RE: How does your urls age affect your ranking.
Interesting take on the question. If you are talking about buying an "old" domain I'd definitely agree. Google know when (most) domains change hands and benefits from old domains that have been resold seem to reset.
However that video is not actually answering this question. That video is about how long you register a domain for (future) as opposed to age (past). There was a flurry of talk a while back about how registering your domain for 10 years rather than 2 would help and that video appears to be addressing that point rather than the point about domain age.
-
RE: Jquery in top of page vs text on bottom page
I think that there is one clear advantage of that slider: It looks a lot less spammy that the stuffed copy at the foot of the page. That has to be a good thing for any number of reasons.
Another effect is that the slider could have an impact on dwell times.
My concern would be that it detracts from the sales process. That particular example makes the slider seem a lot more clickable than the products to me. However some changes to style and presentation could probably improve that balance.
-
RE: How does your urls age affect your ranking.
Hi Feilim,
As with many things in SEO, no-one outside of Google truly knows the answer. It is so difficult to isolate the effects of any one ranking factor that accurate measurement of it's impact is often impossible.
However - we get clues. These come from Google themselves, people doing experiments and analysis of other information (the work of Bill Slawski analysing search engine patents comes to mind )
Most people agree that domain age is a factor. A lot of low quality sites are created on a "churn and burn" type principal, so it makes sense that a site that has been active for a longer time is more likely to be more trustworthy than a newer one.
Whether that effect comes from the actual age of the domain, how long the site has been in googles index or the backlink history (or all 3 in different proportions) is harder to tell. However older sites do appear to have an advantage of some sort.
However, don't get age and standing confused. A newer site that is better cited can easily outrank an older site that have few good backlinks.
I hope that answer is helpful. I know it isn't that clear cut, but few things in search are!
A couple of related resources that you might find interesting:
- Search engine ranking factors : Results of an annual survey of "top SEOs" looking at the question of what makes a site rank.
- SEO by the Sea : Bill Slawski's blog (mentioned above) Essential reading if you are serious
-
RE: Why did my site go from 1,000 Impressions to 0 impressions over the past couple days
Can you clarify where you are at. You say you have 0 impressions, but also that you have traffic. That seems contradictory.
If it is 0 - as in absolute flatline the first think I would check is your tracking code. If you have server stats as well maybe check one against the other to make sure that you don't just have a tracking error. However your pages are indexed and you have google analytics tracking code in place, so a total flat line in traffic seems unlikely.
Check your top entry sources from before the drop. What were your top terms - what pages were they panding on. Are you still ranking for those terms? Has the URL changed (in which case maybe redirect from the old one), if not how do the new and old pages compare.
I've often seen an initial conversion drop when sites are changed around - even when the changes are big improvements. On sites where you have a lot of repeat visitors this can be because people are used to the site as it was.
-
RE: Should we care about the poor for abroad study?
Hi Debal,
Sounds like an interesting subject, but I don't quite get what it is that you are asking. Can you rephrase that?
-
RE: Google Analytics Organic search queries aren't being updated, even though I'm still seeing results in all our typical results pages.
Not redirecting from a URL that doesn't have the analytics tracking code? Maybe check entrance sources for those pages.
-
RE: Can I get your expert opinion?
"If I did that for every question that I answered here at SEOmoz, I am willing to bet that people wouldn't like it."
I think that most approaches would get treated as spam if you over-do them. Probably a good reminder that any technique/tactic that you use that includes the word "every" would probably be better turned down a notch or two!
-
RE: Canonical tags
Google does seem to take a while with canonical tags. However, it sounds like a 301 might be the better choice in these circumstances.
When choosing between 301 & canonical, the issue for me is the user experience. If the 2 URLs show different content and the user would expect to be able to find either set then I'd go with canonical. Otherwise it is 301. If you are just trying to stop issues with capitalisation in URLs then really it should be a 301.
-
RE: Can I get your expert opinion?
Why does using people's real questions have to mean spamming? I'd agree if you only used the sites in order to drop links, but why should that be the case?
If there is a site where relevant questions frequently come up then why not use that as a source of inspiration for blog topics? Find the common questions, put together an excellent answer, when the question comes up again cite your post within the rules of the forum.
The forum becomes more useful, the user gets their answer, Igor gets a link and the internet is a slightly warmer and nicer place to be. Where's the spam?
-
RE: Link-Building - Directories
The important point is that not all directories are equal. "Directory" really described the structure of a website. However for many is SEO it refers to the thousands of directories that exist only to "help" sites rank in google.
So, if you were promoting a car tyre shop and found "Bobs Auto Resource site" where Bob picked great firms, wrote up what he liked about them and was itself linked from other quality looking sites then that would likely be a good resource.
However "Bobs cheap SEO friendly directory" that listed virtually any site that submitted itself, added no value to those listings, was full of rubbish and linked from equal rubbish would not be as good.
Damn that Bob and his wildly varied site quality.
-
RE: Interesting site migration question.
301ing the old URLs to the new ones will pass the link equity that they had gained from external links. However this doesn't necessarily equate to the new pages having the same rankings: there are other factors that will influence that.
For instance, if the new page had much better on page optimisation than the old one then it could rank better. Likewise if the structure of your site gave more (or less) prominence to a particular URL then that might influence how it ranks.
The definitive answer you are looking for is "probably not". However I would say that the more accurate answer is "probably not, but there is not reason you couldn't not equal of better the other rankings, if you pay attention to the other factors in play".
-
RE: Can I get your expert opinion?
It's a good idea. I think Rand has previously posted somewhere about he has done this himself, blogging topics that have been asked about on Quora and the like.
The biggest danger in terms of spam from posting on forums is if you include you email address in the post. Publish an email address on the internet and it will get spammed (trust me - I've been using the same address for 16 years. It's an issue!). However if you don't actually put the address in the post then you won't have a problem. Almost no forums allows their members email address to be seen by other users unless they choose to.
[edited - correction, Rand had mentioned doing this on quora]
-
RE: Link Building - Where to start and how long to see results
Hi Ouji
You asked "Is "link building" or lack of it my main issue?". It certainly sounds like that is the case.
Good search results require that you page is both relevant and authoritative. Making changes on page can only effect how relevant that page is. Sitewite changes can redirect what authority you have towards key pages, but you need to build authority to rank in competitive areas.
Building authority (largely) means building links. If Google doesn't know whether to trust your site is will never really be able to compete against those that it does trust.
However, having taken a (admittedly rather quick) look through your site I think that your site might be at a disadvantage in that you don't seem to have very much in the way of original content: Most of the section pages seem to recycle the same few snippets and then the detail pages seem to be largely empty.
There are certainly many ways to "punch above your weight" in the results. However the goal of search engines is to serve up the best page for any particular search. If your site is way off that mark you are always going to be swimming upstream.
I'd start by trying to write original text on every page then adding 1 really strong piece of content that other local sites will really want to link to (then make sure they know about it!).
-
RE: Is Guest Blogging the Next Link Buying
What worries me is that the easiest way to apply such a penalty is probably to devalue links that appear to be part of an author box (one of several easy to spot footprints of most guest blogging). This isn't the best way, but it would be incredibly easy to do. If that happens then it would really be a case of "throwing the baby out with the bathwater", but we've already seen this year that Google is happy to do this if it discourages manipulation.
I've always assumed that when the spammier end of guest blogging is addressed it would be through a wider algorithm. As Alice suggests, value could be removed from this as a result of a wider effort to tackle low quality sources - and this would be fair.
This year though I suspect that "fairness" in the algorithm has gone down a couple of places in Google's priorities and that the idea of "the greater good" has become a higher priority. If that is true then anything that is a link building strategy becomes a fair target - even if it causes wider damage.
As always - time will tell !
-
RE: Why is different the difficulty of a keyword in Google Spain and Google mexico?
Hard to say without knowing the keywords and probably also better knowing the two markets. However I can hazard at a few of the factors that might be at play.
Difference in competition
The most obvious possibility is that there is a real difference in business competition. The markets are different, so there will be differences in the supply/demand of various products and services as well as geographic terms. Let me pull a example from my big bag of national stereotypes: I would imagine that the competition for terms around tequila would be higher in Mexico than Spain.
Difference in dialect
Despite the common language the phrases used can differ. I've been doing a tiny bit of work for the Brazilian market and we've noticed a number of differences in the terms used by Brazilian Portuguese speakers and Europeans from Portgugal. My impression is that Spanish can vary even more greatly.
Sample Size
A less obvious factor is sample size. All of these tools rely on samples of data. The more niche your query the less accurate they become. Spain has a population of around 47million, mexico 112 million. That could affect the accuracy of data if the terms you are using are quite specific.
-
RE: Should we Have Our Anchor Text Changed?
Not everyone will agree, however I'd say leave them for now - unless you are seeing anything particularly worrying.
Bad links could case a ranking drop for two reasons: On one hand you could be getting a penalty - even without a warning in webmaster tools. On the other it could just be that you are not getting the benefit from those links that you were.
The second is the most common. The remedy to that is "build more, better, links". That same remedy solves a lot of other problems too.
If a link is poor quality you may as well get it removed rather than changing the anchor. However I'd still rather put resources in to new links rather than removing old ones unless you are particularly concerned.
-
RE: How accurate and quick does Google pick up on canonical tags?
Oh yes... a touch larger than 10k. Big touch too
Some we've been able to 301, however it is mostly a faceted search issues on the site we are working on - so those pages need to stay live to users.
-
RE: How accurate and quick does Google pick up on canonical tags?
It can be slow - particularly on a big site that is crawled slowly. We've got an over-sized site thanks to some iffy CMS logic that we are trying to get Google to follow canonical instructions on. It's happening slowly, but is taking months.
-
RE: Duplicate page error
What other changes happened at that same time? The site seems different?
-
RE: Duplicate page error
Do you know which day it dropped on? There really isn't any reason that anything above should cause a drop. There have been updates - so let's figure out what is going on first.
-
RE: How to control Artist Info sidebar on Google SERP?
This is what Google are calling "knowledge graph" - their own rather cynical take on the semantic web. The idea is to pull data on known entities from a number of sources and present them in once place.
The main sources of information are:
- Wikipedia
- Freebase
- Google's own search data
- a multitude of data-specific sources
I am afraid I have no idea what the actual sources are for band information. However if some is wrong on the knowledge graph results you might be able to find out: Look for other authorative sources where the data is wrong.
The important thing to remember is that little (any?) of that data is unique to knowledge graph - it is pulling data from elsewhere. Correct those sources and it should follow. Start with checking wikipedia.com and freebase.com - then look for other high authority sources that are likely to be used. Where would YOU look for that information?
Sorry I can't help more with the specifics.
-
RE: Reached No. 1 for my Keyword, what next?
You might find this thread interesting: http://www.seomoz.org/q/when-keywords-are-on-the-top-of-the-google-search-engine-then-what-to-do
Someone posted the exact same question earlier and there are some good responses there.
-
RE: Why is my office page not being indexed?
You seem to have at least one other link to that page, but that is from http://www.sandersonweatherall.co.uk/office-to-let/ which also doesn't seem to have been indexed. This page doesn't seem to be linked in, although I haven't crawled the site.
So - 2 links. 1 from low down on a very long sitemap page with 1200 links on it. The other from a page that in itself isn't indexed. That is your answer.
However I think it is sympomatic of a potentially wider problem - your structure is rather odd. The structure in navigation and on page links doesn't seem to match that in your sitemap. Your structure is also deep - what is that 6 levels?
In all honesty it looks like loads of pages have been created hoping to pick up long tail search and those are are only really being linked from the site map and each other. I haven't spent long on it but the instant thought was that it felt like 2 sites: one for the public and one for search - which probably isn't a great idea. Some of that content is rather thin too.
It's all actually an approach that worked pretty well up until the end of last year. Wouldn't be surprised if you were were struggling a bit now though.
If you really just want to get that page indexed then get it linked from a page that is indexed (and doesn't have 1200 other links on it).
If you want to improve further I would start looking at the content on your site and figuring out whether it all deserves to be there, if it does whether it is good enough and whether it is structured in the best possible way.
On an unrelated note I'd also look at the choice of font in the main menu. I really struggled with that.
I think you have some good stuff there. Your case studies could be really good content - move the picture in to the main page, consider broadening each one with a few bullet points. Likewise your people profiles- you've created them, but only linked them in from the sitemap (i think) - seems a missed opportunity.
-
RE: Best Social Sharing Platform for B2b sites
I've just watched a stunning video about how to de-rag a pump in 2.5 mins! OK - I'll admit you've got a challenge on your hands.
I'll assume for a moment that a link based piece that focuses on all the innuendo surrounding pump deragging is out of the question. However I can't help think that there is a slightly naughty tumblr to be had here pulling off some meme type images.
The video I just watched seemed to be based entirely on the sales pitch of speed. Is that something that you could do an interesting thing around? Maybe a video of a deragging race? Or one of someone deragging a pump in less time than it takes to boil a kettle (ending with them sitting down for a cup of tea to underline the time saving).
Probably not quite right - but it shows that dull topics don't have to be presented in a dull way.
-
RE: Can't rank with certain keywords for the life of me
Hi Cesar - I've actually used your site in the past! That's cool.
That date sounds suspiciously like you were hit with by the Panda. Panda is a Google update that mostly targeted 'thin' content - it was updated on the 23rd. If you ever get a sudden drop in rankings like that it is always good to check http://www.seomoz.org/google-algorithm-change which lists when updates happen. That is fast becoming the page I link to most frequently in Q&A answers!
So - why might they think you have thin content? I'll look quickly at 2 pages: http://www.freescrabbledictionary.com/ - ranking for scrabble dictionary & http://www.freescrabbledictionary.com/scrabble-cheat/ - presumably the one that has dropped for scrabble cheat.
OK, the two pages have a few paragraphs of unique content, but it is really what I would call "fluff" - it doesn't really serve any purpose other than to tell the search engines that you wish to rank for that page. That is not an accusation - I've written plenty of fluff myself. Just telling it how it is. This text is also quite a way down the page which indicates that it isn't the most important.
The "meat" of the page is the form. This is identical on both pages. So really we really have 2 pages that are only really different in terms of the fluff. That is textbook for the sort of thing that Panda hit.
How to fix it then?
I'd consider actually providing information that is more relevant to the search "scrabble cheat". List common cheat methods (and how to spot them) plus a big call to action pointing back to the main search tool. At first glance I can't see any need to replicate that form on every page.
Your scrabble helper page is actually following roughly this process - although I would make the link back to the main tool much more obvious. However the on page optimisation isn't great on that one - tweak that up and get some people to link to it and I'd imagine that could help quite a bit.
Good luck with it.
-
RE: How to create SEO budget?
The numbers by themselves are meaningless. Read my response again - look closely at the sites that are linking to them: Why are they linking? What's in it for them? Could you do the same? Could you apply that same process to another site that they don't have a link from? Can you take that idea further - change it to put a new spin on it?
You can't win at SEO my chasing the links of competitors. You will always be playing catch up, you will never get every link and you'll never get ahead. However you can learn from those links and take their strategies rather than their links.
When you have a strategy then budgeting starts to make sense.
-
RE: Need to rebuild client's flash website
I'd honestly start with the assumption that you are building a new site, rather than changing the platform from the existing one. If you think that the current content is strong then keep that, but start afresh with the actual build.
-
RE: When keywords are on the top of the google search engine then what to do ?
I'd start thinking about more keywords to target to be honest. It's really risky to be that narrowly focused and you'll bring in more business by broadening your focus anyway. As a bonus, work that you do to introduce new target phrases should also strengthen your hold on your primary terms by attracting more authorative links.
-
RE: Best Social Sharing Platform for B2b sites
Wow - sounds like a fun project! Are people really going to share anything that dull though? Sounds like the content is the issue not the platform.
I'll assume you mean "niche" rather than "dull". If it is interesting to the target audience then you have a chance. If not then you're stuffed whatever platform you use. So - interesting, but only if you are the right type of geek.
You just need to find where those geeks are hanging out. I tend to find twitter is good for those smaller niche groups, but that is partly because they are easier to find thanks to tools like followerwonk. It is very dependent on the niche though.
-
RE: How to create SEO budget?
Don't be fooled in to thinking that it is about the number of links. It never has been and it is probably less so now than ever. You could build 8000 links very cheaply and very easily if that was the only thing that was important. However it probably wouldn't help.
Start looking at what the links are rather than how many of them there are. What are their 10 best links? What would it take to get links like or better than those? What about their top 50 links? At least assume that it is only authentic looking links that are going to help and then start thinking about what you can do to match or better those. The methods used will depend on the market and your website. The cost will depend on the methods and how well you do it.
I know that you were probably looking for an answer in $, but it's impossible to say and pointless to do so.
-
RE: Need to rebuild client's flash website
Flash professional does allow you to export flash content as HTML. If you have flash professional CS6+ and the original source files you can do this. There is a demo of how here: http://tv.adobe.com/watch/cs6-creative-cloud-feature-tour-for-web/exporting-flash-content-as-html-in-flash-professional-cs6/
Whether you would want to use the resulting HTML as your website is debatable. I wouldn't. I'd rather pay someone to rebuild a simple version of the site from scratch and at least know that the structure and on page set-up is all good.
What is the aim of switching? Flash websites are not great for most purposes, however neither are badly made HTML ones. If your aim is SEO success then I doubt that exporting from flash will help you much.
-
RE: Affilliate Marketing for local business? UK
I've no idea what would be a good commission rate in that sector. It's not an area I've worked. However, if competitors are on networks then that might provide some indication.
Tracking sales is very straightforward. You just need to drop a cookie whenever one of your affiliates sends a visitor and check those cookies when enquiries are made. Depending on how complex your systems are and what numbers you are working with reporting can get fiddly though.
There are lots of scripts out there that will do this all for you. We're big on custom building stuff, so I don't really know any of the 3rd party solutions well enough to mention by name. However there are a few and some seem to be quite well respected.
What sites would I target?
The regional focus means that I'd definitely start with local sites. I'd dig out a list of every town and city in my area and see which have local community sites and start finding contact details for them.
-
RE: Trying to decide on best domain
If I was starting out now I would opt for a branded domain every time and not worry about the keywords in domain. You don't get the initial advantage, but it';s the better long term approach both on and offline. The 801 could work well for that. However when I see "utahrealestate801.com" it just looks like a spam result to me. I'd rather see "801living.com - the Utah real estate specialists" or something similar that sounds more like a "real business".
(I'm not saying use that one - I only spent 10 seconds on it. However I wanted to include a real example that was available)
-
RE: UK home improvement - building links or social shares?
If people are posting questions about glazing there is definitely an opportunity here. You could start blogging great answers to common questions so and then set up alerts for when those questions appear. When they do you can post a short answer to the question but provide a link to the indepth version.
This also means you start improving the quality / depth of content on your site at the same time, which has to be a good thing.
-
RE: Affilliate Marketing for local business? UK
I definitely wouldn't use one of the networks for something with a regional focus. Despite the sales talk they don't do much to actually help you build the programme, mostly just administer it and allow you to use their existing network of affiliates.
The exception to that would be if you can see that there are lots of national sites with regional pages that you think you could be listed on and getting business from if you were part of an affiliate programme.
However it is quite cheap and easy to provide affiliate/referral tracking without a network. If you can contact the sites yourself and offer a commission on leads/sales you might do much better with a local market. Overhead will be lower and you might get better targeted sites and have an excuse to start building relationships with them.
-
RE: Why my twitter handle link is not getting counted in links, though for other sites it is counting!
What do you mean "not being counted as links". Do you mean that seomoz isn't picking it up in opensite explorer or that it isn't appearing in webmaster tools?
If you just mean open site explorer then don't worry. OSE only crawls a small portion of the web and it's purpose is to give you some analysis rather than a defininitive link.
If Google isn't picking it up and it has been there a while you just need to get some links pointing back to it. Drop your twitter handle if you are guestposting / using author boxes or maybe as your link on some blog comments and it'll soon get picked up.
Overall though, if you are worrying about a single link getting indexed then you probably are not using your time in the best way. Focus on what to do to get more links, rather than worrying about the ones you get. The payoff is far bigger.
-
RE: Website accessible on http and https. Is it bad?
Cyril - I honestly wouldn't worry about this. The vast majority of sites with https behave like this and it doesn't cause a problem. Your canonical is extra protection against it.
-
RE: Planning to out source
Oh, I bet you are going to get some good replies to the above here:)
All of those things can actually be quality. However if the proposal is based around those then they are unlikely to be - as they are all cheap, easy "bulk" methods. Exactly the sort of thing that has cause many people to come unstuck this year.
I'd be wary of any claims of success based just on those methods to be honest. Sites using those methods can rank for competitive terms. However in the current climate they are often ranking despite those methods not thanks to them.
-
RE: Anyone have any experience with freelance graphic designer sites?
I'm lucky in having an in-house graphic designer, but I have used these sites for side projects and to help me out of a corner in the past. I can't imagine that my experience is true for everything, as they wouldn't be sustainable otherwise - however I've not had great results.
The problem for me has been that so many of the freelancers using the sites focus on how little they can do to meet their obligation. It's not entirely their fault to be fair: Many of the sites are set up like reverse auctions that only serve to drive prices down and that is never a way to ensure quality.
if I was going down that path I'd find sites that specialise in design freelancers rather than general odesk/freelancer type sites. Particular places that let people use the site as a way to build their reputation rather than just as a way to generate lots of leads quickly.
I am sure that plenty of great designers use them. If you can find a site where the proportion of the good ones is higher you'll probably save a lot of time.
-
RE: Google caffeine
Caffeine was a back-end improvement that Google implemented in 2010. The idea was to provide more up to date results for topical and fast moving stories.
From a user perspective there is nothing to worry about. You search - stuff happens. Whether caffeine impacts on that stuff isn't important.
There is a user level explanation of Caffeine here
From and SEO perspective then if you have been doing stuff that works for the last couple of years then don't worry explicitly about caffeine - you've been dealing with it's impact anyway. If you are struggling to rank for breaking stories or those that are updated quickly then it is more of a consideration.
-
RE: Do I need robots.txt and meta robots?
If you want the stub listing removed as well, this is quite straight forward once you have it blocked in Robots. Instructions here: http://support.google.com/webmasters/bin/answer.py?hl=en&answer=1663419
Just checking though: If the content you are trying to remove is something private that should be hidden (as opposed to just low value stuff that you don't want cluttering the SERPS) then this isn't the right way to go about it. If that is the case reply back.
-
RE: Does using Google Loader's ClientLocation API to serve different content based on region hurt SEO?
I think people panic about cloaking/dynamic content too much to be honest.
It would be easy to go overboard and start alarm bells ringing, but if you have a dynamic area on a well structured and balanced page I can't see it being an issue.
Caveat: I can't think of a clear comparison to something I have worked on in terms of serving it geographically. However I've done similar based on countless other criteria and not felt it has harmed anything.