You should really just add that to your disallow list in your robots.txt file. That's the easiest method.
<code>User-agent: * Disallow: /customer/</code>
More information on RogerBot.
Welcome to the Q&A Forum
Browse the forum for helpful insights and fresh discussions about all things SEO.
You should really just add that to your disallow list in your robots.txt file. That's the easiest method.
<code>User-agent: * Disallow: /customer/</code>
More information on RogerBot.
Highly unlikely. Why would a CTR matter at all, in Google's perspective?
I'd even wager that CTR of actual Google SERPs isn't used. The CTR doesn't really tell you anything without providing false positives.
Yes, Google does not use them for anything. The Bing team also said they have used it as a spam signal.
Sorry no source on the latter fact, since that was a random video. But there is this:
http://googlewebmastercentral.blogspot.com/2009/09/google-does-not-use-keywords-meta-tag.html
Google didn't condemn comments. Or comment link building.
Fire and brimstone comes when you use the likes of ScrapeBox to submit to hundreds or thousands of blogs, in a small timeframe, with not many varying phrases.
You're fine adding your domain. I'd even consider it an opportunity lost if you didn't.
20 Miles is too insignificant to warrant the need for two separate profiles. If you planned on doing community outreach and the offices were in different states, that'd make sense.
Just put your main office in the address field and be sure to list the second address somewhere in the information page.
Why not keep Buddypress, but disallow search-bot traffic to the root directory where it resides? You can easily do that in robots.txt.
User-agent: *
Disallow: /buddy-press-directory/
Without more details on your ads, and what they said specifically, I doubt you will find much help.
When a company sends you a generic disapproval message, it's because they decided they don't want to waste human resources on explaining things that are clearly outlined. Google Adwords/Adsense is the same way. I would suggest looking over the terms carefully again.
Sorry, wish I could help more. Maybe link us to some examples.
Hi Thomas!
It depends on your server configuration and what software you are using.
An underscore isn't negative. At one point Matt Cutts recommended using dashes (-) as a delimiter, but that was many years ago and he has since said that Google is smart enough to figure out what delimiters you are using.
I find that there are diehard fans of specific delimiters, which I think is silly. I myself prefer the dash, but again that's just personal preference.
The more concerning matter is that you have no sense of hierarchy in your URL. I'm assuming you are running an old version of osCommerce or something similar (your website won't load for me). I don't have much experience in osCommerce but I do remember that their permalink structure was horribly limiting back in the day. Not sure how plausible it would be to change your structure, but a quick Google search for your version number should return an answer rather quickly.
Also, both of your links are the same. I think you meant for the second one to be different
Would be glad to help out further if you can supply more information. Cheers!
P.S. - I'm biased toward Magento, but they have a free Community Edition you can check out if you want an eCommerce solution that isn't antiquated.
It is impossible to comment without more information.
A transition that is done correctly will not have any decrease in ranking. At least, that is my experience.
I'd be glad to help you further. Look forward to your answers!
Cheers.
Well, there are two categories of SEO and you are covering both already: onsite SEO and external SEO.
If you are looking for a third classification of SEO, you won't find one. If you are looking for more ways to get links, do linkbait, and so forth, SEOmoz actually has some fantastic articles on this:
...and that should keep you busy! I hope I understood your question correctly.
Cheers!
What exactly do you mean by "bad?"
They are both undesired. You certainly won't be penalized for having a 404 specifically, but you may see a decrease in ranking if it affects your site structure or if the pages you lost were passing along page authority to other pages.
You should do everything you can to limit both. Both are equally bad.
Let me know if I understood the question correctly. Cheers!
Hi there,
I'm assuming you are trying to do pagerank sculpting (or something related..) - which was made a little more tough in recent years. I'll base my answer around this assumption, so feel free to correct me if this isn't the case.
There are several methods to make a link uncrawlable:
Let me know if you have questions. I'd be glad to help further.
Cheers!
Hi there!
The specific reason why it ranks higher could be any number of things. I'll go over several items you should be aware of if you aren't already.
Are you currently redirecting all visitors with mobile user agent strings to the mobile version? You should be. There is a video by Matt Cutts that says Googlebot will be able to differentiate between mobile/regular pages in doing so. He does recommend using the "m" subdomain like you are currently doing. There will be no duplicate content/cloaking issues.
Now let's think of what would make one website rank over another. Links, right? If we take a look at your mobile site compared to a regular article page, we see what every visitor wants to see: content. There are no advertisements, no busy navigation menus, nothing: just the content they want to share. Wouldn't it make sense to share the mobile version then? This could lead to visitors linking to your mobile version instead of the regular. (Try optimizing your mobile design more so there is less of a distinction in this case)
Are you linking to your mobile site in any shape or form? The mobile version has less outbound links. As such, the authority it receives from any links you have up will be retained more easily than your regular article page, which has quite a few outbound links.
Can you draw any conclusions from SEOmoz tools to see a difference between the two pages?
Lastly, use the mobile XHTML doctype instead of the one you currently have. It helps Google differentiate between the two pages:
Best of luck! Let me know if you have questions.
Can you link to the webinar you mentioned? I'll keep an open mind, but from everything I've experienced in the last five years, links that don't get clicked still work pretty darn well.
It would be hard for Google to track which links are getting clicked. In addition, the links boost the site's ranking and so they will gain traffic organically. (Maybe I misunderstood you)
I don't consider forum profile links to be spam or even grey hat when used appropriately. If you are using a tool like Dripfeed Blasts, then of course that's another story.
Much of grey hat SEO and a moderate portion of black hat is still in effect. I still see competitors from two years ago sporting a large number of links from terrible neighborhoods, and still ranking incredibly well. I've reported them countless times to no avail.
It almost seems like you have to get a competitor featured in a New York Times article for Google to do anything substantial.
Cheers.
Hi John!
Are you using IIS Manager to administrate your server? If not, that's probably why the details you listed would not apply.
If you come from a Linux background and just want to modify a file, like you would with the .htacess file, you will be editing the web config file. Microsoft has a knowledge base article (albeit short) on this at the following:
Let me know if you have questions. I would be happy to help further.
Cheers!
EssEEmily has fantastic advice. I haven't seen your website but it sounds like it would do best by contacting journalists directly or trying to get in touch with websites you think would run your story. As long as your have some form of credibility this will be the best method for you.
Social media is great but you have to really know your target market and have realistic expectations. Again, haven't seen the website and so can't comment further. I can say that paid forms of social marketing have worked out very well to get the fire started. Buying StumbleUpon traffic usually works well. If you can dump $50 into an SU campaign it's usually money well spent.
I had a lot of success with Digg a couple years back. They are starting to make somewhat of a revival so if you can keep an active account going (note: submitting and voting on many stories, not just your own) it will eventually pay off. (On a funny note: I actually got banned because my front page ratio for each story was 30% - ha! Boiled my blood because I wasn't doing anything wrong)
Google News inclusion: find a resource that is similar to your own that syndicates to Google News. If there isn't one, become that resource! Look in the Google News guidelines and submit to get included. I had a buddy that ran a general news source that got me in when I needed it. Haven't messed with GN in ages though so I don't have the domain bookmarked anymore. I'll PM you if I find it, if you want?
Anyway, best of luck on your campaign! And a happy Friday to you!
Hi there!
To better assist you we will need to know what your goals are.
Do you have a budget?
Who do you want to reach through this campaign?
Do you have experience with other press release networks, and if so, what makes you want to seek out others?
My experience with prlog.org is good. They have a very easy to use system - and it's free! The only problem is that there is less exposure on prlog.org than you would find with other press release websites. Getting a press release popular will help you out (there exists several tactics to do this the evil way).
Prlog.org also helps drive traffic. Not a lot, but press releases tend to get visibility. The highest ranking releases get placed in Google News. There are easier ways to get in Google News if that's what you are after.
I look forward to hearing your goals! Cheers.
It's not blackhat SEO, and it's very common to create separate domains for the means of SEO. You can even use the same IP address (so you don't actually need a new host or new IP) and the benefit is still there. While it does help if the domains are hosted at separate locations, it isn't necessary.
Any of the articles that do belong on your company blog should be on your company blog. Everything else can go on the secondary domains. Just be sure that you develop the domains as you would your flagship website: with quality and attention to detail. Otherwise they serve no purpose other than for your SEO (no value to visitors) and they could be considered as grey/black-hat SEO.
Your secondary domains also become guinea pigs. You can test new services or link building ideas on them, and if they lose their rank, it certainly isn't good but it's not going to hurt your main domain. It's a layer of abstraction that will both protect your main website's SEO and allow you to start building case studies.
Personally I like to get whoisguard to mask the registrant of the domain, separate them all on different servers, and try to make them as unique as possible. (Tough in a specific industry, though..) I'd recommend you did the same.
Let me know if you have questions about this! Cheers.
It's against the Google TOS. Source
That being said, there are many offenders. I'm not sure if Google lets any of the big marketing companies do it, or even give them API access in some shape or form, but I know for the smaller guys that it's frowned upon.
To my knowledge there has not been a case of Google pursuing action against someone who has scraped search results.
Scraping Google requires a proxy network because they do indeed block requests that appear like scraping bots. There are temporary bans and captcha tests put in place, but switching out an IP address works just fine until the test no longer appears.
When it comes down to it: no, not legal, but you also probably won't get in trouble for it.
(Would love to hear from official SEOmoz staff on this one...)
Ha! I've seen those far too often. I don't know the process that led a company to get a contract by building links like that. Frankly I'd be interested to see their proposal.
I was given the opportunity to look at competitor proposals on several occasions. Most bank on giving an informational overload to show that they have top-secret SEO tools and knowledge of the industry. I saw a 40-page PDF spitting out data from what seemed like a normal SEOmoz account. I think they get the deal-maker to focus on data and "wow" them instead of putting focus on case studies and guarantees.
I just can't imagine why a proposal would ever be over five pages. I think Alan Weiss also states that all proposals should be under three pages. Harder to do sometimes given large projects, but still a good rule to follow.
Cheers to tasteful back linking.
Well I'm fine giving broad answers to the question (guest blogging, agreements with webmasters, social/forum associated link building.. etc) but anything more than that is unnecessary.
I had a client recently that wanted to know exactly how many hours I worked, a spreadsheet containing every link I built, and the list goes on. This was after we had signed a contract stating that I price based on value and that reporting like this would not fit into the contract. Instead I offered a guarantee on rankings and a broad idea of the links I build (white hat only).
When you outsource SEO you aren't just buying links. You are buying the case studies that came from the consultant or company as well. They know what works. It doesn't make sense for them to divulge every detail. Alan Weiss states that at the very least, reporting like this should cost extra. A lot extra. (I agree)
As long as you check out the references, portfolio, and case studies you will be fine outsourcing to a reputable company/consultant. It just might not be affordable like the OP requested. (And by affordable, I mean not initially affordable. All SEO projects should give a return to the client, even counting your fees, within one year is the general stance I take.)
Can you recommend any reputable company thats affordable?
You really have to pick if you want reputable or affordable.
I would recommend that you do the process yourself because it's much cheaper than outsourcing in most cases. Any time I have outsourced SEO I ended up getting burned in some shape or form, so I don't have any companies to recommend. Sorry.
If you give more details about your project perhaps the community may give you advice on doing the task yourself.
The only thing that I think a lot of people would have a problem with is your statement that using wordpress is spammy
I believe WordPress had a recent blog post that said around 10% of websites on the Internet are using their platform. That's not the problem of course! Using the free themes (I see a link in your footer) degrades a website. The sitewide link is taking value from each page, and then of course being a free theme it begs the question if the business is legitimate.
I'll be honest: I only paid mind to the first Panda update. I haven't been affected by the first or subsequent updates so I stopped following it, so I can't give you specific advice on how others have seen it through.
Here is a discussion to get you started, though:
http://www.webmasterworld.com/google/4305793.htm
Here is a poll that shows interesting results about recovery:
http://www.seroundtable.com/google-panda-recovery-poll-13456.html
Juicy content from SEOmoz:
Put yourself in Google's shoes: doesn't your website structure and content look familiarly like a content farm? I'm not saying your website is one, but it gives the vibe for several different reasons.
First, the domain is clearly overly-targeted towards keywords. That's a big giveaway that the website is probably not a legitimate business. If you are an attorney, you should instead have a domain registered for your company name. Have branding.
You are using a free WordPress theme. Content farms largely run WordPress because it's so easy to install across many different domains and manage them all with little effort. Content farms grab free themes because, well, they are free. Shouldn't a legitimate business have it's own web design?
You don't have a business listing in Google. When I search for your address I don't see your domain come up. Create one and verify your business number.
Give your website a more personal touch. Your admin name is uncontesteddivorceny. Create an author bio if you plan on having multiple authors, but at the very least use your real name.
To sum up the onpage stuff: it looks like you tried too hard to game the system. If I saw this listing in Google when searching for an actual attorney, I would quickly press back and block your website from appearing in my results again. I mean this criticism constructively.
And then of course there is the external SEO. I only took a brief glance, but it looks like all of your links are from directories, video channels, and comment spam. None of this looks good. Get endorsements from any organizations and websites you can that are actually authoritative. I don't know your industry well enough to give specific advice on this, however.
To answer your question, yes people recover from Panda. But if you are doing SEO right you never had to recover in the first place. SEOmoz has great articles to help you out if you aren't sure where to start.
Cheers.
Ask for a link in exchange for allowing websites to embed your graphic.
Check out Creative Commons licenses and pick the one that works best for you at the following link, and make sure the websites are made aware of the license:
Make the link modest so that the websites don't mind keeping it. Also consider asking that the image links to your website instead of a caption text, or give the website an option to do either.
Crawl your infographic or widget with a GoogleBot simulator if you aren't sure if Google will pick up the link. Although, you shouldn't have a problem doing what I described.
You can hide a link behind the image, but you really shouldn't be trying to hide links. Just stick with my advice and you should be golden.
Good question! Cheers.
Personally I would make the root domain a catch-all for every term. You don't need separate pages for each phrase you want to rank for. If the entire website is about the same thing, there's no need to create landing pages for different phrases. (Google also frowns upon this)
I would limit the usage of the <abbr>tag to once or twice per abbreviation. You also must realize this is onpage SEO, which is great and it does help, but the majority of your ability to rank for your phrases will come from external SEO.</abbr>
If you take a look at the Adwords Keyword Tool, you can see some phrases get more traffic than others:
My advice would be to continue with alternating anchor and title tags, but give more preference to phrases that get the most searches. Be sure to include all phrases in the copy of the page as well, but pay mind that the page doesn't look spammy.
Adwords Keyword Tool: <cite>https://adwords.google.com/select/KeywordToolExternal</cite>
Using the <abbr>tag will help. Again, don't overdo it. You will need to take time to think how you can fit all the acronyms in the content body without making it appear like you intentionally did so.</abbr>
Proper usage: <abbr title="Football Manager 2012">fm2012</abbr>
It's a waiting game at this point. If they don't find problems then ask for reinclusion again. Wait 24 hours between asking for reinclusion & seeing if Google reports new problems.
In FireFox you are getting personalized results. Make sure you are logged out of your Google account and try again.
I'm getting the same result (page 4) in both browsers.
Alternatively you may turn off these settings as detailed here:
http://www.google.com/support/websearch/bin/answer.py?hl=en&answer=54048
Atul,
You definitely don't need to contact webmasters to change any links.
There is a 301 redirect flag you can apply to your rewrite rule:
RewriteRule ^(.*).html$ $1.php [R=301,L]
Do a 301 redirect test to confirm. Best of luck!
I ran into an issue with malware once and Google was very responsive during the process. Each time I asked for reinclusion the request was responded to within 24 hours.
I say "each time" because this particular piece of malware infected random files across an entire dedicated server hosting a great deal of websites. After I became aware that the problem was impossible to solve manually, I wrote a script to detect and remove all traces of the malware. At this point it was my 5th request I believe, and there was no problem with Google approving my request.
There are scanners you can use but during my look at them, I didn't find any reliable free ones. Hopefully you got it all and won't need to pay for anything.
Wonderful people, these malware creators. Best of luck.
Hi Ryan.
Hitting the 100-link warning is usually best ignored if you are an eCommerce store. Stores often can't help hitting the limit with large navigations, sub-menus, and then product images/titles. This is fine.
For a blog this is less common. I would be interested in knowing why you are in excess of 100 links. Personally I would work towards lowering the link count because it divides the strength the page gives to each link.
Matt Cutts does not recommend using tag clouds. There are several studies on the matter that prove Matt isn't lying: do a quick Google search to get a few of them. When you think about it, they don't really offer the visitor anything. I would suggest removing the tag cloud.
Disregard page strength, acquire links.
...as long as the page is relevant. The fact that cheaper forms of SEO like article marketing are still effective proves that poor page strength shouldn't sap your efforts so quickly. I use page strength more of an indicator of where on my priority list I should put acquiring a link, if possible, and also assuming the page is relevant.
I understand not wanting to be associated with bad neighborhoods. Page strength is not a strong indicator of a bad neighborhood. One common SEO tactic is to create contests, correct? One recent contest I developed required participants make a review of a product, and naturally that meant linking to the client website. Most of these reviews came from personal blogs and general websites that did not have a lot of domain authority. Yet it was still a boost to rankings and domain authority.
Use your own judgement by looking at a website and determining if you want to be associated with it. Don't pick and choose based on the website popularity alone.
Take a look at the Lynx web browser to see how Google would see the website:
Alternatively, check out the Google cache of your website or use a googlebot simulator. You will see that everything in your HTML, including your navigation, is read by Google.
If you are running an eCommerce operation you can safely ignore some of the warnings that SEOmoz shows. For example, many eCommerce stores will have an excess of 100 links due to a large navigation, shopping categories in the sidebar, and product images/titles. The same is true for keyword count.
As long as you aren't overdoing it you are fine. If you have a tennis website, a proper navigation might have "rackets" as the parent category, and then list child categories by color. You should not have the children listed by "Blue tennis rackets," "Yellow tennis rackets," and so on. Duplicating a keyword like this is not necessary.
Should brands be concerned with registering or blocking the .XXX TLDs?
Reputation management is one of your strong suits, or so your profile says. You of all people should know that registering all relevant top level domains is necessary in a situation where a business is serious about their reputation.
Should individuals care about .XXX registration?
Some will be interested, some won't. Not everyone wants to develop a .xxx domain. Maybe I don't understand this question?
Will links from .XXX be helpful?
Matt Cutts has stated on more than one occasion that the domain extension does not matter. Yes.
Will a 301 redirect from a .XXX be harmful?
It would be too easy to register .XXX domains and redirect them to your competitors. No.
The only pages you should be using nofollow on would be any pages you don't want to be found in Google. PageRank sculpting is long dead (or so Google says) so you won't be able to horde your page strength.
Matt Cutts does use nofollow on his website, but only on his RSS feed. I go a step further and also nofollow any cart pages for ecommerce stores and any login/auth pages. Just anything that shouldn't be in search results.
I pretty much let PageRank flow freely throughout my site, and I’d recommend that you do the same. I don’t add nofollow on my category or my archive pages. The only place I deliberately add a nofollow is on the link to my feed, because it’s not super-helpful to have RSS/Atom feeds in web search results. Even that’s not strictly necessary, because Google and other search engines do a good job of distinguishing feeds from regular web pages.
--Matt Cutts
More importantly, why are you getting a 404 when viewing Site.com/widget? That doesn't make sense. If Site.com/widget/ resolves, so too should Site.com/widget.
The page has authority because it's getting linked to. If someone is linking to your page they aren't going to add a slash unless they know they have to. You will need to properly setup your rewrite rules so that the URLs resolve correctly.
Once you make that update the page authority will transfer correctly, and yes, it will help.
Easy: noindex the search results so they don't appear in Google.
I realize you want the traffic and are attempting to skirt around the issue, but it's something Google warns against:
Use robots.txt to prevent crawling of search results pages or other auto-generated pages that don't add much value for users coming from search engines.
http://www.google.com/support/webmasters/bin/answer.py?answer=35769