I don't think one is more accurate than another. I just use whatever plug firefox or chrome plugin that looks like it has the most downloads. Toolbar PageRank is not a precision measurement and there's no need to focus heavily on it, as it's only updated every few months and is not the actual PR that Google uses in its calculation of search results.
Posts made by Chris.Menke
-
RE: What's the most accurate free pr checker?
-
RE: Does google have trigger words it does not like?
There used to be this concept of "poison words" floating around that I never really understood or believed it had any merit but I haven't heard talk of such for many years now. Today, each individual word used on your page adjusts the page's relevance slightly but of more significance is the breadth of the vocabulary used on it. If a word like "sex" is used a few times on the page, and in the title, Google will zero in on that word as topical for that page and will lean towards ranking that page in search results for that that term. If it's just used once or a few times on a page, it's no big deal--it's not going to impact your rankings for anything.
-
RE: Link from Blogspot.com subdomain...
As far as those stats go, you're seeing that the page/subdomain itself doesn't have any links or authority but the root domain (being blogspot ) has a lot of both. The link isn't going to do much for you but if the page looks like it was made to be a valuable resource for people looking for a product like yours--and not made just to be a back link for someone else's site--there' d be no harm in acquiring such a link.
-
RE: Are URL suffixes ignored by Google? Or is this duplicate content?
As Lesley says, it's not ignored. If the content is exactly the same on both URLs, you can ask your IT folks to include a rel=canonical directive in the header that sets the canonical version of the content to one specific URL or, if a URL isn't needed, it can be 301 redirected to the proper URL.
-
RE: Is there any downside to have a product name (branded keyword) that has a top keyword in it?
In general, I'd say that if you are already ranking well for it, don't do it. If you weren't already ranking for it, I might say the same thing. However, if you are doing--and will always do--quality link development (vs. somewhat spammy, or worse) and you've got solid social media engagement going on around the product, and you stay away from using the keyword by itself on you product page and internal linking, it could certainly work in your favor.
-
RE: How do you stop Moz crawling a page?
On http://moz.com/help/pro/rogerbot-crawler Moz gives an answer to the question "We are still seeing duplicate content on Moz even though we have marked those pages as "noindex, follow. Any idea why?
Moz is not a search engine index, it uses a crawler. If those pages are not blocked by the robots.txt file, then Moz will crawl them. They ignore the noindex tag because they don't index anything. Search engines will honor the noindex tag and not index a page if you specify with the robots meta tag. However, to remove pages from the crawl, disallow them in the robots.txt or metarobots.
Their answer is not exactly clear, but according to it, no, a meta noindex will not block rogerbot from crawling your page.
-
RE: Ranking for Competitive Keywords vs. Less Competitive Keyword Variations
I think the reason is back link anchor text and I don't think it has to do with social signals. Exact match anchor text can still be a very strong ranking factor. Take note of the pages in the results--you may see that home pages are showing up for the "real estate"searches but interior pages are showing up for the "real estate variations" searches.
Your logic is off a bit. Just because a page shows up for the higher-tier keyword, doesn't mean it will more easily show up for the lower tier keyword.
-
RE: How To Detect Primary Site With Duplicate Domains?
Go to any site that they may be jointly ling to and evaluate the strength of each of the links pointing to the site. If you see one of the domains being dominant in their back link data, I'd chose that one.
-
RE: Tips for improving google places page
You want to use a unique phone number for each location and if you can use unique numbers for the Pasco/Pinellas offices you'd be better off.
-
RE: Domain Consolidation & Proper Linking Strategy?
If the individual sites don't have any external links pointing to their internal pages and those pages are not getting any search traffic, you could just 301 each domain to the appropriate landing page on the new domain.
-
RE: Domain Consolidation & Proper Linking Strategy?
Hi Alex,
If your individual sites have back links going to their internal pages and/or are getting search traffic to internal pages, those pages and the domain's default pageshould be 301'd to appropriate pages on the new site--the homepage for each individual site, as you say, should go to the each gym's new landing page and the internal pages get redirected to pages on the new domain with information similar to what's on the page that's being redirected. If all of the old sites have an internal "weight lifting" page, for example, all of those pages would be redirected to a comnon "weight lifting" page on the new site.
-
RE: Tips for improving google places page
Noah, you're kind of just scratching the surface as far a what you need to do for your local search visibility. Now that you've got your listing straightened out, you need to start working on your citations and reviews.
I'd recommend you go through this list of Local Citation Building Best-Practices | The Super Rad Whitespark Blog as your next step.
-
RE: Getting listed in the Google local result - help!
Martin,
You haven't done everything until you go through all of these best practices:
http://www.whitespark.ca/blog/post/16-local-citation-building-best-practices
-
RE: My Guest Blog: Still A Good Link Building Resource?
Ruban
Guest posting on blogs can be worthwhile if done with the proper knowledge, time and research--meaning that it is not a shortcut to building good links. Yes there are sites like myblogguest that work to broker the introduction of writers and publisher and it is possible to use them to "build links" but believe me, the SEO knowledge, time, and research that you're going to have to put in to it is more than what you're thinking it's going to be.
The reason for that is the technique is so overrun with low quality content being offered and accepted by low quality sites that finding a gem and knowing that you've found one takes a lot of digging and a good amount of knowledge to understand what makes a good guest post opportunity.
If you're just starting out on your blog and on your SEO knowledge and you move full speed into the guest posting strategy, you're likely going to populate your back link profile with low quality quality links and not really be aware of it. Currently, you've got about 14 domains linking to your site and a fair percentage of those already aren't of substantial quality. A few guest posts probably aren't going to hurt you but unless you can choose the best opportunities, they're not likely to help much, either.
Rand did an interesting post on his blog hypothesizing If the first 500 links that point to your site aren’t authentically earned, you’re screwed that's worth a read--and the comments.
Another good one is Eric Enge's interview with Matt Cutts goes which touches on the issue of guest posts: http://www.stonetemple.com/link-building-is-not-illegal-or-bad/
-
RE: Trying to advise on what seems to be a duplicate content penalty
You know I'm all about SEO and originality...
-
RE: Trying to advise on what seems to be a duplicate content penalty
It could have to do with dupe content--in combination with relatively low quality directory back links. They really need to work on diversifying their back link profile and increasing the quality of their links--not that it's all that spammy but there's not much there that's telling google that others on the web are voting for it. They really need to work the social media/content channels. Have you done a comparison between them and some of their competitors in OSE?
-
RE: How do I find a great SEO Mentor?
Hi Daniel,
You're welcome to PM me with your questions.
-
RE: Authorship Photo Not showing in for last 6 months now
Actually, I'm going to revise that answer. When google provides the "Search for similar searches" option on the results page, such as with this search, it will show terms that were used in the query but were not found on the page as struck through.
-
RE: Authorship Photo Not showing in for last 6 months now
Gagan,
I'm going to guess that the search you did to get the results you show in the images had the words that are struck out used in quotation marks. Google will provide results for a search query with multiple words each used in quotations even if there are no pages that include all those words. It represents those results by showing the quoted words that are not on the page with a strike through.
-
RE: Should I offer our free eBook on its own page?
The conversion of the ebook download and the conversion to the product sale being on the same page put the two objectives in competition with each other on that page and can make it hard to segment your visitors.
Link building to the ebook conversion page will certainly help the authority of your product conversion page and entire domain. In fact, using an ebook page as the basis of many successful ecommerce link building strategies.
-
RE: Blogger to Wordpress guide on moving
xoffie,
This is a good, up to date guide on moving from blogger to a custom domain. It will walk you through the main hurdles as you make the move. www.wpbeginner.com/wp-tutorials/how-to-switch-from-blogger-to-wordpress-without-losing-google-rankings/ and what that one doesn't cover, this one does http://blogger2wordpress.com/
There's a plugin that will 301 redirect traffic from your Blogger account to your new wordpress site: http://wordpress.org/plugins/seo-blogger-to-wordpress-301-redirector/faq/
-
RE: Massive URL blockage by robots.txt
Le Fras,
You don't only have to change the robots.txt file for Google to indicate that more URLs are being blocked by it. The robots.txt file tells the search engines not to crawl given URLs, but that they may keep them in the index and display the URLs in the search results.
So the search engines do know of the URLs that are being blocked and they are able to indicate that more are being blocked as you add pages to your site that are restricted by the robots.txt file.
-
RE: URL removals
Be sure your read through this too: Redirection - SEO Best Practices - Moz
-
RE: URL removals
Yes, you could do that. But again why not rel=canonical all of them to the same page B is being canonicalized to?
-
RE: URL removals
If you want the pages to remain in place, the best way is to rel=canonicalize one to the other. Here's Moz's guide for that.
-
RE: Is having two blogs bad?
Have you identified how many of those 200 visits are converting in some way (unique phone number being tracked or referral visits from there to your domain)? You could keep what you've got on blogger and stop blogging there, delete the content that's been duplicated on your domain and just continue blogging from your domain. The value of moving the content already hosted on blogger may be minimal.
Your law firm is going to look better to potential clients if those prospects find content on the law firms domain rather than on a freebie alternative.
-
RE: Any help on best practices to move blog domain?
Laura,
Wordpress.com has a Offsite Redirect service that you can purchase for something like $12 per year, or something like that, and it provides a 301 redirect to your new site. If you purchase that service and upload your backup to your new domain, you'll be in good shape on the duplicate content issue.
If you just delete the old site and not pay for the redirect service, there will likely be overlap in Google's index of identical content from two different domains and unless you get a lot of good links going to your new site immediately, your old site would probably be recognized as the authoritative one for the first 4-6 months or so.
-
RE: Feefo review links
Never heard of feefo before but in looking at a couple of representative companies listed there, I didn't see any site wide links pointing at their sites.
From what I did see, it does appear that a company's first feefo review page is being indexed by Google (good), but the following ones are not (not so good). Since new reviews are added at the beginning of the list (top of the first page) and older ones get pushed down and off to subsequent pages after about 25, the links on the first page will change as reviews are added (not ideal) the value of the page is very low (not so good) so the links there won't pass on much value. Also, keep in mind that Goolge only counts the first link on a specific page to a specific page, so, if that first review page has five or ten reviews of a single one of your products with each review linking to that product with the same anchor text, only the first one is going to pass any link juice (good in this case).
So, in my quick review, the max number of links you can get from them would be one link per one of your products--up to a total of 25 products, so long as you don't get more than on review per product. Total algorithmic value/harm to your site: negligible.
-
RE: Am I doing enough to rid duplicate content?
Heather,
First things: 1. Are they still driving traffic? 2. Rel=canonicals are supposed to be used on identical pages or on a page whose content is a subset of the canonical version.
Those pages are very thin content and I certainly wouldn't leave them as they are. If they're still driving content, I'd keep them, but for fear of panda, I'd 302 them to the main pages while I work steadily on putting real content on them and then remove the redirects as the content goes on.
If they're not still driving traffic, it seems to me that it wouldn't be very hard to justifying their removal (or 301 redirection to their main pages). Panda is a tough penalty and you don't want to get caught in that.
-
RE: Google Analytics Code
Hamid,
Be sure that you have installed your google analytics code on your PPC landing page(s) and be sure that you're not using two different google accounts--one for the code you're looking at for your organic results and code from another account set up for PPC. I've seen both of these things happen before.
-
RE: On Page Local SEO
Sean,
Robert Fisher had some great info on this that I think is going to help you. http://moz.com/community/q/keyword-research-how-best-to-target-keywords-without-using-a-region-as-part-of-the-search-query
For advanced on-page ideas on how to hack the 7 pack, check out Dr. Pete's post, http://moz.com/blog/how-does-google-count-local-results
-
RE: Sub-domains or sub-directories for country-specific versions of the site?
There may be slightly more reason to lean towards using subdirectories but each case is different. Have you looked these over yet:
-
RE: Why does my aged Yahoo Directory listing not show up as link in OpenSiteExplorer?
It's possible that OSE never crawls that deeply into the directory pages of your niche--OSE doesn't get to every page on the web. In the end, however, it's not what OSE shows, it's whether or not Google knows it's there and if Google assigns any value to it. Essentially, a YDir link is a paid link and link juice shouldn't be expected from it. However, you can drop the url (minus the "http://") do a site:dir.yahoo.com search for your company name and see if google's indexed that page.
edit: However, you can do a google search on the url (minus the "http://") or a site:dir.yahoo.com search for your company name and see if google's indexed that page.
-
RE: Why does my aged Yahoo Directory listing not show up as link in OpenSiteExplorer?
Bizzer,
It can take a long time before OSE finds a listing on Yahoo directory--like 6 months or more. As far as the value of that listing, my opinion is that your $300 is better spent developing content for your site and finding a few well curated niche directories to put your links.
-
RE: Site Launching, not SEO Ready
Since I'd guess you're only talking about a matter of days or a few weeks, I really don't think it matters, so I would lean towards getting it indexed as early as possible and dealing with the SEO once the site is "live".
-
RE: Rel Canonical and Moz Crawl
Sara, if you're 301ing a page that also includes a rel=canonical directive, the rel=canonical will not be seen because the server redirects the user before they get to the page. If the pages are identical, you can rel=canonical one to the other and both URL will be available to the user. A 301 will prevent a user from landing on that page.
-
RE: Effect SERP's internal 301 redirects?
wellness,
You'll want to use 301 redirects from the old pages to the new pages to tell search engines that the values assigned to the old pages should now be assigned to the new ones. Here's the info you need to to that:
-
RE: How many keywords?
Your on-page factors provide relevance to the keyword topic, but it's not what gives your site/page the oomph required to rank above other sites. (It used to be that "keyword dense" pages was all that was needed, which is what led to pages with more keywords than information for the visitor.) Off-site authority building is what lifts your site above other strong competitors.
Part of SEO is understanding where to put your efforts for the best impact with the available time/budge/knowledge resources. To do that, you need to know what the marketing budget is, what the SEO skill level is, what the client goals are, the competitiveness of each potential keyword, the time frame that the client needs to see a return on their investment, and the strength of the page/site being worked on.
For a one man band the more narrowly you define the project and the client's perception, the better off you're going to be--especially early on.
-
RE: How many keywords?
"In a competitive space" as you say, is the key factor. : )
-
RE: How many keywords?
In the planning phase, ballpark a page of content optimized on-page and -off for each keyword and one page per product. Your results will differ widely based on a number of criteria. Before moving forward, I recommend digesting the following:
-
RE: How many keywords?
Sure, it's possible, just not for one page. 1-7 keywords per page is a good estimate, but perhaps a bit high but a site can target hundreds, even thousands or tens of thousands with enough pages and enough effort. And your right, the more keywords being targeted the more man hours will have to be put into it.
-
RE: Should I Do a Social Bookmarking Campaign and a Tier 2 Linking?
Zoran,
Looks like you picked up a copy of the 1996's Backpackers Guide to Linkbuilding. As Ricky said, that stuff's not worth much today. Basically, the more control you have over a link today, the less it is worth--with bookmarks being at the low end of the scale and editorial links from authoritative resources on the other end. Rather than spending your time at the low end, work on building out your social media profiles and figuring out who your audience is so you can publsh content for them and announce that content through your social channels.
-
RE: How do I execute the following strategy.......on my own!?
Rob,
- Start with Moz's SEO guide for understanding your SEO.
- Use Google's Adwords Keyword Planner for your keyword research
- Read through these posts to bring you up to speed on building your social media presence
- Open Site Explorer will allow you to benchmark on-page and off-page ranking factors for your competitors. You can do them one at a time or compare them against each other.
- Google Alerts is a free place to start for monitoring your competitors and here are some others to look at
I think once you get through all that, you'll have a basic foundation for moving forward on your new site's search marketing.
-
RE: How to ask Google to remove old pages that don't exist
Mayank,
If, when you type in the URLs and the server give a response of 404 (page not found) then they will eventually drop out of the index. You may also use your Google Webmater Tools Remove URL tool (under the "Google Index" link) to request removal of the URLS.
-
RE: Finding authoritative sources on Google+ - is there a tool?
Steve, here are a few to check out:
- CircleCount www.circlecount.com/ track your followers and analyze your shares. See how many followers you've gained over time.
- FindPeopleonPlus www.findpeopleonplus.com/ research, outreach, and link building. Sort by keywords, profession, country, and more.
- ** SharedCount** API www.sharedcount.com/documentation.php combined statistics of Google+, Twitter, Facebook, and more, the SharedCount API puts a ton of social data at your fingertips.
- Google+ Ripples To view Ripples for a public post in your stream, just click the dropdown arrow at the top of the post you’re curious about and click View Ripples. Google+ Ripples creates an interactive graphic of the public shares of any public post or URL on Google+ to show you how it has rippled through the network and help you discover new and interesting people to follow.
- All my + http://www.allmyplus.com/ is a useful tool for evaluating the activity and engagement of specific profiles
-
RE: When You Add a Robots.txt file to a website to block certain URLs, do they disappear from Google's index?
William, If the pages in question are linked to from external resources the robots.txt file will not prevent the pages from appearing in the index. Per Moz's Robots.txt and Meta Robots best practices, "the robots.txt tells the engines not to crawl the given URL, but that they may keep the page in the index and display it in in results.
To prevent all robots from indexing a page on your site, place the following meta tag into the section of your page:
-
RE: Is a press release a bad idea?
Yeah, but what you're paying the provider for is the sure knowledge that they're going to spam your press release. On Craigslist you wouldn't even be sure your $500 would go to do even that.