I can't speak to how thorough OSE is. I know they don't find every link, but in my experience, they seemed to find most of the strong links. I can suggest Majestic SEO for a very thorough, but not free, backlink profile.
Posts made by JoeAmadon
-
RE: How accurate is Open Site Explorer?
-
RE: On Page Optimization vs. Anchor Text
Pages can definitely rank for multiple terms, so having anchor text around one term does not necessarily mean you can't rank well for another term (Lots of pages rank for terms other than "click here" despite the majority of their links having that anchor text). Standard practice is to only target one main keyword per page, but that doesn't mean you can't make it work for a page to rank for a couple different major terms.
The main question I have is how different is the term your going after from the anchor text you are getting? I'm of the belief that if anchor text isn't just exact match or nothing. If the anchor text is topically relevant to the terms you are targeting you're in good shape. If that isn't the case why do you have so many links like these pointing to that page?
-
RE: Are Guest Post Anchor Text Links Evil?
I could be wrong, but from what I remember of the webinar, he was referring to paid text links in her posts, which is manipulative. I believe the emails only offered cash in exchange for links.
But to the point of whether text links in a guest post are manipulative, I think it depends on the relevancy of what is being linked to. Would a user actually want to go to that site from the post they are on? I'm guessing not many people would want to go to a car insurance site from a travel blog.
I think blog owners need to remember that whether they've written the content or not, if they are putting it on their site, they are vouching for it. Your credibility with your readers and with search engines is on the line. So the blog owner should still take responsibility for making sure the links in a guest post go to reputable sites.
-
RE: It has been recommended that we remove the number of links in our footer, should we?
Pruning links from page templates (header, footer, etc.) is generally a good idea if they don't go to important pages. As Albin suggests, listen to the data. What are users not clicking on?
I can see some of these as not being needed in the footer. If your store pages are used as navigation, these are redundant, unless users like using them in the footer. Pages like "returns" and "order tracking" probably aren't making you a lot of money, and can still be easily found from a customer service page that is linked to from the footer. This way users can still find what they need, but you only devote one link instead of four or five.
I don't think removing a handful of links from the footer will diminish the look of the site or the user experience significantly.
-
RE: Two URLs with same content
A 301 redirect is the best solution as it will point users and bots to what will then be the only source of the content. A rel="Canonical" will tell search engines which page is the canonical version but you will still have users hitting pages on the old domain and potential creating links to those pages instead of to your new subdomains, which isn't ideal.
-
RE: Is there a good tool for finding the outbound links on a domain?
This probably isn't quite what you're looking for, but Bing allows you to do a search that gives these results:
LinkFromDomain:domain.com
I want to say Danny Sullivan mentioned this in a semi-recent post, but you can find more information here.
It won't be the cleanest list to work with, but it's the only thing I know of that will give you this info
-
RE: Managing Large Regulated or Required Duplicate Content Blocks
Is there anything keeping you from putting the ISI content in an iframe? Search engines don't identify the content in the iframe as part of the page.
-
RE: CMS Preferences: Drupal or Joomla!?
Thanks for the response, Ryan. I'm leaning towards Joomla! as they seem to have more options for each feature and a more supportive community, which I will certainly need.
-
CMS Preferences: Drupal or Joomla!?
I already have some experience with WordPress, but for a new site I want to use a different CMS. Anyone care to weigh-in on the pros and cons of Drupal or Joomla!?
Thanks,
Joe
-
RE: How Can a Page Have More Unique Pageviews than Total Pageviews?
Magic. Obviously.
I can't give you a definite answer but I imagine what is happening is that the data points are being pulled two different ways. Perhaps unique pageviews are being pulled from requests for a URL but total pageviews are being pulled from tracking code and the code isn't loading because the code is put in wrong or the user hits the browser's back button before the page loads.
Which tool are you using?
-
RE: Like my Facebook Page or my Facebook URL?
I agree with Ryan that a Like Button on your homepage is best utilized by that user "liking" your FB page, as this is likely a vote on their part for your brand.
However, I think for deeper pages, like a product page or a blog post, it makes more sense to have the like linked to that specific URL, since that is what the user most likely is "Liking". And this should help that page rank better. There are others who would disagree with this advice, though. I believe Ian Lurie from Portent has the Like button on his blog posts pointing to his FB page to gather a stronger following there.
-
RE: Googles Now Giving More Power To Ranked #1?
These are called sitelinks and Google derives them from your site navigation. From Google Webmaster Tools Help:
"The links shown below some sites in our search results, called sitelinks, are meant to help users navigate your site. Our systems analyze the link structure of your site to find shortcuts that will save users time and allow them to quickly find the information they're looking for."
-
RE: How do 301 redirects affect rankings?
You might still see the red-shoes URL for a brief period of time, but it will eventually disappear from the rankings. While the brown-boots page will receive most of the link juice from the 301, on-page ranking factors will diminish this page's ability to rank for "red shoes". This is one of the reasons, you want your 301 redirects to be to a page that delivers similar content. This redirect should help you rank for terms that are targeted in the on-page content of the brown-boots page by increasing the link juice passed but the anchor text might not always be relevant.
-
RE: Changes in Sitemap Indexation in GWT?
I use the same manual approach, manually dumping into Excel on a regular basis. I'll usually increase the frequency if we are making changes that should effect indexation.
-
RE: Changes in Sitemap Indexation in GWT?
Sorry I forgot to specify that this has been over the past week.
Thanks for the reply, Ryan.
-
Changes in Sitemap Indexation in GWT?
I've noticed some significant changes in the number and percentage of indexed URLs for the sitemaps we've been submitting to Google. I've been tracking these numbers directly from Google Webmaster Tools>Site Configuration>Sitemaps. We've made some changes that could be causing the changes we're seeing, but I want to confirm that this wasn't just a change in the way Google reports the indexation.
Has anyone else noticed major changes, greater than a 30% change, in the indexation of your sitemaps in the past week?
Thanks,
Joe
-
RE: How to avoid duplicate content on ecommerce pages?
Sorry I didn't make that a little clearer. But, yes, it sounds like you understand. Only serve one product page. List the product page in whatever categories seem appropriate.
As to which page to serve, I actually like the URL www.site.com/home-gym-equipment/body-solid-g9s.html as it provides additional keywords as opposed to just repeating keywords. But if Body Solid is a popular enough brand where lots of people shop specifically for that brand, then you might want to go with www.site.com/body-solid/body-solid-g9s.html.
-
RE: How to avoid duplicate content on ecommerce pages?
If you are worried about the category pages having duplicate content, that shouldn't be a big concern as most of the content should be unique from one page versus any other page. Some content is almost always going to be repeated to some extent on a website. You need to make sure there is enough unique content on the page to differentiate it as The page on your site about that keyword. Having multiple navigation paths to the same product is going to increase conversion as it makes it easier for customers to find what they want, so I would not shy away from that.
I believe the problem is how you serve the product page for products that are cross-listed. I don't know how your product pages are created, but you won't want more than one URL for any one product.If you use breadcrumbs, you'll have to choose which set of breadcrumbs you want your users to see, even if they came to the page via a different path. If any other content is dynamically created based on the category it is listed in, you'll have to make sure that the one page you serve adequately fulfills what the customer would expect to see based on other experiences with your site.
-
RE: How Long For New Keyword Rank Data?
I believe you will have to wait until Friday. Another solution, depending on the size of your keyword list is Rank Checker . It can check lists of keywords for you and gives you data that seems to be only a day or two old.
-
RE: Major Website Migration Recovery Ideas?
Neil, after reading through your situation, I think it is remarkable that you've been able to sustain these changes as you have. I think you could make a remarkable case study out of this.
Taking care of the 301 redirect chains is definitely a good step. Other things to consider include the Panda update, but looking at your content, it looks really solid. I considered the Google's ever growing preference for big brands, which could be playing an effect, but it looks like you've done everything you can on the site to signal a legitimate brand.
When I started looking a little closer at the site, it seems like the word "travel" has almost intentionally been avoided. I actually like that you don't overuse the term, but possible using the term a little more could help signal to search engines the types of search queries that you are relevant for.
Another on site enhancement I'd suggest is re-examining the site's navigation and internal linking structure. For instance, the SEOmoz analysis tool shows almost 5000 internal links on the homepage. When I look at the source code, I don't see nearly that many. But you might want to look into how search engine bots are crawling the mega dropdowns to make sure your internal link juice is flowing to the pages you want it to.
Unfortunately, I can't offer much more advice than that. While I don't think search engines are going to be distrustful of your site for the moves, the 301s still lose some link juice. And I'm a believer that SERPs are somewhat self-reinforcing (i.e. getting good rankings will make it easier for you to get more natural backlinks as you're getting more traffic), so time spent in search engine oblivion is always going to be tough to come back from.
Best of luck.
-
RE: How do you determine a quality link?
In my opinion, these sort of blog comment links are pretty low quality. A handful of these shouldn't hurt you but don't expect them to bring a lot of value. It is a pretty common practice, because it is such an easy link to acquire. Which is often one of the things to consider when evaluating the quality of a link; how difficult is it to acquire a similar link? Some easy links will be of decent quality, but not many. Other things to consider when evaluating the quality of a link are:
Page Authority.
Number of outbound links on the page. The more links leaving a page, the more diluted the link juice will be.
Relevancy. Would it ever make sense for a user to travel from that page to your page and be satisfied with the experience? Users don't have to use the links, but this is a good way to judge topical relevancy.
Anchor text. The more descriptive the anchor text, the better. Expect poor anchor text ("click here") on some links, but good anchor text is a definite bonus.
-
RE: Why would the PageRank for all of our websites show the same?
Yes. Every tool I know of that provides PR pulls from the tool bar PR.
-
RE: Why would the PageRank for all of our websites show the same?
The server shouldn't be an issue at all. PR in the tool bar is very unreliable. Soon Google is going to stop supporting it. I wouldn't worry at all about it. Domain authority and page authority from the SEOmoz tool are much more reliable.
-
RE: How often do you refresh meta descriptions? And does refreshing meta descriptions help in ranking?
There is no connection to meta descriptions and rankings, so refreshing the content in the meta description will not help your rankings at all. The meta descriptions are typically used as the text below the page title in the SERP, so good meta descriptions can help click through rate. So write your meta descriptions for users, and only change the content if you believe you can create better copy that will drive more click throughs.
-
RE: How to choose keywords for a small, local business
Go with Option B. Optimizing for "toyota" or "used cars" is going to be super competitive and most of the people performing those searches aren't likely to be relevant to a brick and mortar location. Targeting the more specific phrases will be easier to rank and the traffic the site gets will be more likely to convert. Keyword selection requires that balance of search volume and specificity. "toyota anytown", "used cars anytown", and "Used toyotas anytown" should be a good place to start.
And if you haven't already, you'll want to get the business listed in Google Places.
-
RE: Is URL appearance defined by crawling or by XML sitemap
Google is going to go to the pages submitted in the sitemap and see that they are serving a 301 response code, which they don't want to see in sitemaps. Either find a way to create a sitemap for the URLs you want to use (this is what I'd do) or shorten your URLs so they work with your sitemapping solution (although it is not a good idea to change URL structure because of a software limitation).
-
RE: What is the ideal range of keyword density?
I think the idea of optimal keyword density is a bit out-of-date, but I believe, historically, Google preferred 4-6%, while Yahoo and MSN liked a 6-9% range. These days natural writing is the best way to go as search engines have gotten much better at identifying unnatural writing patterns and factoring in the words that are typically associated and are expected to be seen in good writing about a topic/keyword.
-
RE: Keyword stuffing
Looking at the page, the use of "Berghaus" seems fairly natural. The excessive usage is necessary for good product names used as anchor text to the products. I think your best move is to increase the content blurb, which should dillute the keyword density for "Berghaus".
-
RE: Multiple h1 headers in a slideshow
Best practice is a single
per page and using CSS to style.
-
RE: Will 301 redirecting a site multiple times still preserve the original site value?
I don't have experience redirecting entire sites to other sites, but when it comes to 301 redirects you want to avoid redirect chains (abc.com redirects to def.com redirects to ghi.com, etc.). This is a pattern too common among malicious sites, so search engines will tend to see it as a negative signal. Avoid this while still passing as much link juice as possible by editing the redirect on abc.com to send traffic directly to ghi.com.
-
RE: Search Result Page, Index or Not?
Changing the URL as you've done should help, but if it is still a search results page, Google won't want to show it, even though they don't put a lot into identifying internal search landing pages. So you should be fine with what you've done. If you feel these pages are really helpful and you'd like to land users on them, an improvement might be to create navigation to http://www.mysite.com/travelguide/attraction-guide.html and similar pages and redirect internal searches for relevant keywords to that page. This way the page is more than a search results page.
-
RE: Site Usage Statistics and organic ranking
I don't doubt that search engines use bounces back to SERP as a ranking factor. But I'd be hesitant to attribute the statistics you're reporting as a result of that. What I think is a stronger factor, although it's tough to determine with what you've shared, is that the internal linking from the homepage to "content" pages is getting bots to those pages so they actually see the content that exists on your site. If what I suspect is true, I imagine you'd see that a good portion of the new traffic is entering on pages other than your homepage.
-
RE: How to use Schema.org for product listings
Walid,
I've never seen the markup on a subcategory page. All the documentation I'm aware of suggests they only want to see one product per page with markup, so a subcategory page wouldn't work. It might be possible to add the markup to whatever product information exists on the subcategory page and specify the product URL of product instead of the subcategory page's URL. But I'm not sure if any search engines would accept this or if there is any advantage to doing this versus having the markup on the product page. If you go this route, I'd be interested to hear how effective it was.
I'm not allowed to discuss what we actually do at Overstock, but semantic markup is viewable in the source code.
-
RE: How to use Schema.org for product listings
I'm not sure your exact question is. If you simply want to know whether to markup your product pages or your subcategory pages, add the schema.org markup to your product pages. The information that can be used to describe products can be found here: http://schema.org/Product . Unfortunately, the schema.org site doesn't offer much for examples, and although the rich snippet testing tool is supposed to work for the schema.org data, there were bugs early on. I still haven't heard if these bugs have been fixed. Please let me know if that doesn't answer your question.
-
RE: How to Implement Massive SEO Modifications
Considering how massive the changes are, I'd say it's best to do them all at once. This will let you start rebuilding as soon as possible. Making one big change and then waiting to start ranking again, followed by another big change that could drop them out of the rankings again would likely cause a longer period of your client not getting traffic. I wouldn't say that the on-page and metadata changes need to be made at the same time, if there are limited resources.
One problem with doing this all at once is that it will be more difficulty to evaluate the effect of each change. This might not be a huge deal to you, but sometimes it is nice to know what return came from each change.
-
RE: Links to commercial pages vs resources.
I understand the concern with having pages that don't convert as well outranking your "more important" pages. Fortunately, this is probably one of the better problems you could have. There are two things you can do to take advantage of this situation, assuming these pages exist in subfolders of your site and not as a subdomain or separate site.
First option, optimize the internal links on your resource pages to dirrect users and bots to your homepage and commercial pages that are relevant to the content of that specific resource page. The fewer links you have off these pages the better in order to not dilute the PR that is going to the pages you want to rank.
A second option, is to move the content to your commercial pages, assuming the information make sense to have on those pages, and 301 the resource page to the commercial page. I imagine this is going to be less ideal for users, depending on your site and content, but this should completely avoid the problem you're concerned about. I would only go this route, if option 1 doesn't work and the traffic that is entering your resource pages isn't converting.
And, although I don't know if search engines consider it at all, you have the option of designation a page's priority relative to the rest of your site in the sitemap. So you could set your resource pages' priority to .3 and your commercial pages to .7. But again, I don't know if this is actually used by search engines.
-
RE: My google rankings went down
Movement from position 1 to position 4 is not that unusual. I imagine this has caused a subastatntial drop in traffic as the CTR tends to drop sharply after position 3, but I wouldn't consider this a dramatic drop in rankings caused by problems with your backlinks. It is possible that some backlinks might be devalued by Google which has caused a small drop,
More likely this is caused by a Panda update. Panda wasn't a one time update, and this Search Engine Watch article suggests an update around the time of your drop (http://searchenginewatch.com/article/2080631/Google-Quietly-Launches-Panda-Update-Version-2.2). This probably means that Google sees your content as a little weaker than your competitors.
Based on this, I don't think you need to make any major changes. Try to get some more good content up on your site. And I'm not sure how active you are in the social realm, but more social engagement (I suspect Google is a big fan of +1's) could help.
I'm not sure if your on-page efforts are also limited to two keywords, but if so, I'd consider expanding that a bit. Diversification will make your traffic and revenue less susceptible to fluxuations like this. Moving into the social media will also give you another way to produce traffic that isn't reliant on SEO, while usually increasing the efforts of your SEO.
-
RE: My google rankings went down
If it were me, I would shift the spend from linkbuilding to an SEO firm that will not only provide a good SEO audit, but is willing to provide some education as well.
I'd also focus on putting together a few really great pieces of content related to your niche. While researching for this sort of content, make note of other good sites and blogs that your site aligns with, so you can engage with them. Hopefully, through this engagement, they might find the content you've created helpful to their users/readers and link to your site. This is a pretty conservative approach that doesn't yield quick results, but I tend to be in favor of building for the long term.
There are a lot of ways you can start to generate more natural links to your site. I'd recommend doing some research on SEOmoz because my philosophy isn't the only way to go. You've got to figure out what works for your situation.
-
RE: My google rankings went down
Yes, the links could be hurting your rankings, but as Theo points out, a number of things could be causing your rankings to drop. Have you received any notices in Google Webmaster Tools about possible unnatural linking detected? Do you handle your own SEO or have a consultant? Are you relying on the company doing linkbuilding for you to provide you with your SEO analysis? Linkbuilding can sometimes be harmful to the health of your website, so it's a good idea to have someone knowledgeable about SEO to keep an eye on the incoming links you're getting to make sure they're good quality. Or you can avoid this sort of unnatural linkbuilding altogether. Depends on the level of risk you're willing to accept.
-
RE: Multiple anchor text links
You won't lose PR.
This is something that is most likely filtered out by search engines after a crawl. SEOmoz has probably just chosen not to implement such a filter.
-
RE: Multiple anchor text links
I agree with Russ that the second link won't hurt you. I believe that the "first link is the only one that counts" rule refers to the anchor text and not necessarily the passing of link juice.
-
RE: Up-to-date list of search engine bot user agents
Thanks, Keri. I found http://www.botsvsbrowsers.com/ to be very helpful. I'm looking for all the bots from the "major", U.S. search engines (Google, Bing, Ask, AOL, Blekko).
-
Up-to-date list of search engine bot user agents
Does anyone know of an up-to-date-list of search engine bot user agents?
Thanks.
-
RE: Getting page cached
I haven't seen any documentation by Google supporting this. In my experience I have seen it be successful. However, I haven't run any significant tests to back it up. It has just worked for me, so hoping it works for others. However, Fetch as Googlebot was still in the labs when I saw this working, so I'm not sure if this has changed.
-
RE: Getting page cached
I've heard from others that this can be very effective and I've seen good results getting pages cached quickly (a couple of days) after using it. I've used this very sparingly, so I don't know the period for the allotment. I also used this when it was in the Google Labs. They may have made a few changes when they brought it out of the lab.
-
RE: Getting page cached
Go into webmaster tools > Diagnostics> Fetch as Googlebot. Enter the URL for the page you want crawled.
-
RE: Linking out?
You should be fine linking out to a site that links back to you when there is reason to do so (relevant information, helpful to user, etc.). Cyrus covered this very well in the latest White Board Friday.
http://www.seomoz.org/blog/external-linking-good-for-seo-whiteboard-friday
-
RE: Website Content
There is absolutely nothing wrong with that, as long as I'm understanding your situation. There is not a big difference to search engines whether you pages are HTML or XHTML. Running them through a validator is a good idea though.
-
RE: Is it necessary to optimize every page of a site
I think it's fine to focus on a few/your most important pages, but I would take the time to make sure the internal linking/navigation is optimized for the entire/majority of the site. Letting bots crawl all your pages is the first step. Then focus optimization where you get get the most return.