Exchanged links are generally devalued. I would not recommend them unless you actually expected to receive traffic from the link.
Posts made by RyanKent
-
RE: Link exchange
-
RE: How would you handle network header links?
Some companies have a lot of sites covering various topics, for example, http://ninemsn.com.au/
The "100 links per page" is a very old rule back from the days when crawling web pages was much less sophisticated then it is now. Search engines have the ability to crawl hundreds of pages.
The example you are using is a site with a DA of 85 and is in the top 1000 most trafficked sites in the world. This site can support the crawling of 100s of links. Unless your site has an exceptionally high DA, you probably want to reduce your links to the minimal amount necessary to ensure a quality user experience. As you examine your links, determine which links are actually used. There are tools such as CrazyEgg that can help you evaluate your site.
Imagine a site with 1000 pages and a link to all 1000 pages from the navigation bar. What you are telling search engines and users is that all 1000 pages are equally important. That probably is not the case. The most important pages and categories should have a link, but the lesser pages would require an additional click.
Should these headers be implemented in javascript?
Search engines can crawl most javascript. The best practice would be to reduce the links as mentioned above. Search engines reward sites which improve the user experience by providing higher rankings. If you offer 200 links and 150 of them are not used, you are bleeding PR and your site will not rank as well overall, which is by design.
I'd prefer to reduce the number of links, but sometimes company policies don't allow this.
Your role as a SEO is to educate the company on the importance of the changes you recommend. If a company refuses to implement your recommendations, then there is not much you can do about it.
-
RE: Zero visits from keyword in Google Analytics
Thanks for following up Will.
-
RE: Does a +1 or Share appear in SERPs site-wide, or only for the page that is specifically shared?
Presently Google +1 indication in SERPs only works for the page shared.
-
RE: Noindex all dodgy content?
Noindex will cause search engines to not index the page. If a page is not original, not quality content or you are certain it is not of any value to web surfers, then it would be appropriate to use noindex.
Nofollow is used when you do not trust a link, or cannot vouch for or otherwise support the source. In most cases you would not use nofollow for internal links unless you know the target page is a dead-end which will never be indexed. An example might be the "print" version of a page. When in doubt, do not use nofollow on an internal page.
-
RE: Zero visits from keyword in Google Analytics
Yes.
In this instance the testing I performed this week and the testing Anil performed in 2008 showed consistent results, and are in alignment with Google's explanation of how their sessions work.
A session begins when a user first visits your site. That session is maintained for 30 minutes even if the user leaves your site and returns to Google. Everything makes sense and synchs perfectly with the results.
If you wish to pursue this any further I can only suggest either repeating the test yourself, or contacting Will or Avinash concerning the prior article. I can only presume there was a misunderstanding in Will's article.
-
RE: Will using a service such as Akamai impact on rankings?
My understanding of how a CDN is configured is it's a back-end server change. The HTML will still appear as mysite.com/image.jpg but when a request for that image is made, your server will tell the user's browser to fetch it from cdn.chicago.akamai.com/mysite.com/image.jpg.
Your server still hosts the image and is the primary source of the image. That image is duplicated on CDN servers throughout the country and world depending on what cdn plan you purchase.
So in short, the images are hosted on mysite.com and images should not be taken out of the mix. You can confirm this by checking well known sites which use Akamai:
-
RE: Title tag - shorter = better?
Presently SEOmoz is not on the first page of Google for "SEO Software" nor "simple seo software". If they wanted to be, I am sure they could optimize pages for these terms and appear on the first page of Google SERPs.
Titles are not purely about ranking. There is the branding factors and how you wish people to think of your company. From a pure ranking perspective for "SEO software", you are correct the term "simplified" should be removed. I would bet Rand prioritized presenting his company slogan in the title above having the page rank well.
-
RE: Organising Blog categories for SEO and usability.
I don't care for that category organization myself. It seems a bit spammy. That doesn't mean that it can't work but it really depends on how you present it.
What I would not like to see is www.greatphotography.com/portrait-photography/newborn-photography/ as a URL. I am already on a photography site. It's not helpful to twice more add photography to the category names.
Make your titles as useful and user-friendly as possible. I know this answer is a bit vague but only so much help can be offered with a general question in a Q&A format. There are many factors such as how much content you can have to offer, how much traffic is there for the chosen category names, how competitive is that traffic, etc.
-
RE: Redirects
I'm thinking that google may go back to the actual old page in some way.
To the best of my knowledge, that is simply not possible.
Google can choose to ignore the robots.txt. Google can choose to ignore a meta tag. Google can choose to do just about anything it wants with respect to page rankings and SERPs. What Google cannot do is access a page on a web server to which it does not have access.
Google cannot tell the web server "hey, I know you are showing a 301 here but I don't want to be redirected. Show me the original page instead". At least, they can't based on my understanding of how the web works. If I am mistaken, I would love to learn about it so I can improve.
-
RE: Will using a service such as Akamai impact on rankings?
Google effectively crawls all types of sites from around the world. As long as you offer proper navigation with your site, there shouldn't be any issue.
Your content for each region should have a landing page for that region. mysite.com/jp would be your landing page for Japan, etc. Your landing pages would be treated as your home page for Japanese speakers. You should have links from Japanese companies to the /jp page as if it was your site's home page.
-
RE: Redirects
In a normal web page request, the requested page is provided by the host server with a 200 header code.
In a 301 situation, the new page is returned with a 301 header code. This would happen whether the old page is present or not. Even if the old page was present, the hosting web server would not look at nor offer the old page.
If there is no additional information or context, I would stand by my original statement. My question to the person who is the source of the statement would be, what exactly is Google supposed to see on the page before it is redirected? What has changed from the last time Google saw the page?
-
RE: Redirects
The fine people at bruce clay said it's important to let the page be seen before deleting it.
Do you have a specific link? Something is wrong with that statement and I feel it must be taken out of context.
Or possibly there is additional details you have not shared? Has the page changed in some way? Let's say your page is crawled by Google every 2 weeks. So it was crawled last week and you decide today you wish to 301 the page. You are suggesting to wait a week to let Google re-crawl the page before 301'ing it. My question is, what has changed on the page since the last crawl? What do you wish Google to see?
-
RE: Is it Panda?, how to deal with AP etc newswire articles
Panda isn't a penalty per se. It is a algorithmic change to how Google ranks sites and pages. If your site has duplicated content on it, you will need to fix all of it. Once your site has been cleaned up, it will can take a month or more for Google to fully re-index your entire site and see all of the duplicated content gone or properly handled (i.e. noindex or canonicalized).
It's not as if you have 1% of duplicated content that your site is affected but no one knows for sure what exact percentage triggers this effect, so your best course of action is to clean it all up.
By using the canonical tag, these pages will be removed from the index for your site. The "harm" would be that if someone searches for the pages your site wont be listed unless you have relevant comments for the search query.
-
RE: Is it Panda?, how to deal with AP etc newswire articles
If this content is merely duplications of articles which exist elsewhere then yes, you can add the canonical tag pointing to the source.
You would definitely not want to "nofollow" these pages. By adding a nofollow tag you are telling search engines not to flow page rank to the other links they find on the page. That is not the result you desire.
You could noindex the pages as well. Prior to doing such I would ask if you are offering comments or other user generated comments. If you are not, then the noindex tag is fine. If you do offer UGC, then I would recommend the canonical tag.
-
RE: Redirects
when redirecting a page, is it necessary for google to see the old page before it is deleted?
If you are performing the redirect via htaccess, then no. A 301 redirect is simply a header code. It lets Google know the page which is being displayed is not the page requested, but a different URL. Google then understands the need to replace the old URL with the new one.
-
RE: So I ran into a site that was not ranking 4 days ago and has over 2 million links to it on some keywords. My guess is this is a link bomb, but the issue is this is pushing one of my sites down. Does anyone know a good way to over come a bomb like this?
When I search for "jewelry display" the riogrande.com site shows as the 5th result. The page that appears is actually a search result page. It has a PA of 1 with no links. This type of page can bounce in and out of SERPs.
-
RE: So I ran into a site that was not ranking 4 days ago and has over 2 million links to it on some keywords. My guess is this is a link bomb, but the issue is this is pushing one of my sites down. Does anyone know a good way to over come a bomb like this?
Hello Kate.
The site's logo says the business has been around since 1944. I don't trust that information but I do take note of it. Next I look at the site's whois record and notice it was registered in 1995 which is pretty much the beginning of time as far as the internet is concerned. This record would support the site has been around for a long time.
Next I took a look in Open Site Explorer. The numbers from OSE show the links were present from probably back in June so they didn't just appear. While the site has over a million links they are only from 2000 domains so the may have footer links or other site-wide links which are less valuable then individual content links.
The links appear natural. There is only one directory link on the first page of OSE and that is a nofollow'd link so it's not flowing link juice.
What exactly are you seeing on this site which leads to the "Google bomb" label?
-
RE: Why did our site drop in Google rankings?
akaigotchi, can you please share your specific concern with the footer link for the SEO company?
It is a common practice for web designers, software companies and SEO companies to offer a footer link. Why would you suggest that link would negatively affect a site's rankings? Why would you suggest that the link might be suspicious in any manner?
-
RE: Why did our site drop in Google rankings?
bought advertising with a follow link
Advertisements are not an endorsement of your site and should not be offered with a follow link. It's cheating, and this may have given your site a boost in rankings which has now been corrected. Or worse, you may have been penalized for this activity.
Your non-www re-direct seems to be working fine so that is not an issue. Adding a blog to your site would not be an issue either. Canonicalizing your pages as you described is a good step as well.
It sounds like you have a lot of changes going on at the same time which makes it harder to diagnose problems, but based on what you shared the advertisements is the only issue which should be corrected. After you fix the problem, use your Google WMT reconsideration request to determine if your site is under a penalty. Do not perform this step until you have resolved the advertising issue.
-
RE: Zero visits from keyword in Google Analytics
My test results support the first article you shared.
I duplicated Anil's test. I searched three times for the same site only using a long tail phrase with four words. I altered the 4th term each time.
The first phrase showed "1" visit, the other two phrases appeared in the report but showed "0" visits.
The other results were combined. Even though I did bounce on my first visit the Bounce Rate showed 0%. My bounce on the first visit was immediate but the Average Time on Site was 1:12 which is clearly the average of my three visits.
You can easily perform this test and have the results the next day. Choose a key phrase where you perform well in SERPs but is unlikely to be used in a search. Alter the last term a repeat the steps in Anil's test. The next day, check your GA.
-
RE: Organising Blog categories for SEO and usability.
I understand and agree with your thought process Ioan, but not the end result.
There is a need to balance usability with SEO. Also consider CTR. Ideally everything can come together naturally.
I do recommend optimizing your URLs but keep in mind there are 200+ factors used to determine a page's ranking. I think you are over-optimizing in this regard.
You can build relevancy by using other factors such as header tags, content, backlinks from local sites, etc.
Perhaps others will share a different view point.
-
RE: Organising Blog categories for SEO and usability.
Does your blog support tags?
If your site is focused on photography you may want to consider category names such as "weddings", "portraits", "commercial", etc. You can use tags for locations such as "Cheshire", "North Wales", etc.
From a user's perspective, if I visited your site and wanted to view wedding photos, I would hope to find a single area of your site which offered your complete selection. I don't think the location of the wedding has much relevancy to the pictures.
-
RE: Zero visits from keyword in Google Analytics
Hi Atul.
What you have is common in the SEO world: two different SEOs who offer opposing explanations on the same topic. In this case they cannot both be correct. Some things to consider:
The first article was written by Anil Batra. I have never heard of him, but that's OK! He's probably never heard of me either. He lists his credentials at the top of the page which seem satisfactory.
Anil's article was well presented and he offered a screenshot of his results along with a meaningful description of the test he performed to draw the conclusion he ultimately made. The article was written in April 2008.
The second article was written by Will Critchlow of Distilled. Will has written other articles I have read and I find him to be a credible source of SEO information.
Will's article is also well presented. Will specifically shared he contacted Avinash Kaushik from Google who would be considered an expert on the topic. Will's article was written in Jan 2009.
Personally I would choose to accept Will's response being that it is more recent and I offer higher credibility to his Google contact. BUT, I am also hesitant to discount anyone's ideas, especially when they are well presented such as Anil's article.
The great news: we can easily try a new test and find the answer! It's been over 2 1/2 years since the most recent article. I'll go ahead and try some tests and share the results.
-
RE: Stuffing keywords into URLs
Thanks for the kind words Paul.
If you are looking for outstanding SEOs to follow, I would recommend EGOL and Alan Bleiweiss. I merely ride in the wake of their excellence.
Your response jumped around a bit but a few replies I would offer:
-
You are right. The value of most directories has dropped significantly. There are very few that offer any real value nowadays.
-
MVC is the current best practice for web design, but friendly URLs is a separate item. You can achieve them with or without MVC.
-
Most people who complain about their site's ranking drop actually have issues on their site if you look closely. I can't begin to share how many people I have encountered who were insistent their site was outstanding when their site had numerous issues.
-
Likewise, I have worked with clients who were quite upset about other sites that ranked well who referred to them as "junk" sites when those ranking were earned. Yes, there are exceptions and Google still has work to do, but they are doing a reasonable job. The truly bad sites usually disappear in 4-8 weeks.
-
I know nothing about "The Marketing Analysts" but they could have an offline presence or have undergone a name change which may explain the "Since 1989" claim. Let's remember Al Gore didn't invent the internet until about 1996 and there has been tremendous changes since then.
-
-
RE: Blog and seo
Generally speaking, if your blog covers the same topic as your main website, then you would benefit most from having the blog as part of your site within the main domain. An example URL would be mysite.com/blog.
By keeping your blog as part of your site, you only have to maintain one website and one URL which is cheaper and less time consuming then maintaining a separate site.
The primary reason to consider a separate site would be if your blog would cover topics not associated with your main site. If you offered a site which sold vitamins and you wanted to regularly include articles about travel, nature and other topics that did not relate back to your core business, it would be best to use a separate site for your blog.
-
RE: Why does my lousy little blog Rank number 1 on Google?
After checking the various SEO factors there is one glaring difference between your page and 8 of the other 9 results on the first page.
Your page is one of only two results that actually uses the phrase in your content.
The only other page to do so is the YouTube result who's total content is 3 sentences. I am willing to be if the YouTube result offered a transcript, it would outrank your page.
I checked Bing/Yahoo. Your site is indexed in both of their results but the page is not. It would be interesting to see if you submitted the page to Bing in WMT how it would ultimately rank.
-
RE: Deleted campaign?
I suggest you contact the help desk. The normally respond within a day.
-
RE: Will using a service such as Akamai impact on rankings?
Akamai offers numerous services. Are you referring to their Content Delivery Network? If so, then the CDN will provide faster page load times which is a good user experience. If your site was being penalized for slow load times (only a small percentage of sites fall into this category) then yes, by properly setting up your content on a CDN you can remove the penalty which would improve rankings. Otherwise you would not directly benefit in terms of rankings, but your users will likely find your site more usable, explore more, etc. which could benefit your rankings.
With respect to your site design, I would recommend a single .com site with folders for each country. mysite.com/ru for russia, mysite.com/au for australia, etc. This method will allow you to collect all your domain authority in a single site and can greatly reduce your software and site maintenance expenses.
-
RE: Ranking local content against English content
Not in a reasonable way.
Search engines provide results based on matching content to the user's query. If a user searches for a term in English then you would need to provide that term on your page in order to rank for it. You would also need to optimize your page for that term (title, header tags, url, etc). You presumably have already done that for the English version of the site.
If a user makes a search query using an English term, they presumably understand English and are seeking a website written in English. If a result was provided in another language that would be a bad user experience, the page would definitely have a higher bounce rate, and that may cause the rankings to drop.
-
RE: Ranking local content against English content
When you say it ranks higher, what exactly are you using for your search term?
If you use an English search term, then yes the English content should rank higher. If the .th domain is Thailand, then the content on the .th section of the site should be in Thai. Try performing a search using Thai and checking the rank. I am willing to bet the Thai content will rank on top. This logic applies to the other countries you list as well.
-
RE: Google International and National Algorithm
Legitimate link building can always offer value. All the standard link building metrics apply. What is the DA/PA of the linking site, is the site relevant to yours, what is the anchor text, etc.
If your site sells artwork, clearly anyone in the world might appreciate and buy from your site. If instead you offer local plumbing services, then a link from another country shouldn't hold as much value.
There are over 200 factors evaluated to determine a page's ranking in search results. PA and DA are two factors. What about the content and the other factors?
-
RE: Google International and National Algorithm
The algorithms used for the various ccTLDs like Google.de are not the same as .com. For example, Panda has fully been implemented for .com.
I would highly advise adjusting any site that has been penalized or otherwise seen ranking issues in Google.com. The other Google updates will be updated eventually.
-
RE: How accurate is Open Site Explorer?
OSE is dependent upon the Linkscape index. The Linkscape database is refreshed about once per month and contains the top 25% of web pages. For SEO, that's mostly what matters. The index was last updated July 25th.
It can take 60+ days for you to have visibility to a link in OSE. It depends upon when the link was created and when Linkscape began crawling the internet.
It's a great tool to use if you understand it's limitations.
-
RE: Indexed non www. content
This issue needs to be cleaned up. It can definitely affect your rankings and search results.
Steps to take:
1. Contact your web host and inform them you wish to redirect all "non-www" URLs to their "www" counterpart. This request is quite common and can be done easily by your host.
2. After your host makes the change, verify that it works. Visit a couple pages from your site and remove the "www" prefix. When you hit enter to visit the non-www version of the page, the "www" subdomain should automatically appear.
3. Since your site is only 12 pages, take a close look at your site. Check every page and examine the URL listed in every link. Be certain the URLs use the "www" prefix and you do not have any broken links.
4. When you are finished, try creating a sitemap. You can generate a free sitemap at http://www.xml-sitemaps.com/. If your site has 12 pages, your site map should show 12 pages. If it shows more, try to figure out why.
5. You can also use the SEOmoz crawler to get a detailed look of your site if there are further issues with your sitemap that you cannot figure out.
Once this issue is cleaned up, it will probably take a month for Google to clean up your site in their index. You can log into Google Webmaster Tools to monitor the status. The "number of indexed pages" should decrease a bit each week.
-
RE: Stuffing keywords into URLs
Really great catch on the canonical issue Trevor! The entire time I just knew I was missing something, and that's it.
The www version of the URL has a PA of 53 which put's it as even stronger then the wiki page. The links mostly use "medical translation" as the anchor text with some "medical translator" and "medical translation service" variances thrown in. The link profile is varied enough to satisfy me the page has earned it's ranking.
-
RE: Getting started
Hi Kimberli.
I do see an issue with your site's setup. Currently your "www" subdomain and your main domain provide the same content.
http://www.kimberliskuties.com
Both of the above links bring up the same page. Your site is duplicated on the internet. It appears your site has all of it's links going to the "www" form of your URL. The easiest resolution is to contact your host and let them know you wish all of the traffic for your site to go to the "www" form of the URL. This is a very common request and the change can be made instantly.
After the change is made, try testing one of your internal links such as http://kimberliskuties.com/storm.cfm?funnelaction=865. Notice I offered the "non-www" form of the URL. If the change was made correctly the page should appear and the URL should include the "www".
One other thing I noticed...your URLs are not friendly. They all appear identically with only the trailing number changing. I would suggest working with your web designer to try to offer readable URLs such as www.kimberliskuties.com/photos and similar URLs. This change would be helpful for SEO and users.
-
RE: Stuffing keywords into URLs
I love a great SEO mystery and, for me at least, you have found one. I think this is a case for the famous SEO forensic analyst Alan "Sherlock" Bleiweiss.
I can confirm your overall findings and cannot explain the results. Specifically, on Google.com I searched for "medical translation". The results are listed below.
Result #19: http://en.wikipedia.org/wiki/Medical_translation
PA: 52, DA 98
Title: Medical translation - Wikipedia, the free encyclopedia
H1: Medical translation
First words of content: Medical translation is the translation of technical, regulatory....
Internal links (2): Anchor text on both links is "medical translation". Lowest PA of a linked page is 61. About 1000 links per page.
Title: Medical Translation Services: Pharmaceutical, Equipment, Specifications, Medical Literature, HIPPA, [99 chars in title so display is cut-off]
PA: 12, DA 60
H1: none. H2: Medical Translation: Medical Translation Services: Pharmaceutical, Equipment, Specifications, Medical Literature, HIPPA
First words of content: When it comes to the medical translation, you can trust THE MARKETING ANALYSTS.
Internal links (3): Anchor text on all three "Medical translation". The highest PA from a page is 15. One of the links is from the home page which has 220 links total.
As I try to reach for some other factor that would allow this site to rank so well compared to the wiki page I notice the following:
-
the site has "medical translation" in it's site's navigation bar
-
the site has a link in the left sidebar on the home page directly to the page. The sitebar is a tad spammy with 43 links.
The above two items are factors, but not enough to do it for me.
I still couldn't explain the ranking so I searched the page for the term "medical". It only appeared twice so I performed a "find" which indicated the term was being used many more times on the page but was not visible. After searching the HTML and CSS I determined there was extra hidden content. I could not find anything suspicious in the CSS and was puzzled on how this content was being hidden then I realized the "trick" involved.
Please notice the US/UK flag in the upper-right area of the page. Press it. Viola! The home page contains extra content directly related to Medical Transcription that no one will ever see. The content includes "Medical Transcription" as a H3 tag, a link to the target page, and a nice paragraph.
This technique is squarely black hat. The purpose of a language button is to offer a translation. There is only one button for the language the page is already being presented in, so no one will ever press it. The content is additional text and links which has nothing to do with a translation.
Even so, I find it interesting this content is enough to yield the #1 ranking in SERP. Either there is another factor remaining that I could not locate (I really don't think that is the case but would love to hear from others) or Google is putting more weight to content on the home page. I have always felt home page content was very strong, but this page just is not strong enough to blow the Wiki page away like this at all, unless Google is weighing this home page content quite strongly.
I like the Yahoo results MUCH better for this search. Wiki is #2 and this page is #13. Bing shows Wiki as #5 with this page as #13. I am ok with those ranking as well.
-
-
RE: Google +1 and ranking effect .
Google made a statement and specifically declared Google +1 presently does not have any effect on rankings but they hinted that can change in the future.
-
RE: Submitting site to dmoz.org
Try submitting to a different category. Most areas in dmoz is administrated by an individual. These are volunteers and maintaining a directory section is often not seen as a worth cause. Some people only log in once every few months, and others are intentionally ignoring other sites as they don't want to increase their competition.
Yes it sounds bad. It is bad. It has been complained about on every level. I am not aware of anything that can be done to demand a fair consideration of your site.
-
RE: How do search engines score "nested" keywords?
Pluralized terms are treated as different words.
When I search for "coin" the wiki article is first. When I search for "coins" that same article is still on the first page but at the bottom. Other pages focused on "coins" have more relevancy due to the identical match.
-
RE: Geolocation: Google only crawls from the US
No, not at all.
It is a common practice for sites to set up geo-location where if a user is from the US the site would appear in English, and if the user is from France the site would appear in French, and so forth.
-
RE: Title tag - shorter = better?
Is that the case even if [the title is] below 64 chars?
Yes. I think you may be confusing Google's title display limit (about 70 chars) with this question. How many characters Google will display is not related to the weighting for each word in the title.
I just went back and re-watched all the Matt Cutts videos where the title was discussed. It was never confirmed by Matt that adding extra keywords to title tags causes the dilution. Even so, I am still convinced that is the case and will act on that understanding.
I base my opinion on a few patents from Google that I have reviewed, other articles, and my logical understanding of how Google works.
The below video discusses titles and Matt describes how Google recognizes when Title tags or Meta description tags contain the same information throughout the site. I have to believe Google recognizes the site name appended to the end of a title tag as a common practice and adjusts accordingly.
In summary, I maintain my understanding the title tag itself is given a certain weight. Let's say 100 "points". If the title is one word like "SEO" then that word is given 100 points. If the title is "Search Engine Optimization" then each term would be given about 33 points. This is the way I see the weight of title tags being distributed, in a similar way to a page's PR is distributed to the links on a page. If anyone agrees or disagrees with this thinking I would enjoy hearing other opinions.
-
Geolocation: Google only crawls from the US
A question was previously asked about geo-location and specifically if Google crawled from other countries. I could not locate the original question but wanted to share the below information.
As of earlier this year Google only crawls from US IP addresses:
-
RE: From an SEO Standpoint, which is better for my product category URLs?
Presenting your URLs without the technology ending (i.e. html) is definitely preferable for numerous reasons. This question is frequently asked and I really should write an article on this topic.
A couple examples as to why it is preferable:
-
the .html offers no value to consumers and makes your URLs longer
-
the .html lets the bad guys know what type of technology was used to create your pages
-
the .html means that when you change your site to .php pages (a very common change) or any other technology, you will need to 301 your entire site which would lead to a loss of your link equity you worked so hard to build.
-
-
RE: On Page Optimization vs. Anchor Text
Is it hard to get a page to rank for a particular term if the majority of the anchor text pointing to that page is different from your chosen term?
The question is far too vague to answer.
What EXACTLY do you mean by "rank". Are you trying to earn the #1 spot? The top 3? Any spot on the first page?
How competitive is the term? On a non-competitive term you can provide relatively poor SEO and still rank in the top three spots.
What kind of variations are taking place with the anchor text? Are the variations other ways to say the same word (i.e. Limousine vs Limo)? Are they pluralized forms of the term? Are they other words that have the same meaning as the primary term ("tissue" vs "kleenex")?
-
RE: It has been recommended that we remove the number of links in our footer, should we?
I strongly agree with Albin and Joe. Check to see what your user's think. I'll take the user experience over a group of SEO experts. What do SEO's know?
Your footer links are very well presented and represent your site well. It is a best practice to minimize your links. If you discovered your links are not actually being used then the feedback from your users is basically those unused links are not helpful and you can consider removing them OR possibly altering the anchor text to something that users may find more helpful.
I will disagree with Albin about the sitemaps. I usually recommend an HTML sitemap but I would not recommend placing a link to a XML sitemap on your page. Offering two sitemaps to users does not make sense to me, and a HTML sitemap is clearly the more useful way to present a sitemap. In your case, I probably wouldn't offer a HTML sitemap either. You mentioned that you already offer links to almost all of the pages on your site. If the couple pages you do not offer links to are not very popular and you have other links to those pages, you may be better of as-is.
I have the impression many SEOs blindly follow certain "rules" such as eliminate footer links and always go xyz. It's important to view standards as guidelines which need to be flexible and adjusted for each site's needs.
-
RE: Does using tags instead of " " good for SEO purposes?
You can see that from the W3schools article I linked above: "Even if
works in all browsers, writing
instead is more future proof."HTML worked with the idea that certain tags could be opened but did not need to be closed such as the and
tags. The XHTML standard requires all tags be closed. As I understand the idea, it's just a better means of presenting that every tag is closed.Functionally there is currently no difference BUT it can lead to different behaviors in various browsers if you use invalid code.
-
RE: Has SEOMoz Domain Authority calc been updated recently
The DA figures in your SEOmoz toolbar should have changed on July 25th when the Linkscape index updated. Someone please correct me if I am mistaken but I believe the toolbar only updates when the Linkscape index updates.
Increased DA is a good sign for your site and let's you know your site is measurably improving it's link support. The change needs to be taken in context. It's not as if when you see your DA improve that you should run to Google and check your current rankings. Google rankings change daily, and in fact can change throughout a day. Google updates their metrics constantly where the Linkscape index updates around once per month.