What is the goal of the sitemap. And I'm assuming you are talking about a html sitemap, not a XML sitemap.
If it is for users, guide them to the best category pages.
If it is for SEO, don't have one. Focus on building organized XML Sitemaps.
Welcome to the Q&A Forum
Browse the forum for helpful insights and fresh discussions about all things SEO.
What is the goal of the sitemap. And I'm assuming you are talking about a html sitemap, not a XML sitemap.
If it is for users, guide them to the best category pages.
If it is for SEO, don't have one. Focus on building organized XML Sitemaps.
It is alright to have outbound links, just make sure that they are relevant to the page it is linking from. I believe I have read that relevant outbound links are actually good for SEO. (But don't go crazy).
The best way to do this would be at page level. For the pages you don't want indexed, add
This will tell the search engines not to index that page, but to follow and index the other pages.
But I do have to say, I have never heard of anyone trying to not index the home page. It will be interesting to see how this turns out.
Bing controls Yahoo so make sure you are using Bing's Webmaster Tools and have XML sitemaps.
But the truth is that they are both different so it's really hard to control them both.
OSE is Open Site Explorer by SEOmoz. http://www.opensiteexplorer.org/
While you are making changes, I would monitor daily.
Your competitor has 1000 different websites linking to them. It could be from any page on their site to any page on your site.
It's not necessarily their home page.
At a quick glance, it looks like you might have over optimized the site.
I would slowly start taking steps backwards and see what happens with each step.
Also, looking at OSE, it looks like you are way too heavy on inbound links with "jobs in kent" (75 domains, 103,647 links) and "jobs around kent" (73 domains, 4,029 links)
Then from there it drops down to 4 domains with 16 links.
The main benefit is that it allows you to control the image, url, title, and description that gets posted on FB.
The robots.txt file should not show up. Sounds like there is something seriously wrong.
You should take a look at the metrics inhttp://www.opensiteexplorer.org
As you will see you have only 3 linking root domains, 23 total links, Page Authority 30, and Domain Authority 19.
Focus on getting inbound links.
It all depends on the amount of pages on the website. I would suspect that you will see results any day now.
Yes, SEOmoz is addictive!
Not sure. What happens if you request a crawl via SEOmoz's Crawl Test Tool
Also you may want to allow the SEOmoz spider, rogerbot, in your robots.txt file to see if this changes anything.
`User-agent: rogerbot` Allow: /
After I updated my Chrome browser to the beta version, 18.0.1025.140 beta-m, I started having the same issue. Prior to my update it was working fine.
As for now I just uninstalled the SEOmoz plugin and have been using Firefox until the Chrome browser I'm using is out of beta.
But... if this isn't the issue, it would be nice to know.
Yes this is a problem since you are able to access the home page of your website via two different urls, with and without www. This issue is can be resolved with Canonicalization.
More information about Canonicalization: http://www.seomoz.org/learn-seo/canonicalization
From the article:
For SEOs, canonicalization refers to individual web pages that can be loaded from multipleURLs. This is a problem because when multiple pages have the same content but different URLs, links that are intended to go to the same page get split up among multiple URLs. This means that the popularity of the pages gets split up. Unfortunately for web developers, this happens far too often because the default settings for web servers create this problem. The following lists show the most common canonicalization errors that can be produced when using the default settings on the two most common web servers:
Each of these URLs spreads out the value of inbound links to the homepage. This means that if the homepage has multiple links to these various URLs, the major search engines only give them credit separately, not in a combined manner.
SEOmoz gives you the tools to help you improve the traffic of your website.
Don't use html sitemaps, rather use XML sitemaps instead.
Here is a great post about utilizing XML sitemaps.
http://www.seomoz.org/blog/multiple-xml-sitemaps-increased-indexation-and-traffic
Well it all depends on the errors.
What you may have to do is tell the designer exactly what to do.
Start with all the Errors from your SEOmoz report, broken pages (404), etc. Provide the designer with this complete list (download the file) and have him start fixing them.
Errors is what you need to worry about. The only Warning I would really worry about is any 302s.
Can you tell us how many of each you have. (Errors, Warnings, and Notices) and out of how many pages.
I have had great luck with http://visualwebsiteoptimizer.com/ but they are more of an A/B testing company. What I found useful is their blog. I think you will find your answers in there.
Nofollow does not pass any juice... According to Matt Cutts of Google, Google doesn't count nofollow back links.
There is a great answer to that question here. http://www.seomoz.org/q/do-no-follow-links-have-any-seo-value-at-all
I wouldn't suggest buying, but focus on finding better locations to build Follow links.
You have to love Google
I guess my question would by, why are you creating a sub domain to help your main domain? All the effort you are putting into the sub domain, could be put into the main domain for more power. This is why everyone suggests "www.example.com/blog" instead of "blog.example.com".
Depending on what you are doing, a sub domain or a different tld are going to have the same effect. (I'm assuming your main domain is www.example.com)
I would suggest reading "Root Domains, Subdomains, Microsites and subfolders" from SEOmoz. http://www.seomoz.org/blog/understanding-root-domains-subdomains-vs-subfolders-microsites
Have you checked out SEOmoz's Directory list? http://www.seomoz.org/directories
2011 was an interesting year in the Google index cleanup. It will be fun to watch 2012 and see how far Google goes.
I'm not sure where to begin here since there are many issues. I think you should start by looking at the Web Developer's SEO Cheat sheet. http://www.seomoz.org/blog/the-web-developers-seo-cheat-sheet
pdf - http://static.seomoz.org/user_files/SEO_Web_Developer_Cheat_Sheet.pdf
and The Beginners Guide to SEO http://www.seomoz.org/beginners-guide-to-seo
Title: Most Important, Main keyword in beginning
Meta-description - No SEO, but use for SERPs call to Action and attention.
Meta-keywords - No SEO, Remove.
H1 - Important, should be one of the first things a spider crawls.
H2 - Not as important
H3 - H6 - Not important
Bold or Italic - Important - Main keyword should be bold or italic
Link Anchor Text - Important, most important links near the top, and Google will only count the anchor text for the first link. (same link)
Image Alt - all images should have Alt
Content - Content is extremelly relevant. But this section would need a whole page of explanation.
I would make it fun but keep them related to your sports industry.
No, they understand seo. That is why they are tying to get the links to their site instead of yours.
The issue is that Google won't and shouldn't index pages that are restricted.
This is best for user experience. Most people won't sign in to view the content.
You basically have to create two sites. One that is visible to all, and Google where you show or preview a bit. then the other that is protected.
All links and ads should come to your site. If they want to have a different splash (landing) page, then have them build it to sit on your site.
The only benefit is for them.
I guess my questions is why would you want Google to index something that is only available to registered users?
In order for it to be indexed, it has to be open to everyone.
You will have to figure out what can be shown as a preview and what can't. If you want something to be indexed, then you will have to create a separate section for your preview content (since Google won't index your protected content.)
OSE takes a while, so don't expect sudden results. Start analyzing the sites/pages these links are on in OSE.
For example, if www.example.com (where your link is from) isn't in OSE, then OSE won't find your link.
Since SEOmoz doesn't provide all of their metrics they use to calculate, it's hard to say just how much the correlation is. But I would have to say that the correlation would be there since good crawl penetration and juice distribution leads to better indexing and a better user experience.
SEOmoz Domain Authority
Domain Authority represents SEOmoz's best prediction about how a website will perform in search engine rankings. Use Domain Authority when comparing one site to another or tracking the “strength” of your website over time. We calculate this metric by combining all of our other link metrics (linking root domains, number of total links, mozRank, mozTrust, etc.) into a one single score.
To determine Domain Authority, we employ machine learning against Google's algorithm to best model how search engine results are generated. Over 150 signals are included in this calculation. We constantly refine this model over time. This means your website's Domain Authority score will often fluxuate. For this reason, it's best to use Domain Authority as a competitive metric against other sites as opposed to a historic measure of your internal SEO efforts.
I would guess that you are looking at two different metrics. Are you looking at the sub domain, or root domain (not what is coming in, but the page that is being analyzed)?
www.example.com (sub domain)
or
example.com (root domain)
or page specific.
If you are going to work with an outside company to help you build links, then you should thoroughly investigate the company and make sure they are on the up and up. And when they start, really review their reports and watch very closely all metrics with your site.
If the site is built well, then all pages would be accessible via 3 clicks. So html sitemaps really aren't needed.
I feel html site maps are no longer needed if you correctly setup XML sitemaps, and also link to them from your robots.txt file.
But if you still want them for users, then I would put it in the footer, and have Google Index and Follow.
What subdomain are you trying to get link juice for? app.mydomain.com or www.mydomain.com?
Never use a 302. You lose 100% of link juice.
With a 301, you will still lose a little, maybe 10% or so, but you will pass the link juice on to the app.mydomain.com
But if you are trying to get the most Link Juice to the root domain mydomain.com, then I would just suggest linking directly to the app.mydomain.com
You should set up the correct Canonicalization rewrites at the server level with IIS or .htaccess. (Not sure which one you have). If you know what type of sever you are on, then you can find all the correct rewrites. (www, non www, lowercase, trailing slash / , etc.)
For example, here is a great post if you have IIS. http://www.seomoz.org/blog/what-every-seo-should-know-about-iis
And you should also use rel=canonical tags.
You should definitely setup your site Canonicalization, and you should also utilize rel=canonical tags to help distinguish which page is the actual page.
For example, if you want to identify that www.example.com is the correct url, then you would use the following:
I agree with Ryan. You need to setup 301 redirects in IIS for all the old pages.
Never use a 302.
Even if you make the links nofollow, you will still lose link juice.
Just focus on putting your best links towards the top.
Here is a great post explaining why you still will lose link juice.
http://www.seomoz.org/blog/google-maybe-changes-how-the-pagerank-algorithm-handles-nofollow
I'm sure you have already spent a ton of time on Google, but it looks like there are a lot of options out there with a Google search of "spanish seo". http://www.spanishseo.org/
But I am curious what others have to say, because this would be nice to have as an option for clients.
I would go for the link but not the reciprocal link. On devaluing the link, it all depends on if the site is relevant to yours.
Yes, (well, depending on your brand name), I would use it. If the Title is structured professionally and ends with | The Company, it gives credibility to the page.
Here is a great post about that topic.
http://blog.tamar.com/2011/10/google-rids-keywords-of-apostrophes/
I believe it would be smart to use the year in the url do to the fact that people may be searching for a 2007 Cellar-pod Adelaide Hills Viognier. As long as the year is on the page as a header to reinforce what the url is saying.
This sounds like you will need to email help@seomoz.org to have them take a look.