I'd say you can safely put this issue aside. As you've mentioned yourself, xml files don't need title tags, and I presume you aren't trying to have this page rank in the SERPS in the first place anyway. Perhaps one of the SEOmoz fellows can look into this?
Best posts made by Theo-NL
-
RE: Title tag on sitemap.xml
-
RE: Blog as sub folder or subdomain
"99.9% of the time, if a subfolder will work, it's the best choice for all parties." (http://www.seomoz.org/blog/subdomains-subfolders-and-toplevel-domains).
Even though this is a post made in 2006, I don't know of any relevant changes that would alter this advice.
-
RE: What is the Title Tag length for mobile pages optimized for the iPhone?
That depends a lot on what you're trying to achieve with those titles. Are you looking for maximum exposure of the title to the user? Then I'd suggest ~45 characters (which is the maximum number of characters visible in the Safari browser, my personal favorite 'Atomic' shows only 10 characters in the tabbed interface). If you're trying to have the page ranked in the iPhone version of Google, I'd advice the regularly advised title tag length of no more than 65 characters as Google appears to have the same cut-off on mobile and desktop browsers. Or perhaps there is another goal you're trying to achieve?
-
RE: Maximum length of a URL for good SEO?
There isn't really a maximum or optimal length to a URL (well, technically it is 2048 characters [http://www.boutell.com/newfaq/misc/urllength.html], but I don't think that was the kind of limit you were refering too).
In general it is a good idea to keep the URL as short as possible, but at the same time try to include keywords in it that make it clear to both the user and the search engine what the article is about. As with many things in SEO: try to find the right balance.
-
RE: Can I use canonical links outside of the head section?
Unfortunately you can't, as Dr. Pete explained in the following blog post: http://www.seomoz.org/blog/6-extreme-canonical-tricks
-
RE: By paying you guys...?
Even though I'm not a SEOmoz employee, I think I can answer this answer on behalf of them.
SEOmoz will not make your website more visible or accessible as a direct result of your payment. If you decide to take a PRO membership with the site however, you'll get access to premium resources that will enable you to take your website to the next level with regards to SEO, and keep increasing that level over time.
By using the (premium) tools that are available to PRO members (such as the campaigns that you can use to monitor weekly crawls of your selected websites), you can see very clearly which mistakes you've made on these sites and what you can do to improve your SEO.
-
RE: Paging. is it better to use noindex, follow
From what I've read on the internet, it is best to "noindex,follow" all pages >1. This issue had bugged me for quite some time as well, and I've struggled to find good resources explaining why their solution was the best. Now that I've actually given the subject some thought, and finally managed to read some quality material on the matter, it all makes sense.
It's basically a checklist. Do you want search engines to
-
index your paginated result pages: yes / no
-
reach the items that are listed in your paginated result pages: yes / no
In most cases you don't want your paginated result pages to be indexed. With our without Panda, visitors get little value from actually viewing 'page 7' in your result pages. That actual page provides little or no value to those visitors. However, you DO want those items listed on these paginated pages to be crawled, especially when you don't have any other pages linking to them (which you should by the way). This boils down to:
-
Don't nofollow your paginated links (because you want search engine spiders to reach them)
-
Put "noindex,follow" in the meta robots tag for all pages >1 (thus page 2 and greater) so the engines will no index these paginated results, but will crawl on to the pages that are behind the listings
Good luck!
-
-
RE: Why are we not seeing similar traffic patterns in a new market?
Some of the following will be guesswork since you didn't provide any URLs, but I'll try my best. This old (Atlanta targeted) website, has it been around for a (quite) longer time than the newer (Nashville) domain? Besides the amount of links the older domain has most likely collected, domain age appears to influence ranking on its own (even though only slightly, #10 in the ranking factors http://www.seomoz.org/article/search-ranking-factors#ranking-factors). Does the Nashville targeted website have the same amount of local (and related) backlinks as the Atlanta targeted website? You've mentioned that the Nashville website is only live for about 2 months, which I'd consider a really short time to draw any real conclusions to be honest.
With some more time and the same effort as you've put into the Atlanta targeted website, I'm sure the new one will perform in a similar fasion!
-
RE: Changing server location for a global targetted site
You should be looking at this from two different angles: (1) how would the hosting affect your rankings and (2) how will it affect the loading time of your website.
In terms of rankings, especially since you're considering to purchase a .info domain (side note: are you sure about this? Domains with .info isn't exactly considered a super premium TLD), I don't think it'll make much of a difference whether you'll be hosting in the USA or in Germany. Had it been a .de targeting Germany (much like the articles you're referencing) things had been different and I would've advised you to keep hosting the website in Germany.
Add the fact that it seems that fewer than 5% of your visitors are coming from Germany in the first place, I would say a move to an IP located in the US wouldn't hurt your number of visitors. Perhaps it might even increase your rankings in the USA marginally because Google now finds your IP hosted over there.
In terms of loading time you might want to consider CDN solutions to serve your website as quickly as possible to a global crowd.
-
RE: How much is too much?
You're stating "I tend to link using Branded ...", are these links internal or coming from external websites? If they are internal, there isn't anything to worry about. When you have actively 'build' those links from other websites, I would be more cautious with pages that you're giving keyword-rich links, and indeed mix it up a little.
What you would want most is to have a natural overall profile. Not necessary a perfect profiles for each page (which, ironically, would be unnatural), but a pattern that would look perfectly diverse, skewed and chaotic in all its natural perfection.
-
RE: Converting to Joomla - will we lose ranking?
I've recently written a blog about this very subject, you might find it interesting:
http://www.finishjoomla.com/blog/34/google-rankings-dropped-after-switching-to-joomla/
-
RE: Google cache my backlinks two days ago but i did not see results
You'll have to wait a LOT more. Results such as these coming from a couple backlinks might take weeks if not months to actually process into higher ranks for specific keywords. Hang in there, keep getting great links, and eventually your rank will start increasing!
-
RE: My nofollow link is showing as a 302\. Is this OK?
Not necessarily entirely 'their issue', because if Jeremy is linking with a 302 redirect without wanting to do so, this issue might occur on more places across his website.
To expand on the question by Alan: is this (1) an internal link between two pages on your website, (2) an external link from your website to another website or (3) a link from another website to your website?
-
RE: Is it necessary to choose local server?
It is not absolutely necessary to choose a local server in the UK. However, these servers will serve the content faster to visitors from the UK than servers from (for example) the USA will.
There was quite a good Whiteboard Friday on the subject on international SEO a couple of months ago, you might wanna check it out: http://www.seomoz.org/blog/international-seo-where-to-host-and-how-to-target-whiteboard-friday
-
RE: Can too many high PR links can hurt SEO
As long as these links are acquired in a natural and valid way, I wouldn't worry too much about them.
If you had acquired a couple hundred links in a short time, when your website had only ten before, that looks unnatural, which isn't good. If you already have thousands of valid backlinks, a couple hundred extra won't hurt you as this fits the natural pattern. Besides, PageRank isn't a great metric to base any decision on these days anyway as it seems (http://www.finishjoomla.com/blog/33/why-pagerank-lost-its-value-for-seo/).
In regards to the rankings: I personally think these are either natural swings or there is more at play than just a handful of new and naturally acquired backlinks.
-
RE: Is it wise to target different keywords for each page?
I would suggest to limit the number of keywords targeted per page. Having just one (or just a few) keywords per page allows you to target them better on-page, and you'll have more targeted incoming links (both in terms of relevance and anchor text) as well.
Targeting the same keywords on all pages might result in so called keyword canabalization as well. Read more on that subject here: http://www.seomoz.org/blog/how-to-solve-keyword-cannibalization
-
RE: Panda Update: Isn't a link still a link?
Yes, a followed link is still a followed link. It was like that before Panda and it is like that after Panda.
However, not every followed link has the same value. For example: a link from CNN.com has more value than a link from buy-viagra-online-now.info, this holds true both before and after Panda.
What Panda did was decrease the value to you that a link from eZineArticles.com has. Whether you're wasting your time is up to you to decide. Given that you know the article submitted to eZineArticles will only generate a low-value link, is that a wise task to spend your time on, or is that time better spend elsewhere?
-
RE: Developer comments in code & SEO
Comments visible in HTML code do increase the file size, but assuming these blocks of code are of a reasonably normal size, that shouldn't be a problem. Search engines ignore everything that is commented out (mainly because it would make gaming the system so easy).
-
RE: Linking to Adsense heavy sites?
It was all good until you've mentioned that you intend to like to 100 'spammy looking' URLs. 'Just a couple' wouldn't have been a problem, but 100 (especially since you'll start linking to them in a short period of time) seems a bit much.
Linking out to bad neighbourhoods is seen as a general negative signal for your position in the search engines. This does make sense from the perspective that in the real world you don't want to be linked to too many bad neightbourhoods as well. Why exactly do you want to start linking to those 100 'spammy domains'?
-
RE: Too Many on page links! Will "NoFollow" for navigation help?
Unfortunately it won't. Some time ago (like one or two years) Google has announced a change in the way they handle nofollow in combination with link juice. Adding nofollow will only prevent juice from flowing to those pages, but won't distribute it over the other pages like it would when the nofollow'ed links weren't there.
-
RE: What is the best way to change your sites folder structure?
My gut feeling says #2. I'll spend the rest of this post thinking out loud why I think that one is there better option (though I don't think there is actually a 'wrong' and a 'good' option here, both have their advantages en disadvantages).
-
Both your visitors and the search engines will stop visiting the old URLs as fast as possible (saving you bandwidth on the redirects).
-
Less 'code overhead' regarding cases such as 'did I change that one already?'
-
You are treating search engine robots and human visitors equally
Love to see what others have to say about this!
-
-
RE: Java-script slider & H1 tags
The one(s) that is/are visible when you view the page without Javascript. You can use http://seobrowser.com/ to view your website like a search engine spider sees it.
-
RE: Should I use www. or not in my main URL?
Both are equally good and neither are preferred more or less by the search engines. Just choose which one you like more (for whatever reasons) and redirect the one you don't choose to the one you do.
My personal choice is www as it looks more 'classic', but various reasons could be given for the non-www as well (such as the fact that it creates shorter URLs for example when Tweeting one)
-
RE: Meta Robots Tag - What's it really mean?
You can find out your desired meta robots setting by asking yourself two questions:
noindex or index: do you want this particular page to show up as a listing in the Google results for a query? Yes: index, No: noindex.
nofollow or follow: do you want the links on this page to be crawled (followed) by the search engine robots, thereby allowing them access to the deeper pages it links to (even though that particular page might even be noindexed). Yes: follow, No: nofollow
-
RE: Does it sound like a linkwheel to you?
You seem to be linking in and out for the right purposes (providing additional value for your visitors), and don't plan to do it in extreme amounts. Even though all the interlinking between these sites might seem sketchy, especially considering they are from unique c-blocks I personally don't think this will get you in trouble, and might indeed get some boost in traffic and authority along the way.
-
RE: Google: show all images indexed on a domain
Using the site:-command in Google images works as expected for me (as it shows all the images on the domain I entered). Have you tried it?
-
RE: Old articles in a blog
You could use OpenSiteExplorer to see if any of those pages still has incoming links. If so, I'd recommend to redirect those pages to (in order of preference): an updated version of that article, the category page for this article or your main page. This would make sure the least amount of dead links turn up on the internet (and on Google) and the most link value is saved for you in one move.
-
RE: Putting A Blog On A Sub-Domain The Right Thing To Do?
"99.9% of the time, if a subfolder will work, it's the best choice for all parties." (http://www.seomoz.org/blog/subdomains-subfolders-and-toplevel-domains).
Even though this is a post made in 2006, I don't know of any relevant changes that would alter this advice.
-
RE: Changing domain extension to detoxify a domain
I personally wouldn't rely on the advice of just one company that 'your domain is toxic' (whatever that might mean). In order to make such a drastic move, I'd first get a second opinion on this, or run the site through some tools.
Perhaps the company doing the SEO just isn't capable enough to get your higher rankings? What arguments are they presenting that might be causing the 'toxicity' for your domain?
-
RE: Uhhh... How do I log in?
You seem to already be logged in. You can click on the icon of the person (with the fancy lines next to it) in the top right corner to see your data. If you were logged out, a 'Log In' button would be visible.
-
RE: Title Suggestions for my New Book?
Inbound Marketing for Beginners
Inbound Marketing for Traditional Business Owners
Search Engines, Social Media and You
-
RE: Keyword Dulication in Tags
I'd say Google isn't seeing this as a partly duplicate title. Especially considering both words contain no spaces (and even if it would, I'd doubt Google would see it as duplicate if both versions had significant search volume individually).
I must add though that I can't recall any hard evidence stating either way, but I suppose others that reply can.
-
RE: New domain name for existing site
Hey tgraham,
Personally I don't see any objections to sending the traffic from the new url to a subfolder on the old domain. Since this page will probably be more relevant to your redirected visitors than your home page, your visitors will have a better user experience this way as well.
Kind regards,
TheoPS: I presume you're aware of the fact that this way of utilizing the new domain will keep it from ranking in the search engines? By using a 301 redirect only the old domain will rank, not the new one.
-
RE: Does increased adwords traffic boost your search engines rankings?
Short answer: no.
Longer answer: read the following article (http://www.submitawebsite.com/blog/2010/03/does-buying-google-adwords-improve.html) on the relationship between SEO and PPC.
-
RE: Domain Authority Question
You could have a look at the full rapports as generated by the keyword difficulty tool by SEOmoz.
It provides a pretty complete pictures of many of the factors at play (on-page optimization, inlinking data, anchor text distribution, etc.) and might help you to figure out what is causing these lower DA domains to outrank you.
-
RE: Stop List and Keywords
Is this a public list? If so, I'd love to have a look.
I don't see why Google would ignore words in the first place, and even more so why they would ignore the word 'self'. In theory I could see why the would ignore words like 'a' or 'the', but not 'self'.What reason does the author of the list state for Google's exclusion of the keywords on it?
As a side note, it is always wise to optimize both incoming anchor text and on-page optimization for closely related variations of words (such as 'self tan' and 'tan' in your case).
-
RE: Expiring URL seo
Why not use the 301 redirect? Even though the content strictly hasn't move but has 'expired', I'd use the 301 redirect to take both visitors and links from the old expired page to another page that 'catches' them. Good for both the user experience (perhaps you might place some links to jobs related to the one that we were redirected from on the new page?) and for your link juice.
See this topic for more information: http://www.seomoz.org/q/what-do-you-do-about-links-to-constantly-moving-pages
-
RE: Reciprocal Link Advice
If the association has explicitly requested a link back, I think it would damage the trust between the client and the association to make that link a no-followed one. Yes, the value of a one way link is greater than that of a reciprocal one in most situations, but in this case the trust of the association (which already links to you!) should be worth more than the 2 milligrams of extra link juice you might earn by no-following the link.
-
RE: Best way to create page title to products catalog
In the example you've presented, I don't see a duplicate title tag. Product name A is different from product name B, which makes title page A different from title page B.
If you want those titles to be even more unique, you could consider sometimes such as:
Title page (A): Product name A | Unique Selling Point for product A with Good Keywords
Title page (B): Product name B | Different Selling Point for Product B with Different Keywords
-
RE: IP block in Google
Is this happening every day at 10AM? What happens between midnight and 10AM? Before midnight? Is this static IP on a blacklist? Does this static IP change over night? Where are you getting these 'impossible CAPTCHAs'? Where is 'Google blocking you'?
-
RE: Can backlinks negatively influence your ranking, or worse, cause a penalty?
The (in)famous Google Panda update (thousands of posts on Google about this) struck international websites on the 24th of February, right in the same month where you reported the problems started. Perhaps the drop is traffic is a result of this update and not of the backlinks?
Regarding those links: I think a single website that links to you (be it with millions of links) probably won't be the cause of this drop. Unless, these backlinks were put in place close to (before) the date where the traffic started dropping. Have you tried contacting the website that has those links up and request them to remove your site from their system?
-
RE: Do links from a Tumblr blog count as inbound links?
I think they do, as I can't find any 'nofollow' on their links. However, unless if you have a really popular (and well linked to Tumblr blog), I don't think the value will be large.
-
RE: Why do I rank so differently in Bing and Yahoo?
A search on Google (how ironic) for 'google vs bing seo' will bring up several dozen articles, including one semi-recent by SEOmoz: http://www.seomoz.org/blog/google-vs-bing-correlation-analysis-of-ranking-elements
-
RE: All page files in root? Or to use directories?
I would personally place the keywords at the end for clarity. It indeed seems unnatural to have the id as the final part of the URL. Even if that does indeed cost you a tiny bit of 'keyword power', I would glady sacrifice that in exchange for a more user-friendly URL.
Limiting the amount of words in the URL does indeed make it look slightly less spammy, but slightly less user friendly as well. I guess this is just one of those 'weigh the pros/cons and decide for yourself'. Just make sure the URLs don't get rediculously long.
-
RE: Is there a way to tell if a directory is passing Page Rank or is devalued?
I don't think there'll be a trustworthy tell-tale to know what directories are still passing link juice and which aren't. You might want to try looking at the 'Domain mozTrust (DmT)' as reported by the SEOmoz toolbar. The higher this value the greater the change directories are still passing value.
-
RE: OnPage Issues with UTF-8 and ISO-8859-1
Getting a web page to display your content as TRUE utf-8 requires everything to be set at the utf-8 encoding. 'Everything' includes: your database, your database tables, your database fields, your connection from php to your database, you header as set by php, your header as set by html, your content itself etc.
The following resources were extremely helpful to me when I was switching to utf-8 (which is by far the better encoding over ISO):
http://www.phpwact.org/php/i18n/utf-8
http://www.phpwact.org/php/i18n/utf-8/mysql
Bonus tip: make sure your content and files are saved as utf-8 without BOM (Byte Order Mark), this will save you lots of trouble later!
-
RE: Robots.txt disallow subdomain
You could use enviromental variables (for example in your env.ini or config.ini file) that are set to DEVELOPMENT, STAGING, or LIVE based on the appropriate environments the code finds itself in.
With the exact same code, your website would either be limiting IP addresses (on the development environment) or allow all IP addresses (in the live environment). With this setup you can also set different variables per environment such as the level of detail that is shown in your error reporting, connect to a testing database rather than a live one, etc.
[this was supposed to be a reply, but I accidentely clicked the wrong button. Hitting 'Delete reply' results in an error.]