Not exactly:) Read here: http://microformats.org/wiki/hcard
Posts made by sesertin
-
RE: Developing my footer area
-
RE: Should I interlink all the websites my company owns?
And that is even a better advice than mine.
-
RE: How does Google determine freshness of content?
I think there is one thing that overcomes freshness of content: regularly updated content. So it is not enough just to write a single page about a topoic and publish it and wait for google to index. It is considered to be fresh but it is not updated and probably will not be able to make it to the top. Look at newsportals on the other hand: they are always publishing and publishing and getting indexed several times a day.
So the consequence is that it is not enough to have fresh content once a year, but you need to have regular fresh content - the more the better. This way you are getting indexed frequently, so you don't need to worry about google finding fresh material, and your rankings will improve as well as google sees your site is up to date, it always contains the latest news (you would not like to learn form a 10 year old seo book neither).
Two things you can do besides regular writing to get indexed more often is to implement google site search as your search engine inside your site and to keep your pages as light as possible. Google always spends a planned amount os time on your site. It can make a big difference if google can index 5 or 10 pages in that given amount of time.
-
RE: Developing my footer area
Hello,
I would definately place there my primary site-wide keywords with large and bald fonts and make them a link to the relevant pages. if you are going for local seo than placing there your address may carry additional benefits, just make sure to do it in a hcard format. If place a hcard your campíny name will definately be needed there. Keep in mind that footer area forwads the least pagerank of any website area but it can be an important navigational aid for users.
-
RE: Should I interlink all the websites my company owns?
If I get it right sites are based on the same topic, so interlinking them carries both user and search engine value (maybe a little bit less for engines than for users as google maybe knows that you own both sites so it gives less value to the links). When you do your linking just keep in mind what pages would you like to rank with and hw pagerank flows so do your job based on that http://perthseocompany.com.au/seo/tutorials/a-simple-explanation-of-pagerank (thanks ALan). Good luck!
-
RE: Robots.txt usage
Others are right by the way canonical may be better, but if you insist on robots restriction you should add two schemas to each parameter:
disallow:?view=m disallow:?view=m*
so that you block the urls that contain the parameter at the end and block the ones that have it in the middle as well.
-
RE: Compare URLs with 302 redirects
As I see your proble goes further than redirection issue. When I first clicked http://www.theprinterdepo.com/catalog/product_compare/add/product/100/uenc/aHR0cDovL3d3dy50aGVwcmludGVyZGVwby5jb20vcHJpbnRlci1wYXJ0cy5odG1sP3A9NA,,/ it took me to http://goo.gl/XMaZg but when I clicked it for the second time it added the product to the comparison list:
So when I was first redirected that was due to the page uses cookies and it supposed that cookies are not enabled on my computer so it redirected me to http://goo.gl/XMaZg to tell me to turn cookies on. But it got it wrong as for the second time I could add the product to the list without doing any setting with cookies so there must be some sort of trouble with this feature and this causes the redirection.to http://goo.gl/XMaZg. Should the program get it right is cookies are turned on, or should you use session ids besides cookies, redirections would not be present anymore.
-
RE: Robots.txt usage
You can do the restriction you want but if i get it right m stands for map view g stands for gallery view and l stands for list view. So if you want list view to be indexed and map and gallery view not to be indexed you should add two lines of distriction:
disallow:?view=m disallow:?view=g
if these paratmeters are not at the very end os the url you should add * after the letter of the parameter as well in the restriction
-
RE: Are multiple embeds of the same video considered content duplication?
Repeating just a snippet of infromation is not considered to be duplicate content. So if you have a few lines of text appearing repeteadly somewhere on your site, or multiple occurances of a video taht is not a duplicate content as long as the page contains enough individual information.
The other thing is that engines can't really find out video content they just make conclusions based on the descriptions you give them. Vary descriptions and titles.
-
RE: Historical search engine results
I don't know such tool besides webmaster toold that represents rises and falls in rankings, but maybe you can get some use of this one: http://www.archive.org/web/web.php It is actually a web page archive so you can get a view on a former look and content of the webpage. Maybe this can give you some clues about the rankings.
-
RE: Meta keywords
It is absulutely true. In the old days it was enough just to stuff your keywords section with related keywords to rank high in google. So it is not a surprise that lot of peole has abused this functionality. First goole just gave less significance to keywords section but now it uses more obvious metrics in its algorythm. You could stuff your keywords with seo audit for example and your page could be about webpage development and you could gain some extra traffic but it is much harder to put seo audit is the title on the page in the url, in picture names and have a totally different content. So google stopped using Here is the official video from Matt Cutts: http://www.youtube.com/watch?v=jK7IPbnmvVU
As I know yahoo still uses this part to some extent.
-
RE: Duplicate content? Split URLs? I don't know what to call this but it's seriously messing up my Google Analytics reports
Hello Dave!
As I see the redirection is in place and www.enf.org/camp redirects to www.enf.org/camp/ and so on with the othe urls: all of them redirects to the one ending with /. I think you should simply neglect the ones tat do not end this way as anyone opening them would end up on the www.enf.org/camp/ type, but that is not rue the other way. So if I type www.enf.org/camp/ will never see www.enf.org/camp.
Consequence: if you examine only www.enf.org/camp/ you get the number of people visiting www.enf.org/camp/ and www.enf.org/camp as well. This is true woith all the other ones. It is enough to examine the one the users are redirected to.
-
RE: How can I see if my website was penalize by Google?
If you don't purchase links in bulk for money and don't link your content from suspicious pages dealing with porn and sexual stuff, obtaining links from relevant pages you have got nothing to fear.
A goog one to read: http://www.seobook.com/archives/001792.shtml
-
RE: How can I see if my website was penalize by Google?
I am more and more convinced that you are not penalized. I cpied and pasted a sentence from your home page to google and you came up in first place: if you were penalized this would not happen: this is the strongest sign. I saw you only have 7 backlinks. If I were you I would spend my time on building more backlinks and don't give a s..t about PR it will surely climb. This algorythm is really slow. I saw sites with 30000 visitors a month and no pagerank. You are not penalized that is for sure. Keep working you are earning your money on visitros not on pagerank. Bumber os visitors is increasing. Your pagerank surely will as well, it is just a question of time.
-
RE: How can I see if my website was penalize by Google?
You are not penalized, I am sure. Did you look at the tool I suggested? It is a pagerank prediction tool. As I said maybe the pagerank algorythm itself has to be updated and that is the matter. Did the prediction tool give any result?
-
RE: How can I see if my website was penalize by Google?
Hello Pedro,
If you are climbing fast in the serps and number of visiors coming from search engines are increasing you are not penalized for sure. Just think consequently: if you did not like somebody you worked with earlier as he did something serious that is against your firm rules would you have him to do part of your jobs or would you totally ignore him? If your site was penalized G won't let it rank high for any terms at all. It definately won't let it climb even higher.
Pagerank is an interesting fact to look at but don't get lost with it. Algorythms are updated periodically, not always. Maybe the pageran of your site was not updated since you lunched it. If you are really intrested you can have a look at this tool: http://www.selfseo.com/google_pagerank_prediction_tool.php
I think overall number of visitors, time on page, pages per visit, actual ranking are much more important factors than pagerank is. You should better give those more attention and your pagerank will climb sooner or later
-
RE: Does keyword at the very front of meta description have impact?
It has some impact bust not very serious, I would say converging to zero. I think writing a good description that answers the serchers's querry and generates clickthroughs is more important than stuffing your kw in the first few words.
-
RE: Title tag question
Surely first one is better. If you look at the second one than football boot makes up nearly 50% of the the text which is close to keyword stuffing. One other option could be to use like cheap football boots - football boots to indicate google that those are two totally different querries you are targeting, but I would defintely use the first one only. In addition you have both words in your domaini as well.
-
RE: Usage of HTTP Status Code 303
It probably will not pass your link juice if any. Zhis is the difference: 301 status codes are passing on 90% of the link juice the inbound links are giving to your pages.
For users it is good to redirect them to semothing else. The fact that a products period is over does not mean that it will not be searched anymore. Keeping old pages at least in the sitemap will not blow your pages at all. I would do that, however technically if there are no inbound links pointing to the pages that you want to 303 redirect, it will not hurt your seo.
-
RE: Experience with 307 HTTP status code
Hello,
It is definately a bad idea. It is simply because google can not detect that your page is not present due to your supply shortage, it yust sees that page is present sometimes and sometimes it is gone. Now ask yourslef: would you refer somebody to your friend who sometimes does the work that he is asked to do and sometimes he just quites in the middle. I think you should rather refer one who is always there and can be trusted. The same with webpages. No matter how good your page is is it unavailable for shorter and longer periods on a regular basis than it is not trustworthy. You should rather simply wite to the product that it is out of stock. Not to mention that it is much easier to solve this in your cms.
-
RE: Multiple country site versions and hosting
Hello Gary,
The better solution for sure is to host sites in their own county like mentioned here: http://www.dirjournal.com/articles/multilingual-seo/
At the mean time you should take the product into consideration as well. If it has low serch traffic and not really competitive in the target country you can give it a try with one page, as a tld for english sites does not prohibit it to rank in France for example, so traffic and competition matters, but different tld and hosting are better choice in the long run.
If you are doing it with one site you can implement a content delivery network so that you pages are loading faster in target areas and taht is good for seo, however it does not replace the signal the local server gives to google about the target area.
-
RE: Usage of HTTP Status Code 303
Hello,
Are your products gone forever for sure? If you place 301 or 303 the visitors clicking your pages from the serps will see a new content instead of a 404 eror page that is for sure, so it has its user side benefits. However if you are ranking for these products in google and these words are bringing in traffic to your side i would think twice to delete those pages. I you delete the actual content you are ranking with the useres and the engines will see a totally new content, so if you lose your product specific pages you will also lose your rankings sooner or later.
I would leave those pages but do a little reorganizatin on the landing page. I would push the current content a bit downwords and place a one-two line convincing text why you have finished to sell those products (why users should not serych for them longer) and give an alternate better solution for the product type they are searching. So like we have finished selling lithium batteries as the new xy technique has longer 2x life period, and has half the time to charge. You can look at these astonishing products here
-
Blocking robots.txt
Do you know any methods to block robots.txt file from external users?
-
RE: Block all but one URL in a directory using robots.txt?
According to my knowledge this possibility does not exist. One fast method to get over this is to get a crawler program to crawl your urls, so that you can quickly copy out all url in the folder paste in in the robots.txt and leave aout the one that you want in the index.
-
RE: How can I verify who links to me?
in analytics you can actually see the links that send you visitors. You can have a look at them under traffic sources and linking sites. However this is not a complete set of sites linking to you.
-
RE: Mobile sites link strategies
An intresting thing to know would be if the site has already got several inbound links. If not I would definately start up with rewriting the urls with the aimed keywords and redirecting the old ones to the new ones.
This is a hard work for sure, but has its beneftis.
The actual linkbuilding startegy is another question and it is different in each ad every case depending on the subject of your site. You should try directory listings, blogs and forums to start up.
-
RE: How can I verify who links to me?
Open site expolorer here by seomoz, or you can try open site explorer by yahoo. Here you can find a list of some more if you would like to try multiple ones: http://www.toprankblog.com/2009/11/1-seo-tools-for-tracking-inbound-links/
-
RE: Multiple URLs for the same page
Hello,
This is seriously hurting your seo. The best way to get it over to implement canonical links in the head section of each page, so that it will tell google which page to display in the serps. You should choose the all lovercase letter one with no .aspx ending.
-
RE: Wordpress for e-commerce
I am not intrested in the payment itself but rather in the basket function and this kind of webshop stuff
-
Wordpress for e-commerce
What plugin should I use to make a webshop taht is good for seo as well?
Should I use wordpress indeed or should i use some other open source CMS?
-
RE: 7 years old domain sandboxed for 8 months, wait or make a domain change?
Great question, honestly, i haven't got a good answer
-
RE: 7 years old domain sandboxed for 8 months, wait or make a domain change?
The case you mention is not sandboxing but penalty due to bad links. Sandbox only occurs with totally new domains in the first few months. Maybe you or the previous owner of the domain have purchased bulk links and google discovered it. You should remove those links and ask for reconsideration in google webmasters.
-
RE: How to change URL of RSS Feed?
I think than you had better consider legal steps. If they have acces to your content from a third domain or ip they can also do that from a furth or fifth one. So no metter how many ips you will block, if they do know you feed address they can subscribe with a completely new one. In my opinion if this is the case than legal solution should be the best for you.Copyright your articles.
-
RE: How to let Search engines index login-first SNS sites?
hello Boson,
I can help you with google. The thing you are looking for is: first click free. You let anyone to access the first page of your content when getting to your site from the serps but request them to log in for the second one. You can do this with a session id. In depth: http://googlewebmastercentral.blogspot.com/2008/10/first-click-free-for-web-search.html
-
RE: Search for signed in users
Hi Atul,
It would be intresting to see the whole article but i think this refers to google algorythms: as when you are logged in with your google account you get a totally different serp for a term you have already searched, than when you are not logged in. This is because if you have already visited a site, spent time on it, maybe returned a few times google considers that site more important to you, so would place it more foreward in your own serp maybe even to the first place, however when you log out the page can still be on the second or third page of the serp.
This will become even more actual with google+ as it can considers your friends likes and interests as well.
-
RE: How to change URL of RSS Feed?
Maybe one of the simplest solution is not blocking those sites to access your content. Make your h1 tag a link to the actual post. So if you have a post title post1 on the url domain.com/post1, the post1 heading on the top of the page should point to domain.com/post1. So if anybody steals your content than they will point back to the original content on your site, so of all copies your site will be he strongest one with the most links.
I would also place links in the content body pointing to my other pages, so anybody copying my content would be appriciated as they are giving handful of backlinks in return. I would also write a little info panel in the bottom: this atrticle was originaly posted on: www.domain.com and written by xy.com. Find similar articles here: domain.com/relatedposts.
In the mean time I would place a leagl statement that copying my content is all right, but just with the links included.
-
RE: How to change URL of RSS Feed?
You have to cotact with your programmer to change the url for you, or in some cms systems you can do it in the backend.
What do you exactly mean by scraping? IF they steasl your content, than using a new url is not the best solution fo you.
Rss and rankings: rss usually contains the same information that is already available on your site on some url (not in all cases of course but usually). If that is the case than the feed only has negative affects on your rankings as it duplicates the content: the exact same text that you can find on domain.com/xy can be found on domain.com/feed/xy. So if that is the case you should not worry about your rankings.
If you change your url you should also redirect teh old one at the same time, if you do not do this than all of those who are yet subscribed will lose your feed, you do not want that. If you redirect, than anybody who knew the old url will get to the new one. I think it is pointless.
I would block the ip adresses i do not want to access my content. You could also try to apply legal stuff: say nobody is allowed to use your copys on their own sites. It is easy to find out if anybody does.
-
RE: Reviewing good answers
I am just simply intrested which answers were marked as good.
-
Haven't seen anything like this
I personally haven't seen anything like this before. I am optimizing my first worldwide page. The rank checker at seomoz and another rankchecker as well says that I achieve rankings in the Us and in the UK as well for my 2-3 word terms. However if I copy a paste a whole sentence from either of my pages (8-10 words) google can not find my site. If I put the terms in quotes than it is ok, but it still cannot find the home page. Not any sentence with or without quotes from the home page, although this page has the most incoming links.
Anybody any idea how can it be?
-
Reviewing good answers
Can I somehow look which of my answers were marked as good answers?
-
Speed checker
Is there any speed checker to check your page load time on localhost so that bandwith does not affect the result?
Preferably I'd like one that lists critical issuse or solutions that can be improved.
-
RE: Should I make All My "Non-Money" Pages No-Follow?
You should not. You can mark the inbound links nofollow. Despite this Google will probably find the page and is you keep the outbound links the page still gives some page rank to the other pages.
Another consideration is that non-money keywords can still hold important keywords that bring people into your buying funnel. If you have important keywords there you shouldn't even nofollow links. You should definately keep the contact us as it contains valuable information for local seo.
-
RE: Triple listing in rankings
Hello.
First: your competitor's better ranking maybe has good reasions as google algorythm is pretty musch complicated and works with numerous data. However if all of your factors are better I think your competitor'd domain must be older, when all factors are better ususally this is the case, but without examining the page this is just a guess.
I don't think you really want to get rid of triple ranking this is a pretty good thing. To change order for vca cursus simply apply internal links that point to the page you want to be first for vca cursus. The anchor and the title of the links has to be vca cursus of course. Use that term in the heading tags of the disered page. To reinforce this and to beat your competitor obtain extarnal links to the page you want to be first with for that term. Again: link anchor alt and title has to be vca cursus.
-
RE: Page with h1 and h1 class=
You can use 2 h1 tags but if the first one is empty than you surely don't need that