They are not suggesting not to use alt tags, they are saying not to have a picture of your text, have readable text
Best posts made by AlanMosley
-
RE: SEO Info Bing - Logo and Name
-
RE: Microdata and dinamic data.
Yes they index dynamic data, but try to keep the urls friendly, you can dynamicly generate microdata just like any other markup this should not be a problem
-
RE: Why SEOmoz bot consider these as duplicate pages?
You are correct a canonical will take care of it, and using a canonical does not tell the search engine they are identical. It works just like a 301 except for the fact that it does not physically move the users to the canonical page.
But does the search engine take the content from all urls and give the canonical value for al the content, I an not sure it dose, I have never tested it, so I would rather do something with JavaScript or maybe use previous and next tags.
-
RE: To create extra pages, or not to create extra pages?
There is a case for more pages, depending on how you structure your internal links you can boost your home page rank by simply having more pages on your site see http://thatsit.com.au/seo/tutorials/a-simple-explanation-of-pagerank
I would also mark up your questions and answers with schema.org http://schema.org/Question
-
RE: DNS(Name servers), IP and domain.. All Showing Same Site...
Yes it is correct if its on the same ip. but you should 301 redirecet to the canonical domain.
1.1.1.1 and ns1.hosted.com should 301 redirect to www.xyz.com
What type of server are you hosted on linux or windows?
-
RE: Why might Google be crawling via old sitemap, when the new one has been submitted and verified?
you may still have links pointing to those 404 pages on your site or externally. If not then eventually they will fall out of the index
-
RE: How do URL's influence Google Rankings?
Having 2 or more domains will not help at all, it does not work link that. you would have to 301 redirect them to a single domain and that single domain is the only one that will help.
-
RE: Bing/Yahoo! Updates
Your rank on bing can certainly change without making changes, one of the signals bing use is user action, if people click on your serp, but then return to the search page and click on another serp, then it is assumed your page did not answer the query.
There is also somthing where if your serp gets more clicks than the one above it, you will take his place.
Also because these signals make it hard for a new site to get exposure they give any page sumited though BWMT a run, but if user action is not there it will drop again.
I have also had pages that have had big drops but come back again. Some times this can take weeks.
-
RE: No back links showing in site explorer but..
site explorer does not crawl the whole internet, could it be that the 2000 are low quality links from obscure sites. I think SE tries to get the best of the web.
-
RE: Duplicate page content due to Sort By dropdown
you need to use a canonical tag, to show that they are all the same page.
-
RE: 301 on ALL 404 for Pagerank Recovery, bad idea?
They will be ignored if you do this.
For large redirect projects, you’ll need to get dirty and redirect all that
you want to keep the value from. There’s no simple way around this. Skip mass
redirects to a single page, as most won’t end up passing value, and remember:
_keep all redirects pointed to pages which are relevant to the original. _
Skipping this step negates the value.
http://www.bing.com/community/site_blogs/b/webmaster/archive/2011/10/06/managing-redirects-301s-302s-and-canonicals.aspx -
RE: Bing/Yahoo! Updates
Can we get a url, it could be so many things.
-
RE: 301 or canonical for multiple homepage versions?
Agree with Paul here.
A 301 is a directive to the crawler, while a canonical tag is only a hint and is not always followed. Bing for one will ignore canonical tags if it believes they are misused.
as for the mention of "multiple 301 redirect" .
You do not need to have a 301 redirect for every url, just follow the logic
if HTTP_Host is not myPreferredDomain then redirect to myPreferredDomain -
RE: 301 on ALL 404 for Pagerank Recovery, bad idea?
it depends one what sort of 404, if its a 404 because its in the index, then i would let them 404 and fade away, if its a 404 because it is a external link then i would 301 to a relevant page, if they ae internal links, then i would correc the link, as not to leak link juice using a 301
-
RE: New Keywords stealing juice?
Egol have you thought about marking them up with schema.org
they have a schema for datatables also
i use html5 artcle tag, also the article schema, and relate the images to the article, by using the imageOject
representativeOfPage property
-
RE: Ranking better
do you have access to Microsofy SEO toolket?
I found 646 errors or warning scaning your site. Dont be too alaremed as very few sites pass, in fact i have found none that pass. If you have iis7 yoiu can instal it for free. if not i can send you the report.I would fix these first. Your efforts are discounted while these problems are not fixed. its a bit like fishing with holes in your net.
-
RE: 404-like content
You could try using either the global.asx file or a http model to do the rewiring, global.asx would be the easiest.
from memory the begin_request event would be the one to use.
the thing is you need to do the rewriting earlier in the event cycle.
-
RE: New Keywords stealing juice?
Its a lot of work, but you can copy and paste or use other ways of reusing code.
one day hay
-
RE: Google Docs Paranoia
I remember Matt Cutts say that no GA data is used. but then he also denies involvment in the Kennedy thing as well as his freemason membership.
-
RE: What to do with Deleted Posts?
What do you mean by low quality?
unless it was duplicate or very thin content, i would not worry.
but since you have delete them, you need to do nothing, those errore will eventualy go away, but if you have links to those posts, then you might want to do a 301 rediret to a relevant page to keep the link juice. If they were low quality the chanses are they did not have any links to them, so do nothing.
You can go to WMT and remove them from the index immediately, but its is not necessary -
RE: Why are new pages not being indexed, and old pages (now in robots.txt) remain in the index?
SE's dont take notive first crawl, they need to see resulst a few times before they take notive, this i think is make sure thet you mean what the see, and not just a mistake.
I would not exclude any pages with robots, this is a very crude way of doing things. if the page does not exists then it will drop out of index, if it does exist, then you would lose link juice thought all the links that point to those pages. If you have to no-index a page, use the meta tag "no-index,follow" so that links are still flollowed back out of the page.
-
RE: Duplicate content? Split URLs? I don't know what to call this but it's seriously messing up my Google Analytics reports
Zsolt is correct.if you have a 301 redirecet to the slash, you must also have your internal links point at the slash, or visa versa if you 301 to non slash your internal links should do the same,
This is a common problem i find in WordPress sites.
Why this is important is each redirecet leaks a bit of link juice, site wide this adds up to a lot.
Your site has many html errors, another thing common to CMS.
unclosed tags can confuse a bot as to what is visible to the user
-
RE: Domain name with separated/non-separated keywords
simon has said it all, hyphes are hard to comunicate and look spammy.
-
RE: Merging three sites to one
The only think you are 301 redirecting are requests, such as incomming links. so it is whether the page has links that is important.
You can 410 what is left, as it will get rid of the other pages quicker, but just letting them 404 will do the same.
-
RE: Basic Purpose of SEO moZ s subscription !!!!!
It wont do it alone, but there is a lot of tools and information that willl enable you to do a better job.
While not all information is hard fact, Rand and the crew do a goot job collecting data and running tests, Bing and Google are the best souces of information that they want you to know, I think SEOMoz is the best source for information after that.
-
RE: What to do with Deleted Posts?
If they have links pointing to them I agree with you, but if they dont, you should let them 404 or remove them though WMT,
-
RE: How much (%) of the content of a page is considered too much duplication?
James gives a good response.
i have a few tutorial pages, where a lot of the instuctions are the same, but the are still indexed and rank.
It maybe a guide of what you can get a way with
http://thatsit.com.au/seo/tutorials/how-to-fix-canonical-domain-name-issues
http://thatsit.com.au/seo/tutorials/how-to-fix-canonical-issues-involving-the-trailing-slash
http://thatsit.com.au/seo/tutorials/how-to-fix-canonical-issues-involving-the-upper-and-lower-case -
RE: SEOMoz & Google Webmaster Tools crawl error conflicting info
GA you can add more users to your account in WMT and GA, so thats not a problem.
Delete errors and see if they come back.
If you want to give us a url. i will tell you if you have any errors
-
RE: Is blogger could for building links
you could always pass a few links to your site, but you would need to promote your blog first so your not really getting anywhere.
What i would do is add a blog to your main site or a subdomain of your main site. you can then shape your jink juice with internal linking and any links you attract will help your main site also.
-
RE: Could ranking problem be caused by Parked Domain?
Could very well be,
Google has stated that they penalized m,any sites because the wrongly thought they were parked
-
RE: 500 errors and impact on google rankings
Yes 500 errors will affect ranking, SE like a site that is well mainated, obviosly they dont want to send users to a error page.
Can we get a url, I will take a look, and may be able to advice you.
-
RE: How do I clean up this 301 disaster?
i agree with Ryan your problem is not so great, just replace all teh 301's to the new site in one hop, dont be lazy and 301 all to the home page, as Bing for one will dismiss them.
also if the pages have no links, then there is little to gain by 301ing them.
As for hosting word press on a Microsoft server, you certainly can. if you use the Web Platform Installer it will do it all for you.
-
RE: Bounce rate and average visit time in an e-commerce site
Bouce rates should be seen as reletive, you may have a site that ranks for a lot of keywords that dont convert, there are many reasons why you can have a high bouce rate.
I would not worry so much. if it suddly moves then i would wonder why, but if it is constant, then so be it.You can look at what keywords are causing the bounce, that may explain it .
-
RE: Setting up Goals in Analytics when no common 'order confirmation' type page
That's a joke.
do it your self$("input[data-paypal-button]").click(function () {
_gaq.push(['_trackEvent', 'PayPal', 'Button', 'domain.com.au', 5])
})In the above code, all you need to do is add an attribute to the button, then include the script in a external file, job done
Where "data-paypal-button" can be anything, but should start with data- to keep with convention
'PayPal', 'Button', 'domain.com.au' can also be anything
See https://developers.google.com/analytics/devguides/collection/gajs/eventTrackerGuide?csw=1
-
RE: Using alt-codes such as ? in META title / description
i think it looks spammy myself. Just my opinion
-
RE: Google Algo Update Aftermath
I am reaping the benifits,
I have one client that has a original product, but has many compeditors with cheap or ripoff imitations, he was ranking above them anyhowe, but now he is the only selling site on page one, he is now competing with wikiopeadia, the NYT and other information sites only. Sales are up 25%.
Others have also seen some improvement and some compeditors gone. I cant complain.
-
RE: Index.php and /
Yes you should 301 redirect instead.
but you must have links that point to the index.php url or the crawler would not have found it. fix those links so that they point to "/"
links that go thought a 301 loose a bit of link juice, if you can make your link go directly to the correct url you can save that links juice.
RewriteCond %{THE_REQUEST} ^[A-Z]{3,9}\ /.index.php\ HTTP/
RewriteRule ^(.)index.php$ /$1 [R=301,L] -
RE: 301 for "index.php" in Web.config?
Yannick seems to have it right, but here is another way
<system.webserver><rewrite><rules><rule name="DefaultRule" stopprocessing="true"><match url="^index.php"><action type="Redirect" url="/" =""></action></match></rule></rules></rewrite></system.webserver>
-
RE: Can you be penalized by a development server with duplicate content?
I would not worry at all, there is no duplicate copntent penalty for this sort of thing, al that will happen is one site will rank one will not. The original site with the links will obviously be se as the site to rank, block off the deve site anyhow if you are worried. but this seems like a deeper problem that a bit of duplicate content
-
RE: Index.php and /
no use having the canonical as well.
iwould be looking for links within your site that point to https://findmover.in
also do you need to use https? if your site does not need to be encrypted, then it would run faster as http
-
RE: 404-like content in webmaster tools
I assume you are atlking about the soft 404 errors in GWMT
A soft 404 is a page that returns a 200 when it should return a 404.
Can we get a url?
there is a quict test you can do, see if this returns a 200 or a 404
some websites use the 404 page to do url-rewriting or some other reason, this causes a soft 404.
From what I have read, Google can test if you are doing this buy requesting anonsecne url, and see if it returns a 404 or a 200. if it returns a 200 then they look at your pages more closly.
-
RE: How should I go about cleaning up link spam from an acquired domain?
Are you getting a warnming in GWMT? If not i would not be too concerned, directore will not hurt you, they may not give you any value but not harm you.
Have a look for signs of link farms, such as links on blogs post that dont seem to be natural.
-
RE: Block IP from analytics
go to admin, then look at filters, its is pretty easy from there
-
RE: Blocking robots.txt
There is no setting or feature to do this, but you can do it with code
You could detect the useragnet and create a robot.txt dynamically depending on the useragent.
But i cant see why you would want to do this. -
RE: How to properly link to products from category pages?
Dont use no-follow, you will just leak link juice.
One way around this, is to use a anchor # in your url for the image. like page.html#someterm
This will in fact give you link text relevancy for both, google will see this as 2 different pages.
Make sure you have alt text for the image.
This tataic and well as what x-com may in the future be seen as over optimization, so it may be tter to do somthing like this
Your link text
You can just link the whole lot in the one link.
Or move your text to above the image.
-
RE: URL Parameters
Assuming you have your canonicals done correctly, the pages will disappear in time.
the pages you wont to de-index, should have a canonical tag that points to the original.
-
RE: Will frequently adding and frequently removing pages from my site hinder any SEO?
then your link to /ford-mondeo-from-6995 would no longer exist
call the page /offers.
one link will offset the advanatge of the keyword in the url
-
RE: How to properly link to products from category pages?
I should not of said 2 pages, but it has been shown that both links will give link text relevancy.
The javascript link will be followed, it will not help