yes but that link juice circulates around the website and amasses on certain pages. the pages with the most links. What is important is that it ends upon the pages you want to rank.
http://thatsit.com.au/seo/tutorials/a-simple-explanation-of-pagerank
Posts made by AlanMosley
-
RE: Will Nofollow in Nav Cause a Problem?
-
RE: Will Nofollow in Nav Cause a Problem?
I did not want to get to technical, but you seem to really want do this., I will show how to do this with jQuery in a way that search engines will not find.
I would suggest having a real link to your contact page from your homepage so that your address and contact details are found.
From every other page do something like this
Contact Usthen you need some javascript, you will need jquery
$(document).ready(function () { $("[data-contact-page]").click(function () { document.location = "/contactpage.html"; });
}); -
RE: Will Nofollow in Nav Cause a Problem?
Paul is correct, most web sites have this problem, there is not a lot you can do about it because most of the time you need a contact page link on every page.
but don't think of it as one way, remember link juice flows back out of the contact page. The page rank calculations are done many times not once, until the page rank settles on the pages it does. For this reason I have to disagree that internal links can only pass a little pagerank, the more external links you have the more pagerank the internal links spread , so you are correct the way you sculpt your links is very important.
-
RE: 301 or canonical for multiple homepage versions?
Yes that's classic asp
what sort of server is it on windows?
What sort of webserver IIS? if so what version, if it is 7 or greater it is very easy to do your redirects if you have access to the controlpanel
-
RE: 301 or canonical for multiple homepage versions?
you are correct, it want do the index.html or default.asp
but this rule will solve all your domain problems, not the "!" mean not, so if not the desired domain, then redirect to the desired domain not matter what the domain is. this will fix non www, or any other secondary domain you may have such as oldDomain.com or mergedSite.com
Options +FollowSymLinks RewriteEngine On
RewriteCond %{HTTP_HOST} !^www.domain.com$ [NC]
RewriteRule ^(.*)$ http://www.domain.com/$1 [R=301,L]but if you have a page called default.asp. your site is a classic ASP site, ASP is a Microsoft technology and would be on a Microsoft web server IIS. if so 301 redirects are very easy to do.
Is your site ASP.is it on a Microsoft IIS server?
-
RE: How long takes to a page show up in Google results after removing noindex from a page?
Depends on the site, if the site is Microsoft.com with a link from the home page, you can expect it to appear same day.
If its on boringoldsite.com then it could take a week or more.
But mostly a few days -
RE: Penguin hit | Immediate?
I would expect penguin to be sudden.
What your problem sounds like to me, is slowly your links are being discounted as google finds them. Do you have a lot of rubbish links? -
RE: 'key word' SEO question
no but bold does, all be it not much
Bold not using css, but overdoing this will look spammy to SE's as well as users
-
RE: Google stripping down Page Titles
There are many reasons why Google do this, but it may be that they think you are keyword stuffing.
-
RE: 301 and Canonical - is using both counterproductive
I think the point is,
mydomain.com/Page.html 301's to mydomain.com/page.html
but mydomain.com/page.html?x=y canonicals to mydomain.com/page.htmlso in this case both have a function.
but having said that I would fix the links to mydomain.com/Page.html as using a 301 leaks link juice, they are good when correcting a external link, but an internal link should be fixed by fixing the link itself.
-
RE: Will Nofollow in Nav Cause a Problem?
If a page has 5 links, the page rank will be split between those 5 links and will flow to the pages they point to. There are some modifiers to this, but general its 20% per link.
if you no-follow one of those links, 20% of your page rank will be lost. It will be wasted. it is better that your contact page gets it. If you have a link back to your home page from your contact page. you will get some back.
how pagerank works http://thatsit.com.au/seo/tutorials/a-simple-explanation-of-pagerank -
RE: Wordpress shortcode or plugin for current time
It wasn't me either.
$thetimeis = getdate(time()); $thehour = $thetimeis['hours']; $theminute = $thetimeis['minutes']; $thesecond = $thetimeis['seconds'];
if($thehour > 12){ $thehour = $thehour - 12; $dn = "PM";
}else{ $dn = "AM";
}echo "$thehour: $theminute:$thesecond$dn"; ?>
-
RE: Angular.js + Crawlers
What do you mean by "We have deployed a couple updates to render pages for the bots" that sounds like clocking?
-
RE: 301 or canonical for multiple homepage versions?
Agree with Paul here.
A 301 is a directive to the crawler, while a canonical tag is only a hint and is not always followed. Bing for one will ignore canonical tags if it believes they are misused.
as for the mention of "multiple 301 redirect" .
You do not need to have a 301 redirect for every url, just follow the logic
if HTTP_Host is not myPreferredDomain then redirect to myPreferredDomain -
RE: Should we NOINDEX NOFOLLOW canonical pages?
Depends on how unique the pages are.
if the pages using params are quite unique then I would just leave them be, if they are no very unique then using the canonicals the best bet.
-
RE: XML Sitemaps - how to create the perfect XML Sitemap
1 Yes, Bing have stated that they will ignore the sitemap if it is not accurate
2. no
3, no
4 yes
5 no, use priorities for that
6 looks ok
I would look into using the Google and Bing sitemap generators, problem solved (if you have access to the server)
http://www.bing.com/blogs/site_blogs/b/webmaster/archive/2013/05/23/bing-sitemap-plugin-1-0-launch.aspx
http://googlesitemapgenerator.googlecode.com/svn/trunk/doc/gsg-installation.html -
RE: XML sitemap latest best practices?
Yes you can not force indexing it is always up to the search engine.
In a small site a sitemap is next to useless really. sitemaps help the search engine find and decide what to crawl. in a small site with all pages linked its going to find them anyhow. in a large site it could be a good thing to show what has changed, where to find new links and maybe pages that are not linked.
-
RE: XML Sitemap on another domain
I read a post by bings Duane Forrester where he explained that they will lose trust in your site map if it has errors more then 2%. all entries should return status code 200, no 404's or redirects.
He also said that you should not include all pages, just places where you want bing to start crawling from, or where you want to discover links.
-
RE: 301 page into a 404
If you can do a 301 you should be able to do a 404
-
RE: Is 307 the best way to handle temporarily disabled items ?
302 and 307 are the same except for one small thing, with a 307, the redirect must be the same method as the original request, if the original was a GET then the redirect must be a GET, if it was a POST the redirect must be a POST.
A 302 allows for a change in method, but who ever changes the method? just use a 302 -
RE: XML Sitemap on another domain
I agree with mike, but I am curious why you cant have a xml sitemap, maybe you need to chage hosting
-
RE: Spooky goings on from bots?
I don't think GA counts bingbot or other bots, not sure what that article is talking about.
could it be that there is a timezone problem here, and that the hits are not really in the early hours. or maybe from another country?
-
RE: Do we need to update our sitemaps each time our content changes?
I have to disagree, if you include <a><lastmod></a>2012-07-24T08:36:20Zlastmod> tag, then yes you should update it. Bing for example will ignore your site map if it is not accurate. The have a tolerance of about 2% errors.
If your site is only small, and all your pages are linked, then I would not even bother with a sitemap. Bing also stats that you should not list all your pages in a sitemap, but only your key pages.
Sitemaps don't help rankings only indexing, on large sites it is good to tell the search engine what pages have changed, or what pages are good pages to find links to other pages,
-
RE: Duplicate Domain Listings Gone?
Maybe.
I have the #1 position for a corner of the market, but I could not get a second page on the front page, seeing I had #1 I hade a second site and then had 1# and #2,soi made another, I now have 3 on first page, once you have #1, this seems they way to go.
-
RE: Duplicate Domain Listings Gone?
Sometime ago google made a change where they did just this, tried to get more domains on the front page rather then many pages from the same domain.
This was a few years back, so not sure what you are seeing today, it may be that the domains were penalized some other way. -
RE: Should we NOINDEX NOFOLLOW canonical pages?
Devanur is correct, the canonical will never be seen if you no index.
Also you should use noindex,follow not nofollow, using no follow means all link juice pointing to those pages will be lost, using follow means the link juice can flow back out of the pages
-
RE: Lazy loading images effect image seo?
When I wrote the post above I hosted my sites on premises and used cdn for images. But cloud hosting has got so cheap I just host the lot on the cloud. With band width these days I really don't worry about images any more, I optimize them, and host them in the cloud, plus the fact that most people have a good download speed these days, I don't see it as a problem.
Also as a user experience it is more important that the page starts loading soon, when it completes is less important.
I would look into Microsoft Azure they host all sorts of sites, windows and Linux and are cheap.
-
RE: Blocking some countries and redirecting that traffic
You sound like you have an expensive cdn.
Try Microsoft Azure they are cheap will cost you pennies for cdn, or use YouTube free.
-
RE: An affiliate website uses datafeeds and around 65.000 products are deleted in the new feeds. What are the best practises to do with the product pages? 404 ALL pages, 301 Redirect to the upper catagory?
OK I understand a bit better now, yes I agree with what you are doing. All makes sense
-
RE: How cloudflare might affect "rank juice" on numerous domains due to limited IP range?
For a small number of sites I would not be concerned, but if you are worried, Try Microsoft Azure you get a unique ip for each website and they are very cheap with a great interface.
-
RE: An affiliate website uses datafeeds and around 65.000 products are deleted in the new feeds. What are the best practises to do with the product pages? 404 ALL pages, 301 Redirect to the upper catagory?
A few points, I would not put al these pages in the sitemap.xml, if you are a talking about a html sitemap visually showing on your site, then I would not have any links on y site that do not go directly to a page, don't want redirects from internal links, and you definitely don't want 404s.
You say you have a customized 404 page. does this page return a 404 status code? if not you will be getting soft 404 problems. Also a 404 page really should not have any images js or css files on it, especially if they are changing. if you get a broken link on your 404 page, such as a missing image, you will get a internal loop, and search engines really don't like that. -
RE: Why are some pages now duplicate content?
Remus said it well, they are obvious thin content pages, almost identical, Search engines don't want to list every variation of a simular page. You need to add more unique content to each page.
But when I visit, using IE 11 on Windows 8.1, I get a big warning about cookies. This can't be good for conversion. do you really need this warming?
-
RE: I want to put 65.000 productpages on NOINDEX, FOLLW at once! Would Google mind?
That's should be fine. I am glad you said noindex,Follow and not noindex,nofollow.
Remus presents a link by matt cuts, but that is talking about adding pages in great numbers -
RE: Does Moz Analytics need Google Analytics installed?
Yes and no, no its not needed, but there is some extra data that they get from GA, such as traffic data, such as number of visits, keywords sending visits.
-
RE: Experience/suggestions in redirecting old URLs (from an existing site) to new URLs under a new domain
If the site structure is the same, then a simple redirect from oldDomain.com to newDomaina.com will do the trick.
If the site structure is different, then I would look at your external links, if a page has links then redirect it to a page with same or simular content on new site, those that don't have external links I would not bother wit there is no value in redirecting them.
-
RE: 301's & Link Juice
a 301 simply redirects a request to a new request with a new url. If the page has no external links then a 301 will do nothing for you. If you don't want the page delete it, remove any internal links and your done
Each request leaks link juice.
If you have links pointing to page A, and you 301ed page A to page B, then any link juice will go to page B , but will lose a bit of link juice, in fact you lose it twice, once for the link, and once for the 301 redirect. If the only links are internal links why not just link to Page B in the first place.but I would not remove the page, all pages have PageRank to start with, the more pages on your site the more PR, but the more pages to share it with, but with smart linking you can sculpt the more PR to fall on pages you want it to and less you don't want it to.
Read this simple explanation http://thatsit.com.au/seo/tutorials/a-simple-explanation-of-pagerank
-
RE: Setting up Goals in Analytics when no common 'order confirmation' type page
That's a joke.
do it your self$("input[data-paypal-button]").click(function () {
_gaq.push(['_trackEvent', 'PayPal', 'Button', 'domain.com.au', 5])
})In the above code, all you need to do is add an attribute to the button, then include the script in a external file, job done
Where "data-paypal-button" can be anything, but should start with data- to keep with convention
'PayPal', 'Button', 'domain.com.au' can also be anything
See https://developers.google.com/analytics/devguides/collection/gajs/eventTrackerGuide?csw=1
-
RE: Importance of 301 Redirects
you have a point, But if your page has no links, then it is likely search engines see it as poor quality been around so long yet no links, so for that reason the page age may be a bad signal, at least not one worth keeping
If in fact page age is even looked at.
It may also be that the age is not transferred, this link is all I know about age
http://www.youtube.com/watch?v=-pnpg00FWJY&feature=player_embedded
It talks about domain age not page age, but Matt talks about the first time they crawl a domain. If we adapt that to a page I would suggest that they will look at the first time the crawl a URL so changing a url will be enough to lose the age.
OK I don't have inside knowledge of google, but I do know how a 301 works and I can give a you a pretty good guess. A 301 tells a request to make a new request to a new url. I think that google follows it just like that, here is a new page, it has no relevance to the old page its just a new url. to imagine that they keep a record of all 301s where a page used to be for ever, I find hard to believe. When they crawl a external link, they get retuned a new url and the link juice is redirected again thought a second request(hence the second loss of link juice.) and finds the new page where the link juice lands.
Think of a page that has many 301s from many old pages, then it would have many ages, I would suggest google keeps it simple, they just follow the 301 to a new url and treat it as a new found page and distribute the link juice as they would any other page they and on.
For a bunch of pages that don't have links, I cant see any difference, but what I would worry about is page speed. reading a long set of 301 code then trying them can slow a site down (remember we are talking about a large number here). If you can simple do all your 301's with a few lines of code then that's not a problem, but if you start to get a long list then its going to a be a problem for every page in the site.
back to your point, I have also had thought about this, but for a slightly different reason, and that is original content, if your page was from 2001 and you were scraped in 2003 and will the 2003 scrape now get the original content credit. but then this may e the case anyhow even with a 301, and again, I don't think this is a big problem because sites that scrape are not likely to be given credit, they will have lots of scraped content and will be known as a scraping site and not be awarded credit.
-
RE: What are we doing wrong with Rich Snippets?
A bit late here.
But How I understand it, price can not be a meta, it must be showing to the user, like it is in the schema.org page
itemprop="price">$55.00
http://schema.org/Product -
RE: Will using 301 redirects to reduce duplicate content on a massive scale within a domain hurt the site?
Yes that's correct
The pages will be found just as quick thought there new links as they would have been though their old links with 301's
-
RE: CDN image links passing SEO benefit?
no, I wold have thought they were outgoing links
-
RE: Search instead of 404 - Vtex Brazil
It can be good, to redirect users to simular content, but not how they have done it, but I think there are much better things to spend you time on.
What they have done is returned a Status code of 200, this will cause problems with soft 404s.
Also when you create a 404 page, you really should not have any external files such a images css or js files, this can cause a eternal loop if any of these files goes missing. a missing image will create a 404, that will call the 404 page that will cause a it to happen al over again. Crawlers will not like this. Bing once stated that they see your site of low quality of this is the case.
-
RE: Setting up Goals in Analytics when no common 'order confirmation' type page
Look at events in GA, all you need to do is fire a event in javascript and that event can be your goal, a goal does not have to be an url, it just needs to be an action, a button click for example.
add an attribute to a button or link such as data-ga-event
then add code such as
$document).ready(function(){
$("*[data-ga-event]).click(function(){
//your google event here
}
}) -
RE: What makes high quality content?
Is it interesting that people want to view it. If so people will link to it. another measure that search engines look at is clicks from search, if a user clicks on your searp, then retunes to the search and chooses another result, then the search engines assumes your result was not a good result for that search term.
technically, the content should crawl well, and have semantic meaning, you need to use schema.org to describe the content to the search engine to make sure that the SE understands what you are trying to say.
-
RE: Will using 301 redirects to reduce duplicate content on a massive scale within a domain hurt the site?
I would look at your links coming into those pages, and 301 the links only, I doubt all 100,000 pages have links.
A 301 only redirects link juice, so if no external links no problem, of cause you need to change your internal links. You should never 301 internal as 301's leak link juice, you should fix your internal links.
-
RE: How important is w3c validation for mobile sites???
I would look at it the other way around., am I concerned about what it fails me for.
yes I would go with responsive design, bootstrap is a good for layout,
yes I would try to get a good score on page speed,
-
RE: Block bad crawlers
Chris gives a good answer, but is it really a problem, bandwidth is very cheap these days, in fact here in Australia most accounts are unlimited,
I Host with Microsoft Azure and bandwidth is very cheap.
-
RE: Headers question (H1,H2..)
Sounds like nonsense to me, do they give any evidence to back this up?
I cant see how you can dilute a keyword, besides linking out with the keyword.
The reason people like to keep to one keyword per page, is that there is only one page title, one H1(should be only one), one url, not because more then one will dilute the other, and especially in your case, I cant see how the same keyword can dilute itself.
Don't keyword stuff of cause, but you are far from doing that.