The other responders here are right - that activity needs to be stopped right away. It's highly unlikely that they'll get away with it for much longer, and when they're hit it won't be pretty.
Posts made by JaneCopland
-
RE: What is your opinion on link farm risks and how do I explain this to a client?
-
RE: Site Penalized - 301 Redirect Question
It's generally accepted that penalties pass through redirects (this wasn't always the case - up until a couple of years ago, a 301 solved a lot of penalty issues). I have to guess that Silkstream's experience is uncommon in the long-term - a bad penalty will probably transfer over sooner or later. However, there are no set rules for this and perhaps several people who try this will get lucky.
I wouldn't rely on that though.
-
RE: Repeating Content Within Code On Many Pages
Hi,
Agreed that if the pop-up draws the content from a single page / URL, there's no duplicate content problem. JavaScript would still be required to pull the content up and produce the pop-up: as we understand it, Google and other search engines are not too keen on executing JS functions, so often won't be served the content. And even if they are, drawing that content from its own URL will mean it is not included in the source code or seen as being on the page.
Whether this content could hurt you as it is currently included is hard to prove either way, but drawing it from a different source with JS would alleviate the potential problem.
-
RE: Switching to HTTPS YES/NO ?
Hi Remko,
If the entire site sits on HTTPS, definitely ensure that you have a valid certificate.
As far as it being necessary to have an entire site sitting on HTTPS, opinion on this has changed over the years. It previously was not a great idea; it's now much more common and many well-ranked websites use HTTPS as standard. I would not change your entire site including pages that do not need to be secure based solely on this session at SMX, especially as it was indicated that there's opposition to making this a ranking factor. I'd wait for more evidence before I did something like that. However, if making a site load solely on HTTPS URLs becomes absolutely necessary, then I'd be more comfortable doing so now than ever before.
-
RE: Should I continue adding new content on the website penalized by Google?
Hi John,
If the penalty is solely related to inbound links and you are cleaning these up, the content will suffer while your penalty exists (i.e. it won't start ranking well, as it's housed on the bad domain), but once the penalty is lifted, the content should have no trouble ranking, even though it was added when the site was under penalty. The same goes for the on-page optimisation: you'd be doing this for the future, not for the present. It won't help you in the short term, but you should still benefit from the improvements when the site is no longer suffering because of its inbound links.
Cheers,
Jane
-
RE: WebMaster Tools keeps showing old 404 error but doesn't show a "Linked From" url. Why is that?
Hi,
How long have these errors been appearing since you fixed the issue? It could be a case of Google looking for URLs on the site that it has seen in the past, even though there is no path to them anymore. With the pathway gone, it should stop looking, but I'm curious how long the issue has been fixed for?
-
RE: Seo site architecture - how deep?
Agree with Lesley - there's little to no benefit in stuffing keywords into a URL (which was a "traditional" reason why people added multiple subcategories), and excessive categorisation / siloing shows diminishing returns. I would stick to as flat a structure as possible whilst keeping a sensible hierarchy of information.
Cheers,
Jane
-
RE: Selecting a PPC agency
My former SEO agency had a good relationship with Periscopix in the UK (they're based down by Tower Bridge, south of the river and a few streets to the east). You can also check out the Recommended Agencies list here on Moz. There was always an SEO focus for most of those recommended companies, but several are in design, PPC, conversion rate optimisation, etc.
-
RE: Does omitted results shown by Google always mean that website has duplicate content?
Hi Prashant,
Yes - any URLs that are different are different in Google's eyes, unless the modifier is a # symbol.
So if you have www.example.com/key#value12345 and www.example.com/key#valuexyzabc, then Google sees these as the same, i.e. www.example.com/key. They will ignore everything after the # character.
All other query strings, etc., mean that the URL has changed and if the pages on those URLs are the same, it's duplicate content.
I hope this helps.
Cheers,
Jane
-
RE: Can a 100% bounce rate page hurt whole website?
Just adding to this thought - it really is about context. I suspect that bounce rate is used very carefully and that Google tries to add context, given that a bounce on an informative article and a bounce on an opening hours page are very different. If you believe the pages should either attract further clicks, conversions or longer periods spent on page, then 60% is a worry.
I would never go as far as to say that you will ever be "penalised" for bounce rate - keep in mind that bounce rate should also be easy to fake, either for positive or negative consequences, for a website or a competitor. Google tends to weigh lightly factors that can be easily influenced. It would not be very hard to fake a high bounce rate from real-looking visitors on a competitor's website if you wanted to - you could do it with automated activity or use something like Mechanical Turk. As such, Google can't put too much weight in things that are too easy to manipulate. Hence why they put more weight in backlinks than in on-page mark-up, etc.
-
RE: Transferring link juice on a page with over 150 links
Hey Flo,
Good news! This went up literally yesterday: http://www.seroundtable.com/google-link-unlimited-18468.html
See the longer discussion here: https://productforums.google.com/forum/#!topic/webmasters/alde4GNOWp0/discussion
This is the first time a Googler has confirmed the lack of a limit on the crawling of internal links
-
RE: Duplicate content from pagination and categories found in multiple locations
You should be able to use canonicalisation here, but for a more in-depth guide to pagination including rel="next", "prev", etc., check out this blog post by my former agency. It's a great resource on the subject.
-
RE: Could posting on YouMoz get you penalized for "Guest Blogging?"
I know Moz has talked about this internally and perhaps sought clarification on how Youmoz "should" work from Google, so hang tight for Keri to get back to this one, everyone
-
RE: Do deep pages issues affect homepage chances of ranking?
The only way I could imagine this would hurt the home page is if the site structure is terrible enough that its Panda penalty is extremely severe. I can't think of a place where I've seen this "in the wild" though so that's entirely theoretical.
-
RE: Why do these links violate Google's Quality Guideline?
Hi John,
As the others have said, there are issues with all three types of links. Number 1 is obvious.
Number 2 comes from a site that appears to blog about absolutely anything, as long as they're paid to do so. The site has posts about buying a car in Philadelphia, passport photos, saving money on gas, business protection... and piano lessons. It's pretty obviously a source of income for the blog owner, with these posts placed in between personal updates.
Number three is from a piano website, but they even list "reciprocal links", which was an outdated link building technique in 2006
I would say that Google is well within its own guidelines to suggest that these links are bad.
-
RE: How to block search bots in crawling my site except for homepage?
Robots.txt exclusion is definitely the easiest way to go. The URLs within the site might still be "indexed", but they will not be crawled and if they ever showed up in a search, they would look like something like this: http://i.imgur.com/xU6mDYA.png
-
RE: Transferring link juice on a page with over 150 links
Hi Flo,
I think it totally depends on the usefulness of the page and perhaps the ratio of links to supporting text / resources. A page with 150+ links certainly isn't automatically useless, and many big sites have that number of links on most of their pages due to navigational elements, long blog posts with multiple citations, outbound links from comments (albeit nofollowed), etc.
The 100 links per page limit is (hopefully) rather outdated and stems from when Google visited each page with a limit of how much data it could process. Very long, heavy pages would not be properly crawled, so their links wouldn't be properly crawled either, and PR would not pass properly out of all the links on the page. Google is now much more advanced and is used to dealing with a wide variety of page sizes.
A page with over 150 links can certainly perform well and there should not be a problem with appearing spammy or overdone if each link is contributing to the aim of the page. As I said previously, I would also try to include a fairly good amount of supporting text (or whatever type of supporting resource is appropriate - images, video, etc.) so that the page is not solely a list of links.
Hope this helps!
Jane
-
RE: Www2 vs www problem
Hi,
I doubt you will see too many SEO detriments to this, but that depends on how the site is configured re: the non-www version of the site. If you access http://domain.dk/, what happens? Are you redirected to www.domain.dk, www2.domain.dk, or does one of the two categories' content load on the non-www URL?
Google should simply treat www and www2 as different subdomains. I have not heard of ranking / indexing confusion based on using www1, www2 etc. but it's definitely the usability issue that would really bother me. Definitely good to work on convincing the client to hurry up with the complete redesign so you can get it all back on the www
-
RE: Why blocking a subfolder dropped indexed pages with 10%?
Hi,
If nothing significant, and no noticeable loss in rankings (e.g. no pages that were bringing in legitimate traffic were affected), I would wait this out and keep and eye on indexed pages. I've definitely seem similar rises / falls in indexed pages, but if the activity doesn't coincide with "real world" traffic / ranking consequences, it tends to be Google removing unnecessary pages (pagination, etc.) or even reporting error.
-
RE: Www2 vs www problem
Hi Tihomir,
Is there a way you can rename the subdomains? E.g., name the old design / category something like http://categoryname.domain.dk/ and have the new content on http://www.domain.dk/?
-
RE: How to remove the certain backlinks completely ?
I would wait another few weeks to a month - if the links are still showing up in WMT, use the disavow tool.
If you are penalised (manually or algorithmically), and are sure that those links are the reason, that's another reason to use the disavow tool too.
-
RE: Manual Webspam Error by Google!
Hi Marek,
Very few people get anywhere with the tweet-Matt option sadly
If you received a manual penalty, this has little to do with Penguin updates - the penalty has been handed out by a member of the Webspam team rather than by the algorithm.
What concerns me about your links is firstly how many links point to the site using commercial terms rather than brand terms as anchor text. This is one of the red flags Penguin looks for, but it's also amazingly easy for a person to discover: http://i.imgur.com/INcW11X.png
No backlink profile created "naturally" (and I realise how hard it is to create a natural backlink profile) would look like that. A Googler would take a dim view of that anchor text spread.
Secondly, I'm curious about the sites that link to you using those anchors. I tried visiting them and many of them returned the exact same 500 database error: http://i.imgur.com/lQHEk3p.png + http://i.imgur.com/zpw6YC7.png
All these sites have the same IP address. The other sites hosted on this IP are all down as well: http://www.bing.com/search?q=ip%3A176.67.167.170&go=&qs=n&form=QBLH&filt=all&pq=ip%3A176.67.167.170&sc=0-3&sp=-1&sk=&cvid=fd590e3d130749f290febb6a76973ced
If links were placed on this network of sites all hosted on the same IP, this would absolutely be grounds for a penalty. The weird thing to me is not the penalty but the fact that you were later told you didn't have one.
It's also worth noting that I'd recommend removing those links, penalty warning / loss of rankings or not. There are some other low-quality pages linking to you multiple times with competitive anchors, like http://www.lanaintl.com/all-about-desks-and-its-types. This just looks ridiculously unnatural and manipulative: http://www.lanaintl.com/ - starts off talking about Albuquerque pest control, them keeps linking out to a UK furniture store
You've also got commercial links from sites with identical themes: http://www.house2homefurniture.com/lc-140-xx.html
Link removal is absolutely necessary here, I'm afraid. These bad links all have to go.
-
RE: Does building multiple websites hurt you seo wise? Good or bad strategy?
Agree with everyone else here - this is a pretty horrible strategy when done solely for SEO purposes, and usually ends up being very spammy. Most businesses don't have a broad enough catalogue to warrant multiple sites, and the benefit of having keyword-rich domain names (even, or especially exact-match domains like carinsurance.com) is negligible nowadays. Google had to crack down on this, as domain names used to be a very easy to way to rank.
Google is good at figuring out who owns which websites, so unless you are incredibly dedicated with your efforts to hide details, a network of sites like this is likely to be grouped together - Google will probably know they're all yours. Whilst that alone isn't a terrible thing (lots of businesses own more than one domain), Google has seem networks created for SEO purposes like this so many times that the view they'll take of it is dim. Something as simple as the same Local / Places / telephone information would be more than enough to make that connection.
-
RE: Does omitted results shown by Google always mean that website has duplicate content?
Hi Prashant,
This sounds like removal due to duplication rather than DMCA - the omission is usually noted as being because of DMCA notices if they are the reason, e.g. http://img.labnol.org/images/2008/07/googlesearchdmcacomplaint.png
Google likely sees these as duplicates, or near-dupes, as David has said,
-
RE: Www2 vs www problem
Hi Tihomir,
Are you planning to give the remaining category a facelift too?
It would be best to include all three categories under the same subdomain (e.g. the "www." subdomain) and place them in folders, e.g. www.domain.dk/category1, www.domain.dk/category2 and www.domain.dk/category3. www2 isn't technically damaging but it's bad from a usability point of view. It's incredibly unlikely to be remembered, for one, and even more likely to be mistyped as www.
-
RE: Why blocking a subfolder dropped indexed pages with 10%?
Hi,
The drop could be unrelated to your disallowing the account pages (but perhaps check if the CMS allows random query strings, and look into whether it could have created any upon user action, etc. just in case). It's pretty common to see fluctuations in the number of indexed pages, especially with numbers of pages in the thousands or higher. Have you noticed a decrease in traffic from search that you can match with deindexation of pages that were previously bringing in visitors?
-
RE: Blog Location
Hi Sara,
If you are keen to get more traffic through the news stories and the main blog link, it's definitely a mistake to move it, just from a usability perspective... I agree with Andy that stats like you're seeing for these links in Analytics suggest that you can make an improvement to the page to increase the click through rate if.
The effect of a link being above or below the fold is probably negligible when it comes to PageRank passed (with the exception of footer links, which appear to be devalued).
-
RE: What should I do with a large number of 'pages not found'?
Hi Claire,
If you really can't 301, consider serving a page providing alternative products, a search function and an explanation of why the page's former content is no longer available. Many estate websites are quite good at this. Using real estate as an example, some maintain the URLs of properties that regularly go on the market (big city apartments, for example) but grey out the information to show a user that the property is not currently for lease. Other URLs will show properties in the former listing's post code.
Your robots.txt file is going to get out of control if you are having to add millions of pages to it on a regular basis, so I would personally not pursue that route.
-
RE: Manual Webspam Error by Google!
Hi Marek,
Agree with William that doing the reconsideration request in the same week is too soon, barring exceptional circumstances. By that, I mean that if you had been actively removing bad links in the weeks / months leading up to receiving the spam action notice / penalty, you could submit a reconsideration request and cite this. However, in general Google does like to see significant effort on a webmaster's part to get rid of bad links before asking for reconsideration. What this means is that your request should show the activity you've engaged in to try and remove links: how many emails you have sent to the websites hosting the bad links, how many replies you've had, how many of those links were removed as the results of your efforts and how man you feel you cannot remove due to inaction on the part of the webmaster or your inability to find a real person to contact.
It's confusing that you received a message saying that you did not in fact have manual action against your site if you were previously told that you did - this could just be a glitch, but if that first message coincided with a ranking problem that is persisting, I would say that it is necessary to remove the poor quality links pointing to the site, including those from low-quality sites, and those with overly-optimised anchor text.
Cheers,
Jane
-
RE: Paid Directory Links
Ha, in that case, it's pretty much every second new client you speak to!
-
RE: How to remove the certain backlinks completely ?
Hi Ivan,
Here's information on how to do file a disavowal file and reconsideration request. The reconsideration request and disavowal should really only be used if you have been penalised, but you can make the decision whether or not to use it based on the severity of the problem too.
Cheers,
Jane
-
RE: Sitemap for multilanguage website
Hi there,
Ah, I see what you mean.
You can certainly add these sitemap links to the .com's robots.txt file. I don't believe there is syntax to specify language for this though - if you look at large multinational sites like Adobe.com, you can see their multitude of sitemaps below their disallowed files: http://www.adobe.com/robots.txt
Microsoft's is similar: http://www.microsoft.com/robots.txt
I'd say you're safe enough to link like this - the sitemap will be accessed, plus if you have used hreflang tags and geo-targeted the subfolders in Webmaster Tools, Google will understand that it's looking at maps / content for a specific language.
Cheers,
Jane
-
RE: Is it possible we are being penalized for doorway pages?
Hi Neil,
What I mean by category URLs is that a product sits on a URL like http://www.trophycentral.com/5x7blacmarpl.html, rather than http://www.trophycentral.com/plaques/insertplaques/5x7blacmarpl.html but as I said, the flat structure you are using can work as well. Putting products in structures like that can get confusing if products exist in multiple categories and make way for duplicate content (i.e. a product is found under multiple different URLs). Just worth mentioning though because it's not common to see such a flat structure nowadays with the ecommerce platforms a lot of folks are using, like Magento, etc.
I wouldn't worry too much about tabs. If the content behind tabs is a) not incredibly long, b) relevant to the page, and c) available in the source code on page load (i.e. it doesn't require the execution of a JavaScript function to pull the content into the tab / onto the page), Google can see this content and should treat it much the same as if the content wasn't tabbed.
Cheers,
Jane
-
RE: Paid Directory Links
I don't take these risks on purpose - if a link shows up on what looks like a high quality "directory" as a result of a paid posting for a client or a personal site, I won't worry too much about it but that's on a case-by-case basis. Links like that show up in the wild all the time, but I wouldn't personally seek them out. It depends on how much of a risk you're willing to take, but if you're otherwise good at link development, you don't need to take the sort of risks you see people taking when they acquire 20, 30, 40 paid directory links.
I am not sure what you mean by clients - are you asking if I have been responsible for clients being penalised? If so, no, I have not. Thanks.
-
RE: How to remove the certain backlinks completely ?
Hi Ivan,
The links no longer physically exist, so Google should end up picking up on the fact that they are gone soon. It surprises me that it's been several months and they're still showing up, however. That said, if the links are from a source that Google has no reason to regularly crawl, it can take a long time before they re-visit and find that those links are gone.
If the links are causing you ranking problems, consider a disavowal / reconsideration request submission in which you say that the links are gone and that Google simply has not picked up on this yet.
Cheers,
Jane
-
RE: How is this site ranked so high on Google?
Agreed with the other guys here - you often see Google place sites like this in the SERPs for queries of similar competitiveness to this. It's not the internet's most competitive keyword but is also one that carries with it a lot of interest and can be profitable if the travel blog is operating as an affiliate. This example seems very closely related to the query, and Google is known to also often place "smaller" websites like this in the mix to make a change from giants like MSN, newspapers, travel guides on big sites, etc.
-
RE: GWT shows 38 external links from 8 domains to this PDF - But it shows no links and no authority in OSE
Hi Dana,
I'm inclined to think that this is a problem with how the tools process links to and from PDFs rather than the passing of PageRank through PDFs itself as well, although I am not sure. I would certainly not say no to a link to or from a great PDF if that was the only option, although I have also actively encouraged clients to re-publish their quality content housed in PDF format as HTML and redirect the PDF file to the new source. We're just a lot more sure about how authority passes through HTML files.
-
RE: Moving to New Domain - Ranking impact
Hi Conrad,
Unfortunately every migration is going to yield different results in terms of how well the redirection goes and how long you have to wait to get your rankings back if they suffer. Thomas' experience is fairly typical (and the resources he cites here are good too). It's impossible to say what will happen - a particularly large site (let's say a big e-commerce site for a high-street retailer) might suffer due to the sheer volume of URLs that need to be moved and picked up by Google; a small website may have an easier time. However, metrics like the age and authority of the moving website may well play into how successful the move is as well. As such, it's really hard to say exactly how a migration will go without seeing the sites (and even then, ranking problems can crop up that were unexpected).
Cheers,
Jane
-
RE: Sitemap for multilanguage website
Hi there,
Sorry, I'm a little lost - what is it that you're preparing to do ("add in .com/robot.txt a sitemap with language")? Would you be able to rephrase? I'm not quite sure what you mean by this.
Cheers,
Jane
-
RE: Paid Directory Links
William and Gary sum it up really well here - directory listings can be useful, but they should be sought for other reasons than SEO. Noah and Eduardo are also right - the safest thing to do is to acquire these links with the nofollow tag.
If you want to take a risk or two, the highest quality directories probably won't hurt you if they're followed links. Google's stance on these things is pretty hard line though, so if you're working on behalf of a larger company or client, perhaps it's best to play it safe.
-
RE: Google images
Hi Richard,
I'm not a Google Images expert (otherwise I'd offer to help!) but you could check out the companies / individuals listed on Moz's <a>Recommended Providers page</a>. Many are agencies that prefer to work with people on longer retainers, but you may find someone to work with there, or the people listed may be able to point you to a trusted contact who specialises in images.
I haven't heard of people specifically specialising in image search but that doesn't mean those people don't exist, or that you won't find someone who has a lot of experience with image-heavy photography / art websites.
Cheers,
Jane
-
RE: New un-natural links to my website that i didnt create.. and lots of them!
Hi,
The guys here have already left some great comments. It's hard to guess why / where the links are coming from without seeing them, but I agree with William that five per week seems low for an automated bot. Those things usually do thousands, unless it's being run by someone who doesn't want you to notice the activity. Either way, a person is probably behind the activity at its root - do you have any idea who might want to maliciously hurt your website?
There is a possibility that the activity is not being carried out in order to directly hurt you, but that your site is part of someone's larger negative SEO attack against someone else. Some negative SEO attacks will target a business (let's say Nordstrom's, for purpose of example). They'll point thousands of links to http://shop.nordstrom.com/ from some terrible pages, but will sometimes mix in links to other sites to muddy the waters a bit. Are there any other sites linked to from the same pages as yours? Any trends you can spot?
Moz and OSE sometimes miss the real underbelly of the internet - the stuff that's available if you know where to look for it (and Google usually knows where to look for it) but for the purposes of regular marketing / SEO, is fairly irrelevant. My former agency built its own backlink analysis tool and did the same thing. We could do deep crawls to find this crap if we needed to (and so can Moz) but it's thankfully rare that situations like this come along that require such a deep look at links. Moz, Majestic, Ahrefs, etc. will show you links from the part of the internet that they believe counts. Frustrating when things like this happen, but if you are finding these in WMT then you should have sufficient information to go through with a disavowal and / or reconsideration request.
-
RE: How to fix Overly Dynamic Urls
Hi Robert,
The canonical tag is in place on the dynamic URL, meaning that Google is assigning that "page"'s value to the correct, static URL. As such, this shouldn't be a problem for Google. Not sure why Moz is still flagging this, but my guess is that the Moz system is not set up to disregard overly dynamic URLs when they are properly canonicalised elsewhere. The Moz system likely understands that this isn't duplicate content, however, due to the canonical tag. The dynamic URL warning is just being triggered regardless.
If all your dynamic URLs are canonicalised like this to their proper canonical, static version, you should not have a problem with Google.
Cheers,
Jane
-
RE: Why am I getting links in my link report from sites that no longer exist?
Hi Benjamin,
Eduardo and Kevin are right - Link Detox's cache will be a bit old, hence why it's showing you data that's out of date. Google crawls much faster than most commercial tools (Moz's OSE included), so it's likely that those bad sites / pages have dropped out of its index already. Search for those page's URLs to make sure they're not still cached in Google's index. If they are, you'll have to wait it out (but it shouldn't take long for them to disappear if they're returning 404 or 410 status codes).
-
RE: Unnatural inbound links message from Google Webmaster Tools!
Gary and Marty have some great points here. Every recovery is somewhat different and I have recently seen sites recover a portion of their rankings quite quickly, but the link issue wasn't bad to begin with. Depending on how bad your backlink profile is, you could be left waiting / working a very long time.
Documentation of what you've done to fix the situation is one of the primary things you need to do. Google doesn't take reconsideration requests very seriously if that intricate documentation isn't included, so Gary's advice on that is absolutely key.
Perseverance, patience and not losing hope is also pretty key in making it through this process if your situation is pretty bad - definitely use tools like OSE, Ahrefs, Majestic SEO and the link download within Webmaster Tools to get a full look at your backlink profile. I like to use multiple data sources too - ensures you're seeing the full picture.
Cheers,
Jane
-
RE: Is it possible we are being penalized for doorway pages?
Hi there,
The duplication of the products is not highly likely to be causing an issue here given that many ecommerce sites operate like this, but duplicate content was one of the primary issues Panda sought to weed out. It seems as if Panda can be very hard to get rid of, even if you have cleaned up 99% of the issue: you're doing the same or better than similar sites that are not under a penalty but the penalty remains because a certain amount of duplication (or another issue) remains.
Is the problem uniform - i.e. all products that rank well are not duplicated, and all products that have ranking problems are duplicated?
The structure without category URLs is a little abnormal too but what amounts to a flat website shouldn't hold you back completely either.
-
RE: Doing a re-design but worried about my new navigation affecting rankings
Hi there,
Links do pass PageRank, but they don't drain a page of the PageRank it already has. If you link out from a page 100 times, it doesn't make that page 100 times weaker. Think of PageRank as coming in two forms: that which you accumulate via being linked to, and that which you can pass on because of your accumulated authority.
The only way in which adding more links to a page can be seen to be "damaging" is that if you link out 100 times from a page, each linked-to page receives roughly 1/100th of the passable strength from that page (links in footers or similar won't receive as much). However, if you link out 1000 times, each linked-to page receives 1/1000th of the passable link strength. Therefore, if you want a page to rank better, you need to consider whether you're diluting the amount of PageRank it receives due to the high number of links on other pages, not on the page itself.
Is this clearer? Sorry it's hard to explain, but we basically believe that PageRank comes in two forms: one which a page accumulates and one which it can pass on, and passing PR on doesn't weaken the page itself.