Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Link Age as SEO factor?
-
Hi Guys
I have a client who ranks well within a competitive sector of the travel industry. They are planning CMS move which will involve changing from .cfm to .aspxWe will be doing the standard redirects etc
However Matt's statement here on 301 redirects got me thinking
http://www.youtube.com/watch?v=zW5UL3lzBOA&t=0m24sHe says that basically you loose a bit of page rank when you do a 301 redirect.
Now, we will be potentially redirecting 1000s of links and my thinking is 'a lot of a little, adds up to a lot' In other words, 1000s of redirects may have a big enough impact to loose some rankings in a very competitive and aggressive space.
So recommended that we contact the sites who has the link highest value and ask them to manually change the links from cfm to aspx. This will then mean that there are no loss value as with a 301 redirect.
-But now I have another dilemma which I'm unsure about.
So the main question:
Is link age factor in rankings ?If I update any links, this will make said link new to Google, so if link age is a factor, would this also lessen the value passed initially?
-
Do you have the option of not displaying the extension on your URL? That way no matter what underlying language you use, you have the same URL and don't have to worry about updating links in the future.
-
The dev team is aware of the duplicate posting issue. I delete duplicate posts when I see them, but occasionally get errors myself.
-
Link age is not a factor.
Strength of domain/page the link is coming from is a factor. (would you want a day old link from the front page of SEOmoz, or a two year old link from your buddies blog?)
The only reason link age is thought to be a factor is that the older the link, the older the page the link is coming from, and the older the page the more time it has had to build more authority, thus pass more juice.
Great idea on getting those links manually changed!!!
-
Ho ho ho! Very whimsical indeed
For your sanity you should know there has been issues with this for all of us recently - and Delete Reply doesn't work
( Hmmm, I wonder if SEOMoz will get penalised for all of this duplicate content???
)
-
From a developer's point of view: If you do not already have the new system in place, I would suggest a MVC move rather than aspx on the dotnet platform and put a cfm handler in place to map the pages at the controller level. Goggle will not know know there has been a change and you site will perform much faster. Microsoft is tending to move away from the aspx to the more structured mvc version anyway.
-
apparently I was quick enough to answer it twice before you could answer once
I tried to delete the 2nd post but I get a site error. C'est la vie!
-
Update your links to get back 100% of your link juice back.
See here for more info: http://www.seomoz.org/qa/view/48932/link-age-vs-domain-age
EDIT: DANG! How quick are you Ryan ;o) Were you at Vivid Lime's house when he started writing the question?!?
-
If I update any links, this will make said link new to Google, so if link age is a factor, would this also lessen the value passed initially?
My answer to you is NO. But you should be aware there is at least some dispute on the topic.
The SEOmoz point of view (which I agree with): http://www.seomoz.org/blog/age-of-site-and-old-links-whiteboard-friday
Another point of view: http://www.the-system.org/2011/01/google-algorithm-change-attacks-spam-or-does-it/
-
If I update any links, this will make said link new to Google, so if link age is a factor, would this also lessen the value passed initially?
My answer to you is NO. But you should be aware there is at least some dispute on the topic.
The SEOmoz point of view (which I agree with): http://www.seomoz.org/blog/age-of-site-and-old-links-whiteboard-friday
Another point of view: http://www.the-system.org/2011/01/google-algorithm-change-attacks-spam-or-does-it/
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How Many Links to Disavow at Once When Link Profile is Very Spammy?
We are using link detox (Link Research Tools) to evaluate our domain for bad links. We ran a Domain-wide Link Detox Risk report. The reports showed a "High Domain DETOX RISK" with the following results: -42% (292) of backlinks with a high or above average detox risk
Intermediate & Advanced SEO | | Kingalan1
-8% (52) of backlinks with an average of below above average detox risk
-12% (81) of backlinks with a low or very low detox risk
-38% (264) of backlinks were reported as disavowed. This look like a pretty bad link profile. Additionally, more than 500 of the 689 backlinks are "404 Not Found", "403 Forbidden", "410 Gone", "503 Service Unavailable". Is it safe to disavow these? Could Google be penalizing us for them> I would like to disavow the bad links, however my concern is that there are so few good links that removing bad links will kill link juice and really damage our ranking and traffic. The site still ranks for terms that are not very competitive. We receive about 230 organic visits a week. Assuming we need to disavow about 292 links, would it be safer to disavow 25 per month while we are building new links so we do not radically shift the link profile all at once? Also, many of the bad links are 404 errors or page not found errors. Would it be OK to run a disavow of these all at once? Any risk to that? Would we be better just to build links and leave the bad links ups? Alternatively, would disavowing the bad links potentially help our traffic? It just seems risky because the overwhelming majority of links are bad.0 -
Deep linking with redirects & building SEO
Hi there. I'm using deep linking with unique URL's that redirect to our website homepage or app (depending on whether the user accesses the link from an iphone or computer) as a way to track attribution and purchases. I'm wondering whether using links that redirect negatively affects our SEO? Is the homepage still building SEO rank despite the redirects? I appreciate your time & thanks for your help.
Intermediate & Advanced SEO | | L_M_SEO0 -
Onsite SEO vs Offsite SEO
Hey I know the importance of both onsite & offsite, primarily with regard to outreach/content/social. One thing I am trying to determine at the moment, is how much do I invest in offsite. My current focus is to improve our onpage content on product pages, which is taking some time as we have a small team. But I also know our backlinks need to improve. I'm just struggling on where to spend my time. Finish the onsite stuff by section first, or try to do a bit of both onsite/offsite at the same time?
Intermediate & Advanced SEO | | BeckyKey1 -
SEO Impact of High Volume Vertical and Horizontal Internal Linking
Hello Everyone - I maintain a site with over a million distinct pages of content. Each piece of content can be thought of like a node in graph database or an entity. While there is a bit of natural hierarchy, every single entity can be related to one or more other entities. The conceptual structure of the entities like so: Agency - A top level business unit ( ~100 pages/urls) Office - A lower level business unit, part of an Agency ( ~5,000 pages/urls) Person - Someone who works in one or more Offices ( ~80,000 pages/urls) Project - A thing one or more People is managing ( ~750,000 pages/urls) Vendor - A company that is working on one or more Projects ( ~250,000 pages/urls) Category - A descriptive entity, defining one or more Projects ( ~1,000 pages/urls) Each of these six entities has a unique (url) and content. For each page/url, there are internal links to each of the related entity pages. For example, if a user is looking at a Project page/url, there will be an internal link to one or more Agencies, Offices, People, Vendors, and Categories. Also, a Project will have links to similar Projects. This same theory holds true for all other entities as well. People pages link to their related Agencies, Offices, Projects, Vendors, etc, etc. If you start to do the math, there are tons of internal links leading to pages with tons of internal links leading to pages with tons of internal links. While our users enjoy the ability to navigate this world according to these relationships, I am curious if we should force a more strict hierarchy for SEO purposes. Essentially, does it make sense to "nofollow" all of the horizontal internal links for a given entity page/url? For search engine indexing purposes, we have legit sitemaps that give a simple vertical hierarchy...but I am curious if all of this internal linking should be hidden via nofollow...? Thanks in advance!
Intermediate & Advanced SEO | | jhariani2 -
Is CloudFlare bad for SEO?
I have been hit by DDoS attacks lately...not on a huge scale, but probably done by some "script kiddies" or competitors of mine. Still, I need to take some action in order to protect my server and my site against all of this spam traffic that is being sent to it. In the process of researching the tools available for defending a website from a DDoS attack, I came across the service offered by CloudFlare.com. According to the CloudFlare website, they protect your site against a DDoS attack by showing users/visitors they find suspicious an interstitial that asks them if they are a real user or a bot...this interstitial contains a Captcha that suspicious users are asked to enter in order to visit the site. I'm just wondering what kind of an effect such an interstitial could have on my Google rankings...I can imagine that such a thing could add to increased click-backs to the SERPs and, if Google detects this, to lower rankings. Has anyone had experience with the DDoS protection services offered by CloudFlare, who can say a word or two regarding any effects this may have on SEO? Thanks
Intermediate & Advanced SEO | | masterfish1 -
One Way Links vs Two Way Links
Hi, Was speaking to a client today and got asked how damaging two way links are. i.e. domaina.com links to domainb.com and domainb.com links back to domaina.com. I need a nice simple layman's explanation of if/how damaging they are compared to one way links. And please don't answer with you lose link juice as I have a job explaining link juice.... I am explaining things to a non techie! Thank you!!
Intermediate & Advanced SEO | | JohnW-UK0 -
How to ping the links
When i do link building for my website, how can i let the search engines know about that. is there any way of pinging?
Intermediate & Advanced SEO | | raybiswa0 -
Increasing Internal Links But Avoiding a Link Farm
I'm looking to create a page about Widgets and all of the more specific names for Widgets we sell: ABC Brand Widgets, XYZ Brand Widgets, Big Widgets, Small Widgets, Green Widgets, Blue Widgets, etc. I'd like my Widget page to give a brief explanation about each kind of Widget with a link deeper into my site that gives more detail and allows you to purchase. The problem is I have a lot of Widgets and this could get messy: ABC Green Widgets, Small XYZ Widgets, many combinations. I can see my Widget page teetering on being a link farm if I start throwing in all of these combos. So where should I stop? How much do I do? I've read more than 100 links on a page being considered a link farm, is that a hardline number or a general guideline?
Intermediate & Advanced SEO | | rball10