Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
How long for authority to transfer form an old page to a new page via a 301 redirect? (& Moz PA score update?)
-
Hi
How long aproximately does G take to pass authority via a 301 from an old page to its new replacement page ?
Does Moz Page Authority reflect this in its score once G has passed it ?
All Best
Dan -
I wouldn't get too hung up on the Moz timeline, since that's only correlated with what Google does, to build a broader model. If Google has crawled/cached the 404 and the page actually is no longer in the index, then that page should stop inheriting and passing link equity. It can get complicated, because sometimes 404s have inbound links and other issues tied to them that can confuse the crawlers. So, I'd say it's situational.
Moz (specifically, OSE) can help you determine what links still exist to those URLs, which really should guide whether you let them stay 404s or 301-redirect them to something relevant. The other aspect of the decision is just whether something relevant exists. If you clearly have built a page to replace the old one then 301-redirect it. If the old page is something that ceased to exist for a reason, then a 404 is probably fine unless that old page had a ton of inbound links. In that case, the 404 has essentially cut off those links.
The problem is that those inbound links are still out there, so it's not that the authority has ceased to exist. It's that you've basically cut the pipe through which the authority flows.
-
No I want to 301 a page that became a 404 after devs missed redirecting during site migration where urls changed. Moz saying a few of these 404 urls still have authority. I know should 301.anyway even if they don't but are they likely to still have authority according to G ? In other words how long aftr an old page/url with authority that becomes.a 404 retains it ?
-
Sorry, I'm a little confused - are you 301'ing to a 404? I'm not really sure I understand the situation.
-
thanks Dr Pete !
What i meant was how long aprox do you have to set up a 301 for an old page resolving in a 404 before it loses its page authority ?
If you leave it a month say will it have lost the PA or will it retain it so will still transfer the PA once you 301 it say 4 weeks or more since its became a 404 page?
Or is it more likely that Moz Analytics hasnt updated yet and still attributing the url with an authority score when in fact Google is likely to have dropped its authority since been a 404 for 4 weeks ?
cheers
dan
-
I think our data is getting refreshed every couple of weeks at this point, but I'm not sure if a 404 will drop a page from MozScape/OSE right away at that point. I suspect 301s may be updated more quickly, since a 404 could be a temporary issue. Once the target of a 301 gets passed the PA, the original page should lose it.
-
And how long does it take for the PA on an old page to 'expire' if hasnt been transferred ?
Say if devs have missed some pages in the migration (4 weeks ago) which are now resolving in a 404, which Moz Dashboard is still reporting as having PA.
Will that authority still transfer if set up a 301 4x weeks later ?
I know we should 301 anyway, to reduce 404's, but just interested in knowing if the pages old authority expires at any point (which i'm sure it must do eventually).
Thanks
Dan
-
great thanks Dr Pete !
-
It can vary quite a bit. The page has to be recrawled/recached, which can take anywhere from hours to weeks, depending on how much authority the page has. That's usually the big delay. After that, Google may on occasion delay passing authority, but we don't have proof of that (there are just cases where it seems like they do).
If it's just a handful of pages, re-fetch them through Google Webmaster Tools. It never hurts to kick the crawlers.
-
thanks Keri
any ideas though aprox how long G takes to update/pass authority of an old page to new one via a 301 ?
-
Moz Page Authority is a separate metric. Sadly, we have no pipeline from Google where they tell us exactly what they think of a site. We update our metrics about every month, so it may take a couple of months to see authority changed to a new page in Moz.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How long does Moz takes to update DA and PA value?
I used to work daily on my office website https://quintdaily.com/ and follow link building strategy. however, since more than 40-50 days, there is no movement in my DA, as the PA has just lifted up. However, there is no changes in Domain Authority so far. Besides, other than this website, im working on location wise SEO projects, but, i can find those websites Domain Authority changed in less 20 days. Is there is time period in changing the Domain Authority once it touched DA 30+?
Moz Pro | | seochris172 -
New blog site spam score is 40+ without any backlink
I have purchased a new domain ( Studytobecome.com ) from GoDaddy, before 15 days, and i just writing daily 1 article on my site, without any SEO, or backlinks, but now when I see in Moz spam score of my site after 15 days it shows 40+ without any links. How to reduce it, and whats the problem is, I don't understand.Please help me.5Vc6zl8
Moz Pro | | bhavierureu1 -
Source page showsI have 2 h1 tags on my page. I can only find one.
When I grade my page it says I have more than one h1 tag. I view the source page and it shows there are two h1 headings with the same wording. If I delete the one h1 heading I can find, the page source shows I have deleted both of them. I don't know how to get to the other heading to delete it. And I'm off page one of google! Can anybody help? Clay Stephens
Moz Pro | | Coot0 -
MOZ Versus HUBSPOT??? If We Discontinue MOZ Subscription and Get HUBSPOT, Anything Lost?
How does Hubspot compare with MOZ? Doe it provide similar features or is the functionality very different? Is Hubspot complementary to MOZ or could it be used as a substitute? If we stopped our MOZ subscription and subscribed to Hubspot would we lose anything? Thanks, Alan
Moz Pro | | Kingalan10 -
Block Moz (or any other robot) from crawling pages with specific URLs
Hello! Moz reports that my site has around 380 duplicate page content. Most of them come from dynamic generated URLs that have some specific parameters. I have sorted this out for Google in webmaster tools (the new Google Search Console) by blocking the pages with these parameters. However, Moz is still reporting the same amount of duplicate content pages and, to stop it, I know I must use robots.txt. The trick is that, I don't want to block every page, but just the pages with specific parameters. I want to do this because among these 380 pages there are some other pages with no parameters (or different parameters) that I need to take care of. Basically, I need to clean this list to be able to use the feature properly in the future. I have read through Moz forums and found a few topics related to this, but there is no clear answer on how to block only pages with specific URLs. Therefore, I have done my research and come up with these lines for robots.txt: User-agent: dotbot
Moz Pro | | Blacktie
Disallow: /*numberOfStars=0 User-agent: rogerbot
Disallow: /*numberOfStars=0 My questions: 1. Are the above lines correct and would block Moz (dotbot and rogerbot) from crawling only pages that have numberOfStars=0 parameter in their URLs, leaving other pages intact? 2. Do I need to have an empty line between the two groups? (I mean between "Disallow: /*numberOfStars=0" and "User-agent: rogerbot")? (or does it even matter?) I think this would help many people as there is no clear answer on how to block crawling only pages with specific URLs. Moreover, this should be valid for any robot out there. Thank you for your help!0 -
Special Characters in URL & Google Search Engine (Index & Crawl)
G'd everyone, I need help with understanding how special characters impact SEO. Eg. é , ë ô in words Does anyone have good insights or reference material regarding the treatment of Special Characters by Google Search Engine? how Page Title / Meta Desc with Special Chars are being index & Crawl Best Practices when it comes to URLs - uses of Unicode, HTML entity references - when are where? any disadvantage using special characters Does special characters in URL have any impact on SEO performance & User search, experience. Thanks heaps, Amy
Moz Pro | | LabeliumUSA0 -
Moz & Xenu Link Sleuth unable to crawl a website (403 error)
It could be that I am missing something really obvious however we are getting the following error when we try to use the Moz tool on a client website. (I have read through a few posts on 403 errors but none that appear to be the same problem as this) Moz Result Title 403 : Error Meta Description 403 Forbidden Meta Robots_Not present/empty_ Meta Refresh_Not present/empty_ Xenu Link Sleuth Result Broken links, ordered by link: error code: 403 (forbidden request), linked from page(s): Thanks in advance!
Moz Pro | | ZaddleMarketing0 -
HTC access 301 redirect rules regarding pagination and striped category base (wp)
I am an admin of a wordpress.org blog and I used to use "Yoast All in one SEO" plugin. While I was using this plugin it stripped the category base from my blog post URL's. With yoast all in one seo: Site.com/topic/subtpoic/page/#
Moz Pro | | notgwenevere
Without yoast all in one seo: Site.com/category/topic/subtopic/page/# Now, that I have switched to another plugin, I am trying to manage the page crawl errors which are tremendous somewhere around 1800, mostly due to pagination. Rather than redirecting each URL individually I would like to develop HTC access 301 redirects rules. However all instructions on how to create these HTC access 301 redirect rules are regarding the suffix rather than the category base. So my question is, can HTC access 301 redirects rules work to fix this problem? Including pagination? And if so, what would this particular HTC access 301 redirect look like? Especially regarding pagination? And do I really have to write a 301 redirect for each pagination page?0