Experencing page authority issues after a 301 redirect
-
We just completed a build of a new site and used 301 redirects to retain our page authority. In the first week all the interior pages reported a page authority of 1 after 2 or so weeks the page authority began to look more accurate but they were still not as high as the original pages. The strange thing is that when you click on the link to a page the page authority populates correctly but when the page finally finished loading the PA goes back down. Has anyone ever experienced this and if so how did you fix it?
Thanks!
-
Hey!
That sounds like odd behavior and I don't think I've heard of that happening before. I'd love to dig a bit deeper to see what's going on.
Would you be able to send me the pages you are searching? I assume you are experiencing this in Open Site Explorer?
If you would prefer not post the URLs in this forum, feel free to email me directly at carin@seomoz.org!
Thanks,
Carin -
301 redirects do not pass 100% of page equity/authority. I have no idea exactly how much is passed through a 301, but you're always going to be in for a wee drop during a re-build.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Reclaim lost links to old pages?
We recently moved our site to a new CMS and did a complete redesign, with new content. It's really hit our SEO. Open site explorer is telling me there are lots of 404 web pages which were basically old product pages that no longer exist, since the way we have structured the site has changed. So is it worth trying to reclaim these links? If so, how can I do that without building pages at the URL - will a 301 redirect be enough?
Moz Pro | | jbritchford0 -
Moz crawl duplicate pages issues
Hi According to the moz crawl on my website I have in the region of 800 pages which are considered internal duplicates. I'm a little puzzled by this, even more so as some of the pages it lists as being duplicate of another are not. For example, the moz crawler considers page B to be a duplicate of page A in the urls below: Not sure on the live link policy so ive put a space in the urls to 'unlive' them. Page A http:// nuchic.co.uk/index.php/jeans/straight-jeans.html?manufacturer=3751 Page B http:// nuchic.co.uk/index.php/catalog/category/view/s/accessories/id/92/?cat=97&manufacturer=3603 One is a filter page for Curvety Jeans and the other a filter page for Charles Clinkard Accessories. The page titles are different, the page content is different so Ive no idea why these would be considered duplicate. Thin maybe, but not duplicate. Like wise, pages B and C are considered a duplicate of page A in the following Page A http:// nuchic.co.uk/index.php/bags.html?dir=desc&manufacturer=4050&order=price Page B http:// nuchic.co.uk/index.php/catalog/category/view/s/purses/id/98/?manufacturer=4001 Page C http:// nuchic.co.uk/index.php/coats/waistcoats.html?manufacturer=4053 Again, these are product filter pages which the crawler would have found using the site filtering system, but, again, I cannot find what makes pages B and C a duplicate of A. Page A is a filtered result for Great Plains Bags (filtered from the general bags collection). Page B is the filtered results for Chic Look Purses from the Purses section and Page C is the filtered results for Apricot Waistcoats from the Waistcoat section. I'm keen to fix the duplicate content errors on the site before it goes properly live at the end of this month - that's why anyone kind enough to check the links will see a few design issues with the site - however in order to fix the problem I first need to work out what it is and I can't in this case. Can anyone else see how these pages could be considered a duplicate of each other please? Checking ive not gone mad!! Thanks, Carl
Moz Pro | | daedriccarl0 -
Page Authority and Google updates favouring websites with black hat practices ?
Can someone explain how is it that most of the competitors I have online and that rank in first page of the search results almost entirely get links ( in the thousands) and still have higher or equal domain/page authority than mine? I went 1 by 1 checking all their links and they mostly come from sex pages, and non related sites. I say stop creating angry pandas and penguins and start taking out of the game people that just play dirty. Thanks.
Moz Pro | | AbellSEO0 -
301 or canonical for multiple homepage versions?
I used 301 redirects to point several versions of the homepage to www.site.com. i was just rereading moz's beginners guide to seo, and it uses that scenario as an example for rel canonical, not 301 redirects. Which is better? My understanding is that 301s remove all doubt of getting links to the wrong version and diluting link equity.
Moz Pro | | kimmiedawn0 -
On-Page Report Card Questions
First post here, a couple questions as I work through some of the SEOMOZ tool reporting, specifically the On-page Report Card. I've just received the report results yesterday, so working through the data now. There are two issues categorized as critical by the tool: (1) The grader is stating I don't have any instances of the target keyword in my page title, yet it's there. (The page title is too long, but I'm in the process of hacking the blog software to fix this, it's auto-generated by the CMS.) (2) It's also saying under "Broad Keyword Usage in Document" that I have zero instances of the keyword in the body text, and while I certainly don't have enough, there is at least one instance at the bottom of the blog post. All the text is contained with tags. (3) Related to #2, what's the difference between "Appropriate Keyword Usage in Document" under "High Importance Factors" and "Broad Keyword Usage in Document" under "Critical Factors"
Moz Pro | | webranger0 -
Google Hiding Indexed Pages from SERPS?
Trying to troubleshoot an issue with one of our websites and noticed a weird discrepancy. Our site should only have 3 pages in the index. The main landing page with a contact form and two policy pages, yet google reports over 1,100 pages (that part is not a mystery, I know where they are coming from.....multi site installations of popular CMS's leave much to be desired in actually separating websites) Here is a screen shot showing the results of the site command: http://www.diigo.com/item/image/2jing/oseh I have set my search settings to show 100 (the max number of results) results per page. Everything is fine until I get to page three where I get the standard "In order to show you the most relevant results, we have omitted some entries very similar to the 122 already displayed." But wait a second, I clicked on page three, now there are only two pages of results and the number of results reported has dropped to 122 http://www.diigo.com/item/image/2jing/r8c9 When I click on the "show omitted results" I do get some more results, and the returned results jumps back up to 1,100. However I only get three pages of results. And when I click on the last page the number of results returned changes to 205 http://www.diigo.com/item/image/2jing/jd4h Is this a difference between indexes (same thing happens when I turn instant search back on, Shows over 1,100 results but when I get to the last page of results it changes to 205). Any other way of getting this info? I am trying to go in and identify how these pages are being generated, but I have to know what ones are showing up in the index for that to happen. Only being able to access 1/5th of the pages indexed is not cool. Anyone have any idea about this or experience with it? For reference I was going through with SEOmoz's excellent toolbar and exporting the results to csv (using the Mozilla plugin). I guess google doesn't like people doing that so maybe this is a way to protect against scraping by only showing limited results in the Site: command. Thanks!
Moz Pro | | prima-2535090 -
Question about a dramatic decrease in domain authority.
My personal website http://Bsalva.com has started to drop in both page authority and domain authority. This seems to be largely due to open site explorer not picking up all of my inbound links. Right now it says I only have one linking root domain, but I know this to be crap. I have quite a few followed links from well trusted and reputable sites. My page rank typically bounces between a 5 and 6. I should also have hundreds of inbound links. The other sites In my campaign monitor haven't taken such a hit so I'm starting to get a little worried. Does anyone know what might cause incoming links to not me recognized? they were just fine last month, and I can go find them on the web as we speak.
Moz Pro | | Bsalva0 -
What do i do when all pages are grade A?
I've used the on page grade and now have all my pages at a grade A for relevant keywords. Most of them are cool, achieveing first page rankings apart from a few massive keywords. So the question is, what's next? What do i do now that I'm at grade A, but perhaps not #1 yet... Cheers -dan
Moz Pro | | spytunes0