Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Will "internal 301s" have any effect on page rank or the way in which an SE see's our site interlinking?
-
We've been forced (for scalability) to completely restructure our website in terms of setting out a hierarchy.
For example - the old structure :
country / city / city area
Where we had about 3500 nicely interlinked pages for relevant things like taxis, hotels, apartments etc in that city :
We needed to change the structure to be :
country / region / area / city / cityarea
So as patr of the change we put in place lots of 301s for the permanent movement of pages to the new structure and then we tried to actually change the physical on-page links too.
Unfortunately we have left a good 600 or 700 links that point to the old pages, but are picked up by the 301 redirect on page, so we're slowly going through them to ensure the links go to the new location directly (not via the 301).
So my question is (sorry for long waffle) :
Whilst it must surely be "best practice" for all on-page links to go directly to the 'right' page, are we harming our own interlinking and even 'page rank' by being tardy in working through them manually?
Thanks for any help anyone can give.
-
Thanks Everett - sorry about delay in coming back to your response.
This 301 issue was one if the things we were worried about (along with a ton of others) so we can at least be a little self-assured that we're prgressing on all fronts and not leaving a gaping problem that will continue to dog us.
Cheers
W
-
I'm just going to answer your question directly. This was your question:
"Whilst it must surely be "best practice" for all on-page links to go directly to the 'right' page, are we harming our own interlinking and even 'page rank' by being tardy in working through them manually?"
Short Answer: As long as you are working to update those internal links, and you have 301 redirects in place during the meantime, you should be fine.
Technically speaking, it is best practice to link directly to the page internally, rather than relying on 301 redirects. Yes, it is true that a very small (very, VERY small so as to be virtually undetectable) amount of pagerank is lost when redirecting, it only becomes an issue when you begin adding redirect on top of redirect. Keeping your house clean, so-to-speak, by not relying on redirects to fix your broken internal links will keep this from happening, and is exactly what the tiny amount of pagerank loss is said to be created for (to discourage webmasters from relying on redirects to fix broken internal links) - if you believe Matt Cutts.
With that said, you may indeed have many other issues to deal with, as do most sites that have a geotargeted, deep URL structure like the one you have outlined. Panda slammed a lot of sites like that pretty hard. But all of that is beyond the scope of this question.
I hope you find whatever is wrong and get your traffic back. Good luck!
-
Hi Chris
Thanks - I 'love' the loose MC videos - "it is - but it isn't an issue".
That was my gut that there may be a temporary loss of link juice, but it would re-adjust after a period. Which means we have other issues.
Cheers
W
-
Thanks for your advice - amended the question so it is simpler to read. sorry about that.
Well that's what I thought - but anecdotal evidence ( as well as past experience ) is making me wonder whether we're losing a significant passing of link juice. We put the 301s in place about 6 or 7 months ago so any loss of link juice between pages should have come back by now.
Maybe we have some other issues?
W
-
Agree with Chris, thumbs up. I would just add that "ideally" you would have manually gone through all the links ahead of time and had the 301s in place prior to launch. That way there is no downtime/confusion to Google on what they are supposed to do with these pages. If you think about it you have 600 pages that are in limbo and so after a while Google will just say, well, I guess those pages are dead and start to crawl them less often and eventually drop them.
I would make it a priority to go through those pages and setup the new 301s ASAP. Google will keep trying a old page for a while (few months) if it 404s or even if you have a 301. It knows that mistakes happen. So in the case of the 301, it will still crawl the old URL for a while even after it sees the 301 the first time, just to make sure that the 301 is really permanent. You have a bit of a grace period so take advantage of it to get things cleaned up quickly.
-
Hiya,
First off let me post this video from Matt Cutts regards to 301 redirects http://www.youtube.com/watch?v=Filv4pP-1nw
As long as the 301 is pointed towards either the same page or a page of equal value (content wise) you should be good. Whilst going through them manually may loose you a bit of rank over time at least you can know you are directing to the correct pages.
short answer
manual - Short term rank loss long term benifit
Auto - visa vesa
Hope this helps
-
Hello,
I don't quite understand your question, if you are adding more category pages, you should have more pages instead of less, just make sure to 301 redirect every single old page and you shouldn't have a problem.
I had to do something similar to one of my sites like 3 months ago and I did loose pagerank on some pages but ranking got better so I wouldn't worry much about pagerank.
Cheers
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google ranking content for phrases that don't exist on-page
I am experiencing an issue with negative keywords, but the “negative” keyword in question isn’t truly negative and is required within the content – the problem is that Google is ranking pages for inaccurate phrases that don’t exist on the page. To explain, this product page (as one of many examples) - https://www.scamblermusic.com/albums/royalty-free-rock-music/ - is optimised for “Royalty free rock music” and it gets a Moz grade of 100. “Royalty free” is the most accurate description of the music (I optimised for “royalty free” instead of “royalty-free” (including a hyphen) because of improved search volume), and there is just one reference to the term “copyrighted” towards the foot of the page – this term is relevant because I need to make the point that the music is licensed, not sold, and the licensee pays for the right to use the music but does not own it (as it remains copyrighted). It turns out however that I appear to need to treat “copyrighted” almost as a negative term because Google isn’t accurately ranking the content. Despite excellent optimisation for “Royalty free rock music” and only one single reference of “copyrighted” within the copy, I am seeing this page (and other album genres) wrongly rank for the following search terms: “free rock music”
On-Page Optimization | | JCN-SBWD
“Copyright free rock music"
“Uncopyrighted rock music”
“Non copyrighted rock music” I understand that pages might rank for “free rock music” because it is part of the “Royalty free rock music” optimisation, what I can’t get my head around is why the page (and similar product pages) are ranking for “Copyright free”, “Uncopyrighted music” and “Non copyrighted music”. “Uncopyrighted” and “Non copyrighted” don’t exist anywhere within the copy or source code – why would Google consider it helpful to rank a page for a search term that doesn’t exist as a complete phrase within the content? By the same logic the page should also wrongly rank for “Skylark rock music” or “Pretzel rock music” as the words “Skylark” and “Pretzel” also feature just once within the content and therefore should generate completely inaccurate results too. To me this demonstrates just how poor Google is when it comes to understanding relevant content and optimization - it's taking part of an optimized term and combining it with just one other single-use word and then inappropriately ranking the page for that completely made up phrase. It’s one thing to misinterpret one reference of the term “copyrighted” and something else entirely to rank a page for completely made up terms such as “Uncopyrighted” and “Non copyrighted”. It almost makes me think that I’ve got a better chance of accurately ranking content if I buy a goat, shove a cigar up its backside, and sacrifice it in the name of the great god Google! Any advice (about wrongly attributed negative keywords, not goat sacrifice ) would be most welcome.0 -
Is page speed important to improve SEO ranking?
I saw on a SEO Agency's site (https://burstdgtl.com/search-engine-optimization/) that page speed apparently affects Google ranking. Is this true? And if it is, how do I improve it, do I need an agency?
On-Page Optimization | | jasparcj0 -
Does DA/PA have any effect on rankings?
I have seen many people are concerned about increasing DA and PA of their websites. While I am very curious why do people focus on increasing DA and PA? Does DA and PA effect the rankings of the website? Because I have recently launched my website regarding men beard trimmer and it is ranking on 1st page but not on number 1 position. Will increasing DA/PA of the site help me in occupying 1st position?
On-Page Optimization | | RyanAmin0 -
Hreflang Errors 404 vs "Page Not Found"
For a websites that differ between catalogs (PDPs) what hreflang error causes the least harm? Obviously the best solution is to only have hreflang for shared products, but this takes more work to implement. So when no identical product exists... 1. Hreflang points to 404 or 410 error. 2. Hreflang points to 200 status "Page Not Found" page. This obviously has the additional issue of needing to point back to 100+ urls. I want to avoid having Google decide to ignore all hreflang due to errors as many correct urls will exist. Any thoughts?
On-Page Optimization | | rigelcable0 -
Duplicate 'meta title' issue (AMP & NON-AMP Pages)
how to fix duplicate meta title issue in amp and non-amp pages? example.com
On-Page Optimization | | 21centuryweb
example.com/amp We have set the 'meta title' in desktop version & we don't want to change the title for AMP page as we have more than 10K pages on the website. ----As per SEMRUSH Tool---- ABOUT THIS ISSUE It is a bad idea to duplicate your title tag content in your first-level header. If your page’s <title>and <h1> tags match, the latter may appear over-optimized to search engines. Also, using the same content in titles and headers means a lost opportunity to incorporate other relevant keywords for your page.</p> <p><strong>HOW TO FIX IT</strong></p> <p>Try to create different content for your <title> and <h1> tags.<br /><br />this is what they are recommending, for the above issue we have asked our team to create unique meta and post title for desktop version but what about AMP page?<br /><br />Please help!</p></title>0 -
Stolen Content reposted on other sites. How does this affect ranking?
Visitors often copy and paste my content and post it elsewhere... on Facebook, on Tumblr, on forums and sometimes on competing websites... but they don't link to me. How does Google treat this duplicated content? What is the best way to handle it? File DCMA claims or ask them for a link?
On-Page Optimization | | brianflannery0 -
Is there a tool that will "grade" content?
Does anybody know of a tool that can "grade" content for Panda compliance. For example, it might look at: • the total number of words on the page • the average number of words in sentences • grammar • spelling • repetitious words and/or phrases • Readability—using algorithms such as: Flesch Kincaid Reading Ease Flesch Kincaid Grade Level Gunning Fog Score Coleman Liau Index Automated Readability Index (ARI) For the last 5 months I've been writing and rewriting literally 100s of catalog descriptions—adhering to the "no duplicate content" and "adding value" rubrics—but in an extremely informal style. I would like to know if I'm at least meeting Google Panda's minimum standards.
On-Page Optimization | | RScime250 -
I have two pages ranking for the same keyword.
The index page and the targeted landing page for that keyword. They have different content, title, meta but I am competing with myself for the main keyword in the industry. What is the best way to fix this? 301 the keyword page to the index page?
On-Page Optimization | | Aftermath_SEO0