301's & Link Juice
-
So lets say we have a site that has 0 page rank (kind of new) has few incoming links, nothing significant compared to the other sites.
Now from what I understand link juice flows throughout the site. So, this site is a news site, and writes sports previews and predictions and what not. After a while, a game from 2 months gets 0 hits, 0 search queries, nobody cares. Wouldn't it make sense to take that type of expired content and have it 301 to a different page. That way the more relevant content gets the juice, thus giving it a better ranking...
Just wondering what everybody's thought its on this link juice thing, and what am i missing..
-
Lots of interesting ideas. Thanks you everyone.
-
a 301 simply redirects a request to a new request with a new url. If the page has no external links then a 301 will do nothing for you. If you don't want the page delete it, remove any internal links and your done
Each request leaks link juice.
If you have links pointing to page A, and you 301ed page A to page B, then any link juice will go to page B , but will lose a bit of link juice, in fact you lose it twice, once for the link, and once for the 301 redirect. If the only links are internal links why not just link to Page B in the first place.but I would not remove the page, all pages have PageRank to start with, the more pages on your site the more PR, but the more pages to share it with, but with smart linking you can sculpt the more PR to fall on pages you want it to and less you don't want it to.
Read this simple explanation http://thatsit.com.au/seo/tutorials/a-simple-explanation-of-pagerank
-
When I first read your question, the first thing I thought of was just recycling the URL's each sports season...
For instance, each year the Bears play the Packers, so in 2013 you write up your prediction on the page: mypredictionsite.com/bears-vs-packers.html and the page hangs around until 2014 when you re-write and re-publish the page for that year's game. Be sure to use schema and other tags to assign a recent Date for updating the page, and put something big and bold at the top of the page so people know what season the prediction is for. (Maybe also a tally of how well you did predicting their previous matches.)
That way any links it picks up over the years are pertinent to the Bears playing the Packers, and that could help with ranking. Also you don't have to keep track of perpetually growing collection of 301's.
Just a thought...
-
For me, I would need to know that the links had variation. So, for each page say A, B, C, D, and E, you have a week or two in between them. So page A runs on the 1st and it gets a couple of links. Page B runs on the 15th and gets a couple, Page C runs on the 30th and gets a couple and so forth.
The links to be truly helpful (at least at some point of which I could not tell you) cannot be the same couple of links to each page. If they are varied, I can see it having validity, but at some point if you are getting links to the pages from the same site/page/person, I think it has to ring a spam bell at some point. PLEASE NOTE: I cannot show you that anywhere that I am aware of so you are free to test it out. I just am using gut here.
Thanks
-
Hi, thanks for your response. I agree with what your saying. The fact is though, my idea was to 301 all the articles that are no longer relevant to the "This weeks previews" so basically all the old articles would be getting 301's to 1 link. so it wont be 301 to one page, then that page 301 to another site...etc
You know what im getting at? Sorry if im making it confusing.
-
ravashjalil
With a news/sports site in particular you are going to have continuous stories you are writing. When you start doing one 301 to another to another to another to another to another to another... sooner or later it is going to appear to be THE SPAM CITY GAZETTE. You do not want a site like that. With a news or sports story unless huge and it is our byline, etc. you are not going to have enough juice on any given page to really benefit you. You ultimately will be attempting to move juice from internal pages to other internal pages.
You are better served to do it the old fashion way, just keep writing great content, etc. Archive the older stuff and let the visitor do their bit. If you get a page with a lot of links coming to it, you might want to leave it alone as people seem to want to read that...not get sent somewhere else.Best,
Robert
-
Check the Page authority of that page.If you have created new content and its have some page authority I surely suggest you to redirect it to new one. Its surely pass link juice.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
After hack and remediation, thousands of URL's still appearing as 'Valid' in google search console. How to remedy?
I'm working on a site that was hacked in March 2019 and in the process, nearly 900,000 spam links were generated and indexed. After remediation of the hack in April 2019, the spammy URLs began dropping out of the index until last week, when Search Console showed around 8,000 as "Indexed, not submitted in sitemap" but listed as "Valid" in the coverage report and many of them are still hack-related URLs that are listed as being indexed in March 2019, despite the fact that clicking on them leads to a 404. As of this Saturday, the number jumped up to 18,000, but I have no way of finding out using the search console reports why the jump happened or what are the new URLs that were added, the only sort mechanism is last crawled and they don't show up there. How long can I expect it to take for these remaining urls to also be removed from the index? Is there any way to expedite the process? I've submitted a 'new' sitemap several times, which (so far) has not helped. Is there any way to see inside the new GSC view why/how the number of valid URLs in the indexed doubled over one weekend?
Intermediate & Advanced SEO | | rickyporco0 -
What's wrong with the algorithm?
Is it possible that Google is penalising a specific page and in the same time it shows unrelated page in the search results? "rent luxury car florence" shows https://lurento.com/city/munich/on the 2nd page (that's Munich, Germany) and in the same time completely ignores the related page https://lurento.com/city/florence/ How I can figure out if the specific page has been trashed and why? Thanks,
Intermediate & Advanced SEO | | lurento.com
Mike0 -
Duplicate content when changing a site's URL due to algorithm penalty
Greetings A client was hit by penguin 2.1, my guess is that this was due to linkbuilding using directories. Google webmaster tools has detected about 117 links to the site and they are all from directories. Furthermore, the anchor texts are a bit too "perfect" to be natural, so I guess this two factors have earned the client's site an algorithm penalty (no manual penalty warning has been received in GWT). I have started to clean some of the backlinks, on Oct the 11th. Some of the webmasters I asked complied with my request to eliminate backlinks, some didn´t, I disavowed the links from the later. I saw some improvements on mid october for the most important KW (see graph) but ever since then the rankings have been falling steadily. I'm thinking about giving up on the domain name and just migrating the site to a new URL. So FINALLY MY QUESTION IS: if I migrate this 6-page site to a new URL, should I change the content completely ? I mean, if I just copy paste the content of the curent site into a new URL I will incur in dpolicate content, correct?. Is there some of the content I can copy ? or should I just start from scratch? Cheers hRggeNE
Intermediate & Advanced SEO | | Masoko-T0 -
Will Canonical tag on parameter URLs remove those URL's from Index, and preserve link juice?
My website has 43,000 pages indexed by Google. Almost all of these pages are URLs that have parameters in them, creating duplicate content. I have external links pointing to those URLs that have parameters in them. If I add the canonical tag to these parameter URLs, will that remove those pages from the Google index, or do I need to do something more to remove those pages from the index? Ex: www.website.com/boats/show/tuna-fishing/?TID=shkfsvdi_dc%ficol (has link pointing here)
Intermediate & Advanced SEO | | partnerf
www.website.com/boats/show/tuna-fishing/ (canonical URL) Thanks for your help. Rob0 -
E-commerce site structure & link juice: Bouncing off an idea
Hi guys, Question from a new-comer in SEO. Summary of the situation: potential customers are searching for a generic product category (buy mountainbike) more often than a brand in that category (Specialized MTB). And the latter is searched more often than a specific product ('some specific product from Specialized brand'). Both the brand pages and product pages are not ranking good Then would it be a good idea to have the category pages only link to the brand pages? They may show the products, but the links wouldn't pass link juice. I'm not even sure if that is technically possible, but I wanted to figure out the merit first. I'm hoping this would support the brand pages to rank better as they take in more volume. Please do feel free to teach me!
Intermediate & Advanced SEO | | Peter850 -
403, 301, 302, 404 errors & possible google penalty
William Rock ran a Xenu site scan on nlpca(dot)com and mentioned the following: ...ran a test with Xenu site scan and it found a lot of broken links with 403, 301, 302, 404 Errors. Other items found: Broken page-local links (also named 'anchors', 'fragmentidentifiers'): http://www.nlpca.com/DCweb/Interesting_NLP_Sites.html#null anchor occurs multiple timeshttp://www.nlpca.com/DCweb/Interesting_NLP_Sites.html#US not found Could somone give us an output of that list, and which ones of these errors do we need to clean up for SEO purposes? Thank you.
Intermediate & Advanced SEO | | BobGW0 -
Need to migrate multiple URLs and trying to save link juice
I have an interesting problem SEOmozers and wanted to see if I could get some good ideas as to what I should to for the greatest benefit. I have an ecommerce website that sells tire sensors. We just converted the old site to a new platform and payment processor, so the site has changed completely from the original, just offering virtually the same products as before. You can find it at www.tire-sensors.com We're ranked #1 for the keyword "tire sensors" in Google. We sell sensors for ford, honda, toyota, etc -- and tire-sensors.com has all of those listed. Before I came along, the company I'm working for also had individual "mini ecommerce" sites created with only 1 brand of sensors and the URL to match that maker. Example : www.fordtiresensors.com is our site, only sells the Ford parts from our main site, and ranks #1 in Google for "ford tire sensors" I don't have analytics on these old sites but Google Keyword Tool is saying "ford tire sensors" gets 880 local searches a month, and other brand-specific tire sensors are receiving traffic as well. We have many other sites that are doing the same thing. www.suzukitiresensors.com (ranked #2 for "suzuki tire sensors") Only sells our Suzuki collection from the main site's inventory etc We need to get rid of the old sites because we want to shut down the payment gateway and various other things those sites are using, and move to one consolidated system (aka www.tire-sensors.com) Would simply making each maker-specific URL (ie. fordtiresensors.com) 301 redirect to our main site (www.tire-sensors.com) give us to most benefit, rankings, traffic etc? Or would that be detrimental to what we're trying to do -- capturing the tire sensors market for all car manufacturers? Suggestions? Thanks a lot in advance! Jordan
Intermediate & Advanced SEO | | JordanGodbey0 -
Redirect Chains - Accept the 301 chain or link from the original page??
Hi everyone, I have a client that re-launched his site and it's gone from 100 pages to 1000 (new languages/increased product pages etc) We've used 301's to map the old site to the new database driven site. BUT the new site is creating extremely long URL's: e.g. www.example.com/example_example_example/example_example_example_example Obviously I want to change these URL's: THE PROBLEM..... I am worried about the Chain Redirects. I know two 301 redirects is okay (although it's not great), but I wonder if there is an alternative: When I've implemented the new URL structure the chain will look like this: www.oldsite.com 301 redirects to www.newsitewithdodgyurls.com which then 301 redirects to www.mynewsitewithgreaturls.com Seeing as the new site has only been live for a month, and hasn't really gained many external links, should I: 301 from the original site (www.oldsite.com) straight to the new site (www.mynewsitewithgreaturls.com)? If so, what would I do with the pages that I have not redirected? Let them 404? OR Leave the 301 chain in place? Your advice, and any other suggestions would be much appreciated Thanks
Intermediate & Advanced SEO | | jamesjackson0