Link reclamation and many 301 redirect to one URL
-
We have many incoming links to a non existing pages of a sub-domain, which we are planning to take down or redirect to a sub-directory. But we are not ready to loose pagerank or link juice as many links of this sub-domain are referred from different external links. It's going to be double redirect obviously. What is the best thing we can go to reclaim these links without loss of link juice or PR?
Can we redirect all these links to same sub-domain and redirect the same sub-domain to sub-directory? Will this double redirect works?
Or
Can we redirect all these links to same sub-domain and ask visitors to visit sub-directory, manual redirection? How fair to manually redirect visitors?
Any other options?
Thanks,
Satish
-
Hi,
No, they shouldn't penalise. A lot of companies are bought out by bigger companies and then only have 1 page on a new domain to redirect all of their pages to. I would say that there would be little to no chance of any penalisation.
All the best,
Sean -
Hi Sean,
My doubt is: If we redirect all incoming links to one page; do Google penalise? What's the best way to reclaim the incoming links pointing to non-existing pages? We have redirected some links and dropped in rankings?
-
Hey Satish,
Historically, updating redirects by amending the URL at their source DID pass on SEO value as each redirect chain caused a link to lose a small portion of equity.
A few weeks ago, Google announced that this is no longer the case and 301 redirects DON'T lose any equity any longer. This update now means that webmasters no longer have to spend time updating links at their source, rendering the practice of outreach pretty much redundant.
You can check out the blog here - https://moz.com/blog/301-redirection-rules-for-seo
In short, I wouldn't bother with any outreaching as it'll provide zero SEO benefit.
All the best,
Sean
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google Country Redirection Change
Analytics is showing a substantial decrease in referring traffic from Google specific regional domains like .ca, .co.uk, .de, etc vs an uptick from .com starting as of March 2018. Did anyone note when this change happened when Google stopped directing traffic to their regional domains? Was there any press about it (couldn't find any). Using a VPN for different countries, I compared regional specific domain SERPs vs .com and they're pretty much identical. Thanks!
Algorithm Updates | | Bragg1 -
Does using non-https links (not pages) impact or penalise the website rankings?
Hi community, We have couple of pages where we we have given non-https (http) hyperlinks by mistake. They will redirect to http links anyway. Does using these http links on page hurt any rankings? Thansk
Algorithm Updates | | vtmoz0 -
Placement of /p/ in URL structure for ecommerce site product URLs
Hi, We're a discussion about how to structure a clients ecommerce site product page URLs where 12345 represent the product SKU/number: https://domain.com/Item--i-12345 https://domain.com/product-name/p/12345 https://domain.com/p/12345 It's a toss up between the second and the third URL, but the SEO company is saying the third is best because of the placement with the /p/ and creating a silo for "products" that help search engines recognize it is a product. Does anyone have thoughts on this? Thanks!
Algorithm Updates | | AliMac260 -
Why do some URLs display in the SERPS with > seperators between subfolders, and others display with a /
Why do some URLs display like this: cargurus.com › Used Cars › Jeep Wrangler and others display like https://www.carmax.com/cars/jeep/wrangler Is there a significance to having the sub folders separated with an arrow vs a backslash?
Algorithm Updates | | Brian_Owens_10 -
Are SEO Friendly URLS Less Important Now That Google Is Indexing Breadcrumb Markup?
Hi Moz Community and staffers, Would appreciate your thoughts on the following question: **Are SEO friendly URLS less important now that Google is indexing breadcrumb markup in both desktop and mobile search? ** Background that inspired the question: Our ecommerce platform's out of the box functionality has very limited "friendly url" settings and would need some development work to setup an alias for more friendly URLS. Meanwhile, the breadcrumb markup is implemented correctly and indexed so it seems there's no longer an argument for improved CTR with SEO friendly URLS . With that said I'm having a hard time justifying the URL investment, as well as the 301 redirect mapping we would need to setup, and am wondering if more friendly URLs would lead to a significant increase in rankings for level of effort? Sidenote: We already rank well for non-brand and branded searches since we are brand manufacturer with an ecommerce presence. Our breadcrumbs are much cleaner & concise than our URL structure. Here are a couple examples. Category URL: http://www.mysite.com/browse/category1/subcat2/subcat3/_/N-7th
Algorithm Updates | | jessekanman
Breadcrumb: www.mysite.com > category1 > subcat2 > subcat3 Product URL: http://www.mysite.com/product/product-name/_/R-133456E112
Breadcrumb: www.mysite.com > category1 > subcat2 > subcat3 > product name The "categories" contain actual keywords just hiding them here in the example. According to my devs they can't get rid of the "_" but could possible replace it with a letter. Also they said it's an easier fix to make the URLs always lower case. Lastly some of our product URLS contain non-standard characters in the product name like "." and "," which is also a simpler fix according to my developers. Looking forward to your thoughts on the topic! Jesse0 -
Duplicate Pate Content - 404's or 301's?
I deleted about 100 pages of stale content 6 months ago and they are currently returning 404's. The crawl diagnostics have pointed out 77 duplicate pages because of this. Should I redirect these as 301's to get rid of the error or keep them as 404's? Most of the pages still have some page authority but I don't want to get penalized. Just looking for the best solution. Thanks!
Algorithm Updates | | braunna0 -
Excessive internal links. Should I remove the footer links?
Hi guys, I have an ecommerce site selling eco-friendly items online. I ran some on-page optimisation reports from SEOMoz PRO and discovered that I have at least 120 internal links per page. 32 of these are in the footer, designed in part to aid user navigation but perhaps also to have a positive impact on SERPs and SEO in general for the ecommerce site. Will removing these links be beneficial to my search engine rankings, as I will have less than 100 internal links per page? Or is it a major change which may be dangerous for my site rankings? Please help as I'm not sure about this! I've attached an image of the footer links below. I won't be removing the Facebook/Twitter links, just the 3 columns on the left. Thank you, Pravin MAvLe.jpg
Algorithm Updates | | goforgreen0 -
Why does Google say they have more URLs indexed for my site than they really do?
When I do a site search with Google (i.e. site:www.mysite.com), Google reports "About 7,500 results" -- but when I click through to the end of the results and choose to include omitted results, Google really has only 210 results for my site. I had an issue months back with a large # of URLs being indexed because of query strings and some other non-optimized technicalities - at that time I could see that Google really had indexed all of those URLs - but I've since implemented canonical URLs and fixed most (if not all) of my technical issues in order to get our index count down. At first I thought it would just be a matter of time for them to reconcile this, perhaps they were looking at cached data or something, but it's been months and the "About 7,500 results" just won't change even though the actual pages indexed keeps dropping! Does anyone know why Google would be still reporting a high index count, which doesn't actually reflect what is currently indexed? Thanks!
Algorithm Updates | | CassisGroup0