Will thousands of redirected pages have a negative impact on the site?
-
A client site has thousands of pages with unoptimized urls. I want to change the url structure to make them a little more search friendly.
Many of the pages I want to update have backlinks to them and good PR so I don't want to delete them entirely. If I change the urls on thousands of pages, that means a lot of 301 redirects. Will thousands of redirected pages have a negative impact on the site?
Thanks,
Dino
-
I've never had a problem on creating a large number of redirects on a site before. It's something that happens quite a bit, for instance if a site is moving to a a site to a new domain or a new CMS, where it can often be very difficult to exactly recreate the same URL structure.
There's no limit to the number of redirects, just the number of hops. If the site had existing redirects in place, you might want to update those existing redirects as well, to point to the new final destination.
-
I'm doing the same thing with a site I'm rebuilding. The page structure is changing to make it more logical to users and hopefully google. I've changed the urls on my site a couple of times over the years and I haven't noticed much change in the short-term and a considerable boost in the long term.
The site I'm building has hundreds of pages with tons of 301 redirects as well.
-
There is some loss in PR with 301 but it's about linking in the future too. Is it easier for someone to link to example.com/red-apples or example.com/?page_id=53.
I personally don't think it's a waste of time and it certainly helps with UX.
-
I do not know how much an optimized url is worth. I also do not know how much link juice would be lost by redirecting. I wasn't aware that any would be lost. If so, then I need to consider if leaving them alone is the best option at this point. Definitely do not want to do more harm than good.
-
You hope to get a tiny bump out of changing the URL ?
But, you are going to waste some linkjuice in redirecting....
and create thousands of htaccess lines that must be processed...
and you are worried that tons of redirects are going to cause a problem....
How much do you think an optmized URL is worth in the search engines?
I don't think that it's worth a whole lot.
-
There won't be any issue. But you need to check with system admin if your server could be able to handle thousands of redirects. Most of the time, reducing duplicates from the current site and using regular expression for 301 redirects will reduce the number of 301 redirects considerably.
-
Simple answer is no, but.....
Make sure that your 301s are not more than 3 redirects from the original post. For example, if you have a page called pageid=44 and it's about red apples, then make sure that the 301 goes to /red-apples (meaning the exact page you want it to go to)
Matt Cutts has a great video on this on WMT.
This should really answer all your questions but if you have any more then please feel free to ask.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Paginated Pages Page Depth
Hi Everyone, I was wondering how Google counts the page depth on paginated pages. DeepCrawl is showing our primary pages as being 6+ levels deep, but without the blog or with an infinite scroll on the /blog/ page, I believe it would be only 2 or 3 levels deep. Using Moz's blog as an example, is https://moz.com/blog?page=2 treated to be on the same level in terms of page depth as https://moz.com/blog? If so is it the https://site.comcom/blog" /> and https://site.com/blog?page=3" /> code that helps Google recognize this? Or does Google treat the page depth the same way that DeepCrawl is showing it with the blog posts on page 2 being +1 in page depth compared to the ones on page 1, for example? Thanks, Andy
Intermediate & Advanced SEO | | AndyRSB0 -
How to speed up transition towards new 301 redirected landing pages?
Hi SEO's, I have a question about moving local landing pages from many separate pages towards integrating them into a search results page. Currently we have many separate local pages (e.g. www.3dhubs.com/new-york). For both scalability and conversion reasons, we'll integrate our local pages into our search page (e.g. www.3dhubs.com/3d-print/Bangalore--India). **Implementation details: **To mitigate the risk of a sudden organic traffic drop, we're currently running a test on just 18 local pages (Bangalore) = 1 / 18). We applied a 301 redirect from the old URL's to the new URL's 3 weeks ago. Note: We didn't yet update the sitemap for this test (technical reasons) and will only do this once we 301 redirect all local pages. For the 18 test pages I manually told the crawlers to index them in webmaster tools. That should do I suppose. **Results so far: **The old url's of the 18 test cities are still generating > 99% of the traffic while the new pages are already indexed (see: https://www.google.nl/webhp?sourceid=chrome-instant&ion=1&espv=2&ie=UTF-8#q=site:www.3dhubs.com/3d-print/&start=0). Overall organic traffic on test cities hasn't changed. Questions: 1. Will updating the sitemap for this test have a big impact? Google has already picked up the new URL's so that's not the issue. Furthermore, the 301 redirect on the old pages should tell Google to show the new page instead, right? 2. Is it normal that search impressions will slowly shift from the old page towards the new page? How long should I expect it to take before the new pages are consistently shown over the old pages in the SERPS?
Intermediate & Advanced SEO | | robdraaijer0 -
How to 301 Redirect /page.php to /page, after a RewriteRule has already made /page.php accessible by /page (Getting errors)
A site has its URLs with php extensions, like this: example.com/page.php I used the following rewrite to remove the extension so that the page can now be accessed from example.com/page RewriteCond %{REQUEST_FILENAME}.php -f
Intermediate & Advanced SEO | | rcseo
RewriteRule ^(.*)$ $1.php [L] It works great. I can access it via the example.com/page URL. However, the problem is the page can still be accessed from example.com/page.php. Because I have external links going to the page, I want to 301 redirect example.com/page.php to example.com/page. I've tried this a couple of ways but I get redirect loops or 500 internal server errors. Is there a way to have both? Remove the extension and 301 the .php to no extension? By the way, if it matters, page.php is an actual file in the root directory (not created through another rewrite or URI routing). I'm hoping I can do this, and not just throw a example.com/page canonical tag on the page. Thanks!0 -
Will Using Attributes For Landing Pages In Magento Dilute Page Rank?
Hello Mozzers! We have an ecommerce site built on Magento. We would like to use attribute filters in our layered navigation for landing page purposes. Each page will have a unique URL, Meta Title and Meta Description. For example: URL: domain.com/art/abstract (category is Art, attribute is Abstract) Title: Abstract Art For Sale Meta: Blah Blah Blah Currently these attribute pages are not being indexed by google as they are set in google parameters. We would like to edit google parameters to start indexing some of the attribute filters that users search for, so they can be used as landing pages. Does anyone have experience with this? Is this a good idea? What are the consequences? Will this dilute Page Rank? Could this destroy the world? Cheers! MozAddict
Intermediate & Advanced SEO | | MozAddict0 -
Multiple 301 Redirects for the Same Page
Hi Mozzers, What happens if I have a trail of 301 redirects for the same page? For example,
Intermediate & Advanced SEO | | Travis-W
SiteA.com/10 --> SiteA.com/11 --> SiteA.com/13 --> SiteA.com/14 I know I lose a little bit of link juice by 301 redirecting.
The question is, would the link juice look like this for the example above? 100% --> 90% --> 81% -->72.9%
Or just 100% -----------------------------------------> 90% Does this link juice refer to juice from inbound links or links between internal pages on my site? Thanks!0 -
External links point to 403 page - how to 301 redirect if no file extension?
Hi guys, After moving from an old static .htm site to Wordpress, I 301'd all old .htm urls fine to the new trailing slash foldery style /wordpress-urls/ in htaccess no problem. But Google Webmaster Tools tells me I still have hundreds of external links pointing to a similar version of the old urls (but without the .htm), giving lots of not founds and 403s. Example of the urls linked to that 403 not found: http://www.mydomain.com/filename So I'm wondering how I do a 301 redirect from a non-exisiting url that also has no file extention as above and is not like a folder? This seems like a lot of possible external link juice to lose. Thanks!
Intermediate & Advanced SEO | | emerald0 -
Can changing dynamic url of over 2000 pages site after a year will change its ranking
Hi- Have built site in joomla The urls are dynamic in nature with over a year - all pages are well indexed and backlinks been built over with these dynamic urls Need to know if i hire an agency to change over dynamic url to static url of these 2000 pages - will it also change all Search engine ranking positions of existing urls Will all the seo effort and backlinks build over 15 months will still hold valid or this will just back to square one due to change of urls is it advisable to get the url changed from dynamic to static one - especially when site is receiving over 75,000 visitors every month Thanks in advance. Look for expert suggestions
Intermediate & Advanced SEO | | Modi0 -
Reducing pages with canonical & redirects
We have a site that has a ridiculous number of pages. Its a directory of service providers that is organized by city and sub-category of the vertical. Each provider is on the main city page, then when you click on a category, it will only show those folks who offer that subcategory of this service. example: colorado/denver - main city page colorado/denver/subcat1 - subcategory page There are 37 subcategories. So, 38 pages that essentially have the same content - minus a provider or two - for each city. There are approx 40K locations in our database. So rough math puts us at 1.5 million results pages, with 97% of those pages being duplicate content! This is clearly a problem. But many of these obscure pages do rank and get traffic. A fair amount when you aggregate all these pages together. We are about to go through a redesign and want to consolidate pages so we can reduce the dupe content, get crawl budget allocated to more meaningful pages, etc. Here's what I'm thinking we should do with this site, and I would love to have your input: Canonicalize Before the redesign use the canonical tag on all the sub-category pages and push all the value from those pages (colorado/denver/subcat1, /subcat2, /subcat3... etc) to the main city page (colorado/denver/subcat1) 301 Redirect On the new site (we're moving to a new CMS) we don't publish the duplicate sub-category pages and do 301 redirects from the sub-category URLs to the main city page urls. We'd still have the sub-categories (keywords) on-page and use some Javascript filtering to narrow results. We could cut to the chase and just do the redirects, but would like to use canonicalization as a proof of concept internally at my company that getting rid of these pages is a good thing, or at least wont have a negative impact on traffic. i.e. by the time we are ready to relaunch traffic and value has been transfered to the /state/city page Trying to create the right plan and build my argument. Any feedback you have will help.
Intermediate & Advanced SEO | | trentc0