301 Redirects on Large Real Estate Website
-
Hi guys,We are about to move over to a new website and need advice on handling the 301 redirects.We have a large real estate website with around 12,000 pages, a lot of these are properties (about 10,000)On our old website, the url structure for each property is as follows -domainname.com/property/view?property=14863on our new site, the url structure isdomainname.com/properties/view/6137The property ID number is always different from old site to new. The way we see it, we have two options. a.) a manual redirect of each and every property url. A very very long jobb.) a folder level redirect, so redirect the 'property' folder on the old site into the 'properties' folder on new. The con with this one is we are not sure if this is the best route to take, if it is how we would go about it?Some advice would be really appreciated guys. I know there are some hyper intelligent SEO's in here and we need to make sure we handle this right!Many thanks in advance.Mark
-
This is true, you can wait for google to deindex them, but that can take 6 months or more.
You could also wait for the 404s to show up and check the referrer and then manually set up the redirect, but if you miss seeing them, you may also risk the linking site removing the link.
Another thing you could do is pull reports from GWMT and Bing WMT and Majestic to discover who is linking to which pages, and then start with those redirects, then watch out for the 404s and pick them up as you discover them.
If you do want to push google along with removing the old pages, you can do it by requesting them in WMT. 12,000 isn't really many, and last time I tried it, you can ask for 1,000 per day, but you have to do them one at a time. That means either a slow manual process or do it with a macro. I think I've had 20,000 or more deleted that way.
-
Hi Mark,
Considering that the old property IDs and new property IDs don't match up and you'd have to configure 1-to-1 redirects (with what sounds like a lot of manual work to get it right and potentially a very large .htaccess file), I'm going to ask a dumb question: why do you need to redirect all of the properties?
In cases like this, I invariably pull some data in to prioritize URLs. Namely, inbound link and direct/referral traffic data.
If a page is not linked to from any external subdomains and gets little or no direct or referral traffic, it's usually best to simply let it return a 404 once you've updated the site - Google will hit the 404 and de-index the page in due time, while the new page will (provided the new site has sound architecture and some authority to justify a deep crawl budget) get picked up.
The only justifiable reason to do a 1-to-1 301 redirect across the board for this many URLs, in my opinion, is if there is enough link equity / traffic to justify the work. Otherwise, Google knows how to handle 404s and they'll crawl/index the new property URLs in due time.
Best,
Mike -
Hey Alan,
Thanks loads for the advice there. Makes a lot of sense.
Problem I have is we do not have any kind of access to the old site. Nor the client having a good relationship with the agency who made the previous site.
I have run multiple crawls of the old site with Screaming Frog and Moz and I just cant get all the properties spidered. Out of the total amount of properties I have about one third of them, which of course can be redirected.
We made a final change to the url structure so the property address is added. The urls now look like the following -
OLD - domainname.com/property/view?property=14863
NEW - domainname.com/property/street-name-postcode/propertyid
The main problem we have and why I think it is not possible using mod rewrite, is the property ids are different on both sites. There is really nothing in common between the two URLs at all aside from /property/ and page title.
Any further advice would be very much appreciated Alan as its clear you have done jobs like this before.
Thanks,
Mark
-
If you have unix and shell access it should be a snap.
but as you're asking this question, you probably don't even know what "grep" is
Get a list of title and URLs from each site
mix them together
sort by title
this will tell you if there are duplicates or if you missed any
if the domain names are different search and replace them so they are the same
Manipulate the list so it is in redirect format
12,000 is not a lot. I worked on sites with several million.
Don't do a folder level redirect.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Forced Redirects/HTTP<>HTTPS 301 Question
Hi All, Sorry for what's about to be a long-ish question, but tl;dr: Has anyone else had experience with a 301 redirect at the server level between HTTP and HTTPS versions of a site in order to maintain accurate social media share counts? This is new to me and I'm wondering how common it is. I'm having issues with this forced redirect between HTTP/HTTPS as outlined below and am struggling to find any information that will help me to troubleshoot this or better understand the situation. If anyone has any recommendations for things to try or sources to read up on, I'd appreciate it. I'm especially concerned about any issues that this may be causing at the SEO level and the known-unknowns. A magazine I work for recently relaunched after switching platforms from Atavist to Newspack (which is run via WordPress). Since then, we've been having some issues with 301s, but they relate to new stories that are native to our new platform/CMS and have had zero URL changes. We've always used HTTPS. Basically, the preview for any post we make linking to the new site, including these new (non-migrated pages) on Facebook previews as a 301 in the title and with no image. This also overrides the social media metadata we set through Yoast Premium. I ran some of the links through the Facebook debugger and it appears that Facebook is reading these links to our site (using https) as redirects to http that then redirect to https. I was told by our tech support person on Newspack's team that this is intentional, so that Facebook will maintain accurate share counts versus separate share counts for http/https, however this forced redirect seems to be failing if we can't post our links with any metadata. (The only way to reliably fix is by adding a query parameter to each URL which, obviously, still gives us inaccurate share counts.) This is the first time I've encountered this intentional redirect thing and I've asked a few times for more information about how it's set up just for my own edification, but all I can get is that it’s something managed at the server level and is designed to prevent separate share counts for HTTP and HTTPS. Has anyone encountered this method before, and can anyone either explain it to me or point me in the direction of a resource where I can learn more about how it's configured as well as the pros and cons? I'm especially concerned about our SEO with this and how this may impact the way search engines read our site. So far, nothing's come up on scans, but I'd like to stay one step ahead of this. Thanks in advance!
Technical SEO | | ogiovetti0 -
Robots txt. in page with 301 redirect
We currently have a a series of help pages that we would like to disallow from our robots txt. The thing is that these help pages are located in our old website, which now has a 301 redirect to current site. Which is the proper way to go around? 1- Add the pages we want to disallow to the robots.txt of the new website? 2- Break the redirect momentarily and add the pages to the robots.txt of the old one? Thanks
Technical SEO | | Kilgray0 -
Duplicate Page Title for a Large Listing Website
My company has a popular website that has over 4,000 crawl errors showing in Moz, most of them coming up as Duplicate Page Title. These duplicate page titles are coming from pages with the title being the keyword, then location, such as: "main keyword" North Carolina
Technical SEO | | StorageUnitAuctionList
"main keyword" Texas ... and so forth. These pages are ranked and get a lot of traffic. I was wondering what the best solution is for resolving these types of crawl errors without it effecting our rankings. Thanks!0 -
301 redirect blog posts from old URL to new one
I moved a wordpress blog from domain.com to domain.com/blog . I want to redirect the links in google from the old domain.com to the new one, but I also want to put a new site/application at domain.com..so I'm thinking an .htaccess 301 redirect at the root wouldn't work. Any tips?
Technical SEO | | callmeed0 -
Is 301 redirecting all old URLS after a new site redesign to the root domain bad for SEO?
After a new site redesign ...would it hinder our rankings if we 301 redirected all old URLS that are returning 404 error codes to the root domain (home page) ? Would this be a good temporary solution until we are able to redirect the pages to the appropriate corresponding page? Thanks so much!
Technical SEO | | DCochrane0 -
Will rankings for my micro site rank better if I 301 redirect it to my main site?
This is my first time asking so I will try to be as clear as possible. Ok, I have a micro site that is an (exact match domain) and the domain is a couple 3-4 years old and ranks very well for several search terms. The main two terms it ranks for are like this. houses for rent in XXXXX XXXXX homes for rent (XXXXX equals a city name) The issue is this site has no backlinks, zero advanced SEO, I only did basic optimization to it when i set the site up. Even site structure, url structure all are not good.
Technical SEO | | Robbie8299
The only page I have ever even seen rank is the main root url. But with all that the site does really good in the top 1-2 results for key search terms. Now, I have a main site that is a very big site that has steadily been climbing in search terms every month with great backlinks, optimized for the city and all.
It currently ranks on second page for the listed search terms listed above. What I want to do is 301 redirect this microsite to my city page on my main site that is much better optimized for the key city terms.
The 301 redirect would point this "root domain" (mymicrosite.com) to my city page that looks like this. www.mymaindomain.com/city/XXXXXXX If I do this will Google rank my main URL city page as well as it ranks this microsite with zero links, seo, etc, etc. What happens if it does not? Will I be able to turn off the 301 redirect and keep the microsite rankings? My main reason for wanting this is I want this city page to rank well and I only want to optimize one site instead of both. Any help would be great!0 -
Trailing Slashes In Url use Canonical Url or 301 Redirect?
I was thinking of using 301 redirects for trailing slahes to no trailing slashes for my urls. EG: www.url.com/page1/ 301 redirect to www.url.com/page1 Already got a redirect for non-www to www already. Just wondering in my case would it be best to continue using htacces for the trailing slash redirect or just go with Canonical URLs?
Technical SEO | | upick-1623910 -
Will 301 redirecting a site multiple times still preserve the original site value?
Hi, All! If site www.abc.com was already 301 redirected to site www.def.com, and now the site owner wants to redirect www.def.com to www.ghi.com - is there any concern that it's not going to work, and some of the original linkjuice, rank, trust, etc. is going to vanish? Or as long as the 301s are set up right, should you be able to 301 indefinitely? Does anyone have any experience with actually doing this and seeing good/bad/neutral results? Thanks in advance! -Aviva B
Technical SEO | | debi_zyx0