Best way to get SEO friendly URLSs on huge old website
-
Hi folks
Hope someone may be able to help wit this conundrum:
A client site runs on old tech (IIS6) and has circa 300,000 pages indexed in Google. Most pages are dynamic with a horrible URL structure such as
http://www.domain.com/search/results.aspx?ida=19191&idb=56&idc=2888
and I have been trying to implement rewrites + redirects to get clean URLs and remove some of the duplication that exists, using the IIRF Isapi filter: http://iirf.codeplex.com/
I manage to get a large sample of URLS re-writing and redirecting (on a staging version of the site), but the site then slows to crawl. To imple,ent all URLs woudl be 10x the volume of config. I am starting to wonder if there is a better way:
- Upgrade to Win 2008 / IIS 7 and use the better URL rewrite functionality included?
- Rebuild the site entirely (preferably on PHP with a decent URL structure)
- Accept that the URLS can't be made friendly on a site this size and focus on other aspects
- Persevere with the IIRF filter config, and hope that the config loads into memory and the site runs at a reasonable speed when live
None of the options are great as they either involve lots of work/cost of they involve keeping a site which performs well but could do so much better, with poor URLs.
Any thoughts from the great minds in the SEOmoz community appreciated!
Cheers
Simon
-
Many thanks Ben - and sorry for slow response!
I'm now planning on doing a simple hand coded re-write for some key terms, and monitoring the results/impact. Good call re: slow site is much worse than ugly URLS - totally agree on that. A migration is inevitable, its a case of 'when' not if (CMS is bespoke and ageing) and I'm hoping re-writes/re-directions on some of the higher traffic pages may help reduce the hit when the migration happens.
Cheers
Simon
-
I am going to be watching the responses here because determining the value of different ranking factoring aspects seem so subjective. We all know what elements are important. But, determining the level of priority in the whole scheme of things is a judgement call based on the bottom line, skill sets, and a company's ability to invest the time and resources.
From the sounds of it you aren't only dealing with hours of billable time but also on the possibility of losing sales because of the bloat that would take place while making the changes. I would say a slower site would have a much more drastic effect than ugly URL's. I would also say that pages with ugly URL's still do ok in search as long as there is good site architecture, quality, and unique content. That is what I would concentrate on under your current system. Then I would probably look at weighing the options of moving CMS. That isn't easy either. Migrations always take a hit on rankings, visitor loyalty, and page authority. You will probably come out much stronger but it would be an investment. (experienced first hand)
Just my 2 cents.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Multilingual website
My website is https://www.india-visa-gov.in and we are doing multilingual. There are three options 1. TLD eg india-visa-gov.fr (French) india-visa-gov.de (German) 2. Subdomain eg: fr.india-visa-gov.in (French) de.india-visa-gov.in (German) 3. Folders https://www.india-visa-gov.in/fr/ (French) https://www.india-visa-gov.in/de/ (German) We have tried the 3rd option but need to know whether its better or not for the long term health from SEO. Does the MOZ DA carry better in Subdomain or TLD or Folders? What does MOZ recommend to maintain DA? Thanks
Technical SEO | | amitdipsite150220200 -
Advice urgently needed on best practice for handling multiple product categories on Magento website
I have an ecommerce site built using Magento and urgently need advice on best practice for handling multiple product categories (where products appear in more than one category on the site creating multiple URLs to the same page). In April this year, based on advice from my SEO who felt that duplicate content issues were causing my rankings to be held back, I changed about 25% of the product categories to 'noindex, follow'. This has made organic traffic fall (obviously) as these pages fell out of Google's index. But, contrary to what I was hoping for, it didn't then improve rankings - not one iota, nothing - which was the ONLY reason why I did this. This has had a real negative impact on sales, so I'm starting to think this was actually an a terrible idea. Should I change them back? And to ask a wider question, what is best practice for this particular scenario?
Technical SEO | | Coraltoes770 -
I physically changed my URL and now I have two...How do I get rid of the old one?
Hi, I physically changed my URL as something else and now Google thinks I have two duplicate pages (I know not to do this in the future). e.g. I had www.example.com/i-like-seo.aspx and changed it to: www.example.com/i-love-seo.aspx Google sees this as two pages now and my CMS system is only showing one page (The new page) Also, SEOMOZ is seeing two pages and further more sees them both as having two different amounts of inbound links? When I change content on the new url page, the old url page also updates. I'm really confused as to what has happened here and don't know how to get rid of the old url so that Google doesn't think that I have duplicate content. Any help to what has happened or how to fix it would be so helpful and appreciated. Many thanks.
Technical SEO | | CoGri0 -
For SEO Purposes, What is a Good Website Builder?
Hi, I was just curious what website builder, example: intuit, sitepro, yahoo site builder, website builder - created by Godaddy, etc... would be considered good for SEO? Any comments or thoughts would be appreciated. Happy New Year! -Jimmy
Technical SEO | | jimmy02250 -
.lbi file - SEO friendly or not?
Up until yesterday afternoon i had never heard of a .lbi file. It turns out it is a library file used by Adobe Dreamweaver. From what i can tell it works like a client side included but i am unsure of the technology behind it. The issue:
Technical SEO | | kchandler
When running through a recent SEO audit for a new client i found these .lbi files being used all over there site for site wide callouts and even navigation. When viewing this content through firebug or in the browser you can see the executed HTML content but when viewing the source or the page in seo-browser.com the content is nowhere to be seen. So my thought is this is not SEO friendly and is the same as displaying content in any client-side script like JavaScript or JQuery. Any feedback or thoughts on this subject would be awesome, especially if anyone has used these previously. Unfortunately i cannot share the client site but i would be more than happy to answer any questions if more detail is needed. Thanks in advance - Kyle0 -
SEO Checklist
Ok I know that this would be a huge over-simplification but I am wondering if there is (at least a bird's eye view) a checklist of SEO do's and dont's? I checked to see if something like this existed but could not find one. Any help would be much appreciated. Thanks~
Technical SEO | | bobbabuoy0 -
Best way to use affiliate links
What is the best practice to use amazon affliate links in blog posts? I have read different opinions on this, and want to be sure I'm using best practices. I sometimes link to amazon with an affiliate link on some of my posts, and am working on a top ten Christmas gift ideas for Children born with Down syndrome with lots of affiliate links on it. I'm want to be sure I'm using best practices when adding links like this. Tanks!
Technical SEO | | NoahsDad0 -
What's the best way to deal with an entire existing site moving from http to https?
I have a client that just switched their entire site from the standard unsecure (http) to secure (https) because of over-zealous compliance issues for protecting personal information in the health care realm. They currently have the server setup to 302 redirect from the http version of a URL to the https version. My first inclination was to have them simply update that to a 301 and be done with it, but I'd prefer not to have to 301 every URL on the site. I know that putting a rel="canonical" tag on every page that refers to the http version of the URL is a best practice (http://www.google.com/support/webmasters/bin/answer.py?hl=en&answer=139394), but should I leave the 302 redirects or update them to 301's. Something seems off to me about the search engines visiting an http page, getting 301 redirected to an https page and then being told by the canonical tag that it's actually the URL they were just 301 redirected from.
Technical SEO | | JasonCooper0