Best way to get SEO friendly URLSs on huge old website
-
Hi folks
Hope someone may be able to help wit this conundrum:
A client site runs on old tech (IIS6) and has circa 300,000 pages indexed in Google. Most pages are dynamic with a horrible URL structure such as
http://www.domain.com/search/results.aspx?ida=19191&idb=56&idc=2888
and I have been trying to implement rewrites + redirects to get clean URLs and remove some of the duplication that exists, using the IIRF Isapi filter: http://iirf.codeplex.com/
I manage to get a large sample of URLS re-writing and redirecting (on a staging version of the site), but the site then slows to crawl. To imple,ent all URLs woudl be 10x the volume of config. I am starting to wonder if there is a better way:
- Upgrade to Win 2008 / IIS 7 and use the better URL rewrite functionality included?
- Rebuild the site entirely (preferably on PHP with a decent URL structure)
- Accept that the URLS can't be made friendly on a site this size and focus on other aspects
- Persevere with the IIRF filter config, and hope that the config loads into memory and the site runs at a reasonable speed when live
None of the options are great as they either involve lots of work/cost of they involve keeping a site which performs well but could do so much better, with poor URLs.
Any thoughts from the great minds in the SEOmoz community appreciated!
Cheers
Simon
-
Many thanks Ben - and sorry for slow response!
I'm now planning on doing a simple hand coded re-write for some key terms, and monitoring the results/impact. Good call re: slow site is much worse than ugly URLS - totally agree on that. A migration is inevitable, its a case of 'when' not if (CMS is bespoke and ageing) and I'm hoping re-writes/re-directions on some of the higher traffic pages may help reduce the hit when the migration happens.
Cheers
Simon
-
I am going to be watching the responses here because determining the value of different ranking factoring aspects seem so subjective. We all know what elements are important. But, determining the level of priority in the whole scheme of things is a judgement call based on the bottom line, skill sets, and a company's ability to invest the time and resources.
From the sounds of it you aren't only dealing with hours of billable time but also on the possibility of losing sales because of the bloat that would take place while making the changes. I would say a slower site would have a much more drastic effect than ugly URL's. I would also say that pages with ugly URL's still do ok in search as long as there is good site architecture, quality, and unique content. That is what I would concentrate on under your current system. Then I would probably look at weighing the options of moving CMS. That isn't easy either. Migrations always take a hit on rankings, visitor loyalty, and page authority. You will probably come out much stronger but it would be an investment. (experienced first hand)
Just my 2 cents.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Website is not indexing
Hi All, My website URL is https://thepeopeople.com and it is neither caching nor indexing in Google. Earlier the URL was https://peopeople.com. I have redirected it to https://thepeopeople.com by using 301 redirections. I have checked the redirection and everything else is fine and I have submitted all the URLs in search console also, still the website is not indexing. Its been more than 5 months now. Please suggest a solution for this. Thanks in Advance.
Technical SEO | | ResultfirstGA0 -
What's the best way to handle product filter URLs?
I've been researching and can't find a clear cut answer. Imagine you have a product category page e.g. domain/jeans You've a lot of options as to how to filter the results domain/jeans?=ladies,skinny,pink,10 or domain/jeans/ladies-skinny-pink-10 or domain/jeans/ladies/skinny?=pink,10 And in this how do you handle titles, breadcrumbs etc. Is the a way you prefer to handle filters and why do you do it that way? I'm trying to make my mind up as some very big names handle this differently e.g. http://www.next.co.uk/shop/gender-women-category-jeans/colour-pink-fit-skinny-size-10r VS https://www.matalan.co.uk/womens/shop-by-category/jeans?utf8=✓&[facet_filter][meta.tertiary_category][Skinny]=on&[facet_filter][variants.meta.size][Size+10]=on&[facet_filter][meta.master_colour][Midwash]=on&[facet_filter][min_current_price][gte]=6.0&[facet_filter][min_current_price][lte]=18.0&per=36&sort=
Technical SEO | | RodneyRiley0 -
How do I optimize a website for SEO for a client that is using a subdirectory as a seperate website?
We launched a subdirectory site about two months ago for our client. What's happening is searches for the topic covered by the subdirectory are yielding search results for the old site and not the new site. We'd like to change this. Are there best practices for the subdirectory site Specifically we're looking for things we can do using sitemapping and Webmaster tools. Are there other technical things we can do? Thanks you.
Technical SEO | | IVSeoTeam120 -
What is the best practice to seperate different locations and languages in an URL? At the moment the URL is www.abc.com/ch/de. Is there a better way to structure the URL from an SEO perspective?
I am looking for a solution for using a new URL structure without using www.abc.com**/ch/de** in the URL to deliver the right languages in specific countries where more than one language are spoken commonly. I am looking forward to your ideas!
Technical SEO | | eviom0 -
Which URL structure holds the best SEO value?
Hello Community! We are rewriting URLs to better rank and provide better visual usability to our visitors. Would it be better to serve a URL that looks like this: www.domain.com/category-subcategory or www.domain.com/category/subcategory Please note the slight difference--the 2nd URL calls out a category that has a subcategory under it. Would it give us more value? Does it make a difference? Thanks in advance!
Technical SEO | | JCorp0 -
Google not showing my website ?
The website is medicare.md. if you search for term "medicare doctors PG county maryland" it is #1 in bing and yahoo but not even showing on google.com first TEN pages, although not banned. Interestingly if you do that search on google.co.pk it is #4. Quite Puzzuling !! Would appreciate any help or advice . Sherif Hassan
Technical SEO | | sherohass0 -
Whats the best way to stop search results from being indexed?
I Have a Wordpress Site, and just realized that the search results are being indexed on Google creating duplicate content. Whats the best way for me to stop these search result pages from being indexed without stopping the regulars and important pages and posts from being indexed as well? **The typical search query looks like this: ** http://xxx.com/?s=Milnerton&search=search&srch_type AND this also includes results that are linked to the "view more" such as:
Technical SEO | | stefanok
http://xxx.com/index.php?s=viewmore Your help would be much appreciated. regards Stef0 -
What is the best method to block a sub-domain, e.g. staging.domain.com/ from getting indexed?
Now that Google considers subdomains as part of the TLD I'm a little leery of testing robots.txt with something like: staging.domain.com
Technical SEO | | fthead9
User-agent: *
Disallow: / in fear it might get the www.domain.com blocked as well. Has anyone had any success using robots.txt to block sub-domains? I know I could add a meta robots tag to the staging.domain.com pages but that would require a lot more work.0