Best way to get SEO friendly URLSs on huge old website
-
Hi folks
Hope someone may be able to help wit this conundrum:
A client site runs on old tech (IIS6) and has circa 300,000 pages indexed in Google. Most pages are dynamic with a horrible URL structure such as
http://www.domain.com/search/results.aspx?ida=19191&idb=56&idc=2888
and I have been trying to implement rewrites + redirects to get clean URLs and remove some of the duplication that exists, using the IIRF Isapi filter: http://iirf.codeplex.com/
I manage to get a large sample of URLS re-writing and redirecting (on a staging version of the site), but the site then slows to crawl. To imple,ent all URLs woudl be 10x the volume of config. I am starting to wonder if there is a better way:
- Upgrade to Win 2008 / IIS 7 and use the better URL rewrite functionality included?
- Rebuild the site entirely (preferably on PHP with a decent URL structure)
- Accept that the URLS can't be made friendly on a site this size and focus on other aspects
- Persevere with the IIRF filter config, and hope that the config loads into memory and the site runs at a reasonable speed when live
None of the options are great as they either involve lots of work/cost of they involve keeping a site which performs well but could do so much better, with poor URLs.
Any thoughts from the great minds in the SEOmoz community appreciated!
Cheers
Simon
-
Many thanks Ben - and sorry for slow response!
I'm now planning on doing a simple hand coded re-write for some key terms, and monitoring the results/impact. Good call re: slow site is much worse than ugly URLS - totally agree on that. A migration is inevitable, its a case of 'when' not if (CMS is bespoke and ageing) and I'm hoping re-writes/re-directions on some of the higher traffic pages may help reduce the hit when the migration happens.
Cheers
Simon
-
I am going to be watching the responses here because determining the value of different ranking factoring aspects seem so subjective. We all know what elements are important. But, determining the level of priority in the whole scheme of things is a judgement call based on the bottom line, skill sets, and a company's ability to invest the time and resources.
From the sounds of it you aren't only dealing with hours of billable time but also on the possibility of losing sales because of the bloat that would take place while making the changes. I would say a slower site would have a much more drastic effect than ugly URL's. I would also say that pages with ugly URL's still do ok in search as long as there is good site architecture, quality, and unique content. That is what I would concentrate on under your current system. Then I would probably look at weighing the options of moving CMS. That isn't easy either. Migrations always take a hit on rankings, visitor loyalty, and page authority. You will probably come out much stronger but it would be an investment. (experienced first hand)
Just my 2 cents.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
New SEO manager needs help! Currently only about 15% of our live sitemap (~4 million url e-commerce site) is actually indexed in Google. What are best practices sitemaps for big sites with a lot of changing content?
In Google Search console 4,218,017 URLs submitted 402,035 URLs indexed what is the best way to troubleshoot? What is best guidance for sitemap indexation of large sites with a lot of changing content? view?usp=sharing
Technical SEO | | Hamish_TM1 -
Some of my website urls are not getting indexed while checking (site: domain) in google
Some of my website urls are not getting indexed while checking (site: domain) in google
Technical SEO | | nlogix0 -
Google Crawling Issues! How Can I Get Google to Crawl My Website Regularly?
Hi Everyone! My website is not being crawled regularly by Google - there are weeks when it's regular but for the past month or so it does not get crawled for seven to eight days. There are some specific pages, that I want to get ranked but they of late are not being crawled AT ALL unless I use the 'Fetch As Google' tool! That's not normal, right? I have checked and re-checked the on-page metrics for these pages (and the website as a whole, backlinking is a regular and ongoing process as well! Sitemap is in place too! Resubmitted it once too! This issue is detrimental to website traffic and rankings! Would really appreciate insights from you guys! Thanks a lot!
Technical SEO | | farhanm1 -
What is the optimum schema for a Website and how important is it really is for SEO?
Hey everyone,
Technical SEO | | artdivision
As you all probably has seen Google has changed their structure testing tool and alongside that has also changed what is considered as valid and not valid. I have been struggling with this question for quite a while now where opinions are really split. **1. How important is Schema for SEO? **
2. How far should you go Schema'ing your website pages? From the one hand i can see how it can be easier for a BOT to read a code that has proper "road signs" (our schemas markups), on the other hands, Google is already extremely clever is working out what is the header, sidebar or footer as well as review and or a blog posts (especially for those of us who use Wordpress. Would love to know if someone has seen a "Like to Like" show case with schemas and/or have some factual information regarding what should or shouldn't be done when it comes ot Schema. Dan. x1aw0 -
What is the best way to handle links that lead to a 404 page
Hi Team Moz, I am working through a site cutover with an entirely new URL structure and have a bunch of pages that could not, would not or just plain don't redirect to new pages. Steps I have taken: Multiple new sitemaps submitted with new URLs and the indexing looks solid used webmasters to remove urls with natural result listings that did not redirect and produce urls Completely built out new ppc campaigns with new URL structures contacted few major link partners Now here is my question: I have a pages that produce 404s that are linked to in forums, slick deals and stuff like that which will not be redirected. Is disavowing these links the correct thing to do?
Technical SEO | | mm9161570 -
International Seo - Canada
Our organization is currently only operating in the USA but will soon be entering the Canadian market. We did a lot of research and decided that for our needs it would be best to use a subfolder for Canada. Initially we will be targeting the english speaking community but eventually we will want to expand to the french speaking Canadians as well. The question is - is there a preferred version in setting up the subfolders: www.website.org/ca/ -- default will be english www.website.org/ca/fr/ - french www.website.org/en-ca/ - english www.website.org/fr-ca/ - french www.website.org/ca/en/ -english www.website.org/ca/fr/ - french Thanks
Technical SEO | | Morris770 -
Website's stability and it's affect on SEO
What is the best way to combat previous website stability issues? We had page load time and site stability problems over the course of several months. As a result our keyword rankings plummeted. Now that the issues have been resolved, what's the best/quickest way to regain our rankings on specific keywords? Thanks, Eric
Technical SEO | | MediaCause0