Best way to get SEO friendly URLSs on huge old website
-
Hi folks
Hope someone may be able to help wit this conundrum:
A client site runs on old tech (IIS6) and has circa 300,000 pages indexed in Google. Most pages are dynamic with a horrible URL structure such as
http://www.domain.com/search/results.aspx?ida=19191&idb=56&idc=2888
and I have been trying to implement rewrites + redirects to get clean URLs and remove some of the duplication that exists, using the IIRF Isapi filter: http://iirf.codeplex.com/
I manage to get a large sample of URLS re-writing and redirecting (on a staging version of the site), but the site then slows to crawl. To imple,ent all URLs woudl be 10x the volume of config. I am starting to wonder if there is a better way:
- Upgrade to Win 2008 / IIS 7 and use the better URL rewrite functionality included?
- Rebuild the site entirely (preferably on PHP with a decent URL structure)
- Accept that the URLS can't be made friendly on a site this size and focus on other aspects
- Persevere with the IIRF filter config, and hope that the config loads into memory and the site runs at a reasonable speed when live
None of the options are great as they either involve lots of work/cost of they involve keeping a site which performs well but could do so much better, with poor URLs.
Any thoughts from the great minds in the SEOmoz community appreciated!
Cheers
Simon
-
Many thanks Ben - and sorry for slow response!
I'm now planning on doing a simple hand coded re-write for some key terms, and monitoring the results/impact. Good call re: slow site is much worse than ugly URLS - totally agree on that. A migration is inevitable, its a case of 'when' not if (CMS is bespoke and ageing) and I'm hoping re-writes/re-directions on some of the higher traffic pages may help reduce the hit when the migration happens.
Cheers
Simon
-
I am going to be watching the responses here because determining the value of different ranking factoring aspects seem so subjective. We all know what elements are important. But, determining the level of priority in the whole scheme of things is a judgement call based on the bottom line, skill sets, and a company's ability to invest the time and resources.
From the sounds of it you aren't only dealing with hours of billable time but also on the possibility of losing sales because of the bloat that would take place while making the changes. I would say a slower site would have a much more drastic effect than ugly URL's. I would also say that pages with ugly URL's still do ok in search as long as there is good site architecture, quality, and unique content. That is what I would concentrate on under your current system. Then I would probably look at weighing the options of moving CMS. That isn't easy either. Migrations always take a hit on rankings, visitor loyalty, and page authority. You will probably come out much stronger but it would be an investment. (experienced first hand)
Just my 2 cents.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
SEO Content Audits Questions (Removing pages from website, extracting data, organizing data).
Hi everyone! I have a few questions - we are running an SEO content audit on our entire website and I am wondering the best FREE way to extract a list of all indexed pages. Would I need to use a mix of Google Analytics, Webmaster Tools, AND our XML sitemap or could I just use Webmaster Tools to pull the full list? Just want to make sure I am not missing anything. As well, once the data is pulled and organized (helpful to know the best way to pull detailed info about the pages as well!) I am wondering if it would be a best practice to sort by high trafficked pages in order to rank them for prioritization (ie: pages with most visits will be edited and optimized first). Lastly, I am wondering what constitutes a 'removable' page. For example, when it is appropriate to fully remove a page from our website? I understand that it is best, if you need to remove a page, to redirect the person to another similar page OR the homepage. Is this the best practice? Thank you for the help! If you say it is best to organize by trafficked pages first in order to optimize them - I am wondering if it would be an easier process to use MOZ tools like Keyword Explorer, Page Optimization, and Page Authority to rank pages and find ways to optimize them for best top relevant keywords. Let me know if this option makes MORE sense than going through the entire data extraction process.
Technical SEO | | PowerhouseMarketing0 -
What will the SEO and conversion implications of an ecommere website not moving to https be?
Client has an ecommerce website on Adobe Business Catalyst and they are currently not able to move websites onto https. They have announced this won't be available until late June. What are the expected implications of this in terms of site visibility and conversions etc?
Technical SEO | | Kerry_Jones0 -
One-Pager and SEO
We're building a page that is going to feature over 31 people as difference makers in their field. We're unveiling one a day for an entire month. The very early mockup of the page has name, pic, some bio info, and a link to open up a new window with the full bio. I would love to have all of the bio content for all of the people on the page (and indexable), but I'm not sure how to do that while still being able to hide the full bios until they are expanded. Anybody have any tips that are SEO-friendly and/or examples of a page that is built like this and ranks well. Thanks!
Technical SEO | | spackle0 -
AJAX and SEO
Hello team, Need to bounce a question off the group. We have a site that uses the .NET AJAX tool kit to toggle tabs on a page. Each tab has content and the content is drawn on page load. In other words, the content is not from an AJAX call, it is there from the start. The content sits in DIV tags which the javascript toggles - that's all. My customer hired an "SEO Expert" who is telling them that this content is invisible to search engines. I strongly disagree and we're trying to come to a conclusion. I understand that content rendered async via an AJAX call would not be spidered, however just using the AJAX (Javascript) to switch tabs will not affect the spiders finding the content in the markup. Any thoughts?
Technical SEO | | ChrisInColorado0 -
Redirects - How Best to do this ?
Hi I am looking to close Website A which has many pages. I would like to keep the home page and add some great content to it with a link pointing to Website B. As for all the other pages excluding the home page , how is it best to approach them on Website A. Should I redirect them all to the home page of Website A which will tell Google thoose Pages are no longer needed and to prevent the visitors from seeing a 404? My Main aim here is to not lose any visitors to Website A by sending them to Website B but also to hopefully pass any Page strength from Website A to Website B Thanks Adam
Technical SEO | | AMG1000 -
Explain me the SEO impact when a website has more internal link compared to less internal links
A website that I am working on has more than 200 internal links (Its because of the design and various kind of service that we offer). I want to know its SEO impact. I also want to know the SEO impact when a website has less internal links compared to more internal links
Technical SEO | | BoniSatani0 -
All in One SEO weirdness
For some reason, I'm getting extra words in my title tags. For example, I wrote "Washing Machine Widgets | Acme Widgets, Inc. | Acme Widgets Inc. Anyone have any idea why I'm getting the extra " | Acme Widgets Inc."? Thanks!
Technical SEO | | PGD20110 -
SEO Tomfoolery
Oh Hai, I recently changed the permalink structure on my Wordpress based site, southwestbreaks.co.uk from the standard ?p=123 to a more SEO chummy /%postname%/. As a result, my site has completely dropped off the board for all my previously well ranked search phrases. Having since gotten into SEOmoz a bit more, I can see there are WP plugins available that apparently would've done this a lot more smoothly. I'd be most grateful if someone could explain if this drop off is just temporary, or have I somehow entered Google's shun book? The site has been like this for about 48 hours. Thanks, Tim
Technical SEO | | Southwesttim0