Best way to get SEO friendly URLSs on huge old website
-
Hi folks
Hope someone may be able to help wit this conundrum:
A client site runs on old tech (IIS6) and has circa 300,000 pages indexed in Google. Most pages are dynamic with a horrible URL structure such as
http://www.domain.com/search/results.aspx?ida=19191&idb=56&idc=2888
and I have been trying to implement rewrites + redirects to get clean URLs and remove some of the duplication that exists, using the IIRF Isapi filter: http://iirf.codeplex.com/
I manage to get a large sample of URLS re-writing and redirecting (on a staging version of the site), but the site then slows to crawl. To imple,ent all URLs woudl be 10x the volume of config. I am starting to wonder if there is a better way:
- Upgrade to Win 2008 / IIS 7 and use the better URL rewrite functionality included?
- Rebuild the site entirely (preferably on PHP with a decent URL structure)
- Accept that the URLS can't be made friendly on a site this size and focus on other aspects
- Persevere with the IIRF filter config, and hope that the config loads into memory and the site runs at a reasonable speed when live
None of the options are great as they either involve lots of work/cost of they involve keeping a site which performs well but could do so much better, with poor URLs.
Any thoughts from the great minds in the SEOmoz community appreciated!
Cheers
Simon
-
Many thanks Ben - and sorry for slow response!
I'm now planning on doing a simple hand coded re-write for some key terms, and monitoring the results/impact. Good call re: slow site is much worse than ugly URLS - totally agree on that. A migration is inevitable, its a case of 'when' not if (CMS is bespoke and ageing) and I'm hoping re-writes/re-directions on some of the higher traffic pages may help reduce the hit when the migration happens.
Cheers
Simon
-
I am going to be watching the responses here because determining the value of different ranking factoring aspects seem so subjective. We all know what elements are important. But, determining the level of priority in the whole scheme of things is a judgement call based on the bottom line, skill sets, and a company's ability to invest the time and resources.
From the sounds of it you aren't only dealing with hours of billable time but also on the possibility of losing sales because of the bloat that would take place while making the changes. I would say a slower site would have a much more drastic effect than ugly URL's. I would also say that pages with ugly URL's still do ok in search as long as there is good site architecture, quality, and unique content. That is what I would concentrate on under your current system. Then I would probably look at weighing the options of moving CMS. That isn't easy either. Migrations always take a hit on rankings, visitor loyalty, and page authority. You will probably come out much stronger but it would be an investment. (experienced first hand)
Just my 2 cents.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
DoesHurt SEO
Hey guys, I've read mixed reviews on this. Does anyone have an answer to whether or not hiding h1 text ( ) negatively effects SEO in 2018. Thanks for the help!
Technical SEO | | Jason-Reid0 -
Why my website does not index?
I made some changes in my website after that I try webmaster tool FETCH AS GOOGLE but this is 2nd day and my new pages does not index www. astrologersktantrik .com
Technical SEO | | ramansaab0 -
Best way to handle pages with iframes that I don't want indexed? Noindex in the header?
I am doing a bit of SEO work for a friend, and the situation is the following: The site is a place to discuss articles on the web. When clicking on a link that has been posted, it sends the user to a URL on the main site that is URL.com/article/view. This page has a large iframe that contains the article itself, and a small bar at the top containing the article with various links to get back to the original site. I'd like to make sure that the comment pages (URL.com/article) are indexed instead of all of the URL.com/article/view pages, which won't really do much for SEO. However, all of these pages are indexed. What would be the best approach to make sure the iframe pages aren't indexed? My intuition is to just have a "noindex" in the header of those pages, and just make sure that the conversation pages themselves are properly linked throughout the site, so that they get indexed properly. Does this seem right? Thanks for the help...
Technical SEO | | jim_shook0 -
SEO URLs?
What are the best practices for generating SEO-friendly headlines? dashes between words? underscores between words? etc. Looking for a programatically generated solution that's using editor-written headlines to produce an SEO-friendly URL Thanks.
Technical SEO | | ShaneHolladay0 -
Best way to create a shareable dynamic infographic - Embed / Iframe / other?
Hi all, After searching around, there doesn't seem to be any clear agreement in the SEO community of the best way to implement a shareable dynamic infographic for other people to put into their site. i.e. That will pass credit for the links to the original site. Consider the following example for the web application that we are putting the finishing touches on: The underlying site has a number of content pages that we want to rank for. We have created a number of infogrpahics showing data overlayed on top of a google map. The data continuously changes and there are javascript files that have to load in order to achieve the interactivity. There is one infographic per page on our site and there is a link at the bottom of the infographic that deep links back to each specific page on our site. What is the ideal way to implement this infographic so that the maximum SEO value is passed back to our site through the links? In our development version we have copied the youtube approach implemented this as an iframe. e.g. <iframe height="360" width="640" src="http://www.tbd.com/embed/golf" frameborder="0"></iframe>. The link at the bottom of that then links to http://www.tbd.com/golf This is the same approach that Youtube uses, however I'm nervous that the value of the link wont pass from the sites that are using the infographic. Should we do this as an embed object instead, or some other method? Thanks in advance for your help. James
Technical SEO | | jtriggs0 -
What is the best way to upload an image for SEO
I have a site that is largely based on images. It runs on Wordpress. Each page has about 10 images. What is the best way to upload images? As a WP gallery. As another gallery (with a gallery plugin) As seperate images uploaded via WP (it shows a thumbnail which is links to the larger image). That way each image can have a title. As seperate images uploaded via FTP in which case I would then make a thumbnail which will link to the larger image. This option would only be good if WP does not optomize the thumbnail images that it creates. As a text title which links to the image. Since most of my content is based on images and mostly comes from Google images I was wondering what the best method to use is.
Technical SEO | | nicolebd0 -
Best way to handle different views of the same page?
Say I have a page: mydomain.com/page But I also have different views: /?sort=alpha /print-version /?session_ID=2892 etc. All same content, more or less. Should the subsequent pages have ROBOTS meta tag with noindex? Should I use canonical? Both? Thanks!
Technical SEO | | ChatterBlock0 -
Struggling to get my lyrics website fully indexed
Hey guys, been a longtime SEOmoz user, only just getting heavily into SEO now and this is my first query, apologies if it's simple to answer but I have been doing my research! My website is http://www.lyricstatus.com - basically it's a lyrics website. Rightly or wrongly, I'm using Google Custom Search Engine on my website for search, as well as jQuery auto-suggest - please ignore the latter for now. My problem is that when I launched the site I had a complex AJAX Browse page, so Google couldn't see static links to all my pages, thus it only indexed certain pages that did have static links. This led to my searches on my site using the Google CSE being useless as very few pages were indexed. I've since dropped the complex AJAX links and replaced it with easy static links. However, this was a few weeks ago now and still Google won't fully index my site. Try doing a search for "Justin Timberlake" (don't use the auto-suggest, just click the "Search" button) and it's clear that the site still hasn't been fully indexed! I'm really not too sure what else to do, other than wait and hope, which doesn't seem like a very proactive thing to do! My only other suspicion is that Google sees my site as more duplicate content, but surely it must be ok with indexing multiple lyrics sites since there are plenty of different ones ranking in Google. Any help or advice greatly appreciated guys!
Technical SEO | | SEOed0