Need to move highest content pages into a sub-domain and want to minimize the loss of traffic - details inside!
-
Hi All!
So the company that I work for owns two very strong domains in the information security industry. There are two separate sections on each site that draws a ton of long tail SEO traffic.
For our corporate site we have a vulnerability database where people search for vulnerabilities to research, and find out how to remediate. On our other website we have an exploit database where people can look up exploits in order to see how to patch an attackers attack path.
We are going to move these into a super database under our corporate domain and I want to ensure that we maintain or minimize the traffic loss. The exploit database which is currently on our other domain yields about three quarters of the traffic to the domain. It is obviously OK if that traffic goes directly to this new subdomain.
What are my options to keep our search traffic steady for this content? There are thousands and thousands of these vulnerabilities and exploits so it would not make sense to 301 redirect all of them. What are some other options and what would you do?
-
Hello Pat,
I do not have experience merging a Ruby site with another type of site, but I think we are confusing issues here anyway. You can have content in a database that gets served up anywhere. That is, you can pull that content into ten different websites if you wanted to. The database issue is almost irrelevant to the SEO issues, which have mainly to do with loss of pagerank from URL changes, and possible duplicate content issues. A 301 redirect from the old URL to the new one would take care of both of these issues.
If you are unable to redirect all of the old content my suggestion would then be to figure out which URLs have external links and redirect those. Let all of the other ones return a 404 or 410 status code so those URLs will be removed from the index since the content will exist on the new URL and you don't want two URLs with the same content indexed simultaneously.
Please let us know if we have missunderstood the question or if we can provide more help with your original question. You may want to post your Ruby question in another thread to ensure the right people see it.
Thanks!
-
Hi Chris,
Sorry for the confusion. The plan is to merge both databases (our vulnerability database on our corporate site and our exploit database on our other website) into one and place them on a subdomain off of our corporate site. Right now the exploit database that is on our second website gets a LOT of traffic, it contributes about three quarters of the traffic to the domain. I would like to minimize the loss of traffic when placing this on this subdomain and looking for ways to do this.
@ryan - I am not sure exactly why, but our web producer told me that we need to use a subdomain and cannot put this on our domain. I will follow up with her to find out why.
Update - I guess one of the databases is written on a different platform (ruby) so it cannot be hosted on the same server - changes are harder to make as a result. I guess this could still be done however it may be a little harder to update - anybody have experience with this?
Thanks for the help guys!
Pat
-
Would like to offer an opinion but can't quite figure out what you're saying in paragraph 3.
-
Not quite sure that I understand the need to put these on a subdomain. Why not have both of these reside/exist on the corporate domain? One of them already exists on your corporate, so you can keep that database/search there, and then move the other over to a similar location. yes, that would require a ton of 301 redirects, but that should be ok given the scope of the project.
In my experience, moving to a new domain or even a subdomain, you always experience some traffic loss that never really comes back (unless you are naturally growing anyway). Keel the main company domain going, put everything under a folder off the root, dont worry about the subdomain issue.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
301 Redirect Only Home Page/Root Domain via Domain Registrar Only
Hi All, I am really concerned about doing a 301 redirect. This is my situation: Both Current and New Domain is registered with a local domain registrar (similar to GoDaddy but a local version) Current Domain: Servers are pointing to Wix servers and the website is built and hosted with Wix I would like to do a 301 redirect but would like to do it in the following way with a couple of factors to keep in mind: 99% of my link are only pointed to the home page/root domain only. Not to subdirectories. New Domain: I will register this with wix with a new plan but keep the exact sitemap and composition of current website and launch with new domain. Current Domain: I want to change server pointing to wix to point to local domain registrar servers. Then do a 301 redirect for only the home page/root domain to point to the new domain listed with wix. So 301 is done via local registrar and not via Wix. Another point to mention is it will also change from Http to Https as well as a name change. Your comments on the above will be greatly appreciated and as to whether there is risk in trying to do a 301 redirect as above. Doing it as above it also cheaper if I do the 301 via the wix platform I will need to register a full new premium plan and run it concurrently to the old plan whereas if I do it as mentioned above will only have the additional domain annual fee. Look forward to your comments. Mike
Intermediate & Advanced SEO | | MikeBlue10 -
Parameter Strings & Duplicate Page Content
I'm managing a site that has thousands of pages due to all of the dynamic parameter strings that are being generated. It's a real estate listing site that allows people to create a listing, and is generating lots of new listings everyday. The Moz crawl report is continually flagging A LOT (25k+) of the site pages for duplicate content due to all of these parameter string URLs. Example: sitename.com/listings & sitename.com/listings/?addr=street name Do I really need to do anything about those pages? I have researched the topic quite a bit, but can't seem to find anything too concrete as to what the best course of action is. My original thinking was to add the rel=canonical tag to each of the main URLs that have parameters attached. I have also read that you can bypass that by telling Google what parameters to ignore in Webmaster tools. We want these listings to show up in search results, though, so I don't know if either of these options is ideal, since each would cause the listing pages (pages with parameter strings) to stop being indexed, right? Which is why I'm wondering if doing nothing at all will hurt the site? I should also mention that I originally recommend the rel=canonical option to the web developer, who has pushed back in saying that "search engines ignore parameter strings." Naturally, he doesn't want the extra work load of setting up the canonical tags, which I can understand, but I want to make sure I'm both giving him the most feasible option for implementation as well as the best option to fix the issues.
Intermediate & Advanced SEO | | garrettkite0 -
Which is more valuable in a landing page, content or functionality?
I have two possible landing pages to focus off page links and paid ad links to, one page has space for content but basically only serves as a springboard to a map view style listing page. The idea is to use this page full of good content to build search engine value. The map view page is the most functional and is what visitors would ultimately be seeking, but has no real room for content. Are these content landing pages useful? Would it be better to focus on user functionality even though there is no space for content, and would search engines naturally apply for value to these pages? Are these landing pages necessary? The url's in question are http://www.rentcollegepads.com/marquette/search and http://www.rentcollegepads.com/marquette Thanks guys!
Intermediate & Advanced SEO | | Dom4410 -
Domain forward to landing page - good or bad for SEO?
Hi Mozzers, Just recently we acquired a domain (www.nhacaribbean.com) for marketing purposes. Our technical staff used a frame forward to redirect the domain to the landing page http://www.nha.nl/alles-over-nha/Caribbean.aspx, which is only linked in the sitemap (not in the navigational structure of the site). Now, I'd personally just redirect the domain with a 301. But our CEO really wanted to keep the domain www.nhacaribbean.com visible in the URL bar. My question is: could this (potentially) really hurt rankings for our web site one way or the other? I'd love to hear from you guys. Thanks in advance.
Intermediate & Advanced SEO | | NHA_DistanceLearning0 -
Why are these pages considered duplicate content?
I have a duplicate content warning in our PRO account (well several really) but I can't figure out WHY these pages are considered duplicate content. They have different H1 headers, different sidebar links, and while a couple are relatively scant as far as content (so I might believe those could be seen as duplicate), the others seem to have a substantial amount of content that is different. It is a little perplexing. Can anyone help me figure this out? Here are some of the pages that are showing as duplicate: http://www.downpour.com/catalogsearch/advanced/byNarrator/narrator/Seth+Green/?bioid=5554 http://www.downpour.com/catalogsearch/advanced/byAuthor/author/Solomon+Northup/?bioid=11758 http://www.downpour.com/catalogsearch/advanced/byNarrator/?mediatype=audio+books&bioid=3665 http://www.downpour.com/catalogsearch/advanced/byAuthor/author/Marcus+Rediker/?bioid=10145 http://www.downpour.com/catalogsearch/advanced/byNarrator/narrator/Robin+Miles/?bioid=2075
Intermediate & Advanced SEO | | DownPour0 -
Is traffic and content really important for an e-commerce site???
Hi All, I'm maintaining an e-commerce website and I've encountered some related keywords that I know will not convert to sales but are related to the subject and might help becoming an "authority". I'll give an example... If a car dealership wrote an amazing article about cleaning a car.
Intermediate & Advanced SEO | | BeytzNet
Obviously it is related but the chances of someone looking to clean his car will go ahead and buy one now are quite low. Also, he will probably bounce out of this page after reading the piece. To conclude, Would such an article do GOOD (helping to become an authority and having more visitors) or BAD (low conversion rate and high bounce rate)? Thanks0 -
"Duplicate" Page Titles and Content
Hi All, This is a rather lengthy one, so please bear with me! SEOmoz has recently crawled 10,000 webpages from my site, FrenchEntree, and has returned 8,000 errors of duplicate page content. The main reason I have so many is because of the directories I have on site. The site is broken down into 2 levels of hierachy. "Weblets" and "Articles". A weblet is a landing page, and articles are created within these weblets. Weblets can hold any number of articles - 0 - 1,000,000 (in theory) and an article must be assigned to a weblet in order for it to work. Here's how it roughly looks in URL form - http://www.mysite.com/[weblet]/[articleID]/ Now; our directory results pages are weblets with standard content in the left and right hand columns, but the information in the middle column is pulled in from our directory database following a user query. This happens by adding the query string to the end of the URL. We have 3 main directory databases, but perhaps around 100 weblets promoting various 'canned' queries that users may want to navigate straight into. However, any one of the 100 directory promoting weblets could return any query from the parent directory database with the correct query string. The problem with this method (as pointed out by the 8,000 errors) is that each possible permutation of search is considered to be it's own URL, and therefore, it's own page. The example I will use is the first alphabetically. "Activity Holidays in France": http://www.frenchentree.com/activity-holidays-france/ - This link shows you a results weblet without the query at the end, and therefore only displays the left and right hand columns as populated. http://www.frenchentree.com/activity-holidays-france/home.asp?CategoryFilter= - This link shows you the same weblet with the an 'open' query on the end. I.e. display all results from this database. Listings are displayed in the middle. There are around 500 different URL permutations for this weblet alone when you take into account the various categories and cities a user may want to search in. What I'd like to do is to prevent SEOmoz (and therefore search engines) from counting each individual query permutation as a unique page, without harming the visibility that the directory results received in SERPs. We often appear in the top 5 for quite competitive keywords and we'd like it to stay that way. I also wouldn't want the search engine results to only display (and therefore direct the user through to) an empty weblet by some sort of robot exclusion or canonical classification. Does anyone have any advice on how best to remove the "duplication" problem, whilst keeping the search visibility? All advice welcome. Thanks Matt
Intermediate & Advanced SEO | | Horizon0 -
Google consolidating link juice on duplicate content pages
I've observed some strange findings on a website I am diagnosing and it has led me to a possible theory that seems to fly in the face of a lot of thinking: My theory is:
Intermediate & Advanced SEO | | James77
When google see's several duplicate content pages on a website, and decides to just show one version of the page, it at the same time agrigates the link juice pointing to all the duplicate pages, and ranks the 1 duplicate content page it decides to show as if all the link juice pointing to the duplicate versions were pointing to the 1 version. EG
Link X -> Duplicate Page A
Link Y -> Duplicate Page B Google decides Duplicate Page A is the one that is most important and applies the following formula to decide its rank. Link X + Link Y (Minus some dampening factor) -> Page A I came up with the idea after I seem to have reverse engineered this - IE the website I was trying to sort out for a client had this duplicate content, issue, so we decided to put unique content on Page A and Page B (not just one page like this but many). Bizarrely after about a week, all the Page A's dropped in rankings - indicating a possibility that the old link consolidation, may have been re-correctly associated with the two pages, so now Page A would only be getting Link Value X. Has anyone got any test/analysis to support or refute this??0