Large Site SEO - Dev Issue Forcing URL Change - 301, 302, Block, What To Do?
-
Hola,
Thanks in advance for reading and trying to help me out. A client of mine recently created a large scale company directory (500k+ pages) in Drupal v6 while the "marketing" type pages of their site was still in manual hard-coded HTML. They redesigned their "marketing" pages, but used Drual v7. They're now experiencing server conflicts with both instances of Drupal not allowing them to communicate/be on the same server. Eventually the directory will be upgraded to Drupal v7, but could take weeks to months the client does not want to wait for the re-launch. The client wants to push the new marketing site live, but also does not want to ruin the overall SEO value of the directory and have a few options, but I'm looking to help guide them down the path of least resistance:
- Option 1: Move the company directory onto a subdomain and the "marketing site" on the www. subdomain. Client gets to push their redesign live, but large scale 301s to the directory cause major issues in terms of shaking up the structure of the site causing ripple effects into getting pulled out of the index for days to weeks. Rankings and traffic drop, subdomain authority gets lost and the company directory health looks bad for weeks to months. However, 301 maintains partial SEO value and some long tail traffic still exists. Once the directory gets moved to Drupal v7, the directory will then cancel the 301 to the subdomain and revert back to original www. subdomain URLs
- Option 2: Block the company directory from search engines with robots.txt and meta instructions, essentially cutting off the floodgates from the established marketing pages. No major scaling 301 ripple effect, directory takes a few weeks to filter out of the index, traffic is completely lost, however once drupal v7 gets upgraded and the directory is then re-opened, directory will then slowly gain back SEO value to get close to old rankings, traffic, etc.
- Option 3: 302 redirect? Lose all accumulate SEO value temporarily... hmm
- Option 4: Something else?
As you can see, this is not an ideal situation. However, a decision has to be made and I'm looking to chose the lesser of evils. Any help is greatly appreciated. Thanks again
-Chris
-
I would heartily agree with this. The workaround is going to be a nightmare and may cause him a lot more pain.
-
I don't envy your situation.
I think that I might try another phone call to the client advising that a two week (or whatever) delay is in the best interest of his business.....
.... I would be willing to risk his temper than do something that I don't recommend. Lots of clients would cuss you now but thank you down the road.
... just saying what I would do.. not trying to argue.
-
Thanks for the ideas.
I'm really looking for an option if upgrading Drupal on either the marketing site or directory is NOT an option. What is the best temporary solution that will cause the fewest major short and long term problems.
-
As I see it they have three options:
1. Upgrade the directory to D7 fast! If there aren't a lot of custom modules this shouldn't be a huge deal, no matter the number of nodes. As long as everything is disabled the upgrade isn't a seriously big deal.
2. If upgrading is really a big issue due to a lot of custom coding, then perhaps downgrade the marketing site to D6 if there aren't tons of pages?
3. I would suggest a combination of both - upgrade the directory to D7 and merge both sites into one D7 install - no sense running two installs when one would do just fine.
-
As an SEO, I agree with you.
Unfortunately, as a consultant I need to provide an answer as changes are going to be made regardless. My job is to pick the "lesser of two evils" decision as to how best preserve the SEO value. Any help there?
-
Thanks... that is nice traffic...
So, if this was my site I would tell the dev guys that we need the version conflicts solved ASAP.
I would not move the directory or the marketing pages.
I would put pressure on the dev guys and not allow impatience to compromise the long-term success of the site.
-
The directory is getting well over 200k visits per month and is majorly competitive for a number of mid to long tail terms.
-
Is the directory getting any traffic from any search engine?
Has the directory gotten any valuable links that were not created by you?
If the answer to those questions is "no" or "very little" then I'd say that it has very little SEO value and could be a weight on the rest of the site.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Quick SEO Audit of my site.
Hello, I hope you are doing great. I am working on a website that is related to flea collars for cats and dogs. And I want you to make a quick audit of the site where am I lacking. It could be great if you can help ASAP. You can view my site here :
Intermediate & Advanced SEO | | Request4peace0 -
URL Too Long vs. 301 Redirect
We have a small number of content pages where the urls paths were setup before we started looking really hard at SEO. The paths are longer than recommended (but not super crazy IMHO) and some of the pages get a decent amount of traffic. Moz suggests updating the URLs to make them shorter but I wonder if anyone has experience with the tradeoffs here. Is it better to mark those issues to be ignored and just use good URLs going forward or would you suggest updating the URLs to something shorter and implementing a 301 redirect?
Intermediate & Advanced SEO | | russell_ms0 -
How much change should you make to your site in one go?
Hey everyone, So we are currently working on a new website and are in the final stages right now. We have some plans for a brand name change too and there is some debate internally whether we should: a) roll out the new site now and hold off on the rebrand - let the redirects kick in and the site bed in so to speak. Then when the dust settles look at a domain name change b) Roll out the new site with the domain name change too - an all in change A bit of background on the changes being made: The new website will have some structural changes but the main blog content will remain the same - this is where we get the majority of our traffic. The blog will have a slight page layout change but the core content, structure, urls, etc. will be exactly the same. The core website surrounding the blog will change with 301 redirects from old out of date content pages consolidated to fewer, more relevant pages. I hope I've explained enough here, if not please let me know and I'll add more detail
Intermediate & Advanced SEO | | hotchilidamo0 -
Which URL is better for SEO?
We have a URL structure question: Because we have websites in multiple countries and in multiple languages, we need to add additional elements to our URL structure. Of the two following options, what would be better for SEO? Option 1: www.abccompany.com/abc-ca-en/home.htm Option 2: www.abccompany.com/home.abc.ca.en.htm
Intermediate & Advanced SEO | | northwoods-2603420 -
Blocking out specific URLs with robots.txt
I've been trying to block out a few URLs using robots.txt, but I can't seem to get the specific one I'm trying to block. Here is an example. I'm trying to block something.com/cats but not block something.com/cats-and-dogs It seems if it setup my robots.txt as so.. Disallow: /cats It's blocking both urls. When I crawl the site with screaming flog, that Disallow is causing both urls to be blocked. How can I set up my robots.txt to specifically block /cats? I thought it was by doing it the way I was, but that doesn't seem to solve it. Any help is much appreciated, thanks in advance.
Intermediate & Advanced SEO | | Whebb0 -
301 redirect hell.... How do you de-commission an old site
Hi SEO experts: We operate a vacation rental website and around 1 year ago moved to a different platform. Because our pages are arranged by location (what we refer to as Locales) we need to put 301 redirects for all the old locale pages. So for example: www.example.com/__Skeggness.cfm redirects to www.example/com/vacation-rentals/locale/skeggness But here's the problem: We can't seem to get Google to drop those old __{locale_name}.cfm pages... even after over 12-months of the new site going live! Other clues we've noticed: The old underscore URLs show up in our SERP sub-links Sometimes google shows the new page title and description but attributes it to the __{locale_name}.cfm URL (aghh!!!) One suggestion we received was to use the URL removal tool in Google WMT.... But given we have 1,000's of locales i don't see that as being affective. Questions: Any suggestions on how to get Google to drop these old URLs and use the new ones? Is this situation hurting our SEO? Or do you think its benign... and I should just take a deep breath.... and relax at little more...
Intermediate & Advanced SEO | | AABAB0 -
Managing Large Regulated or Required Duplicate Content Blocks
We work with a number of pharmaceutical sites that under FDA regulation must include an "Important Safety Information" (ISI) content block on each page of the site. In many cases this duplicate content is not only provided on a specific ISI page, it is quite often longer than what would be considered the primary content of the page. At first blush a rel=canonical tag might appear to be a solution to signal search engines that there is a specific page for the ISI content and avoid being penalized, but the pages also contain original content that should be indexed as it has user benefit beyond the information contained within the ISI. Anyone else running into this challenge with regulated duplicate boiler plate and has developed a work around for handling duplicate content at the paragraph level and not the page level? One clever suggestion was to treat it as a graphic, however for a pharma site this would be a huge graphic.
Intermediate & Advanced SEO | | BlooFusion380 -
Best way to host multiple sites for maximum seo
We have over 100 websites we built for clients that we currently host on 1 shared godaddy hosting account. They each have a link to us but since they are all under one shared account, we feel that we are not maximizing the inbound link potential. I've looked into c class hosting but found that either the ip's were flagged as spam, or they shared nameservers which defeats the purpose. I've also been told that since the c class ip's a hosting company gives to you are all owned by them, that also defeats the purpose. Anyone have any solutions besides opening 130 accounts with different hosting companies? Also, will it make any difference changing existing sites onto different hosts now or are they already tainted?
Intermediate & Advanced SEO | | seopet0