Best way to handle deletion of a forum subdomain?
-
Hello All
Our site www.xxxx.com has long had a forum subdomain forum.xxxx.com.
We have decided to sunset the forum. We find that the 'Ask a Question' function on product pages and our social media presence are more effective ways of answering customers' product & project technical Qs.
Simply shutting down the forum server is going to return thousands of 404s for forum.xxxx.com, which I can't imagine would be helpful for the SEO of www.xxxx.com even though my understanding is that subdomains are sort of handled differently than the main site. We really tremendously on natural search traffic for www.xxxx.com, so I am loathe to make any moves that would hurt us.
I was thinking we should just keep the forum server up but return 410s for everything on it, including the roughly ~3,000 indexed pages until they are removed from the index, then shut it down.
The IT team also gave the option of simply pointing the URL to our main URL, which sorta scares me because it would then 200 and return the same experience hitting it from forum.xxxx.com as www.xxxx.com, which sounds like a very bad idea. (Yes, we do have canonicals on www.xxxx.com).
In your opinion, what is the best way to handle this matter?
Thank You
-
Hello
Thank you for the detailed, helpful response. I should note that most of our SEO traffic does NOT come from forum pages. The overwhelming amount of natural search traffic we receive is from product detail, category, subcategory, and related (how-to articles etc) pages on the main www.xxxx.com site.
I am concerned mainly with the potential fallout of Google seeing 3000 or more 404 pages if we just delete the forum and kill the server, and am looking for the best way to handle that. I am ok with returning a 410 or redirecting anything that tries to hit forum.xxxx.com to www.xxxx.com.
What do you think?
Thanks
-
Something really important to note before you make any decision (of any kind) is that rankings are earned by web-pages, not (usually) by domains or websites. As such, if a large volume of your organic search traffic comes through your forum pages - prepare to lose that! Or at least... to lose a chunk of it, even if 301 redirects are handled correctly.
You want to look in Google Analytics and check out your SEO traffic data. Either go Acquisition->All Channels->Organic or view your traffic via a different dashboard and add the "organic traffic" segment to filter your data down. You should try to look at landing pages, specifically for organic search. Whilst traffic is usually represented by a line-graph along the top of most traffic-centric analytics reports, the table underneath can adapt based upon your perceived primary and secondary dimensions.
If most of your SEO ('organic') traffic is landing on forum pages, maybe closing the forum isn't such a great idea. If that's not the case, you can shut it down and implement 301 redirects t handle the fall-out. Note that 301 redirects won't insulate 100% of a lost page's SEO equity. Some amount of that authority will be transferred, but not all (in fact - none if the new content is irrelevant to the connected search queries for the old page).
My preference is to draw up a giant spreadsheet of live and historic forum URLs (historic ones can come out of GA or GSC if you extend the date ranges, there are also some clever bashes to export all URLs which the WayBack Machine holds for a given domain - though you need some knowledge of JSON arrays).
Once you have that, you can fetch metrics for all the URLs. Export traffic stats for those pages from Analytics, get other stuff en-masse (Moz PA / DA, Majestic CF / TF, Ahrefs URL rating etc) from a tool like URL Profile. Note that URL Profile won't get the metrics for you if you don't plug in various tokens and secret keys (which require subscriptions) from the data source points such as Moz or Ahrefs. It's a great tool, but it doesn't get you free access to paid data...
Once you have all the URLs alongside their associated SEO metrics, you can write a formula to 'balance' and 'normalise' those figures, boiling them down into one single "SEO Auth." metric. The URLs with high to moderate SEO authority all need 1-to-1 redirects, pointing them to **relevant **resources throughout the rest of the website. All of the weak URLs or those which have very poor SEO authority, can be 301 redirected to the homepage or the closest relevant containing category.
Once you have done all of that, you should experience minimal losses.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How to handle individual page redirects on Wix?
I switched from one domain to another because I wanted a domain that had our company name so it was more brand-y. However, the old domain had better DA/PA. Originally I set up a global 301 from the old to the new, but now I'm finding that I actually need to set up individual 301's from each URL of the old site, or at least from each page. However, I am using Wix so it looks like I can't always do URL-URL 301's, although I can redirect any URL to a page on the new website. The problem is that, in some cases, the content on the new site is different (or, for example, I can only link a particular blog post on the old site back to the new site's blog's main page). How closely do URLS/pages need to resemble each other for link juice to be transferred? Also, should I try to set up all these redirects manually or bite the bullet and go back to using the old domain? The problem is that I did a lot of beginner SEO junk for the new domain, like submitting to a few higher-quality directories, and getting our website on various industry resource sites, etc. I'd need to re-do this entirely if I go back to the old page. What do you think?
Intermediate & Advanced SEO | | BohmKalish1230 -
Best server-side sitemap generators
I've been looking into sitemap generators recently and have got a good knowledge of what creating a sitemap for a small website of below 500 URLs involves. I have successfully generated a sitemap for a very small site, but I’m trying to work out the best way of crawling a large site with millions of URLs. I’ve decided that the best way to crawl such a large number of URLs is to use a server side sitemap, but this is an area that doesn’t seem to be covered in detail on SEO blogs / forums. Could anyone recommend a good server side sitemap generator? What do you think of the automated offerings from Google and Bing? I’ve found a list of server side sitemap generators from Google, but I can’t see any way to choose between them. I realise that a lot will depend on the type of technologies we use server side, but I'm afraid that I don't know them at this time.
Intermediate & Advanced SEO | | RG_SEO0 -
An affiliate website uses datafeeds and around 65.000 products are deleted in the new feeds. What are the best practises to do with the product pages? 404 ALL pages, 301 Redirect to the upper catagory?
Note: All product pages are on INDEX FOLLOW. Right now this is happening with the deleted productpages: 1. When a product is removed from the new datafeed the pages stay online and are showing simliar products for 3 months. The productpages are removed from the categorie pages but not from the sitemap! 2. Pages receiving more than 3 hits after the first 3 months keep on existing and also in the sitemaps. These pages are not shown in the categories. 3. Pages from deleted datafeeds that receive 2 hits or less, are getting a 301 redirect to the upper categorie for again 3 months 4. Afther the last 3 months all 301 redirects are getting a customized 404 page with similar products. Any suggestions of Comments about this structure? 🙂 Issues to think about:
Intermediate & Advanced SEO | | Zanox
- The amount of 404 pages Google is warning about in GWT
- Right now all productpages are indexed
- Use as much value as possible in the right way from all pages
- Usability for the visitor Extra info about the near future: Beceause of the duplicate content issue with datafeeds we are going to put all product pages on NOINDEX, FOLLOW and focus only on category and subcategory pages.0 -
Panda Recovery - What is the best way to shrink your index and make Google aware?
We have been hit significantly with Panda and assume that our large index with some pages holding thin/duplicate content being the reason. We have reduced our index size by 95% and have done significant content development on the remaining 5% pages. For the old, removed pages, we have installed 410 responses (Page does not exist any longer) and made sure that they are removed from the sitempa submitted to Google; however after over a month we still see Google spider returning to the same pages and the webmaster tools shows no indicator that Google is shrinking our index size. Are there more effective and automated ways to make Google aware of a smaller index size in hope of Panda recovery? Potentially using the robots.txt file, GWT URL removal tool etc? Thanks /sp80
Intermediate & Advanced SEO | | sp800 -
Best way to duplicate a wordpress site for staging purposes?
I want to make some changes to my Wordpress site, and want to somehow set up a live staging area. Does anyone know of a good way to do this? I want all of the same content there I just want to be able to make changes to it and try it all out before going live. Any thoughts on this? Also I want to be sure the staging site doesn't get indexed since it will be a complete duplicate of my existing site. Thanks!
Intermediate & Advanced SEO | | NoahsDad0 -
Advice on forum links
Hi guys, Looking for some good advice on forum links and there potential negative impact. I am analysing the links of a URL and around 60% of the links are coming from a forum (on a different domain). The forum is very relevant - about the same product he is selling and also has a decent user base. This 60% of links account for roughly 6,500 links. All with different varying keyword anchor text's, and some with excessive usage of a particular keyword anchor text. They are also all do-follow. They are in a mixture of signature links and in post links. The site they link to has been hit by penguin which also has an EMD. MY question is even though these links are relevant and on a good site with good traffic, do you think they have likely been picked up in the penguin algorithm? My initial thought was yes only because they are all do follow and mostly keyword based. But id love to hear thoughts on this as well as possible recovery options, i.e should he remove the forum links, reduce drastically or make them all no follow so traffic can still pass through? Thanks!
Intermediate & Advanced SEO | | ROIcomau0 -
How do I optimise my products for best results?
Hi We have a number of products we want to optimise for example. Barbeque Boss Double Oven Glove, Black When performing keyword research, there are a number of generic terms such as, black oven gloves, barbeque boss and so on. Now i can write the page and optimise for these keyword phrases but I am not sure this is the right way about going for it, particularly if i have several products in the range that are "black oven gloves" or " double oven gloves" How so i best structure my meta tags, meta description and descriptions? Should i just use the product title and optimise around this in the hope Google displays our page in any search queries containing words in this product title? Any advice would be appreciated. Thanks Craig
Intermediate & Advanced SEO | | Towelsrus0 -
Robots.txt disallow subdomain
Hi all, I have a development subdomain, which gets copied to the live domain. Because I don't want this dev domain to get crawled, I'd like to implement a robots.txt for this domain only. The problem is that I don't want this robots.txt to disallow the live domain. Is there a way to create a robots.txt for this development subdomain only? Thanks in advance!
Intermediate & Advanced SEO | | Partouter0