Best way to handle deletion of a forum subdomain?
-
Hello All
Our site www.xxxx.com has long had a forum subdomain forum.xxxx.com.
We have decided to sunset the forum. We find that the 'Ask a Question' function on product pages and our social media presence are more effective ways of answering customers' product & project technical Qs.
Simply shutting down the forum server is going to return thousands of 404s for forum.xxxx.com, which I can't imagine would be helpful for the SEO of www.xxxx.com even though my understanding is that subdomains are sort of handled differently than the main site. We really tremendously on natural search traffic for www.xxxx.com, so I am loathe to make any moves that would hurt us.
I was thinking we should just keep the forum server up but return 410s for everything on it, including the roughly ~3,000 indexed pages until they are removed from the index, then shut it down.
The IT team also gave the option of simply pointing the URL to our main URL, which sorta scares me because it would then 200 and return the same experience hitting it from forum.xxxx.com as www.xxxx.com, which sounds like a very bad idea. (Yes, we do have canonicals on www.xxxx.com).
In your opinion, what is the best way to handle this matter?
Thank You
-
Hello
Thank you for the detailed, helpful response. I should note that most of our SEO traffic does NOT come from forum pages. The overwhelming amount of natural search traffic we receive is from product detail, category, subcategory, and related (how-to articles etc) pages on the main www.xxxx.com site.
I am concerned mainly with the potential fallout of Google seeing 3000 or more 404 pages if we just delete the forum and kill the server, and am looking for the best way to handle that. I am ok with returning a 410 or redirecting anything that tries to hit forum.xxxx.com to www.xxxx.com.
What do you think?
Thanks
-
Something really important to note before you make any decision (of any kind) is that rankings are earned by web-pages, not (usually) by domains or websites. As such, if a large volume of your organic search traffic comes through your forum pages - prepare to lose that! Or at least... to lose a chunk of it, even if 301 redirects are handled correctly.
You want to look in Google Analytics and check out your SEO traffic data. Either go Acquisition->All Channels->Organic or view your traffic via a different dashboard and add the "organic traffic" segment to filter your data down. You should try to look at landing pages, specifically for organic search. Whilst traffic is usually represented by a line-graph along the top of most traffic-centric analytics reports, the table underneath can adapt based upon your perceived primary and secondary dimensions.
If most of your SEO ('organic') traffic is landing on forum pages, maybe closing the forum isn't such a great idea. If that's not the case, you can shut it down and implement 301 redirects t handle the fall-out. Note that 301 redirects won't insulate 100% of a lost page's SEO equity. Some amount of that authority will be transferred, but not all (in fact - none if the new content is irrelevant to the connected search queries for the old page).
My preference is to draw up a giant spreadsheet of live and historic forum URLs (historic ones can come out of GA or GSC if you extend the date ranges, there are also some clever bashes to export all URLs which the WayBack Machine holds for a given domain - though you need some knowledge of JSON arrays).
Once you have that, you can fetch metrics for all the URLs. Export traffic stats for those pages from Analytics, get other stuff en-masse (Moz PA / DA, Majestic CF / TF, Ahrefs URL rating etc) from a tool like URL Profile. Note that URL Profile won't get the metrics for you if you don't plug in various tokens and secret keys (which require subscriptions) from the data source points such as Moz or Ahrefs. It's a great tool, but it doesn't get you free access to paid data...
Once you have all the URLs alongside their associated SEO metrics, you can write a formula to 'balance' and 'normalise' those figures, boiling them down into one single "SEO Auth." metric. The URLs with high to moderate SEO authority all need 1-to-1 redirects, pointing them to **relevant **resources throughout the rest of the website. All of the weak URLs or those which have very poor SEO authority, can be 301 redirected to the homepage or the closest relevant containing category.
Once you have done all of that, you should experience minimal losses.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Should I exclude my knowledge center subdomain from indexing?
We have a very large Knowledge center that is indexed. Is there any reason I should not exclude this subdomain from indexing? Thank you
Intermediate & Advanced SEO | | NikCall2 -
Should I delete 100s of weak posts from my website?
I run this website: http://knowledgeweighsnothing.com/ It was initially built to get traffic from Facebook. The vast majority of the 1300+ posts are shorter curation style posts. Basically I would find excellent sources of information and then do a short post highlighting the information and then link to the original source (and then post to FB and hey presto 1000s of visitors going through my website). Traffic was so amazing from FB at the time, that 'really stupidly' these posts were written with no regard for search engine rankings. When Facebook reach etc dropped right off, I started writing full original content posts to gain more traffic from search engines. I am starting to get more and more traffic now from Google etc, but there's still lots to improve. I am concerned that the shortest/weakest posts on the website are holding things back to some degree. I am considering going through the website and deleting the very weakest older posts based on their quality/backlinks and PA. This will probably run into 100s of posts. Is it detrimental to delete so weak many posts from a website? Any and all advice on how to proceed would be greatly recieved.
Intermediate & Advanced SEO | | xpers1 -
Best way to link 150 websites together
Fellow mozzers, Today I got an interesting question from an entrepreneur who has plans to start about 100-200 webshops on a variety of subjects. His question was how he should like them together. He was scared that if he would just make a page on every website like: www.domain.com/our-webshops/ that would list all of the webshops he would get penalised because it is a link farm. I wasn't sure 100% sure which advise to give him so i told him i needed to do some research on the subject to make sure that i'm right. I had a couple of suggestions myself. 1. Split the amount of pages by 3 and divide them into three columns. Column A links to B, B links to C and C links to A. I realize this is far from ideal but it was one of the thoughts which came up. 2. Divide all the webshops into different categories. For example: Webshops aimed at different holidays, webshops aimed at mobile devices etcetera. This way you will link the relevant webshops together instead of all of them. Still not perfect. 3. Create a page on a separate website (such as a company website) where the /our-webshops/ page exists. This way you only have to place a link back from the webshops to this page. I've seen lots of webshops using this technique and i can see why they choose to do so. Still not ideal in my opinion. That's basicly my first thoughts on the subject. I would appreciate any feedback on the methods described above or even better, a completely different strategy in handling this. For some reason i keep thinking that i'm missing the most obvious and best method. 🙂
Intermediate & Advanced SEO | | WesleySmits0 -
Forum being heavily penalised
Hi everyone, I've just signed up for Moz and I'm getting well and truly stuck in. I have just completed my first site crawl and have a frightening 5,363 errors and 25,319 warnings. The main culprit is the forum on my site, it contains hundreds of pages dating from as far back as 2002. It is full of Duplicate Content, Duplicate page titles and a fair few 404 errors where old links are now outdated. Can anyone advise what would be the best course of action? Should I hide the whole forum from Google's robots? My only concern with doing this is the loss of hundreds of pages of regularly updated content which I feel is boosting SEO. Help! Thanks guys 🙂
Intermediate & Advanced SEO | | gaz33420 -
Domain Name Change - Best Practices?
Good day guys, We got a restaurant that is changing its name and domain. However they are keeping the same server location, same content and same pages (we are just changing the logo on the website). It just has to go a new domain. We don't want to lose the value of the current site, and we want to avoid any duplicate penalties. Could you please advise of the best practices of doing a domain name change? Thank you.
Intermediate & Advanced SEO | | Michael-Goode0 -
Best linking practice for international domains
SEOMoz team, I am wondering that in the days of Panda and Penguin SEOs have an opinion on how to best link between international domains for a web page property. Let's say you have brandname.DE (German site) brandname.FR (French site) brandname.CO.UK (British site) Right now we are linking form each site on the page to the other two language sites to make users aware of the translated version of the site which obviously make it a site wide link which seems to be lately disencouraged by Google. Did anyone out there have any ideas how to strategically interlink between international domains that represent language versions of a web site? /PP
Intermediate & Advanced SEO | | tomypro0 -
Best Product URL For Indexing
My proposed URL: mydomain.com/products/category/subcategory/product detail Puts my products 4 levels deep. Is this too deep to get my products indexed?
Intermediate & Advanced SEO | | waynekolenchuk0