Old subdomains - what to do SEO-wise?
-
Hello,
I wanted the community's advice on how to handle old subdomains.
We have https://www.yoursite.org. We also have two subdomains directly related to the main website: https://www.archive.yoursite.org and https://www.blog.yoursite.org.
As these pages are not actively updated, they are triggering lots and lots of errors in the site crawl (missing meta descriptions, and much much more). We do not have particular intentions of keeping them up-to-date in terms of SEO. What do you guys think is the best option of handling these?
I considered de-indexing, but content of these page is still relevant and may be useful - yet it is not up to date and it will never be anymore.
Many thanks in advance.
-
Thanks for replying Will.
You have mentioned a few ways to deal with this there - and they all seem to point out to the fact that this should not really be a high-priority issue for us at the moment. Especially, if you think that sub-domains do not really have a major effect to the main site (I would not even think it's even worth us deindexing to be honest as it may be relevant to some people and we can just allow Google to continue indexing as it is).
Surely, all considerations point to this: we can come to the conclusion that we won't be doing any SEO-related work on these pages.
Therefore, how do I set up MOZ to ignore these two sub-domains and only show crawl errors related to the main site? We just don't want these pages to be crawled at all by MOZ given we won't do any work on them.
Thanks
-
Hi there. Sorry for the slow follow-up on this - there was an issue that meant I didn't get the email alert when it was assigned to me.
There is increasing evidence that culling old / poor performing content from your site can have a positive effect, though I wouldn't be particularly confident that this would transfer across sub-domains to benefit the main site.
In general, I suspect that most effort expended here will be better-placed elsewhere, and so I would angle towards the least effort option.
I think that the "rightest" long-term answer though would be to move the best content to the main domain (with accompanying 301 redirects) and remove the remainder with 410 status codes. This should enable you to focus on the most valuable content and get the most benefit from the stuff that is valuable, while avoiding having to continue expending effort on the stuff that is no longer useful. The harder this is, though, the less I'd be inclined to do it - and would be more likely to consider just deindexing the lowest quality stuff and getting whatever benefit remains from the better content for as long as it is a net positive, with an eye to eventually removing it all.
Hope that helps - I don't think it's a super clear-cut situation unfortunately.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How to find trustful seo specialist?
How to find trustful seo specialist if you don't know about SEO a lot?
White Hat / Black Hat SEO | | DigiVital1 -
Why isn't a 301 redirect removing old style URLs from Google's index?
I have two questions:1 - We changed the URL structure of our site. Old URLs were in the format of kiwiforsale.com/used_fruit/yummy_kiwi. These URLs are 301 redirected to kiwiforsale.com/used-fruit/yummy-kiwi. We are getting duplicate content errors in Google Webmaster Tools. Why isn't the 301 redirect removing the old style URL out of Google's index?2 - I tried to remove the old style URL at https://www.google.com/webmasters/tools/removals, however I got the message that "We think the image or web page you're trying to remove hasn't been removed by the site owner. Before Google can remove it from our search results, the site owner needs to take down or update the content."Why are we getting this message? Doesn't the 301 redirect alert Google that the old style URL is toast and it's gone?
White Hat / Black Hat SEO | | CFSSEO0 -
Sitelinks Search Box impact for SEO
I am wondering how the relatively new sitelinks search box impacts the SEO rankings for a specific site or keyword combination - do you guys have any experience or bechmarks on this? Obviously it should help on getting more real estate on the SERP page (due to adding the search box), but do you also get extra goodwill and improved SERP position from adding it? Also, is the impact different on different type of terms, let's say single brand or category term such as "Bestbuy" (or "coupon") or a combination term "Bestbuy Apple" (or "Dixons coupon")? Thanks in advance!
White Hat / Black Hat SEO | | tjr0 -
Negative SEO
How do identify if somebody is giving you negative links. If I look at who is linking my site I suddenly see an none related website linking to my site http://plastische-chirurgie-borsten.be/ URL is translated "plastic-surgery-breast" The site is full of links. Would this be an attempt to negative SEO? How can I see the effect of such links?
White Hat / Black Hat SEO | | nono_1974
Should I disavow this link? kind regards,0 -
Alltop good for SEO?
Are there any negative effects on getting your blog posted on alltop? Good SEO value or not?
White Hat / Black Hat SEO | | DemiGR0 -
Same content, different target area SEO
So ok, I have a gambling site that i want to target for Australia, Canada, USA and England separately and still have .com for world wide (or not, read further).The websites content will basically stays the same for all of them, perhaps just small changes of layout and information order (different order for top 10 gambling rooms) My question 1 would be: How should I mark the content for Google and other search engines that it would not be considered "duplicate content"? As I have mentioned the content will actually BE duplicate, but i want to target the users in different areas, so I believe search engines should have a proper way not to penalize my websites for trying to reach the users on their own country TLDs. What i thought of so far is: 1. Separate webmasterstools account for every domain -> we will need to setup the user targeting to specific country in it.
White Hat / Black Hat SEO | | SEO_MediaInno
2. Use the hreflang tags to indicate, that this content is for GB users "en-GB" the same for other domains more info about it http://support.google.com/webmasters/bin/answer.py?hl=en&answer=189077
3. Get the country specific IP address (physical location of the server is not hugely important, just the IP)
4. It would be great if the IP address for co.uk is from different C-class than the one for the .com Is there anything I am missing here? Question 2: Should i target .com for USA market or is there some other options? (not based in USA so i believe .us is out of question) Thank you for your answers. T0 -
Press Releases and SEO in 2013
Mozers, A few questions for the community: Distributing a press release through a service like 24-7pressrelease.com - is it a serious duplicate content issue when an identical press release is distributed to multiple sites with no canonical markup (as far as I can tell)? All of the backlinks in the press release are either nofollow or redirects. If there IS a duplicate content issue, will the website be affected negatively given the numerous Panda and Penguin refreshes? Why SHOULDN'T a company issue a press release to multiple sites if it actually has something legitimate to announce and the readership of a given site is the target demographic? For example, why shouldn't a company that manufactures nutritional health supplements issue the same press release to Healthy Living, Lifestyle, Health News, etc _with a link to the site?_I understand it's a method that can be exploited for SEO purposes, but can't all SEO methods be taken to an extreme? Seems to me that if this press release scenario triggers the duplicate content and/or link spam penalty(ies), I'd consider it a slight deficiency of Google's search algorithm. Any insight would be much appreciated. Thanks.
White Hat / Black Hat SEO | | b40040400 -
Best way to handle SEO error, linking from one site to another same IP
We committed an SEO sin and created a site with links back to our primary website. Although it does not matter, the site was not created for that purpose, it is actually "directory" with categorized links to thousands of culinary sites, and ours are some of the links. This occurred back in May 2010. Starting April 2011 we started seeing a large drop in page views. It dropped again in October 2011. At this point our traffic is down over 40% Although we don't know for sure if this has anything to do with it, we know it is best to remove the links. The question is, given its a bad practice what is the best fix? Should we redirect the 2nd domain to the main or just take it down? The 2nd domain does not have much page rank and I really don't think many if any back-links to it. Will it hurt us more to lose the 1600 or so back links? I would think keeping the links is a bad idea. Thanks for your advice!
White Hat / Black Hat SEO | | foodsleuth0