Our main domain has thousands of subdomains with same content (expired hosting), how should we handle it?
-
Hello,
Our client allows users to create free-trial subdomains and once the trial expires, all the domains have the same page. If people stick, their own websites are hosted on the subdomain.
Since all these expired trials subdomains have the same content and are linking towards the Homepage, should they be nofollows?
Has anyone dealt with something similar?
Thanks very much in advance,
-
Thanks everyone, really helpful!
-
If its essentially a duplicate, thin content Login Screen... NoIndex, No Follow the page.
-
I would say it is unnatural, and therefore any links you have should be no-followed.
-
No. Thin content carries algorithmic penalty as well. Careful. Might want to stick with option 2.
-
Hi everyone,
Thanks for your recommendations. In a way, every subdomain is custom since every client's name is different and it prompts the client to login back into cPanel to reactivate the hosting.
The subdomains are not in a way 'expired', they are expired from a customer's point of view, but they are active and there is no duplicate content, since it says 'You are trying to access x.y.com', so it is different for every subdomain.
But they are all very low quality one-page subdomains, is it useful to have them as links towards the main site?
-
- Custom 404 page?
-
Options:
1.) set rel="canonical" tags on all of them referencing the main page within the href portion of the tag
2.) no-crawl, no-index the pages
3.) all of the above
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Hundreds of Subdomains under a powerful domain
Hello, I own a domain FeedsPortal - com As you can see from the link profile, it has some fantastic referring domains and links. Because of this, it has a DA of 86, with a good CF/TF. The problem is that nearly all of these powerful links are for the sub domains under the main domain. For example, it has a link on this page on MSN... https://www.msn.com/en-us/news/offbeat/finland-home-of-the-dollar103000-speeding-ticket/ar-AA9GA9i?ocid=ansAtlantic11 On this MSN article, it has a link to http://feedproxy.google.com/~r/TheAtlantic/~3/PK4bw0Tkkps/story01.htm (a link under Google.com), which then forwards to my domain under a sub domain http://theatlantic.feedsportal.com/c/34375/f/..... I have many hundreds of sub domains like this. I have a feeling redirecting all non-existent sub domains to the homepage would be a bad idea for SEO. Does anyone else see of a way to do this without harming my SEO? I suppose the only way to do it properly would be to write articles about each subdomain. For example, http://theatlantic.feedsportal.com, write an article about The Atlantic, then forward all traffic meant for theatlantic.feedsportal.com to feedsportal.com/10-reasons-why-the-atlantic-is-great/ Does anyone else have an idea of how to at least get a list of the non-existant sub domains that have links so I can maybe create articles for each sub domain? Or is there a simpler way to do this. Thanks!
Intermediate & Advanced SEO | | thinkingdif0 -
Handling alternate domains
Hi guys, We're noticing a few alternate hostnames for a website rearing their ugly heads in search results and I was wondering how everyone else handles them. For example, we've seen: alt-www.(domain).com test.(domain).com uat.(domain).com We're looking to ensure that these versions all canonical to their live page equivalent and we're adding meta robots noindex nofollow to all pages as an initial measure. Would you recommend a robots.txt crawler exclusion to these too? All feedback welcome! Cheers, Sean
Intermediate & Advanced SEO | | seanginnaw0 -
What is better for web ranking? A domain or subdomain?
I realise that often it is better put content in a subfolder rather than a subdomain, but I have another question that I cannot seem to find the answer to. Is there any ranking benefit to having a site on a .co.uk or .com domain rather than on a subdomain? I'm guessing that the subdomain might benefit from other content on the domain it's hosted on, but are subdomains weighted down in any way in the search results?
Intermediate & Advanced SEO | | RG_SEO0 -
Handling duplicate content, whilst making both rank well
Hey MOZperts, I run a marketplace called Zibbet.com and we have 1000s of individual stores within our marketplace. We are about to launch a new initiative giving all sellers their own stand-alone websites. URL structure:
Intermediate & Advanced SEO | | relientmark
Marketplace URL: http://www.zibbet.com/pillowlink
Stand-alone site URL: http://pillowlink.zibbet.com (doesn't work yet) Essentially, their stand-alone website is a duplicate of their marketplace store. Same items (item title, description), same seller bios, same shop introduction content etc but it just has a different layout. You can scroll down and see a preview of the different pages (if that helps you visualize what we're doing), here. My Questions: My desire is for both the sellers marketplace store and their stand-alone website to have good rankings in the SERPS. Is this possible? Do we need to add any tags (e.g. "rel=canonical") to one of these so that we're not penalized for duplicate content? If so, which one? Can we just change the meta data structure of the stand-alone websites to skirt around the duplicate content issue? Keen to hear your thoughts and if you have any suggestions for how we can handle this best. Thanks in advance!0 -
Domain vs Subdomain for Multi-Location Practice
I have a client who has 2 locations (Orlando & Tampa) and would like to keep the current domain for both locations (DA 29). We want to target additional cities within each service area (Orlando & Tampa). Each service area would target 2 cities on the main pages and 4-5 cities with "SEO" pages which contains unique content specific to the given city. Would I be better off creating sub domains (www.orlando.domain.com & www.tampa.domain.com), creating subfolders (www.domain.com/orlando, etc) or keeping the domain as is and create SEO pages specific to each city? We want to spread the domain authority to both locations.
Intermediate & Advanced SEO | | Red_Spot_Interactive0 -
How To Handle Duplicate Content Regarding A Corp With Multiple Sites and Locations?
I have a client that has 800 locations. 50 of them are mine. The corporation has a standard website for their locations. The only thing different is their location info on each page. The majority of the content is the same for each website for each location. What can be done to minimize the impact/penalty of having "duplicate or near duplicate" content on their sites? Assuming corporate won't allow the pages to be altered.
Intermediate & Advanced SEO | | JChronicle0 -
Confusion about domain extension.
Hi, need bit suggestions from you guys. After researchi found a KW that have some good Lcoal search volume.. My question is should i buy the CLD as it has good LS volume or i should go for .com , .net etc. Becasue as i know that LS means that number of people searching form that location(based on IP) not number of people searching in that local version of google. So no need to go for CLD as it vl help only in local version.. Bit confused waiting for reply thanks
Intermediate & Advanced SEO | | Big_Zee0 -
Moving poor content to its own domain may risk being seen as a doorway page?
We have decided to move some thin content from our primary domain to an independent domain in order to lift the panda penalty. Does anyone have suggestions for how to avoid being seen as a doorway page? Thank you in advance.
Intermediate & Advanced SEO | | nicole.healthline0