Parking page for domain names
-
Hi all,
I represent a hosting company which has thousands of domain names that is parked for the clients until they start using them. Currently we are presenting the client and visitors information about the situation in the top of the pages and we have placed information about all the main products in the last part of the page. You can see an example here:
http://prodesign.no/Would you recommend utilizing these pages in a better way than how we are doing today (SEO wise towards our own website)? We have the ability to instantly change all of these pages at once and we are also able to present different pages for every single parked domain name if we want to.
Best regards,
Jon -
Even though there are thousands of domain names involved, I wouldn't expect for this to have any positive effect on your site's SEO (or rankings). The problem is that these domain names aren't trusted--they aren't going to have any Domain Authority, they're hosted most likely on the same server (or class C block of IPs), and the content isn't unique to each site. For SEO and ranking purposes, unfortunately you have all of those things counting against you.
In order for a site or a domain to help, it needs unique content, higher Domain Authority, and links to it.
This shouldn't stop you from using those domains to your benefit, though. I would, however, either put up an ad that contains a link, or put a text link on the pages. I would also make sure those are "nofollow" links. Suddenly if you have 20,000 sitewide links to your site from "low quality", thin content domains, that could actually throw up a red flag and hurt your site's current rankings.
-
Ok, so, here is how I understand it.
Google knows if domains are parked, and don't index, and, may be, don't pay any attention to those pages whatsoever (at least it would make sense). Here is a link to Matt Cutts video: https://www.youtube.com/watch?v=eF8i6rKojXQ
So, there is a good chance that those parked websites do not have any influence in any way. However, if, for whatever reason, google doesn't recognize some of them as parked (let's say 1%, which will equal to 200+ websites), and "assigns" duplicate content "tag" to those, you might get hit by correlation. Somewhat "friend of my enemy is my enemy" system. Therefore, if unique content is not possible, nofollowing all those links might be not too bad of an idea.
I would recommend sample testing. Unfollow links on large enough amount of those domains, see how it affects rankings etc., in a month or so, put those links back to follow, wait, see how it affects, then do the same with another sample. This will allow to figure it out pretty accurately within couple months for sure.
If you do do that, write a case study, post it here, it will be interesting case to look at
Cheers:)
-
Hi Dimitrii,
There is around 20 000 domains with the same content at the moment and this number is constantly growing. There is only random variations in title and meta description currently. Each time the pages is loaded those data are changing.
We need to have the visible content on these pages more or less the same as they are now as they give useful information to clients at the same time as they market our products. We can change all content that is not visible and make those permanent on each domain (if desirable).
Given those requirements and that information, would you recommend me to put no follow on the links to our website? Is the current setup hurting our SEO efforts since it is guaranteed that extremely many of the domains has duplicate content?
Thank you so far for valuable tips.
Best regards,
Jon -
Hi there.
Well, since it's just one page websites without really any content, you won't be able to get much out of them. So, the only way you might utilize them is with backlinks. So, I'd have 2 follow backlinks, one with company name, one with matched anchor text. Also make sure that you have unique content on those one-page websites - so you don't have duplicate content issues. Other than that, unless you will be creating full content one-page websites for each of those parked domains, i don't really see much benefit.
Hope this helps.
P.S. Do make sure that you are not using exactly the same pages for those.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Domain name change
Here's the scenario... Client has two domain names: domain.com - targeting one country (Australia) otherdomain.com - targeting all other countries Both have identical products, but different currencies (AU$ and US$). The problem (as most of you will know) is that without using a sub-domain or country-code top-level domains, Google has no idea which domain should be served for which domain. Furthermore, because the root domain is different, Google doesn't see any connection between the two - other than the fact they have identical products! My recommendation to the client is to change to: domain.com to domain.com.au otherdomain.com to domain.com Arguably, we could leave the second one alone. But I think it's better for the brand to use the same root domain for each. Obviously this means both will need to be redirected. Since NONE of the pages within the sites will change, do we need to redirect every page, or just the root domain? Any other risks or concerns we should know about?
Intermediate & Advanced SEO | | muzzmoz0 -
How does Googlebot evaluate performance/page speed on Isomorphic/Single Page Applications?
I'm curious how Google evaluates pagespeed for SPAs. Initial payloads are inherently large (resulting in 5+ second load times), but subsequent requests are lightning fast, as these requests are handled by JS fetching data from the backend. Does Google evaluate pages on a URL-by-URL basis, looking at the initial payload (and "slow"-ish load time) for each? Or do they load the initial JS+HTML and then continue to crawl from there? Another way of putting it: is Googlebot essentially "refreshing" for each page and therefore associating each URL with a higher load time? Or will pages that are crawled after the initial payload benefit from the speedier load time? Any insight (or speculation) would be much appreciated.
Intermediate & Advanced SEO | | mothner1 -
Can noindexed pages accrue page authority?
My company's site has a large set of pages (tens of thousands) that have very thin or no content. They typically target a single low-competition keyword (and typically rank very well), but the pages have a very high bounce rate and are definitely hurting our domain's overall rankings via Panda (quality ranking). I'm planning on recommending we noindexed these pages temporarily, and reindex each page as resources are able to fill in content. My question is whether an individual page will be able to accrue any page authority for that target term while noindexed. We DO want to rank for all those terms, just not until we have the content to back it up. However, we're in a pretty competitive space up against domains that have been around a lot longer and have higher domain authorities. Like I said, these pages rank well right now, even with thin content. The worry is if we noindex them while we slowly build out content, will our competitors get the edge on those terms (with their subpar but continually available content)? Do you think Google will give us any credit for having had the page all along, just not always indexed?
Intermediate & Advanced SEO | | THandorf0 -
Parked Domain question
Hi, If a domain has been parked for more than 12 years, and has never been used for a project so far, does this has an impact on SEO or its like having a fresh new domain? Sebi
Intermediate & Advanced SEO | | TheHecksler0 -
Company name doesn't have keyword: use domains instead?
Good Morning! Now, I'll admit, I may be obsessing a little too much on this, and it may not make that big of an impact in the long run, but with Google being introduced to the world if I were to start a business today I would try and include my keyword into the title of my business. For example Dollar Shave Club, at least they got the word shave in there. My business doesn't have a keyword in our name, is it beneficial to structure our URLs to include a keyword so that all of our URLs include that word? So if I sell organic bananas, but my company is called Evananas, is it worth it to have all domains become a child of Evananas.com/organic_bananas? That way at least we have the keyword "Organic Bananas" in our title? So I could then have things like: evananas.com/organic_bananas/recipes evananas.com/organic_bananas/benefits evananas.com/organic_bananas/taste_really_freeking_good Vs. evananas.com/recipes evananas.com/benefits evananas.com/taste_really_freeking_good I'm not sure it makes a difference. The other problem is I want to keep our URL's as short as possible. I feel like less is always more, but I was always under the impression domain/URL based keywords were rather powerful. What is the best practice in this case? Thanks Guys! Evan(ana)
Intermediate & Advanced SEO | | HashtagHustler0 -
Show parts of page A on page B & C?
Good afternoon,
Intermediate & Advanced SEO | | rayvensoft
A quick question. I am working on a website which has a large page with different sections. Lets say: Page 1
SECTION A
SECTION B
SECTION C Now, they are adding a new area where they want to show only certain sections, so it would look like this: Page 2
SECTION A Page 3
SECTION C Page 4
SECTION D So my question is, would a rel='canonical' tag back to Page 1 be the correct way of preempting any duplicate content issues? I do not need Page 2-4 to even be indexed, it is just a matter of usability and giving the users what they are looking for without all the rest of the extra stuff. Gracias. Tesekürler. Salamat Ko. Thanks. (bonus thumbs up for anybody who knows which languages each of those are) 🙂0 -
Sub Domain
Hi everybody, My competition has started to use the sub-domains vastly. He has created one sub domain for every single city and keyword. Is it something that I should be worried of? Is it a good idea I start doing the same thing? Thanks for your help.
Intermediate & Advanced SEO | | Armin6660 -
Multiple domains expiring that have 301 redirects to my primary domain. Am I in trouble?
I recently took on the SEO of a large website with http://example.com. My predecessor bought 40 plus domains for specific cities like Jacksonvilleexample.com, Miamiexample.com, etc. ZERO of the additional domains linked to our main website. The domains that were bought basically had our exact same website in terms of content, links etc that mirrored our main http://example.com. I added 301 redirects to help problems that may be a result of this type of structure. Some of the additional domains were indexed and some were not but all have 301's and as far as traffic is concerned I'm not worried about loosing short term traffic. My question: All the domains are set to expire in June and I don't want to continue to have them 301 redirected to my main domain (example.com). I'm not trying to avoid the additional cost of all the domains but I don't see an advantage to having them so CAN letting all these domains expire hurt me from a long term SEO position if I don't renew them?
Intermediate & Advanced SEO | | ballanrk0