User Created Subdomain Help
-
Have I searched FAQ: Yes
My issue is unique because of the way our website works and I hope that someone can provide some guidance on this.Our website http://breezi.com is a website builder where users can build their own website. When users build their site it creates a sub-domain route to their created site, for example: http://mike.breezi.com. Now that I have explained how our site works here is the problem: Google Webmaster Tools and Bing Webmaster Tools are indexing ALL the user created websites under our TLD and thus it is our impression that any content created in those sub-domains can confuse the search engine to thinking that the user created website and content is relevant to _OUR _main sitehttp://breezi.com. So, what we would like to know if there is a way to let search engines know that the user created sites and content is not related to our TLD site. Thanks for any help and advise.
-
Subdomains generally don't pass any authority, link juice etc to the TLD, Rand did a Whiteboard Friday that briefly covered this a while ago (see http://www.seomoz.org/blog/international-seo-where-to-host-and-how-to-target-whiteboard-friday)
I am curious, if you didn't want user created sites to be associated with your TLD why didn't you set up a different domain for user created sites?
I personally think it is morally wrong to try and stop Google indexing them. So, if you don't want these associated with you or your TLD I would set up a new domain eg yourbreezi.com and 301 any sites that have been set up to the new domain and make sure that any new user sites are set up under the new domain.
In truth I'm not sure it is too much to worry about, after all Wordpress.org uses subdomains for most of its hosted blogs and it doesn't seem to have done them too much harm!!
Hope that helps
-
Robert,
The suggestion you make is not an option. I don't want to remove any sub-domain urls because these are user generated sites that could generate their own respective ranking.
-
Navid,
Using the Robot.txt to block the sub domains might not be the best route.
The only way I would think you can do that is by telling GWT to remove the URL (in this case your subdomains).
On Webmaster tools, click on "Site Configuration", then "Crawler access" then "Remove URL". Here click on "New Removal request". You will then see a option to remove whole site. You can use this option to remove "subdomain.domain.com" from SERP.
-
hmmm... That is a tricky one. One place to look for answer might be to talk to SEO people that have worked with a similar service such as Ning or wordpress.com .
I'll be curious to hear of your findings.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Bloking pages in roborts.txt that are under a redirected subdomain
Hi Everyone, I have a lot of Marketo landing pages that I don't want to show in SERP. Adding the noindex meta tag for each page will be too much, I have thousands of pages. Blocking it in roborts.txt could have been an option, BUT, the subdomain homepage is redirected to my main domain (with a 302) so I may confuse search engines ( should they follow the redirect or should they block) marketo.mydomain.com is redirected to www.mydomain.com disallow: / (I think this will be confusing with the redirect) I don't have folders, all pages are under the subdomain, so I can't block folders in Robots.txt also Would anyone had this scenario or any suggestions? I appreciate your thoughts here. Thank you Rachel
Technical SEO | | RaquelSaiz0 -
How to create sitemap for example.com and blog.example.com ?
Hi I try to create sitemap for www.example.com, this website has link www.blog.example.com. after creating the sitemap using different tool. the sitemap not include www.blog.example.com and its relative files how can i get both example.com and blog.example.com in one sitemap
Technical SEO | | fogtheagency0 -
Keyword research, creating copy, fixing on-page optimisation - what next?
Hello - Wondered if I could get people's thoughts. We/I have started working on a client's website to improve everything - a general overhaul across SEO, on-page optimisation etc. I'm relatively new to this although picking things up and learning on the job which is great, and Moz is so helpful! So far we have conducted a review of the website, created a large list of keywords and analysed these, started overhauling the copy and adding the new keywords within this, have plans to overhaul the other elements of the site (headings, tags etc) and improve the design, functionality and customer journey through the website. My question is: where do I go from here in terms of keywords and SEO? Is it a case of plugging in the keywords we've researched, watch how they perform, and then switch things up with different keywords if they aren't performing as well as we expected? Is it really a lot of trial and error or is there an exact science behind it that I'm missing? I just feel a little as though we've pulled these keywords out of thin-air to a degree, and are adding them into our copy because the numbers on Moz show they should perform well, and they are what we are trying to promote on the website. But I don't know if this is right?! Perhaps I'm over-thinking it...
Technical SEO | | WhitewallGlasgow0 -
Pagination Help
Hi Moz Community, I've recently started helping a new site with their overall health and I have some pagination issues. It's an ecommerce site and they currently don't have any pagination in place except for these tags: Prev 1 2 3 ... 66 Next I understand what these are doing (leading visitors to the previous, next or last page, but do these do anything for search crawlers or does the site need to have an option of:
Technical SEO | | IceIcebaby
1.rel=next/rel=prev
2.canonical leading to the view all page (the view all page takes a long time to load) Thanks for your help. -Reed0 -
Magento Duplicate Content help!
How can I remove the duplicate page content in my Magento store from being read as duplicate. I added the Magento robots file that i have used on many stores and it keeps giving us errors. Also we have enabled the canonical links in magento admin I am getting 3616 errors and can't seem to get around it .. any suggestions?
Technical SEO | | adamxj20 -
SEO credit for subdomain blogs?
One of my clients is currently running a webstore through Volusions. They would like to add a blog to their website, but since Volusions doesn't currently support blogs on the same domain we would have to create a Wordpress blog and link it to a subdomain. (http://support.volusion.com/article/linking-your-blog-your-volusion-store) Using this method, will their primary website receive any SEO credit for the content being created on the blog or will it only count towards the subdomain? Thanks!
Technical SEO | | CMSSolutions980 -
Getting a link removed from brand search - please help!
Hello all you mozzers! Ive just come into work with an established company who have one major problem when you google "palicomp" the second link that comes up is to consumeractiongroup with a thread that has been damaging the business for over 2 years, this thread is absolutely not representative of the business today. Strangely stronger links in search have better authority but google has ranked this post as being highly relevant to the business, does anybody know of any strategies we can do to get this removed, we have contacted consumeractiongroup directly but they are not prepared to move it. Does anyone have any idea of removal ideas or what we can do its crippling our business, we cant work out as to why its ranking better! Chris
Technical SEO | | palicomp0 -
Help with Webmaster Tools "Not Followed" Errors
I have been doing a bunch of 301 redirects on my site to address 404 pages and in each case I check the redirect to make sure it works. I have also been using tools like Xenu to make sure that I'm not linking to 404 or 301 content from my site. However on Friday I started getting "Not Followed" errors in GWT. When I check the URL that they tell me provided the error it seems to redirect correctly. One example is this... http://www.mybinding.com/.sc/ms/dd/ee/48738/Astrobrights-Pulsar-Pink-10-x-13-65lb-Cover-50pk I tried a redirect tracer and it reports the redirect correctly. Fetch as googlebot returns the correct page. Fetch as bing bot in the new bing webmaster tools shows that it redirects to the correct page but there is a small note that says "Status: Redirection limit reached". I see this on all of the redirects that I check in the bing webmaster portal. Do I have something misconfigured. Can anyone give me a hint on how to troubleshoot this type of issue. Thanks, Jeff
Technical SEO | | mybinding10