User Created Subdomain Help
-
Have I searched FAQ: Yes
My issue is unique because of the way our website works and I hope that someone can provide some guidance on this.Our website http://breezi.com is a website builder where users can build their own website. When users build their site it creates a sub-domain route to their created site, for example: http://mike.breezi.com. Now that I have explained how our site works here is the problem: Google Webmaster Tools and Bing Webmaster Tools are indexing ALL the user created websites under our TLD and thus it is our impression that any content created in those sub-domains can confuse the search engine to thinking that the user created website and content is relevant to _OUR _main sitehttp://breezi.com. So, what we would like to know if there is a way to let search engines know that the user created sites and content is not related to our TLD site. Thanks for any help and advise.
-
Subdomains generally don't pass any authority, link juice etc to the TLD, Rand did a Whiteboard Friday that briefly covered this a while ago (see http://www.seomoz.org/blog/international-seo-where-to-host-and-how-to-target-whiteboard-friday)
I am curious, if you didn't want user created sites to be associated with your TLD why didn't you set up a different domain for user created sites?
I personally think it is morally wrong to try and stop Google indexing them. So, if you don't want these associated with you or your TLD I would set up a new domain eg yourbreezi.com and 301 any sites that have been set up to the new domain and make sure that any new user sites are set up under the new domain.
In truth I'm not sure it is too much to worry about, after all Wordpress.org uses subdomains for most of its hosted blogs and it doesn't seem to have done them too much harm!!
Hope that helps
-
Robert,
The suggestion you make is not an option. I don't want to remove any sub-domain urls because these are user generated sites that could generate their own respective ranking.
-
Navid,
Using the Robot.txt to block the sub domains might not be the best route.
The only way I would think you can do that is by telling GWT to remove the URL (in this case your subdomains).
On Webmaster tools, click on "Site Configuration", then "Crawler access" then "Remove URL". Here click on "New Removal request". You will then see a option to remove whole site. You can use this option to remove "subdomain.domain.com" from SERP.
-
hmmm... That is a tricky one. One place to look for answer might be to talk to SEO people that have worked with a similar service such as Ning or wordpress.com .
I'll be curious to hear of your findings.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How to use rel=alternate and hreflang=es to help with International SEO?
We have completed translating our important pages from English to Spanish on our website. I am confused if I should be adding attributes like rel=alternate and hreflang=es to links. On our homepage we have links to our solution pages and the code looks like this: <a href="https://www.membroz.com/es/club-management-software/">...</a > <a href="https://www.membroz.com/es/salon-management-software/">...</a > <a href="https://www.membroz.com/es/pre-school-management-software/">...</a > Should I add the attributes rel & hreflang to them? It would look something like this: ... <a <span>rel="alternate" hreflang="es"</a <span> href="https://www.membroz.com/es/salon-management-software/">... <a <span>rel="alternate" hreflang="es"</a <span> href="https://www.membroz.com/es/pre-school-management-software/">...
Technical SEO | | Krtya0 -
Country and Language tags-Running an SEO audit on a site that definitely has more than one language, but nothing is pulling up. I don't quite understand href lang or how to go about it. HELP HELP!
Ran an SEO audit and I don't really understand country and language tags. For example, sony.com definitely has more than one language, but how do I seo check href lang ? Do I inspect the page? etc?
Technical SEO | | Mindgruver0 -
Unknown Subdomains Ranking
In spot checking some pages that I recently launched, I found subdomains ranking in place of the domain. The strange thing is, we never set up these sub-domains and they don't exist on our server. The pages, though they're indexed with the proper title and meta-description, time out when clicked. We're operating on Drupal 7, pages launched at the beginning of the month. The other pages within the series of content are ranking properly. Any thoughts or tips to resolve this?
Technical SEO | | JordanNCU1 -
Hosted Wordpress Blog creating Duplicate Content
In my first report from SEOmoz, I see that there are a bunch of "duplicate content" errors that originate from our blog hosted on Wordpress. For example, it's showing that the following URLs all have duplicate content: http://blog.kultureshock.net/2012/11/20/the-secret-merger/ys/
Technical SEO | | TomHu
http://blog.kultureshock.net/2012/11/16/vendome-prize-website/gallery-7701/
http://blog.kultureshock.net/2012/11/20/the-secret-merger/sm/
http://blog.kultureshock.net/2012/11/26/top-ten-tips-to-mastering-the-twitterverse/unknown/
http://blog.kultureshock.net/2012/11/20/the-secret-merger/bv/ They all lead to the various images that have been used in various blog posts. But, I'm not sure why they are considered duplicate content because they have unique URLs and the title meta tag is unique for each one, too. But even so, I don't want these extraneous URLs cluttering up our search results, so, I'm removing all of the links that were automatically created when placing the images in the posts. But, once I do that, will these URLs eventually disappear, or continue to be there? Because our blog is hosted by Wordpress, I unfortunately can't add any of the SEO plugins I've read about, so, wondering how to fix this without special plugins. Thanks!
Tom0 -
Ways of Helping Reducing Duplicate Content.
Hi I am looking to no of anyway there is at helping to reduce duplicate content on a website with out breaking link and affecting Google rankings.
Technical SEO | | Feily0 -
How does Ping services help your site
Hi i am trying to understand how services such as pingler.com help your site. I think i understand about the google ping service which tells google that you have updated a page but how does pingler work. Pingler claims that it sends traffic to your site but i do not understand this. Any help would be great
Technical SEO | | ClaireH-1848861 -
Omniture tracking code URLs creating duplicate content
My ecommerce company uses Omniture tracking codes for a variety of different tracking parameters, from promotional emails to third party comparison shopping engines. All of these tracking codes create URLs that look like www.domain.com/?s_cid=(tracking parameter), which are identical to the original page and these dynamic tracking pages are being indexed. The cached version is still the original page. For now, the duplicate versions do not appear to be affecting rankings, but as we ramp up with holiday sales, promotions, adding more CSEs, etc, there will be more and more tracking URLs that could potentially hurt us. What is the best solution for this problem? If we use robots.txt to block the ?s_cid versions, it may affect our listings on CSEs, as the bots will try to crawl the link to find product info/pricing but will be denied. Is this correct? Or, do CSEs generally use other methods for gathering and verifying product information? So far the most comprehensive solution I can think of would be to add a rel=canonical tag to every unique static URL on our site, which should solve the duplicate content issues, but we have thousands of pages and this would take an eternity (unless someone knows a good way to do this automagically, I’m not a programmer so maybe there’s a way that I don’t know). Any help/advice/suggestions will be appreciated. If you have any solutions, please explain why your solution would work to help me understand on a deeper level in case something like this comes up again in the future. Thanks!
Technical SEO | | BrianCC0 -
Has anyone used paid services to help improve their site
Hi, i am getting lots of spam in my mail box about how companies can help you get more traffic and i see on lots of sites about tools that can bring you more traffic and help improve your site, and i am just wondering if anyone has tried any of these services or products to help promote their site. For example, i keep getting sent about submitting my site to over 200 directories or search engines and just wondering if these are a waste of time.
Technical SEO | | ClaireH-1848860