What will you do with the subdomains, keep it or remove it?
-
A client of us has this webpage
not very good in design, but it has good products.
He is having a problem with SEO, he is in page 2 for "estores" keyword. His domain authority is 27 and is upper than other domains that are ranking in page one.The question is the following:
Since the webpage began to work the programmer that created it, did the following: a subdomain per section
http://estores.losestores.com/
http://cortinas.losestores.com/
http://venecianas.losestores.com/
and so on.
This page is not very big one.
Will you remove the subdomains and do de following?
What will be best?
Thanks
Victoria
-
If the content in the subdomains is directly related to the main domain's content, then yes the folder approach would be best. You would consolidate all your links and domain authority into one strong domain. With the subdomain approach your DA is divided.
If you proceed with this change, be sure to properly 301 redirect the URLs and update all links within your site along with any other links you have control over in other locations.
-
Hi Victoria
By coincidence I was looking at a seomoz blog about this earlier. Its dated 2009, but I think its still valid and explains this issue pretty well.
http://www.seomoz.org/blog/understanding-root-domains-subdomains-vs-subfolders-microsites
in summary, it's better to go for sub folders rather than sub domains, ie www.losestores.com/cortinas
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Will Google Judge Duplicate Content on Responsive Pages to be Keyword Spamming?
I have a website for my small business, and hope to improve the search results position for 5 landing pages. I recently modified my website to make it responsive (mobile friendly). I was not able to use Bootstrap; the layout of the pages is a bit unusual and doesn't lend itself to the options Bootstrap provides. Each landing page has 3 main div's - one for desktop, one for tablet, one for phone.
Web Design | | CurtisB
The text content displayed in each div is the same. Only one of the 3 div’s is visible; the user’s screen width determines which div is visible. When I wrote the HTML for the page, I didn't want each div to have identical text. I worried that
when Google indexed the page it would see the same text 3 times, and would conclude that keyword spamming was occurring. So I put the text in just one div. And when the page loads jQuery copies the text from the first div to the other two div's. But now I've learned that when Google indexes a page it looks at both the page that is served AND the page that is rendered. And in my case the page that is rendered - after it loads and the jQuery code is executed – contains duplicate text content in three div's. So perhaps my approach - having the served page contain just one div with text content – fails to help, because Google examines the rendered page, which has duplicate text content in three div's. Here is the layout of one landing page, as served by the server. 1000 words of text goes here. No text. jQuery will copy the text from div id="desktop" into here. No text. jQuery will copy the text from div id="desktop" into here. ===================================================================================== My question is: Will Google conclude that keyword spamming is occurring because of the duplicate content the rendered page contains, or will it realize that only one of the div's is visible at a time, and the duplicate content is there only to achieve a responsive design? Thank you!0 -
New To SEO Management, I just want to double check that my idea will work.
I am new to SEO management. I had a 3 month SEO copy writing internship and a 5 month SEO temp job. In both I mostly wrote copy, but I've been teaching myself SEO on the side, I became Google certified. I ended up getting a telemarketing job and somehow the conversation of SEO came up and I winded up managing their SEO for 12 dollars an hour. They say that every lead generated from the website that turns into a sale will be worth 10 dollars and if and when the sales exceed my paycheck I will starting making commission so long as it stays above my hourly. SEO is very fun and this is like my dream job. They are leaving the planning 100% up to me and I want to make sure that what I am doing will work. My plan is as follows: Part 1: Page Authority via backlinks and social media We are health care brokers and my boss, the owner has a lot of contact. He is talking with large unions like, "The Teamsters," and large company retirment groups like, "Blue flame," which is apparently in some way connected to DTE or GE. Long story short, I am trying to get him to convince them to give us a back link to our main page. He also has a ton of clients that own companies. This is good because they may be persuaded to give us backlinks too. In addition, the tech guy thinks he can implement something where we can get a google +1, facebooks likes/shares, twitter likes and shares and pintrest pin it's that would be a part of an email that we send to people within the list of 12,000 clients. From what I can see, from the client base and the people we are working with we should be able to raise the page authority substantially despite the fact that the site is only a few months old and is not yet out of the sand box. I have been slowly picking off each error with SEO MOZ's website crawling. Part 2: Making a Insurance Jargon Dictionary Guide For The Tri-purpose of gathering traffic, proving our professionalism and helping people understand semi-complex insurance jargon. I could build these 2-3 keywords would be addressed per page and they would be defined in a way to help people looking for terms understand them, while simultaneously netting a strong keyword density and a strong page. I think as far as I can tell there are no issues. Part 3: The dictionary pages will pull in new traffic and the home page will receive links and distribute link juice to the sub-pages. This subpages will guide traffic back to the main page with no-follow links to direct people from the unique termed landing pages to the home page for insurance processing. As far as I can tell my logic is solid and on paper this should work. Am I missing anything (like key details, flaws in my plan)?
Web Design | | Tediscool0 -
When Site:Domain Search Run on Google, SSL Error Appears on One URL, Will this Harm Ranking
Greetings MOZ Community: When a site:domain search is run on Google, a very strange URL appears in the search results. The URL is http://www.nyc-officespace-leader.com:2082/ The page displays a "the site's security certificate is not trusted." This only appears for one URL out of 400. Could this indicate a wider problem with the server's configuration? Is this something that needs to be corrected, and if so how? Our ranking has dropped a lot in the last few months. Thanks,
Web Design | | Kingalan1
Alan0 -
Can anyone recommend a tool that will identify unused and duplicate CSS across an entire site?
Hi all, So far I have found this one: http://unused-css.com/ It looks like it identifies unused, but perhaps not duplicates? It also has a 5,000 page limit and our site is 8,000+ pages....so we really need something that can handle a site larger than their limit. I do have Screaming Frog. Is there a way to use Screaming Frog to locate unused and duplicate CSS? Any recommendations and/or tips would be great. I am also aware of the Firefix extensions, but to my knowledge they will only do one page at a time? Thanks!
Web Design | | danatanseo0 -
Will changing content managment systems affect rankings?
We're considering changing our content management system. This would probably change our url structure (keep root domain name, but specific product pages and what not would have different full urls). Will our rankings be affected if we use different urls for current pages? I know we can do 401 redirects, but anything else I should consider? Thanks, Dan
Web Design | | dcostigan0 -
Is it possible to redirect the main www. domain - but keep a subdomain active?
Hi Mozzers, Quick question, which I hope one of you can answer... Let's say I have a website (i) www.example.com and on that a subdomain exists, (ii) subdomain.example.com. Let's say I want to change my main domain from www.example.com to www.newwebsite.com. I'd 301 all content, use GWT to notify Google of a change of address etc etc. Having done that, is it still possible to keep the original subdomain active? So, even though www.example.com has been redirected / transferred to www.newwebsite.com, subdomain.example.com would still exist. If that is possible, what is the implication for Domain Authority? On the one hand, I have transferred the main site (so DA from that will transfer to the new site); but part of that root domain is still active. Make sense? Any answers? Thanks everyone...
Web Design | | edlondon0 -
Do you think it will be a good idea to delete old blog pages off the server
I paid somebody to build my website using Dreamweaver, and at one point I didn't know how to use the template which automatically updates every page in the menu section so I stupidly broke the template on every new page when I made the websites blog and put the pages into a subfolder. I realised this was a silly thing to do and now and I now know how to use the template correctly I've copied every single page over from the subfolder and put it into the main template. Now I can update the template menu and every page changes automatically. The only problem is I've now got two versions of every page of my blog on the website. For some reason when I do a sitemap it comes up with a links to the old blog pages I, don't know why when I've removed the links from the blog page? and also the new copies also. I have basically got a copys of all blog pages. Do you think it will be a good idea to delete old indexed blog pages off the server so that when Google spiders the site it will pick up only the new links to the copy pages?
Web Design | | whitbycottages0 -
Will A DHTML Overlay Or ThickBox Capturing Email Hurt SEO?
Some of our competitors use an overlay window on all their pages to sign a customer up for an email list (usually offering a coupon). Our questions is related to the impact on SEO the overlay might have. Does anyone have experience doing this and is this a "safe" thing to do related to SEO? Thanks...
Web Design | | onlineinitiatives0