Optimize root domain or a page in a sub directory?
-
Hi
My root domain is already optimized for keywords, i would say branded keywords, which i do not really need, as the traffic from these does not give me any revenue ( mostly consists of our employees/returning visitors). Now i have run on page optimization for set of keywords for root domain which i like and got good grades (hurray!). But yet my website does not show up on search engines for those keywords.
I have got pretty good link building done to my root domain but this is not done for all keywords (but done for branded keywords). It just happened, please do not ask why.
So i decided to optimize inside pages in sub directory with new set of keywords i like. Starting with link building, giving anchor text on various other website linking to this particular page. These pages are not ranked in top 50 in google.
Is that a good practice?
or
I would not need those branded keywords, hence should I re-optimize my root domain to suite my new keywords by giving less preference to branded keywords?
Is this a good practice?
-
Hi there,
Without knowing more about your site, it's hard to say exactly what the best practice is for your situation, but obviously you want to build relevant links to topic specific pages.
It's possible, if your main pages are optimized in a way that aren't providing any benefit, that you could try freshen the content on those pages to more accurately reflect the keywords you're trying to rank for.
Really, out of the two options you proposed - building new pages or updating the old pages to take advantage of the existing link equity/authority - it's really a toss up. More important is going to be the quality of links that you build to those pages.
As long as you're building new links, my preference would be to create new content. This
- preserves your old content (assuming it's relevant)
- gives your domain greater link diversity among a higher number of URLs (deep linking)
- give you a more options in the future if you ever have to "cut" spammy links
Either way, as long as your building new content and creating new, good links, you should be fine.
-
Hi! I would optimize subpages for the specific keywords and build links to them. Also, dont forget to put links to them from your homepage and other powerfull pages in your website.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Does redirecting from a "bad" domain "infect" the new domain?
Hi all, So a complicated question that requires a little background. I bought unseenjapan.com to serve as a legitimate news site about a year ago. Social media and content growth has been good. Unfortunately, one thing I didn't realize when I bought this domain was that it used to be a porn site. I've managed to muck out some of the damage already - primarily, I got major vendors like Macafee and OpenDNS to remove the "porn" categorization, which has unblocked the site at most schools & locations w/ public wifi. The sticky bit, however, is Google. Google has the domain filtered under SafeSearch, which means we're losing - and will continue to lose - a ton of organic traffic. I'm trying to figure out how to deal with this, and appeal the decision. Unfortunately, Google's Reconsideration Request form currently doesn't work unless your site has an existing manual action against it (mine does not). I've also heard such requests, even if I did figure out how to make them, often just get ignored for months on end. Now, I have a back up plan. I've registered unseen-japan.com, and I could just move my domain over to the new domain if I can't get this issue resolved. It would allow me to be on a domain with a clean history while not having to change my brand. But if I do that, and I set up 301 redirects from the former domain, will it simply cause the new domain to be perceived as an "adult" domain by Google? I.e., will the former URL's bad reputation carry over to the new one? I haven't made a decision one way or the other yet, so any insights are appreciated.
Intermediate & Advanced SEO | | gaiaslastlaugh0 -
Why some domains and sub-domains have same DA, but some others don't?
Hi I noticed for some blog providers in my country, which provide a sub-domian address for their blogs. the sub-domain authority is exactly as the main domain. Whereas, for some other blog providers every subdomain has its different and lower authority. for example "ffff.blog.ir" and "blog.ir" both have domain authority of 60. It noteworthy to mention that the "ffff.blog.ir" does not even exist! This is while mihanblog.com and hfilm.mihanblog.com has diffrent page authority.
Intermediate & Advanced SEO | | rayatarh5451230 -
Category Page as Shopping Aggregator Page
Hi, I have been reviewing the info from Google on structured data for products and started to ponder.
Intermediate & Advanced SEO | | Alexcox6
https://developers.google.com/search/docs/data-types/products Here is the scenario.
You have a Category Page and it lists 8 products, each products shows an image, price and review rating. As the individual products pages are already marked up they display Rich Snippets in the serps.
I wonder how do we get the rich snippets for the category page. Now Google suggest a markup for shopping aggregator pages that lists a single product, along with information about different sellers offering that product but nothing for categories. My ponder is this, Can we use the shopping aggregator markup for category pages to achieve the coveted rich results (from and to price, average reviews)? Keen to hear from anyone who has had any thoughts on the matter or had already tried this.0 -
Geo-Targeted Sub-Domains & Duplicate Content/Canonical
For background the sub domain structure here is inherited and commited to due to tech restrictions with some of our platforms. The brand I work with is splitting out their global site into regional sub sites (not too relevant but this is in order to display seasonal product in different hemispheres and to link to stores specific to the region). All sub-domains except EU will be geo-targeted to their relevant country. Regions and sub domains for reference: AU - Australia CA - Canada CH - Switzeraland EU - All Euro zone countries NZ - New Zealand US - United States This will be done with Wordpress multisite. The set up allows to publish content on one 'master' sub site and then decide which other sub sites to 'broadcast' to. Some content is specific to a sub-domain/region so no issue with duplicate and can set the sub-site version as canonical. However some content will appear on all sub-domains. au.example.com/awesome-content/ nz.example.com/awesome-content/ Now first question is since these domains are geo-targeted should I just have them all canonical to the version on that sub-domain? eg Or should I still signal the duplicate content with one canonical version? Essentially the top level example.com exists as a site only for publishing purposes - if a user lands on the top level example.com/awesome-content/ they are given a pop up to select region and redirected to the relevant sub-domain version. So I'm also unsure whether I want that content indexed at all?? I could make the top level example.com versions of all content be the canonical that all others point to eg. and rely on geo-targeting to have the right links show in the right search locations. I hope that's kind of clear?? Obviously I find it confusing and therefore hard to relay! Any feedback at all gratefully received. Cheers, Steve
Intermediate & Advanced SEO | | SteveHoney0 -
Moz page optimization score issue, have a score of 95, but can get to 99 if I ad my keyword basically twice in the url.
Hello, I have a keyword for lack of providing too much info we will say my keyword is laptop-bags. Now we have a /laptop-bags/ page and inside that page **/laptop-bags/leather-shoulder/ ** We got a score of 95 for that page. Now I got a score of 99 when I changed it to **/laptop-bags/leather-shoulder-laptop-bags/ ** The way Bigcommerce handles is it will use the product category title in the url, page title and site links, to me it feels like it's spammy, as well as on my /laptop-bags/ page, I now have 18 keywords of " laptop bags " on that page when before it was 12, since I added laptop-bags to all 6 categories inside the laptop-bags page. How would you handle this, use the /keyword/ then /longtail-keyword/ in full or would using /laptop-bag/leather-shoulder/ still rank for leather shoulder laptop bags? I've asked this before and was told to use whatever sounded better to the user, but now moz is telling me different.
Intermediate & Advanced SEO | | Deacyde0 -
Main page - catogory and sub-catogories
This is a complex one for me but here goes. my site sales personalized widgets and this is the main keyword. other keyword that convert are gold personalized widgets silver personalized widgets brass personalized widgets and many others. all those keywords are categories including the main keyword. (each category is a page on my site optimized for that category) (If you did not guess already I sale jewelry. I assume that having the keyword included in links pointing to intenal category pages dilute the importance of this keyword on the main page? there are few options I consider. for testing have the gold,silver and others to include canonical to the main page just to test the affect on the main keyword. Not pointing from the main page to those sub-category pages - it feels wrong to me just canonize the main keyword category page to point to the main page Would love o hear what are our thoughts
Intermediate & Advanced SEO | | ciznerguy0 -
Are links to on-page content crawled / have any effect on page rank?
Lets say I have a really long article that begins with links to <a name="something">anchors on the same page.</a> <a name="something"></a> <a name="something">E.g.,</a> Chapter 1, Chapter 2, etc, allowing the user to scroll down to different content. There are also other links on this page that link to other pages. A few questions: Googlebot arrives on the page. Does it crawl links that point to anchors on the same page? When link juice is divided among all the links on the page, do these links count and page rank is then lost? Thanks!
Intermediate & Advanced SEO | | anthematic0 -
Does having multiple links to the same page influence the Link juice this page is able to pass
Say you have a page and it has 4 outgoing links to the same internal page. In the original Pagerank algo if these links were links to an page outside your own domain, this would mean that the linkjuice this page is able to pass would be devided by 4. The thing is i'm not sure if this is also the case when the outgoing link, is linking to a page on your own domain. I would say that outgoing links (whatever the destination) will use some of your link juice, so it would be better to have 1 outgoing link instead of 4 to the same destination, the the destination will profit more form that link. What are you're thoughts?
Intermediate & Advanced SEO | | TjeerdvZ0