What are best page titles for sub-domain pages?
-
Hi Moz communtity,
Let's say a website has multiple sub-domains with hundreds and thousands of pages. Generally we will be mentioning "primary keyword & "brand name" on every page of website. Can we do same on all pages of sub-domains to increase the authority of website for this primary keyword in Google? Or it gonna end up as negative impact if Google consider as duplicate content being mentioned same keyword and brand name on every page even on website and all pages of sub domains?
Thanks
-
It seems you didn't get my question. There will be no duplicate content in pages. All I am discussing is about how to optimise sub-domains pages with "keyword" and "brand" at page titles. Generally we have given "brand & keyword" in all pages of website. Can we do same for pages of sub-domains? Is so many pages added with "brand & keyword" helps or kills in ranking?
-
A site can have multiple sub-domains with as many pages as you like and Google can crawl them, but that doesn't mean it will index them. If you are deliberately producing duplicate content (and that includes slightly re-worked content to include a keyword variation) across your site, Google are going to penalise you for it:
"Duplicate content on a site is not grounds for action on that site unless it appears that the intent of the duplicate content is to be deceptive and manipulate search engine results"
Source: Google Search Console Help - Duplicate Content: https://support.google.com/webmasters/answer/66359
You'd benefit far more from 100 pages, filled with exceptional content than a 1,000,000 pages full of zero-value, duplicate content.
Don't just build content that focuses on one keyword, either; instead, build a page around a keyword "theme" and use synonyms that you'd expect you use naturally (i.e. in conversation with a human being), referencing your main keyword a couple of times, near the top of the page and in the title at most - don't plaster it everywhere). Google is smart enough to deal with synonyms - and with the arrival of Google's RankBrain (https://en.wikipedia.org/wiki/RankBrain) even more so. Readers hate keyword-stuffed pages as much as Google do and both will punish you for using them.
Less is more. Invest your energy in the user's experience, instead.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How Best to Handle Inherited 404s on Purchased Domain
We purchased a domain from another company and migrated our site over to it very successfully. However, we have one artifact of the original domain in that there was a page that was exploited by other sites on the web. This page allowed you to pass any URL to it and redirect to that URL (e.g. http://example.com/go/to/offsite_link.asp?GoURL=http://badactor.com/explicit_content). This page does not exist on our site so the results always go to a 404 on our site. However, we find that crawlers are still attempting to access these invalid pages. We have disavowed as many of the explicit sites as we can, but still some crawlers come looking for those links. We are considering blocking the redirect page in our robots.txt but we are concerned that the links will remain indexed but uncrawlable. What's the best way to pull these pages from search engines and never have them crawled again? UPDATE: Clarifying that what we're trying to do it get search engines to just never try to get to these pages. We feel the fact they're even wasting their time on getting a 404 is what we're trying to avoid. Is there any reason we shouldn't just block these in our robots.txt?
Intermediate & Advanced SEO | | russell_ms1 -
301 Redirect Only Home Page/Root Domain via Domain Registrar Only
Hi All, I am really concerned about doing a 301 redirect. This is my situation: Both Current and New Domain is registered with a local domain registrar (similar to GoDaddy but a local version) Current Domain: Servers are pointing to Wix servers and the website is built and hosted with Wix I would like to do a 301 redirect but would like to do it in the following way with a couple of factors to keep in mind: 99% of my link are only pointed to the home page/root domain only. Not to subdirectories. New Domain: I will register this with wix with a new plan but keep the exact sitemap and composition of current website and launch with new domain. Current Domain: I want to change server pointing to wix to point to local domain registrar servers. Then do a 301 redirect for only the home page/root domain to point to the new domain listed with wix. So 301 is done via local registrar and not via Wix. Another point to mention is it will also change from Http to Https as well as a name change. Your comments on the above will be greatly appreciated and as to whether there is risk in trying to do a 301 redirect as above. Doing it as above it also cheaper if I do the 301 via the wix platform I will need to register a full new premium plan and run it concurrently to the old plan whereas if I do it as mentioned above will only have the additional domain annual fee. Look forward to your comments. Mike
Intermediate & Advanced SEO | | MikeBlue10 -
Which is the best option for these pages?
Hi Guys, We have product pages on our site which have duplicate content, the search volume for people searching for these products is very, very small. Also if we add unique content, we could face keyword cannibalisation issues with category/sub-category pages. Now based on proper SEO best practice we should add rel canonical tags from these product pages to the next relevant page. Pros Can rank for product oriented keywords but search volume is very small. Any link equity to these pages passed due to the rel canonical tag would be very small, as these pages barely get any links. Cons Time and effort involved in adding rel canonical tags. Even if we do add rel canonical tags, if Google doesn't deem them relevant then they might ignore causing duplicate content issues. Time and effort involved in making all the content unique - not really worth it - again very minimal searchers. Plus if we do make it unique, then we face keyword cannibalisation issues. -- What do you think would be the optimal solution to this? I'm thinking just implementing a: Across all these product based pages. Keen to hear thoughts? Cheers.
Intermediate & Advanced SEO | | seowork2140 -
How do I get the sub-domain traffic to count as sub-directory traffic without moving off of WordPress?
I want as much traffic as possible to my main site, but right now my blog lives on a blog.brand.com URL rather than brand.com/blog. What are some good solutions for getting that traffic to count as traffic to my main site if my blog is hosted on WordPress? Can I just create a sub-directory page and add a rel canonical to the blog post?
Intermediate & Advanced SEO | | johnnybgunn0 -
When migrating website platforms but keeping the domain name how best do we add the new site to google webmaster tools? Best redirect practices?
We are moving from BigCommerce to Shopify but maintaining our domain name and need to make sure that all links redirect to their corresponding links. We understand the nature of 301s and are fine with that, but when it comes to adding the site to google webmaster tools, not losing link juice and the change of address tool we are kind of lost. Any advice would be most welcome. Thank you so much in advance!
Intermediate & Advanced SEO | | WNL0 -
K3 duplicate page content and title tags
I'm running a Joomla site, have just installed k2 as our blogging platform. Our Crawl Report with SEOMOZ shows a good bit of duplicate content and duplicate title tags with our K2 blog. We've installed sh404SEF. Will I need to go into sh404SEF each time we generate a blog entry to point the titles to one URL? If there is something simpler please advise. Thank you, Don
Intermediate & Advanced SEO | | donaldmoore0 -
Best practices for robotx.txt -- allow one page but not the others?
So, we have a page, like domain.com/searchhere, but results are being crawled (and shouldn't be), results look like domain.com/searchhere?query1. If I block /searchhere? will it block users from crawling the single page /searchere (because I still want that page to be indexed). What is the recommended best practice for this?
Intermediate & Advanced SEO | | nicole.healthline0 -
Sub-domains and different languages
Hi there! All our content is in two languages: English and Spanish, but they're basically the same (sometimes longer, sometimes shorter). We have the English content under a subdomain (en.mydomain.com) and the Spanish one under another subdomain (es.mydomain.com). First of all: is that correct? Is it better to have it under folders or under subdomains? But the most important question. When a user enters to mydomain.com is redirected through a 302 to the Spanish subdomain or to the English subdomain, depending on the language of his browser (microsoft.com works this way). We have now a lot of links pointing to mydomain.com but... where is all this link flow going?? Are we losing it? Should we have a landing page under mydomain.com pointing to both subdomains? or maybe redirect it through a 301 to just one of the subdomains, then redirect the user to his language if necessary? Thank you very much!!!
Intermediate & Advanced SEO | | bodaclick0