What are best page titles for sub-domain pages?
-
Hi Moz communtity,
Let's say a website has multiple sub-domains with hundreds and thousands of pages. Generally we will be mentioning "primary keyword & "brand name" on every page of website. Can we do same on all pages of sub-domains to increase the authority of website for this primary keyword in Google? Or it gonna end up as negative impact if Google consider as duplicate content being mentioned same keyword and brand name on every page even on website and all pages of sub domains?
Thanks
-
It seems you didn't get my question. There will be no duplicate content in pages. All I am discussing is about how to optimise sub-domains pages with "keyword" and "brand" at page titles. Generally we have given "brand & keyword" in all pages of website. Can we do same for pages of sub-domains? Is so many pages added with "brand & keyword" helps or kills in ranking?
-
A site can have multiple sub-domains with as many pages as you like and Google can crawl them, but that doesn't mean it will index them. If you are deliberately producing duplicate content (and that includes slightly re-worked content to include a keyword variation) across your site, Google are going to penalise you for it:
"Duplicate content on a site is not grounds for action on that site unless it appears that the intent of the duplicate content is to be deceptive and manipulate search engine results"
Source: Google Search Console Help - Duplicate Content: https://support.google.com/webmasters/answer/66359
You'd benefit far more from 100 pages, filled with exceptional content than a 1,000,000 pages full of zero-value, duplicate content.
Don't just build content that focuses on one keyword, either; instead, build a page around a keyword "theme" and use synonyms that you'd expect you use naturally (i.e. in conversation with a human being), referencing your main keyword a couple of times, near the top of the page and in the title at most - don't plaster it everywhere). Google is smart enough to deal with synonyms - and with the arrival of Google's RankBrain (https://en.wikipedia.org/wiki/RankBrain) even more so. Readers hate keyword-stuffed pages as much as Google do and both will punish you for using them.
Less is more. Invest your energy in the user's experience, instead.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Paginated Pages Page Depth
Hi Everyone, I was wondering how Google counts the page depth on paginated pages. DeepCrawl is showing our primary pages as being 6+ levels deep, but without the blog or with an infinite scroll on the /blog/ page, I believe it would be only 2 or 3 levels deep. Using Moz's blog as an example, is https://moz.com/blog?page=2 treated to be on the same level in terms of page depth as https://moz.com/blog? If so is it the https://site.comcom/blog" /> and https://site.com/blog?page=3" /> code that helps Google recognize this? Or does Google treat the page depth the same way that DeepCrawl is showing it with the blog posts on page 2 being +1 in page depth compared to the ones on page 1, for example? Thanks, Andy
Intermediate & Advanced SEO | | AndyRSB0 -
New Domain VS New Page Backlink?
Assuming you've already got a link from:
Intermediate & Advanced SEO | | Sam.at.Moz
sitea.com/page1 (Moz domain rank 55, Moz page rank 30) You have two choices for another link: 1. Another link on the same domain but a new page:
sitea.com/page2 (Moz domain rank 55, Moz page rank 30) 2. A link on a new domain but with a lesser domain & page rank
siteb.com/page1 (Moz domain rank 30, Moz page rank 20) Assuming you have no other links to your site - both sites are relevant to your industry, both 5 years old, both have the same number of visitors/external links/ads and the content and anchor text remains the same. Which will have a bigger impact on SERP movements? Sam0 -
Switched to HTTPS, now Google ALWAYS changes Page Title & Meta in SERPs
Wordpress website using Yoast. Website switched over from HTTP to HTTPS successfully about 6 months ago. Noticed after the fact that Google almost never displays the Page Title or Meta Description I've created in Yoast. Yoast is the only SEO plug-in enabled. Yoast is set to Force Rewrite the Page TItles. The Page titles & Meta Descriptions are always within the character limit. They also contain either an exact or partial match the queries in which Google shows a different Page Title & Meta Description. For some Queries, Google will display the URL as the Page Title for certain queries. Concrete example, search for: public administration jobs Screenshot of results attached. First time working with HTTPS. The redirects appear to be have done correctly. I'm not sure what the issue is. uOnFjNt
Intermediate & Advanced SEO | | 2uinc0 -
Best way to handle page filters and sorts
Hello Mozzers, I have a question that has to do with the best way to handle filters and sorts with Googlebot. I have a page that returns a list of widgets. I have a "root" page about widgets and then filter and sort functionality that shows basically the same content but adds parameters to the URL. For example, if you filter the page of 10 widgets by color, the page returns 3 red widgets on the top, and 7 non-red widgets on the bottom. If you sort by size, the page shows the same 10 widgets sorted by size. We use traditional php url parameters to pass filters and sorts, so obviously google views this as a separate URL. Right now we really don't do anything special in Google, but I have noticed in the SERPs sometimes if I search for "Widgets" my "Widgets" and "Widgets - Blue" both rank close to each other, which tells me Google basically (rightly) thinks these are all just pages about Widgets. Ideally though I'd just want to rank for my "Widgets" root page. What is the best way to structure this setup for googlebot? I think it's maybe one or many of the following, but I'd love any advice: put rel canonical tag on all of the pages with parameters and point to "root" use the google parameter tool and have it not crawl any urls with my parameters put meta no robots on the parameter pages Thanks!
Intermediate & Advanced SEO | | jcgoodrich0 -
Redirect multiple domains to a primary domain
Hello that such I make the following query imagine we have three domains on the same thematic or category primary domain: domain-antiguo1.com (3 years) (200 Backlink), domain-antiguo2.net (10 years) (1000 Backlinks) and domain-antiguo3.com (6 years) (500 Backlinks) and decide to redirect all these domains favorite one: domain-principal.com The three domains registered refeccionar this google webmaster, has its respective income sitemap and google webmaster area change of address to the main domain the three domains are my property It would have a penalty for doing this practice?
Intermediate & Advanced SEO | | globotec0 -
What are recommended best practices for hosting native advertising sponsored content on domains for large publishers?
On top of clear on-page sponsored content copy would you add meta robots to noindex native advertising? Google recently came out against this type of content in GNews, is there a similar directive for the main index?
Intermediate & Advanced SEO | | hankazarian0 -
Website Consolidation To Sub Domains or Leave Stand Alone
For a real estate SEO client they have their corporate site and then for each of their communities (10 of them) each community has their own website domains. One of their team members met with another search agency who recommended they move (consolidate) all their community domains under the corporate site as a sub domain. For example let's say their main site was www.maincompany.com and one of their communities was www.localcompany.com the other firm recommended they move that existing site to become localcompany.maincompany.com and for the other 9 communities to do the exact same thing. They shared that it would really help the corporate site and each of the communities improve search rankings. I am struggling to see how this could be possible and was hoping to get some perspective as the client has asked me to come in and give my opinion if they should proceed with this consolidation. Google has indexed each of their community sites and each site gets a decent amount of search traffic and rankings. Due to that I can't see any benefit to doing this. Since each sub domain would be considered a different site than it essentially is what they already have today so it does not raise domain authority for the main company site. Since, each community has a very different brand there would be little reason to go the main company site. What I mean by that is if a user went to localcompany.maincompany.com site I thought that some may (at least out of curiosity) remove the sub-domain to see the maincompany.com site. The look and feel of each are so different it would potentially cause user confusion too. So my thoughts are this would be a negative for both the search engines and user. So I can share factual pros/cons with clients, do you have any thoughts to the pro's / con's of this approach to consolidate/move other existing sites under a sub domain of the main corporate site?
Intermediate & Advanced SEO | | jfeitlinger1 -
What is the best practice when a client is setting up multiple sites/domains
I have a client that is creating separate websites to be used for different purposes. What is the best practice here with regards to not looking spammy. i.e. do the domains need to registered with different companies? hosted on different servers, etc? Thanks in advance for your response.
Intermediate & Advanced SEO | | Dan-1718030