Why isn't there a browser tab title AND meta title?
-
Personal opinion; as a user, it makes sense for me to want a full 50+ character meta title which displays in a search engine that helps me determine if I want to click that link AND a concise browser tab title that tells me which page and brand I have open.
As a search engine, I would (possibly wrongly) suppose that having one more piece user-facing of information would be helpful in understanding a page and that page's relation to the rest of the website.
Theoretical example
Meta title: A great title for the website I've been dreaming of! | OurBrand
Browser tab title: Home | OurBrand
-
As a user, there are certainly times it's helpful to have the browser tab text, especially if I have a few tabs open on a single site and want to distinguish between them quickly. From a search engine's standpoint, I don't see why this wouldn't be one more piece of possibly useful information.
However, as you pointed out, if the user happens to have one window open with dozens of tabs, then each individual tab is so skinny which results in only the brand favicon/icon as important.
-
Interesting idea. I've never even considered this before.
But how often do people look at the browser tab to remember which page they are on?
And for people like me, who have 20-50 tabs open at almost all times (I know, it's an issue I have), it wouldn't be very helpful since all the tab widths are so small.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Migrating From Parameter-Driven URL's to 'SEO Friendly URL's (Slugs)
Hi all, hope you're all good and having a wonderful Friday morning. At the moment we have over 20,000+ live products on our ecomms site, however, all of the products are using non-seo friendly URL's (/product?p=1738 etc) and we're looking at deploying SEO friendly url's such as (/product/this-is-product-one) etc. As you could imagine, making such a change on a big ecomms site will be a difficult task and we will have to take on A LOT of content changes, href-lang changes, affiliate link tests and a big 301 task. I'm trying to get some analysis together to pitch the Tech guys, but it's difficult, I do understand that this change has it's benefits for SEO, usability and CTR - but I need some more info. Keywords in the slugs - what is it's actual SEO weight? Has anyone here recently converted from using parameter based URL's to keyword-based slugs and seen results? Also, what are the best ways of deploying this? Add a canonical and 301? All comments greatly appreciated! Brett
Intermediate & Advanced SEO | | Brett-S0 -
How can a recruitment company get 'credit' from Google when syndicating job posts?
I'm working on an SEO strategy for a recruitment agency. Like many recruitment agencies, they write tons of great unique content each month and as agencies do, they post the job descriptions to job websites as well as their own. These job websites won't generally allow any linking back to the agency website from the post. What can we do to make Google realise that the originator of the post is the recruitment agency and they deserve the 'credit' for the content? The recruitment agency has a low domain authority and so we've very much at the start of the process. It would be a damn shamn if they produced so much great unique content but couldn't get Google to recognise it. Google's advice says: "Syndicate carefully: If you syndicate your content on other sites, Google will always show the version we think is most appropriate for users in each given search, which may or may not be the version you'd prefer. However, it is helpful to ensure that each site on which your content is syndicated includes a link back to your original article. You can also ask those who use your syndicated material to use the noindex meta tag to prevent search engines from indexing their version of the content." - But none of that can happen. Those big job websites just won't do it. A previous post here didn't get a sufficient answer. I'm starting to think there isn't an answer, other than having more authority than the websites we're syndicating to. Which isn't going to happen any time soon! Any thoughts?
Intermediate & Advanced SEO | | Mark_Reynolds0 -
Establishing if links are 'nofollow'
Wonder if any of you guys can tell me if there is any other way to tell google links are nofollow other than in the html (ie can you tell google to nofollow every link in a subdomain or something). I'm trying to establish if a couple of links on a very high ranking site are passing me pagerank or not without asking them directly and looking silly! Within the source code for the page they are NOT tagged as nofollow at present. Hope that all makes sense 😉
Intermediate & Advanced SEO | | mat20150 -
Why my site it's not being indexed?
Hello.... I got to tell that I feel like a newbie (I am, but know I feel like it)... We were working with a client until january this year, they kept going on their own until september that they contacted us again... Someone on the team that handled things while we were gone, updated it´s robots.txt file to Disallow everything... for maybe 3 weeks before we were back in.... Additionally they were working on a different subdomain, the new version of the site and of course the didn't block the robots on that one. So now the whole site it's been duplicated, even it´s content, the exact same pages exist on the suddomain that was public the same time the other one was blocked. We came in changes the robots.txt file on both server, resend all the sitemaps, sent our URL on google+... everything the book says... but the site it´s not getting indexed. It's been 5 weeks now and no response what so ever. We were highly positioned on several important keywords and now it's gone. I now you guys can help, any advice will be highly appreciated. thanks Dan
Intermediate & Advanced SEO | | daniel.alvarez0 -
After reading of Google's so called "over-optimization" penalty, is there a penalty for changing title tags too frequently?
In other words, does title tag change frequency hurt SEO ? After changing my title tags, I have noticed a steep decline in impressions, but an increase in CTR and rankings. I'd like to once again change the title tags to try and regain impressions. Is there any penalty for changing title tags too often? From SEO forums online, there seems to be a bit of confusion on this subject...
Intermediate & Advanced SEO | | Felix_LLC0 -
Page Titles of Blog
Hi, Should all the page titles of our blogs include a Keyword(s) and\or our website name?
Intermediate & Advanced SEO | | Studio330 -
Canonical Meta Tag
Can someone explain how this works and how necessary is it? For example, I have a new client, who is ranking WITHOUT the www in their domain, but they have a good deal of backlinks already that have www in it. When I set up google webmaster tools I made 2, one for WWW and one for WITHOUT and there are diffenet numbers of backlinks for each. I have no idea what do about this or if I should even do anything. Thanks
Intermediate & Advanced SEO | | TheGrid0 -
Best solution to get mass URl's out the SE's index
Hi, I've got an issue where our web developers have made a mistake on our website by messing up some URL's . Because our site works dynamically IE the URL's generated on a page are relevant to the current URL it ment the problem URL linked out to more problem URL's - effectively replicating an entire website directory under problem URL's - this has caused tens of thousands of URL's in SE's indexes which shouldn't be there. So say for example the problem URL's are like www.mysite.com/incorrect-directory/folder1/page1/ It seems I can correct this by doing the following: 1/. Use Robots.txt to disallow access to /incorrect-directory/* 2/. 301 the urls like this:
Intermediate & Advanced SEO | | James77
www.mysite.com/incorrect-directory/folder1/page1/
301 to:
www.mysite.com/correct-directory/folder1/page1/ 3/. 301 URL's to the root correct directory like this:
www.mysite.com/incorrect-directory/folder1/page1/
www.mysite.com/incorrect-directory/folder1/page2/
www.mysite.com/incorrect-directory/folder2/ 301 to:
www.mysite.com/correct-directory/ Which method do you think is the best solution? - I doubt there is any link juice benifit from 301'ing URL's as there shouldn't be any external links pointing to the wrong URL's.0