Numbers vs #'s For Blog Titles
-
For your blog post titles, is it "better" to use numbers or write them out? For example, 3 Things I love About People Answering My Constant Questions or Three Things I Love About People Answering My Constant Questions?
I could see this being like the attorney/lawyer, ecommerce/e-commerce and therefore not a big deal. But, I also thought you should avoid using #'s in your url's.
Any thoughts,
Ruben
-
Thanks everyone.
Happy Friday,
Ruben
-
When it comes to people, '3' is easier for them to process than 'three'. You may even get a minor speed boost when the CMS queries the DB, but I haven't tested that. It may only apply to example.com/2014/08/28/sample-post and example.com/?p=123. Basically, the numeric version is easier for machines to process as well.
However, Michael Gray at Gray Wolf SEO has one good reason why you might want to avoid using the numeric equivalent. He definitely has a point. So keep the number in the title and avoid using it in the URL, unless the information in the post - or the post itself - will never change.
There are plenty of other effective ways to increase page load speed, and that wasn't your concern anyway.
-
I would try to use both, maybe use one in the title then the other or both in the blogs posts so that you can try to rank for both, I've done that a few times.
-
I would use the number: '3'
When searching, most people will use the number instead of writing it out.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
I'm Pulling Hairs! - Duplicate Content Issue on 3 Sites
Hi, I'm an SEO intern trying to solve a duplicate content issue on three wine retailer sites. I have read up on the Moz Blog Posts and other helpful articles that were flooded with information on how to fix duplicate content. However, I have tried using canonical tags for duplicates and redirects for expiring pages on these sites and it hasn't fixed the duplicate content problem. My Moz report indicated that we have 1000s of duplicates content pages. I understand that it's a common problem among other e-commerce sites and the way we create landing pages and apply dynamic search results pages kind of conflicts with our SEO progress. Sometimes we'll create landing pages with the same URLs as an older landing page that expired. Unfortunately, I can't go around this problem since this is how customer marketing and recruitment manage their offers and landing pages. Would it be best to nofollow these expired pages or redirect them? Also I tried to use self-referencing canonical tags and canonical tags that point to the higher authority on search results pages and even though it worked for some pages on the site, it didn't work for a lot of the other search result pages. Is there something that we can do to these search result pages that will let google understand that these search results pages on our site are original pages? There are a lot of factors that I can't change and I'm kind of concerned that the three sites won't rank as well and also drive traffic that won't convert on the site. I understand that Google won't penalize your sites with duplicate content unless it's spammy. So If I can't fix these errors -- since the company I work conducts business where we won't ever run out of duplicate content -- Is it worth going on to other priorities in SEO like Keyword research, On/Off page optimization? Or should we really concentrate on fixing these technical issues before doing anything else? I'm curious to know what you think. Thanks!
Algorithm Updates | | drewstorys0 -
How Google's "Temporarily remove URLs" in search console works?
Hi, We have created new sub-domain with new content which we want to highlight for users. But our old content from different sub-domain is making top on google results with reputation. How can we highlight new content and suppress old sub-domain in results? Many pages have related title tags and other information in similar. We are planing to hide URLs from Google search console, so slowly new pages will attain the traffic. How does it works?
Algorithm Updates | | vtmoz0 -
Question: About Google's personalization of search results and its impact on monitoring ranking results
Given Google's personalization of search results for anyone who's logged into a Google property, how realistic and how actually meaningful/worthwhile is it to monitor one's ranking results for any keyword term these days?
Algorithm Updates | | RandallScrubs0 -
Does it matter? 404 v.s. 302 > Page Not Found
Hey Mozers, What are your thoughts of this situation i'm stuck in all inputs welcome 🙂 I am in the middle of this massive domain migration to a new server. Also we are going to be having a very clean SEO friendly url structure. While I was doing some parsing and cleaning up some old urls I stumbled upon a strange situation on my website. I have a bunch of "dead pages" and they are 302'd to a "page not found" probably a old mistake of one of the past developers. (To clarify the HTTP Status code is not 404) Should I try to fight to get all these "dead pages" a 404 error code or could I just leave the temp redirect 302 > "page not found" ( even though I know for a fact theses pages are not going to turn on again)
Algorithm Updates | | rpaiva0 -
Wistia vs. YouTube
Hello, Mozzers! Sorry if I've missed a thread on this, but I didn't find anything after searching for a while... I've used Wistia for years - LOVE the service and the company! Had great luck getting Rich Snippets, ranked well... until the recent Google change. Now all of my Wistia thumbnails have disappeared (though my rankings have stayed strong, thank goodness!) M question is, does it make sense to now embed YouTube videos on our site, and to create a video sitemap with those pages, with the hope that Google will rank the page better than it otherwise would have, knowing that there is valuable (video) content on the page? This is new videos, I'm not thinking of replacing my Wistia videos at this time. I'll probably need to clarify as I see your responses, since this is a tricky set of interrelated decisions. Thanks for any thoughts that anyone may have! 🙂 ~ Scott
Algorithm Updates | | measurableROI1 -
How big is the effect of having your site hosted in the country you're targeting?
Other than having a ccTLD domain and assigning your target country in Google Webmaster Tools' "geotargeting" feature, how big is the effect of having your site hosted in the country you're targeting? Is it really necessary? or it is just a small signal? Thanks in advance! 🙂
Algorithm Updates | | esiow20131 -
Wordpress Seo Title and Tagline
Hello, I am using wordpress. I also have a plugin called All in One SEO. I was wondering, how I should set up my SiteTitle and Tagline? The main keyword I am going after is "Baking " Others are Cooking, Teaching, World food
Algorithm Updates | | Cinfo10 -
Large number of thin content pages indexed, affect overall site performance?
Hello Community, Question on negative impact of many virtually identical calendar pages indexed. We have a site that is a b2b software product. There are about 150 product-related pages, and another 1,200 or so short articles on industry related topics. In addition, we recently (~4 months ago) had Google index a large number of calendar pages used for webinar schedules. This boosted the indexed pages number shown in Webmaster tools to about 54,000. Since then, we "no-followed" the links on the calendar pages that allow you to view future months, and added "no-index" meta tags to all future month pages (beyond 6 months out). Our number of pages indexed value seems to be dropping, and is now down to 26,000. When you look at Google's report showing pages appearing in response to search queries, a more normal 890 pages appear. Very few calendar pages show up in this report. So, the question that has been raised is: Does a large number of pages in a search index with very thin content (basically blank calendar months) hurt the overall site? One person at the company said that because Panda/Penguin targeted thin-content sites that these pages would cause the performance of this site to drop as well. Thanks for your feedback. Chris
Algorithm Updates | | cogbox0