Does an Apostrophe affect searches?
-
Does Google differentiate between keyphrase structures such as Mens Sunglasses & Men**'**s Sunglasses? I.e. does the inclusion/exclusion of an apostrophe make any difference when optimising your main keyword/phrase for a page?
Keyword explorer appears to give different results..... I.e. no data for Men's Sunglasses, but data appears for Mens sunglasses. So if I optimise my page to include the apostrophe, will it screw the potential success for that page?
Thanks
Bob
-
Hi there!
Search engines have gotten smarter in the past few years and should be able to determine that the keyword (with AND without an apostrophe) means the same thing. I wouldn't worry about the keywords you're tracking and the keywords you're using within your content if the only difference is the apostrophe usage.
-
Thank you Rob, that really helps!
-
Hi Bob,
Search engines are pretty good about grammar and punctuation. As a general rule, rankings aren't impacted by your use (or lack thereof) of punctuation in your titles, content, etc.
The only time punctuation in your content might make a difference is if the punctuation alters the meaning of the sentence - an example I like to use involves a title of an article on Rachel Ray which reads, "Rachel Ray finds inspiration in cooking her family and her dog" which, with proper grammar, would read, "Rachel Ray finds inspiration in cooking, her family and her dog". Obviously, these 2 sentences mean completely different things, and Google will pick up on that based on your content.
The reason keyword explorer is kicking back no data for the search term including the apostrophe is likely because no one uses the apostrophe when conducting their search. As an example, if I am looking for a plumber in New York City, I'm going to search "plumber new york city" or "plumber nyc" rather than "I am looking for a plumber in New York City, New York".
To answer your question directly:
No, using apostrophes in your content will in no way impede your ability to rank for your keyword in the example you have given. You have given contextual information which Google will interpret and rank your pages accordingly.
Hope this helps to explain what's going on. Let me know if you have any follow-up questions.
Cheers,
Rob
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Australian search - ZERO visibility and stumped
Fair warning, this is going to be long, but necessary to explain the situation and what has been done. I will take ANY suggestions, even if I have tried them already. We have a sister site in Australia, targeting Australian traffic. I have inherited what seems to be an incredible rat's nest. I've fixed over two dozen issues, but still haven't seemed to address the root cause. NOTE: Core landing pages have weak keyword targeting. I don't expect much here until I fix this. The main issues I'm trying to resolve first are with the unusual US-based targeting, and the inability of the homepage to rank for anything. The site is www[dot]castleford[dot]com[dot]au. Here's the rundown on what's going on: Problems: The site ranks for four times as many keywords in the US as it does in Australia. The site ranks for a grand total of 5 keywords on the first page for AU keywords. The homepage, while technically optimized on-page for "content marketing agency", and with content through MarketMuse, has historically ranked between 60-100, despite having a fairly strong DA with fairly weak competitors, based on AHREFs keyword difficulty, and Moz keyword difficulty. Oddly, the ranking has gone up to 5-7 for three day spurts over the past year. Infrequent indexing of homepage (used to be every 2-3 weeks, I've gotten that down to 1 week). Sequence of events: November 2017 - they made some changes to their URLs - some on the blog and some on the top nav LPs. Redirects seem okay. November 2017 - Substantial number of lost referring domains, not many seem to be quality. January 2018 - total number of AU ranking keywords more than halved. May/June 2018 - added a follow inbound link sitewide to an external site that they created. 20k inbound links with same anchor text to homepage. Site has a total of 24k inbound links. July-Sep 2018 - total number of US ranking keywords halved November 10 - I walked into this mess. What's been done: Reduced site load speed by over 150% (it was around 20 seconds). Create sitemap (100 entry batching) and submit to GSC. Improved MarketMuse score for the homepage. Changed language from "en-US" to "en-AU" Fetch and render - content is all crawlable and indexed properly. Changed site architecture for top nav core landing pages to establish clear hierarchy. All version of GSC created, non-www and www http, and non www https and www https Site crawl - normal amount of 404s, nothing stands out as substantial. http to https redirect okay. Robots.txt updated and okay. Checked GSC international targeting, confirmed AU. No manual links penalty I'm clearly stumped and could use some insights. Thanks to everyone in advance, if you can find time.
Technical SEO | | Brafton-Marketing0 -
Google Search Console - Sitemap
Hi all, Quick question. I'm trying to update my sitemap via Google Search Console using a sitemap.xml file that I've created with ScreamingFrog. However, when trying to submit it, it seems that Google only allows sitemaps that are located at a path within your domain (i.e. www.example.com/sitemap.xml) as opposed to being able to directly upload a sitemap.xml file.Is there any way that I can easily upload my sitemap.xml file? Or is there any easy way that I can upload the file to a path on my domain so I can upload via the URL?Any insight would be much appreciated!Best,Sung
Technical SEO | | hdeg0 -
Removing site subdomains from Google search
Hi everyone, I hope you are having a good week? My website has several subdomains that I had shut down some time back and pages on these subdomains are still appearing in the Google search result pages. I want all the URLs from these subdomains to stop appearing in the Google search result pages and I was hoping to see if anyone can help me with this. The subdomains are no longer under my control as I don't have web hosting for these sites (so these subdomain sites just show a default hosting server page). Because of this, I cannot verify these in search console and submit a url/site removal request to Google. In total, there are about 70 pages from these subdomains showing up in Google at the moment and I'm concerned in case these pages have any negative impacts on my SEO. Thanks for taking the time to read my post.
Technical SEO | | QuantumWeb620 -
Google Search Console Site Map Anomalies (HTTP vs HTTPS)
Hi I've just done my usual Monday morning review of clients Google Search Console (previously Webmaster Tools) dashboard and disturbed to see that for 1 client the Site Map section is reporting 95 pages submitted yet only 2 indexed (last time i looked last week it was reporting an expected level of indexed pages) here. It says the sitemap was submitted on the 10th March and processed yesterday. However in the 'Index Status' its showing a graph of growing indexed pages up to & including yesterday where they numbered 112 (so looks like all pages are indexed after all). Also the 'Crawl Stats' section is showing 186 pages crawled on the 26th. Then its listing sub site-maps all of which are non HTTPS (http) which seems very strange since the site is HTTPS and has been for a few months now and the main sitemap index url is an HTTPS: https://www.domain.com/sitemap_index.xml The sub sitemaps are:http://www.domain.com/marketing-sitemap.xmlhttp://www.domain.com/page-sitemap.xmlhttp://www.domain.com/post-sitemap.xmlThere are no 'Sitemap Errors' reported but there are 'Index Error' warnings for the above post-sitemap, copied below:_"When we tested a sample of the URLs from your Sitemap, we found that some of the URLs were unreachable. Please check your webserver for possible misconfiguration, as these errors may be caused by a server error (such as a 5xx error) or a network error between Googlebot and your server. All reachable URLs will still be submitted." _
Technical SEO | | Dan-Lawrence
Also for the below site map URL's: "Some URLs listed in this Sitemap have a high response time. This may indicate a problem with your server or with the content of the page" for:http://domain.com/en/post-sitemap.xmlANDhttps://www.domain.com/page-sitemap.xmlAND https://www.domain.com/post-sitemap.xmlI take it from all the above that the HTTPS sitemap is mainly fine and despite the reported 0 pages indexed in GSC sitemap section that they are in fact indexed as per the main 'Index Status' graph and that somehow some HTTP sitemap elements have been accidentally attached to the main HTTPS sitemap and the are causing these problems.What's best way forward to clean up this mess ? Resubmitting the HTTPS site map sounds like right option but seeing as the master url indexed is an https url cant see it making any difference until the http aspects are deleted/removed but how do you do that or even check that's what's needed ? Or should Google just sort this out eventually ? I see the graph in 'Crawl > Sitemaps > WebPages' is showing a consistent blue line of submitted pages but the red line of indexed pages drops to 0 for 3 - 5 days every 5 days or so. So fully indexed pages being reported for 5 day stretches then zero for a few days then indexed for another 5 days and so on ! ? Many ThanksDan0 -
Does HTTPS Affect Inbound Link Numbers?
Hi All, I'm dealing with an internal IT staff that is trying to change an entire site to run on HTTPS instead of HTTP. The way they want things configured, all links pointing to HTTP URLs would redirect to the HTTPS. I'm assuming this would adversely affect page rank/domain authority, etc... am I right there? Thanks, Ben
Technical SEO | | Ben_Alvord0 -
Can changing a host provider impact search rankings?
I was wondering if changing my host provider would impact my search rankings on the major search engines?
Technical SEO | | bronxpad0 -
Subdomain mozTrust - does other parkd domains can affect that ?
Hi , I have my domain www.mydomain.com and it have dpmain authority 26 , domain mozRank around 3 , domain mozTrust 1.63 , page authority 31, Google PR 2.0 etc etc So I am not in very bottom of scores, but my SUBDOMAIN MOZTRUST is only 0.961 and I've checked other websites that I've made some time ago and they have it like 4.0. So it is quite bad. I am having some domain parked within my hosting package. they have different names like www.mydomain2.co.uk , www.mydomain3.com etc. I can acces those domains as wel by typing : mydomain2.mydomain.com mydomain3**.mydomain.com** and have some testing subdomains there as well (just if I need to test something like drupal , wordpress or testing shoping cart etc.) Can that fact affect my subdomain rank ? Because I am having those domains parked there or I've made some subdomains that are not in use and nobody is linking to them and they are visible in Google ?
Technical SEO | | sever3d0 -
Why are old versions of images still showing for my site in Google Image Search?
I have a number of images on my website with a watermark. We changed the watermark (on all of our images) in May, but when I search for my site getmecooking in Google Image Search, it still shows the old watermark (the old one is grey, the new one is orange). Is Google not updating the images its search results because they are cached in Google? Or because it is ignoring my images, having downloaded them once? Should we be giving our images a version number (at the end of the file name)? Our website cache is set to 7 days, so that's not the issue. Thanks.
Technical SEO | | Techboy0