Strange client request
-
I have a client who attends an internet marketing meetup. I have been once myself. Good group of people but most seem lost when it comes to SEO and can't tell Black from White!
Well today my client emailed me and in the email she mentioned doing a trick to the title tags.
Client: "there is a trick to use with the title by putting keywords in quotes and parenthasis. I'm sure you know how to do that little trick. If we do it in the title and in the first few lines of the verbage it will soar us near the top and hopefully on the first page of Google."
a few sentences later
"We could use a tad more content on the first page ( with parantesis and quotes) to boost us up in the ratings. At least it is an easy trick to do."
I have never heard of this. Has anyone else heard about this. Please share thoughts. It sounds completely bogus to me but I will be the first to admit that i don't know everything! However i would like to have more than just my opinion when I talk to my client.
Let me know what you think.
-
Thank you all for your input. I couldn't agree more with everyone. Like I said, i needed to have more points of views to bring to the table.
-
Bad bad idea!
As others have said, I suspect the theory here is to try to rank higher for when people use speech marks in their Google query.
In my opinion, the idea is bad for 3 reasons:
-
Hardly anyone searches like that these days - I do sometimes but only when the results without "" fail to return the results I need - or when I'm doing specific research (intitle:" " etc). Not many 'normal' users search like this
-
From a user perspective it doesn't make sense. In the body of content it would look very odd and unprofessional (unless you are citing a quote!) - Moreover using " " marks in the title tag is a bad idea - you only get a few characters for your title tag, so take FULL advantage of each character! I don't mean over-optimise keywords here either, but as well as having your primary keyword in there, use the title tag to help turn 'would-be' visitors into visitors - using " " marks in your title tag reduces the space you have to use, making it a bad idea.
-
It's a pretty blatant form of trying to manipulate results - Something that big G would likely not approve of... Ask your client if they want to gamble their online presence on something designed to 'trick' Google If they are promoting a crappy $7 affiliate product I'd maybe understand them being that silly, but if they want a long-term online business... Nah!
Kinda makes me wonder who suggested this to them! Did they enter a time-warp when they went into the meeting, going back to 2001?!
-
-
Sounds bogus to me. Any time I hear something that is to good to be true, I typically will ask the presenter for data/proof behind the statement. 9 times out of 10, they won't have it or will "email it" after the presentation. The other 1 out of ten seems to be one example that is an outlier and can't be replicated for some reason.
-
Keri just nailed it.
You will actually hear a lot of crap in places like that. Actually, ive been to events where speakers just talk crap. Stuff they don't even test, just "heard" or made up.
-
If it worked, we'd all see text with lots of odd quotes and parenthesis, correct?
-
Google does allow for people to search exact keywords in that manner so if they think you're going to get more traffic because you know people will search identical keywords answer it just is written and not a good idea to use "whatever" or (don't do it) as people just don't do that as much as writing something unique in google
From a grammar standpoint it should be as user-friendly as possible unnecessary question parentheses is not user-friendly to me.
Thomas
-
To be honest, it sounds bogus. I've never heard of it, and just from a user standpoint, I'd imagine that would be annoying. Let's try that sentence again with what was suggested...
To be "honest" (it sounds bogus); I've never "heard" of it (and just from a user standpoint); I'd imagine that would be "annoying".
Not saying those are the keywords, but how annoying is that sentence to read? From a grammar standpoint, it's giving me chills. Anything in quotes is hinting at something other than what it is... what are we talking "about?" I hate reading through paragraphs where people use quotes out of context. Here's a great example of what I'm talking about: what does this sign mean to you, http://static.guim.co.uk/sys-images/Admin/BkFill/Default_image_group/2012/2/10/1328896276621/cheese-burgers-sign-on-sm-007.jpg? Is it cheese or not? Not sure, but I don't want that burger!
-
Unfortunately clients trick is to attract the exact match's of the words in quotes not your normal broad search terms that include keywords. I think it's a very bad idea to implement
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Scraping Website and Using Our Clients Info
One of our clients on Moz has noticed that another website has been scraping their website and pulling lots of their content without permission. We would like to notify Google about this company but are not sure if that is the right remedy to correct the problem. They appear in search results on Google using the client's name so they seem to be use page titles etc with the client's name in them. Several of the SERP links link to their own website but it pulls in our client's web page. Was hoping anyone could perhaps provide some additional options on how to attack this problem?
White Hat / Black Hat SEO | | InTouchMK0 -
Should a business requestion nofollow links from businesses it has commercial relationships with?
I am working for a motor homes company that works with a network of dealers. Having just analysed the site I notice that dealers are sending links to the site - lots of them. They are all follow links and are freely given. ADDED: There are upwards of a million new affiliate backlinks and then a load of pretty normal freely given backlinks with dealers who have commission arrangements, etc., with the company on motorhome sales. Now this doesn't feel right to me because even if it isn't purposefully manipulative, it may appear so because of clear commercial relationships between my client company and the dealer businesses. So I will recommend nofollow althought the site will lose a huge number of backlinks as a result. What are your thoughts on this?
White Hat / Black Hat SEO | | McTaggart0 -
When you get a new inbound link do you submit a request to google to reindex the new page pointing at you?
I'm just starting my link building campaign in earnest, and received my first good quality inbound link less than an hour ago. My initial thought was that I should go directly to google, and ask them to reindex the page that linked to me... If I make a habit of that (getting a new link, then submitting that page directly to google), would that signify to google that this might not be a natural link building campaign? The links are from legitimate (non-paid, non-exchange) partners, which google could probably figure out, but I'm interested to know opinions on this. Thanks, -Eric
White Hat / Black Hat SEO | | ForForce0 -
Negative SEO from Spammers Killing Client Rankings
Hi - I have identified a client website which was; a ) hacked and had several fraudulent pages added e.g. www.xxx.com/images/uggaustralia.html added which have 301 redirect links to another fraudulent websites. b) had an auto generated back link campaign (over 12k back links at present) with targeted anchor text at cheap ugg boots, ugg sale etc. I've removed the dodgy redirect web pages and also undertook a link audit using Google WMT, OSE and Seo Majestic and have disavowed all the spammy links at domain level. Consequently my client has dropped from top three for the key phrase to #9. Google WMT now sees ugg boots uk, ugg boots sale etc. as some of the most popular anchor text for the site even though it's blatantly obvious that the site has nothing to do with Ugg boots. No manual webspam penalties are in place however the auto generated anchor text campaign is still ongoing and is generating more spammy links back to non existent web pages - which still Google appears to be picking up. Question is - how long do you reckon it will take for the links to disappear and is there anything I can speed Google along as this issue if not of my making? p.s. For the record I've found at least 500 sites that have been targeted by this same campaign as well.
White Hat / Black Hat SEO | | Door4seo0 -
Reconsideration Request Letter - What must be included to be restored
How important do you feel the actual 'fluff' of a reconsideration letter is? I have created a comprehensive analysis reviewing thousands of links, removing hundreds, no-following links and disavowing many. However one thing I did not do in my letter is throw anyone under the bus and told that is what Google is looking for. Do you have any thoughts or insight into this? If it is a manual penalty then is google looking for you to blame a specific agency or company you bought links from to be truly forgiven?
White Hat / Black Hat SEO | | SEOEnthusiast0 -
Panda Recovery: Is a reconsideration request necessary?
Hi everyone, I run a 12-year old travel site that primarily publishes hotel reviews and blog posts about ways to save when traveling in Europe. We have a domain authority of 65 and lots of high quality links from major news websites (NYT, USA Today, NPR, etc.). We always ranked well for competitive searches like "cheap hotels in Paris," etc., for many, many years (like 10 years). Things started falling two years ago (April 2011)--I thought it was just normal algorithmic changes, and that our pages were being devalued (and perhaps, it was). So, we continued to bulk up our reviews and other key pages, only to see things continue to slide. About a month ago I lined up all of our inbound search traffic from Google Analytics and compared it to SEO Moz's timeline of Google updates. Turns out every time there was a Panda roll-out (from the second one in April 2011) our traffic tumbled. Other updates (Penguin, etc.) didn't seem to make a difference. But why should our content that we invest so much in take a hit from Panda? It wasn't "thin." But thin content existed elsewhere on our site: We had a flights section with 40,000 pages of thin content, cranked out of our database with virtually no unique content. We had launched that section in 2008, and it had never been an issue (and had mostly been ignored), but now, I believed, it was working against us. My understanding is that any thin content can actually work against the entire site's rankings. In summary: We had 40,000 thin flights pages, 2,500 blog posts (rich content), and about 2,500 hotel-related pages (rich and well researched "expert" content). So, two weeks ago we dropped almost the entire flights section. We kept about 400 pages (of the 40,000) with researched, unique and well-written information, and we 410'd the rest. Following the advice of so many others on these boards, we put the "thin" flights pages in their own sitemap so we could watch their index number fall in Webmaster tools. And we watched (with some eagerness and trepidation) as the error count shot up. Google has found about half of them at this point. Last week I submitted a "reconsideration request" to Google's spam team. I wasn't sure if this was necessary (as the whole point of dropping the pages, 410'ing and so forth was to fix it on our end, which would hopefully filter down through the SERPs eventually). However, I thought it was worth sending them a note explaining the actions we had taken, just in case. Today I received a response from them. It includes: "We reviewed your site and found no manual actions by the webspam team that might affect your site's ranking in Google. There's no need to file a reconsideration request for your site, because any ranking issues you may be experiencing are not related to a manual action taken by the webspam team. Of course, there may be other issues with your site that affect your site's ranking. Google's computers determine the order of our search results using a series of formulas known as algorithms. We make hundreds of changes to our search algorithms each year, and we employ more than 200 different signals when ranking pages. As our algorithms change and as the web (including your site) changes, some fluctuation in ranking can happen as we make updates to present the best results to our users. If you've experienced a change in ranking which you suspect may be more than a simple algorithm change, there are other things you may want to investigate as possible causes, such as a major change to your site's content, content management system, or server architecture. For example, a site may not rank well if your server stops serving pages to Googlebot, or if you've changed the URLs for a large portion of your site's pages..." And thus, I'm a bit confused. If they say that there wasn't any manual action taken, is that a bad thing for my site? Or is it just saying that my site wasn't experiencing a manual penalty, however Panda perhaps still penalized us (through a drop in rankings) -- and Panda isn't considered "manual." Could the 410'ing of 40,000 thin pages actually raise some red flags? And finally, how long do these issues usually take to clear up? Pardon the very long question and thanks for any insights. I really appreciate the advice offered in these forums.
White Hat / Black Hat SEO | | TomNYC0 -
Strange Pingback/Blog Comment Links
On one of my sites I've noticed some strange links from Google Webmaster Tools recent links feature. They are pingbacks/blog comments but they are using keyword anchor text and linking to my site. I know we are not doing this. Should I be concerned about this possibly being negative SEO? Here's a sample (be careful, shady site)
White Hat / Black Hat SEO | | eyeflow0 -
Dust.js Client-side JavaScript Templates & SEO
I work for a commerce company and our IT team is pushing to switch our JSP server-side templates over to client-side templates using a JavaScript library called Dust.js Dust.js is a JavaScript client-side templating solution that takes the presentation layer away from the data layer. The problem with front-end solutions like this is they are not SEO friendly because all the content is being served up with JavaScript. Dust.js has the ability to render your client-side content server-side if it detects Google bot or a browser with JavaScript turned off but I’m not sold on this as being “safe”. Read about Linkedin switching over to Dust.js http://engineering.linkedin.com/frontend/leaving-jsps-dust-moving-linkedin-dustjs-client-side-templates http://engineering.linkedin.com/frontend/client-side-templating-throwdown-mustache-handlebars-dustjs-and-more Explanation of this: “Dust.js server side support: if you have a client that can't execute JavaScript, such as a search engine crawler, a page must be rendered server side. Once written, the same dust.js template can be rendered not only in the browser, but also on the server using node.js or Rhino.” Basically what would be happening on the backend of our site, is we would be detecting the user-agent of all traffic and once we found a search bot, serve up our web pages server-side instead client-side to the bots so they can index our site. Server-side and client-side will be identical content and there will be NO black hat cloaking going on. The content will be identical. But, this technique is Cloaking right? From Wikipedia: “Cloaking is a SEO technique in which the content presented to the search engine spider is different from that presented to the user's browser. This is done by delivering content based on the IP addresses or the User-Agent HTTP header of the user requesting the page. When a user is identified as a search engine spider, a server-side script delivers a different version of the web page, one that contains content not present on the visible page, or that is present but not searchable.” Matt Cutts on Cloaking http://support.google.com/webmasters/bin/answer.py?hl=en&answer=66355 Like I said our content will be the same but if you read the very last sentence from Wikipdia it’s the “present but not searchable” that gets me. If our content is the same, are we cloaking? Should we be developing our site like this for ease of development and performance? Do you think client-side templates with server-side solutions are safe from getting us kicked out of search engines? Thank you in advance for ANY help with this!
White Hat / Black Hat SEO | | Bodybuilding.com0