Strange client request
-
I have a client who attends an internet marketing meetup. I have been once myself. Good group of people but most seem lost when it comes to SEO and can't tell Black from White!
Well today my client emailed me and in the email she mentioned doing a trick to the title tags.
Client: "there is a trick to use with the title by putting keywords in quotes and parenthasis. I'm sure you know how to do that little trick. If we do it in the title and in the first few lines of the verbage it will soar us near the top and hopefully on the first page of Google."
a few sentences later
"We could use a tad more content on the first page ( with parantesis and quotes) to boost us up in the ratings. At least it is an easy trick to do."
I have never heard of this. Has anyone else heard about this. Please share thoughts. It sounds completely bogus to me but I will be the first to admit that i don't know everything! However i would like to have more than just my opinion when I talk to my client.
Let me know what you think.
-
Thank you all for your input. I couldn't agree more with everyone. Like I said, i needed to have more points of views to bring to the table.
-
Bad bad idea!
As others have said, I suspect the theory here is to try to rank higher for when people use speech marks in their Google query.
In my opinion, the idea is bad for 3 reasons:
-
Hardly anyone searches like that these days - I do sometimes but only when the results without "" fail to return the results I need - or when I'm doing specific research (intitle:" " etc). Not many 'normal' users search like this
-
From a user perspective it doesn't make sense. In the body of content it would look very odd and unprofessional (unless you are citing a quote!) - Moreover using " " marks in the title tag is a bad idea - you only get a few characters for your title tag, so take FULL advantage of each character! I don't mean over-optimise keywords here either, but as well as having your primary keyword in there, use the title tag to help turn 'would-be' visitors into visitors - using " " marks in your title tag reduces the space you have to use, making it a bad idea.
-
It's a pretty blatant form of trying to manipulate results - Something that big G would likely not approve of... Ask your client if they want to gamble their online presence on something designed to 'trick' Google If they are promoting a crappy $7 affiliate product I'd maybe understand them being that silly, but if they want a long-term online business... Nah!
Kinda makes me wonder who suggested this to them! Did they enter a time-warp when they went into the meeting, going back to 2001?!
-
-
Sounds bogus to me. Any time I hear something that is to good to be true, I typically will ask the presenter for data/proof behind the statement. 9 times out of 10, they won't have it or will "email it" after the presentation. The other 1 out of ten seems to be one example that is an outlier and can't be replicated for some reason.
-
Keri just nailed it.
You will actually hear a lot of crap in places like that. Actually, ive been to events where speakers just talk crap. Stuff they don't even test, just "heard" or made up.
-
If it worked, we'd all see text with lots of odd quotes and parenthesis, correct?
-
Google does allow for people to search exact keywords in that manner so if they think you're going to get more traffic because you know people will search identical keywords answer it just is written and not a good idea to use "whatever" or (don't do it) as people just don't do that as much as writing something unique in google
From a grammar standpoint it should be as user-friendly as possible unnecessary question parentheses is not user-friendly to me.
Thomas
-
To be honest, it sounds bogus. I've never heard of it, and just from a user standpoint, I'd imagine that would be annoying. Let's try that sentence again with what was suggested...
To be "honest" (it sounds bogus); I've never "heard" of it (and just from a user standpoint); I'd imagine that would be "annoying".
Not saying those are the keywords, but how annoying is that sentence to read? From a grammar standpoint, it's giving me chills. Anything in quotes is hinting at something other than what it is... what are we talking "about?" I hate reading through paragraphs where people use quotes out of context. Here's a great example of what I'm talking about: what does this sign mean to you, http://static.guim.co.uk/sys-images/Admin/BkFill/Default_image_group/2012/2/10/1328896276621/cheese-burgers-sign-on-sm-007.jpg? Is it cheese or not? Not sure, but I don't want that burger!
-
Unfortunately clients trick is to attract the exact match's of the words in quotes not your normal broad search terms that include keywords. I think it's a very bad idea to implement
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Client Wants To Use A .io Domain Name - How Bad For Organic?
Hi, I have a U.S. client who is stuck on a name that he wants to get as a .io (British Indian Ocean) domain name for a new site. Aside from the user confusion/weirdness, how much harder do you think this makes this sites organic in the U.S. in the future with a .io domain name? FYI, the other part of the domain name he wants to use is short, meaningless and implies nothing in and of itself. Thanks!
White Hat / Black Hat SEO | | 945012 -
Curious, have you ever had a client dispute your Moz Ranking Report?
one of my international clients from China does not believe that his site is now on page #2 for a national search term. He said he had a colleague search from a location in the United States and his site did not come up in any of the top 10 Google search page results. Suggest any ways to back ranking up? Maybe use an additional rank report? appreciate any/all suggestions. THanks! Chris
White Hat / Black Hat SEO | | Sundance_Kidd0 -
Strange strategy from a competitor. Is this "Google Friendly"?
Hi all,We have a client from a very competitive industry (car insurance) that ranks first for almost every important and relevant keyword related to car insurance.
White Hat / Black Hat SEO | | sixam
But they could always be doing a good job. A few days ago i found this: http://logo.force.com/ The competitor website is: http://www.logo.pt/ The competitor name is: Logo What I found strange is the fact that both websites are the same, except the fact that the first is in a sub-domain and have important links pointing to the original website (www.logo.pt) So my question is, is this a "google friendly" (and fair) technique? why this competitor has such good results? Thanks in advance!! I look forward to hearing from you guys0 -
A client/Spam penalty issue
Wondering if I could pick the brains of those with more wisdom than me... Firstly, sorry but unable to give the client's url on this topic. I know that will not help with people giving answers but the client would prefer it if this thread etc didn't appear when people type their name in google. Right, to cut a long story short..gained a new client a few months back, did the usual things when starting the project of reviewing the backlinks using OSE and Majestic. There were a few iffy links but got most of those removed. In the last couple of months have been building backlinks via guest blogging and using bloggerlinkup and myblogguest (and some industry specific directories found using linkprospector tool). All way going well, the client were getting about 2.5k hits a day, on about 13k impressions. Then came the last Google update. The client were hit, but not massively. Seemed to drop from top 3 for a lot of keywords to average position of 5-8, so still first page. The traffic went down after this. All the sites which replaced the client were the big name brands in the niche (home improvement, sites such as BandQ, Homebase, for the fellow UK'ers). This was annoying but understandable. However, on 27th June. We got the following message in WMT - Google has detected a pattern of artificial or unnatural links pointing to your site. Buying links or participating in link schemes in order to manipulate PageRank are violations of Google's Webmaster Guidelines.
White Hat / Black Hat SEO | | GrumpyCarl
As a result, Google has applied a manual spam action to xxxx.co.uk/. There may be other actions on your site or parts of your site. This was a shock to say the least. A few days later the traffic on the site went down more and the impressions dropped to about 10k a day (oddly the rankings seem to be where they were after the Google update so perhaps a delayed message). To get back up to date....after digging around more it appears there are a lot of SENUKE type links to the site - links on poor wiki sites,a lot of blog commenting links, mostly from irrelevant sites, i enclose a couple of examples below. I have broken the links so they don't get any link benefit from this site. They are all safe for work http:// jonnyhetherington. com/2012/02/i-need-a-new-bbq/?replytocom=984 http:// www.acgworld. cn/archives/529/comment-page-3 In addition to this there is a lot of forum spam, links from porn sites and links from sites with Malware warnings. To be honest, it is almost perfect negative seo!! I contacted several of the sites in question (about 450) and requested they remove the links, the vast majority of the sites have no contact on them so I cannot get the links removed. I did a disavow on these links and then a reconsideration request but was told that this is unsuccessful as the site still was being naughty. Given that I can neither remove the links myself or get Google to ignore them, my options for lifting this penalty are limited. What would be the course of action others would take, please. Thanks and sorry for overally long post0 -
Panda Recovery: Is a reconsideration request necessary?
Hi everyone, I run a 12-year old travel site that primarily publishes hotel reviews and blog posts about ways to save when traveling in Europe. We have a domain authority of 65 and lots of high quality links from major news websites (NYT, USA Today, NPR, etc.). We always ranked well for competitive searches like "cheap hotels in Paris," etc., for many, many years (like 10 years). Things started falling two years ago (April 2011)--I thought it was just normal algorithmic changes, and that our pages were being devalued (and perhaps, it was). So, we continued to bulk up our reviews and other key pages, only to see things continue to slide. About a month ago I lined up all of our inbound search traffic from Google Analytics and compared it to SEO Moz's timeline of Google updates. Turns out every time there was a Panda roll-out (from the second one in April 2011) our traffic tumbled. Other updates (Penguin, etc.) didn't seem to make a difference. But why should our content that we invest so much in take a hit from Panda? It wasn't "thin." But thin content existed elsewhere on our site: We had a flights section with 40,000 pages of thin content, cranked out of our database with virtually no unique content. We had launched that section in 2008, and it had never been an issue (and had mostly been ignored), but now, I believed, it was working against us. My understanding is that any thin content can actually work against the entire site's rankings. In summary: We had 40,000 thin flights pages, 2,500 blog posts (rich content), and about 2,500 hotel-related pages (rich and well researched "expert" content). So, two weeks ago we dropped almost the entire flights section. We kept about 400 pages (of the 40,000) with researched, unique and well-written information, and we 410'd the rest. Following the advice of so many others on these boards, we put the "thin" flights pages in their own sitemap so we could watch their index number fall in Webmaster tools. And we watched (with some eagerness and trepidation) as the error count shot up. Google has found about half of them at this point. Last week I submitted a "reconsideration request" to Google's spam team. I wasn't sure if this was necessary (as the whole point of dropping the pages, 410'ing and so forth was to fix it on our end, which would hopefully filter down through the SERPs eventually). However, I thought it was worth sending them a note explaining the actions we had taken, just in case. Today I received a response from them. It includes: "We reviewed your site and found no manual actions by the webspam team that might affect your site's ranking in Google. There's no need to file a reconsideration request for your site, because any ranking issues you may be experiencing are not related to a manual action taken by the webspam team. Of course, there may be other issues with your site that affect your site's ranking. Google's computers determine the order of our search results using a series of formulas known as algorithms. We make hundreds of changes to our search algorithms each year, and we employ more than 200 different signals when ranking pages. As our algorithms change and as the web (including your site) changes, some fluctuation in ranking can happen as we make updates to present the best results to our users. If you've experienced a change in ranking which you suspect may be more than a simple algorithm change, there are other things you may want to investigate as possible causes, such as a major change to your site's content, content management system, or server architecture. For example, a site may not rank well if your server stops serving pages to Googlebot, or if you've changed the URLs for a large portion of your site's pages..." And thus, I'm a bit confused. If they say that there wasn't any manual action taken, is that a bad thing for my site? Or is it just saying that my site wasn't experiencing a manual penalty, however Panda perhaps still penalized us (through a drop in rankings) -- and Panda isn't considered "manual." Could the 410'ing of 40,000 thin pages actually raise some red flags? And finally, how long do these issues usually take to clear up? Pardon the very long question and thanks for any insights. I really appreciate the advice offered in these forums.
White Hat / Black Hat SEO | | TomNYC0 -
Dust.js Client-side JavaScript Templates & SEO
I work for a commerce company and our IT team is pushing to switch our JSP server-side templates over to client-side templates using a JavaScript library called Dust.js Dust.js is a JavaScript client-side templating solution that takes the presentation layer away from the data layer. The problem with front-end solutions like this is they are not SEO friendly because all the content is being served up with JavaScript. Dust.js has the ability to render your client-side content server-side if it detects Google bot or a browser with JavaScript turned off but I’m not sold on this as being “safe”. Read about Linkedin switching over to Dust.js http://engineering.linkedin.com/frontend/leaving-jsps-dust-moving-linkedin-dustjs-client-side-templates http://engineering.linkedin.com/frontend/client-side-templating-throwdown-mustache-handlebars-dustjs-and-more Explanation of this: “Dust.js server side support: if you have a client that can't execute JavaScript, such as a search engine crawler, a page must be rendered server side. Once written, the same dust.js template can be rendered not only in the browser, but also on the server using node.js or Rhino.” Basically what would be happening on the backend of our site, is we would be detecting the user-agent of all traffic and once we found a search bot, serve up our web pages server-side instead client-side to the bots so they can index our site. Server-side and client-side will be identical content and there will be NO black hat cloaking going on. The content will be identical. But, this technique is Cloaking right? From Wikipedia: “Cloaking is a SEO technique in which the content presented to the search engine spider is different from that presented to the user's browser. This is done by delivering content based on the IP addresses or the User-Agent HTTP header of the user requesting the page. When a user is identified as a search engine spider, a server-side script delivers a different version of the web page, one that contains content not present on the visible page, or that is present but not searchable.” Matt Cutts on Cloaking http://support.google.com/webmasters/bin/answer.py?hl=en&answer=66355 Like I said our content will be the same but if you read the very last sentence from Wikipdia it’s the “present but not searchable” that gets me. If our content is the same, are we cloaking? Should we be developing our site like this for ease of development and performance? Do you think client-side templates with server-side solutions are safe from getting us kicked out of search engines? Thank you in advance for ANY help with this!
White Hat / Black Hat SEO | | Bodybuilding.com0 -
I'm worried my client is asking me to post duplicate content, am I just being paranoid?
Hi SEOMozzers, I'm building a website for a client that provides photo galleries for travel destinations. As of right now, the website is basically a collection of photo galleries. My client believes Google might like us a bit more if we had more "text" content. So my client has been sending me content that is provided free by tourism organizations (tourism organizations will often provide free "one-pagers" about their destination for media). My concern is that if this content is free, it seems likely that other people have already posted it somewhere on the web. I'm worried Google could penalize us for posting content that is already existent. I know that conventionally, there are ways around this-- you can tell crawlers that this content shouldn't be crawled-- but in my case, we are specifically trying to produce crawl-able content. Do you think I should advise my client to hire some bloggers to produce the content or am I just being paranoid? Thanks everyone. This is my first post to the Moz community 🙂
White Hat / Black Hat SEO | | steve_benjamins0 -
Is it negative to put a backlink into the footer's website of our clients ?
Hello there ! Everything is in the subject of this post but here is the context : we are a web agency and we, among others, build websites for our clients (most of them are shops). Until now, we put a link in their footer, like "developped by MyWebShop". But we don't know if it is bad or not. With only one website we can have like hundred of backlinks at once, but is it good for SEO or not ? Will Google penalize us thinking that is blackhat practices ? Is it better to put our link in the "legal notices" or "disclaimer" part of the websites ? What is the best practice for a lasting SEO ? I hope you understand my question, Thnak you in advance !
White Hat / Black Hat SEO | | mywebshop0