Strange client request
-
I have a client who attends an internet marketing meetup. I have been once myself. Good group of people but most seem lost when it comes to SEO and can't tell Black from White!
Well today my client emailed me and in the email she mentioned doing a trick to the title tags.
Client: "there is a trick to use with the title by putting keywords in quotes and parenthasis. I'm sure you know how to do that little trick. If we do it in the title and in the first few lines of the verbage it will soar us near the top and hopefully on the first page of Google."
a few sentences later
"We could use a tad more content on the first page ( with parantesis and quotes) to boost us up in the ratings. At least it is an easy trick to do."
I have never heard of this. Has anyone else heard about this. Please share thoughts. It sounds completely bogus to me but I will be the first to admit that i don't know everything! However i would like to have more than just my opinion when I talk to my client.
Let me know what you think.
-
Thank you all for your input. I couldn't agree more with everyone. Like I said, i needed to have more points of views to bring to the table.
-
Bad bad idea!
As others have said, I suspect the theory here is to try to rank higher for when people use speech marks in their Google query.
In my opinion, the idea is bad for 3 reasons:
-
Hardly anyone searches like that these days - I do sometimes but only when the results without "" fail to return the results I need - or when I'm doing specific research (intitle:" " etc). Not many 'normal' users search like this
-
From a user perspective it doesn't make sense. In the body of content it would look very odd and unprofessional (unless you are citing a quote!) - Moreover using " " marks in the title tag is a bad idea - you only get a few characters for your title tag, so take FULL advantage of each character! I don't mean over-optimise keywords here either, but as well as having your primary keyword in there, use the title tag to help turn 'would-be' visitors into visitors - using " " marks in your title tag reduces the space you have to use, making it a bad idea.
-
It's a pretty blatant form of trying to manipulate results - Something that big G would likely not approve of... Ask your client if they want to gamble their online presence on something designed to 'trick' Google If they are promoting a crappy $7 affiliate product I'd maybe understand them being that silly, but if they want a long-term online business... Nah!
Kinda makes me wonder who suggested this to them! Did they enter a time-warp when they went into the meeting, going back to 2001?!
-
-
Sounds bogus to me. Any time I hear something that is to good to be true, I typically will ask the presenter for data/proof behind the statement. 9 times out of 10, they won't have it or will "email it" after the presentation. The other 1 out of ten seems to be one example that is an outlier and can't be replicated for some reason.
-
Keri just nailed it.
You will actually hear a lot of crap in places like that. Actually, ive been to events where speakers just talk crap. Stuff they don't even test, just "heard" or made up.
-
If it worked, we'd all see text with lots of odd quotes and parenthesis, correct?
-
Google does allow for people to search exact keywords in that manner so if they think you're going to get more traffic because you know people will search identical keywords answer it just is written and not a good idea to use "whatever" or (don't do it) as people just don't do that as much as writing something unique in google
From a grammar standpoint it should be as user-friendly as possible unnecessary question parentheses is not user-friendly to me.
Thomas
-
To be honest, it sounds bogus. I've never heard of it, and just from a user standpoint, I'd imagine that would be annoying. Let's try that sentence again with what was suggested...
To be "honest" (it sounds bogus); I've never "heard" of it (and just from a user standpoint); I'd imagine that would be "annoying".
Not saying those are the keywords, but how annoying is that sentence to read? From a grammar standpoint, it's giving me chills. Anything in quotes is hinting at something other than what it is... what are we talking "about?" I hate reading through paragraphs where people use quotes out of context. Here's a great example of what I'm talking about: what does this sign mean to you, http://static.guim.co.uk/sys-images/Admin/BkFill/Default_image_group/2012/2/10/1328896276621/cheese-burgers-sign-on-sm-007.jpg? Is it cheese or not? Not sure, but I don't want that burger!
-
Unfortunately clients trick is to attract the exact match's of the words in quotes not your normal broad search terms that include keywords. I think it's a very bad idea to implement
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Should You Link Back from Client's Website?
We had a discussion in the office today, about if it can help or hurt you to link back to your site from one that you optimize, host, or manage. A few ideas that were mentioned: HURT:
White Hat / Black Hat SEO | | David-Kley
1. The website is not directly related to your niche, therefore Google will treat it as a link exchange or spammy link.
2. Links back to you are often not surrounded by related text about your services, and looks out of place to users and Search Engines. HELP:
1. On good (higher PR, reputable domain) domains, a link back can add authority, even if the site is not directly related to your services.
2. Allows high ranking sites to show users who the provider is, potentially creating a new client, and a followed incoming link on anchor text you can choose. So, what do you think? Test results would be appreciated, as we are trying to get real data. Benefits and cons if you have an opinion.2 -
Negative SEO from Spammers Killing Client Rankings
Hi - I have identified a client website which was; a ) hacked and had several fraudulent pages added e.g. www.xxx.com/images/uggaustralia.html added which have 301 redirect links to another fraudulent websites. b) had an auto generated back link campaign (over 12k back links at present) with targeted anchor text at cheap ugg boots, ugg sale etc. I've removed the dodgy redirect web pages and also undertook a link audit using Google WMT, OSE and Seo Majestic and have disavowed all the spammy links at domain level. Consequently my client has dropped from top three for the key phrase to #9. Google WMT now sees ugg boots uk, ugg boots sale etc. as some of the most popular anchor text for the site even though it's blatantly obvious that the site has nothing to do with Ugg boots. No manual webspam penalties are in place however the auto generated anchor text campaign is still ongoing and is generating more spammy links back to non existent web pages - which still Google appears to be picking up. Question is - how long do you reckon it will take for the links to disappear and is there anything I can speed Google along as this issue if not of my making? p.s. For the record I've found at least 500 sites that have been targeted by this same campaign as well.
White Hat / Black Hat SEO | | Door4seo0 -
Advanced Outside Perspective Requested to Combat Negative SEO
**Situation: **We are a digital marketing agency that has been doing SEO for 6 years. For many years, we maintained exceptional rankings and online visibility.However, I suppose with great rankings comes great vulnerability. Last year, we became the target of a pretty aggressive and malicious negative SEO campaign from another other SEO(s) in our industry - I'm assuming they're competitors. Overnight, there were 10,000+ links built on various spam domains using the anchor text: negative marketing services poor seo butt crack kickass ... and more (see attached image) The issue we face are: Time Investment - Enormous investment of time and energy to contact each web admin for link removal. Hard to Keep Up - When we think we're getting somewhere, new links come out of the woodwork. Disavow Doesn't Work - Though we've tried to generally avoid the disavow tool, we've had to use it for a few domains. However, it's difficult to say how much effect, if any, it's had on the negative links. As you can imagine, we've seen an enormous drop in organic traffic since this all started. It's unfortunate that SEO has come to this point, but I still see a lot of value in what we do and hope that spammers don't completely ruin it for us one day. Moz Community - I come to you seeking some new insight, advice, similar experiences or anything else that may help! Are there any other agencies that have experienced the same issue? Any new ways to combat really aggressive negative SEO link building? Thanks everyone! UUPPplJ
White Hat / Black Hat SEO | | ByteLaunch0 -
What is your SEO agency doing in terms of link building for clients?
What are you or your SEO agency doing for your client's link building efforts? What are you (or the agency) doing yourself, or out-sourcing, or having the client do for link building? If a new client needs some serious link building done, what do you prescribe and implement straight off the bat? What are your go-to link building tactics for clients? What are the link building challenges faced by your agency in 2013/2014? What's working for your agency and what's not? Does your agency work closely with the client's marketing department to gain link traction? If so, what are collaborating on? What else might you be willing to share about your agencies link building practices? Thanks
White Hat / Black Hat SEO | | Martin_S0 -
Panda Recovery: Is a reconsideration request necessary?
Hi everyone, I run a 12-year old travel site that primarily publishes hotel reviews and blog posts about ways to save when traveling in Europe. We have a domain authority of 65 and lots of high quality links from major news websites (NYT, USA Today, NPR, etc.). We always ranked well for competitive searches like "cheap hotels in Paris," etc., for many, many years (like 10 years). Things started falling two years ago (April 2011)--I thought it was just normal algorithmic changes, and that our pages were being devalued (and perhaps, it was). So, we continued to bulk up our reviews and other key pages, only to see things continue to slide. About a month ago I lined up all of our inbound search traffic from Google Analytics and compared it to SEO Moz's timeline of Google updates. Turns out every time there was a Panda roll-out (from the second one in April 2011) our traffic tumbled. Other updates (Penguin, etc.) didn't seem to make a difference. But why should our content that we invest so much in take a hit from Panda? It wasn't "thin." But thin content existed elsewhere on our site: We had a flights section with 40,000 pages of thin content, cranked out of our database with virtually no unique content. We had launched that section in 2008, and it had never been an issue (and had mostly been ignored), but now, I believed, it was working against us. My understanding is that any thin content can actually work against the entire site's rankings. In summary: We had 40,000 thin flights pages, 2,500 blog posts (rich content), and about 2,500 hotel-related pages (rich and well researched "expert" content). So, two weeks ago we dropped almost the entire flights section. We kept about 400 pages (of the 40,000) with researched, unique and well-written information, and we 410'd the rest. Following the advice of so many others on these boards, we put the "thin" flights pages in their own sitemap so we could watch their index number fall in Webmaster tools. And we watched (with some eagerness and trepidation) as the error count shot up. Google has found about half of them at this point. Last week I submitted a "reconsideration request" to Google's spam team. I wasn't sure if this was necessary (as the whole point of dropping the pages, 410'ing and so forth was to fix it on our end, which would hopefully filter down through the SERPs eventually). However, I thought it was worth sending them a note explaining the actions we had taken, just in case. Today I received a response from them. It includes: "We reviewed your site and found no manual actions by the webspam team that might affect your site's ranking in Google. There's no need to file a reconsideration request for your site, because any ranking issues you may be experiencing are not related to a manual action taken by the webspam team. Of course, there may be other issues with your site that affect your site's ranking. Google's computers determine the order of our search results using a series of formulas known as algorithms. We make hundreds of changes to our search algorithms each year, and we employ more than 200 different signals when ranking pages. As our algorithms change and as the web (including your site) changes, some fluctuation in ranking can happen as we make updates to present the best results to our users. If you've experienced a change in ranking which you suspect may be more than a simple algorithm change, there are other things you may want to investigate as possible causes, such as a major change to your site's content, content management system, or server architecture. For example, a site may not rank well if your server stops serving pages to Googlebot, or if you've changed the URLs for a large portion of your site's pages..." And thus, I'm a bit confused. If they say that there wasn't any manual action taken, is that a bad thing for my site? Or is it just saying that my site wasn't experiencing a manual penalty, however Panda perhaps still penalized us (through a drop in rankings) -- and Panda isn't considered "manual." Could the 410'ing of 40,000 thin pages actually raise some red flags? And finally, how long do these issues usually take to clear up? Pardon the very long question and thanks for any insights. I really appreciate the advice offered in these forums.
White Hat / Black Hat SEO | | TomNYC0 -
Is my competitor up to no good? Strange site-explorer results.
I'm researching a competitor using site explorer and the seomoz toolbar and getting some strange results. When you search by the domain name in site explorer you get no results, but the toolbar shows 170K incoming links. http://www.opensiteexplorer.org/links?site=www.augustagreenlawns.com I noticed the top referring page was a strange internal url so I ran that through site explorer and discovered 19 links.. When you put the strange link in a browser, it redirects to the home url;.. At this url the toolbar shows 220 links and semoz shows 19 http://www.augustagreenlawns.com/?xid_78e7f=0f2a64344c8de6bdf2d8cdf8de93ea5c http://www.opensiteexplorer.org/links?site=www.augustagreenlawns.com%2F%3Fxid_78e7f%3D0f2a64344c8de6bdf2d8cdf8de93ea5c What is up with that url? What are they doing? This is a site ranking #1 for my local search term even though he has about 50 pages of almost duplicate content. See link below. I'm really scratching my head here. http://www.augustagreenlawns.com/home.php?all=categories
White Hat / Black Hat SEO | | dwallner0 -
Dust.js Client-side JavaScript Templates & SEO
I work for a commerce company and our IT team is pushing to switch our JSP server-side templates over to client-side templates using a JavaScript library called Dust.js Dust.js is a JavaScript client-side templating solution that takes the presentation layer away from the data layer. The problem with front-end solutions like this is they are not SEO friendly because all the content is being served up with JavaScript. Dust.js has the ability to render your client-side content server-side if it detects Google bot or a browser with JavaScript turned off but I’m not sold on this as being “safe”. Read about Linkedin switching over to Dust.js http://engineering.linkedin.com/frontend/leaving-jsps-dust-moving-linkedin-dustjs-client-side-templates http://engineering.linkedin.com/frontend/client-side-templating-throwdown-mustache-handlebars-dustjs-and-more Explanation of this: “Dust.js server side support: if you have a client that can't execute JavaScript, such as a search engine crawler, a page must be rendered server side. Once written, the same dust.js template can be rendered not only in the browser, but also on the server using node.js or Rhino.” Basically what would be happening on the backend of our site, is we would be detecting the user-agent of all traffic and once we found a search bot, serve up our web pages server-side instead client-side to the bots so they can index our site. Server-side and client-side will be identical content and there will be NO black hat cloaking going on. The content will be identical. But, this technique is Cloaking right? From Wikipedia: “Cloaking is a SEO technique in which the content presented to the search engine spider is different from that presented to the user's browser. This is done by delivering content based on the IP addresses or the User-Agent HTTP header of the user requesting the page. When a user is identified as a search engine spider, a server-side script delivers a different version of the web page, one that contains content not present on the visible page, or that is present but not searchable.” Matt Cutts on Cloaking http://support.google.com/webmasters/bin/answer.py?hl=en&answer=66355 Like I said our content will be the same but if you read the very last sentence from Wikipdia it’s the “present but not searchable” that gets me. If our content is the same, are we cloaking? Should we be developing our site like this for ease of development and performance? Do you think client-side templates with server-side solutions are safe from getting us kicked out of search engines? Thank you in advance for ANY help with this!
White Hat / Black Hat SEO | | Bodybuilding.com0 -
I'm worried my client is asking me to post duplicate content, am I just being paranoid?
Hi SEOMozzers, I'm building a website for a client that provides photo galleries for travel destinations. As of right now, the website is basically a collection of photo galleries. My client believes Google might like us a bit more if we had more "text" content. So my client has been sending me content that is provided free by tourism organizations (tourism organizations will often provide free "one-pagers" about their destination for media). My concern is that if this content is free, it seems likely that other people have already posted it somewhere on the web. I'm worried Google could penalize us for posting content that is already existent. I know that conventionally, there are ways around this-- you can tell crawlers that this content shouldn't be crawled-- but in my case, we are specifically trying to produce crawl-able content. Do you think I should advise my client to hire some bloggers to produce the content or am I just being paranoid? Thanks everyone. This is my first post to the Moz community 🙂
White Hat / Black Hat SEO | | steve_benjamins0