Crowdsearch.me - Is this a legit approach?
-
It seems like a less-than-white hat approach, and anyway I don't know whether or not it could work.
Does anyone have any advice about it?
Thanks!
-
Thanks for the heads up, I was really unsure about this as well but really glad I saved my money by not buying into it!
Matt
-
Thank´s for this page, I have receive a email about this today for my webstore www.arbeidslys.no . I will not use money on something like this.
Preben Want
Manager
Arbeidslys.no -
Terry Kyle has a report on his results with CrowdSearch:
http://seotraffichacks.com/crowdsearching-work-seo-results-far/
-
A nod from the wizard :0 - I'm counting this week as a good friggen week!
-
Thanks, Rand. It's kind of an honor to have you speaking up on my little question here!
It's probably predictable that someone (or more than one) would try to monetize this sort of trick, because of the Google pronouncements that you mentioned and the other articles that have appeared about CTR and time-on-site behavior.
Too bad. I guess that we all have to actually earn all those visits and page views.
-
Thanks, Ray. What you said confirms what I speculated - too good to be true. And not entirely above-board, either.
-
Totally agree with Ray that this isn't a legitimate tactic, nor would I expect it to work. Google's got a lot of defenses and checks to prevent manipulation of this kind, so while it could have an impact briefly and in some SERPs, I'd expect it to be mostly a waste of time and money.
The only part I'll disagree with is Google's disclosure that they do (or rather "might") use pogo-sticking. I believe this was mentioned at a conference last year or in 2013, though I can't find the reference now. There's also lots of test evidence, including the experiment I ran live at Mozcon, this one from my blog: http://moz.com/rand/queries-clicks-influence-googles-results/ (which I did repeat with success), and some mixed results from Darren Shaw here: http://www.slideshare.net/darrenshaw1/darren-shaw-user-behavior-and-local-search-dallas-state-of-search-2014.
Queries and clicks are most certainly impacting rankings, though how directly and with what caveats/other influences we don't yet know (and may never).
-
Is this a legit approach?
No, not really. Google has never confirmed the use of CTR as a ranking signal for their search rankings. And, services such as these point to the fact that if Google did use CTR as a heavy ranking signal, it could easily be manipulated. That's what this service is proposing they are doing, manipulating the search results.
Now, does CTR actually impact search rankings? It's only speculation at this time and does seem like a logical factor to influence ranking. Google wants to show the most relevant results to the user; the results that answer the users search query the quickest and most complete. However, I don't think it could ever be a heavy impact ranking factor because it can be so easily manipulated.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Best Practice Approaches to Canonicals vs. Indexing in Google Sitemap vs. No Follow Tags
Hi There, I am working on the following website: https://wave.com.au/ I have become aware that there are different pages that are competing for the same keywords. For example, I just started to update a core, category page - Anaesthetics (https://wave.com.au/job-specialties/anaesthetics/) to focus mainly around the keywords ‘Anaesthetist Jobs’. But I have recognized that there are ongoing landing pages that contain pretty similar content: https://wave.com.au/anaesthetists/ https://wave.com.au/asa/ We want to direct organic traffic to our core pages e.g. (https://wave.com.au/job-specialties/anaesthetics/). This then leads me to have to deal with the duplicate pages with either a canonical link (content manageable) or maybe alternatively adding a no-follow tag or updating the robots.txt. Our resident developer also suggested that it might be good to use Google Index in the sitemap to tell Google that these are of less value? What is the best approach? Should I add a canonical link to the landing pages pointing it to the category page? Or alternatively, should I use the Google Index? Or even another approach? Any advice would be greatly appreciated. Thanks!
Intermediate & Advanced SEO | | Wavelength_International0 -
Suggested approach (support) for 301 redirects in event of an acquisition
If an agency has recently been acquired by a new organisation, it will need to be redirected to the new organisation's website as soon as possible. We are aware of the need to 301 redirect all pages (domain authority) across to the current domain of the new organisation's website. The new organisation has less pages than our Agency site however, so we cannot point 301 redirects at page level. Would you therefore advise, A, B or C?: A) Redirecting all pages including all blog posts/services pages etc across from the agency site to the new organisation's domain? * new organisation does not have /blog or /services pages. -Will we lose authority if redirecting from pages of our agency site to the new organisation's top level domain? B) Ensure that the new organisation secures hosting of the agency website, and place a holding page on the Agency website directing visitors through to the new organisation for the interim, until we have a /blog, /services page on the new organisation's site? C) Place 301 redirects from agency across to new organisation, and look moving forward (when pages have been put in place on new organisation website) to retrospectively repoint 301 redirects from top level domain of new organisation's site to the new pages which have just been created on the new organisation's site? Any pointers here would be appreciated. Thanks!
Intermediate & Advanced SEO | | Tangent0 -
Prerender.io and similar services to index content - legit?
A client has a huge, unique, updated list of B2B products that are in javascript and not indexed. Reading around, I think I've found that: Google allows showing bots and users different content (if it's fundamentally the same) with no penalty There are good, bad, and ugly ways to do it It's a semi-common problem There are services like prerender.io and formerly ajaxsnapshots.com that can help with this However..... I can't find a single authoritative (read: from Google or Moz) that says the above point 1. I found this White Hat Cloaking: It exists. It's permitted. It's useful. But can't tell where my situation fits (or if it does). So... if I use prerender.io to surface content to get it indexed... is that a smart move? I'm 95% sure it is, but I need 100% to make the decision.
Intermediate & Advanced SEO | | DanSullivan0 -
GWT does not play nice with 410 status code approach to expire content? Use 301s?
We have been diligently managing our index size in Google for our sites and are returning a 410 status code for pages that we no longer consider "up-to-date" but still carry value for users to access to have Google remove them from our index to keep it lean. However we have been receiving GWT warning across sites because of the 410 status codes Google is encountering which makes us nervous that Google could interpret this approach as a lack of quality of our site. Does anyone have a view if the 410 approach is the right approach for the given example or if we should consider maybe simply using 301s or another status code to keep our GWT errors clean? Further notes there is hardly ever any link juice being sent to those pages so it is not like we are missing out on that the pages for which we return 410 are also marked as noindex and nofollow
Intermediate & Advanced SEO | | petersocapro0 -
Blog Posting Approach
Hello Moz, I have came to realize that my blogger is copying and posting the exact content from another domain to my site. She has provided reference at the end of each article. I was wondering if this is a good practice and if it will have any SEO impact on my website. Thanks a lot for the help
Intermediate & Advanced SEO | | businessowner0 -
Site revamp for neglected site - modifying site structure, URLs and content - is there an optimal approach?
A site I'm involved with, www.organicguide.com, was at one stage (long ago) performing reasonably well in the search engines. It was ranking highly for several keywords. The site has been neglected for some considerable period of time. A new group of people are interested in revamping the site, updating content, removing some of the existing content, and generally refreshing the site entirely. In order to go forward with the site, significant changes need to be made. This will likely involve moving the entire site across to wordpress. The directory software (edirectory.com) currently being used has not been designed with SEO in mind and as a result numerous similar pages of directory listings (all with similar titles and descriptions) are in google's results, albeit with very weak PA. After reading many of the articles/blog posts here I realize that a significant revamp and some serious SEO work is needed. So, I've joined this community to learn from those more experienced. Apart from doing 301 redirects for pages that we need to retain, is there any optimal way of removing/repairing the current URL structure as the site gets updated? Also, is it better to make changes all at once or is an iterative approach preferred? Many thanks in advance for any responses/advice offered. Cheers MacRobbo
Intermediate & Advanced SEO | | macrobbo0 -
Google and keywords with and without accents. How to approach optimization for both?
This is more of a problem for people optimizing for keywords in spanish, french, german and such. It is well known that SERPs for keywords with and without accents are different. However, I haven't been able to discover how do I make the incorrectly misspelled keywords rank without messing up the site's content. Another fact to take into account is that more than half the searches made in these languages are done without accents because, let's face it, it's just too much work. An example of my specific problem: The misspelled keyword "cursos de ingles" is currently ranking higher than the correctly spelled keyword "cursos de inglés". However, the misspelled keyword "clases de ingles" is not ranking at all and the correctly spelled keyword "clases de inglés" is on the first page. How is this possible? Now, how can I optimize the misspelled keywords to rank higher without misspelling the content on my site? Thank you! Capture.PNG
Intermediate & Advanced SEO | | 7decode0 -
Anchor Text Diversification – Branded VS Non Branded – What is the best approach… if any?
Our organization competes in the Drug & Alcohol Treatment Category… very competitively I must say. While we create content for long-tail keywords, we focus on linking (blogging + Press Release + Acquisition, etc…) as the main strategy to increase relevancy for 4 major keywords. (Alcohol Rehab, Drug Rehab, Alcohol Treatment, and Drug Treatment)… all these terms have their respective landing pages, and we try to provide a good flow of new links coming to these pages on a weekly basis… Lately we have been acquiring more links than we anticipated… not a bad thing since they are from reputable websites… however I am a bit concern regarding the Anchor Text distribution of these links. Example Let’s say I get 100 links to my ‘Alcohol Rehab’ page – what is an appropriate percentage for the anchor text distribution? For example: Branded Links 20 - Keyword: St Jude Retreats
Intermediate & Advanced SEO | | dhidalgo1
Exact Match Links 70 - Keyword: Alcohol Rehab
Broad Links 10 - Keyword: Rehab Is this an ok distribution, or should I change things around? Hope you guys can help! Thanks!!!!0