Hackers are selling fake 'Likes' on FB, Instragram
-
An interesting article on how to get social media buzz:
http://www.huffingtonpost.com/2013/08/16/fake-instagram-likes_n_3769247.html
-
Checking the news today. The gentlemen apparently did received the $500.
-
Nice! Thank you David for the share.
-
Wow. Look at how much negative PR this created. Much more than $500. I'm sure they get plenty of emails on vulnerabilities, but each one should be looked at. If not, look at what happens...
-
Yeah. I believe it was Ian Lurie @ Portent who said "FB needs to hire this guy"
-
If I was the boss at FB... this guy would have been paid - more than $500 - and given a hot line to the chief of security.
-
Can you believe the security head telling the guy he won't get paid?
It seems the security engineer shouldn't be paid.
-
Here's a fun Facebook hacker story http://rt.com/news/facebook-post-exploit-hacker-zuckerberg-621/#.UhJPVHjA3Q8.twitter
-
Not quite Hacking but despicable all the same.
See this video clip from the UK investigation programme 'Dispatches' - 'Click farms': how some businesses manipulate social media - Channel 4 Dispatches video trailer. I'm not sure if you can see the programme outside the UK but you should get the general idea from this 'Guardian' posting.
People bent on fraud and shortest route to quick gains will try anything Christopher
http://www.theguardian.com/media/video/2013/aug/02/click-farms-social-media-video
David
-
Not really sure of the question here. This has been around awhile. Like all these schemes they really do not add any long term value. Talk to newt Ginigritch ;). http://www.theguardian.com/commentisfree/cifamerica/2011/aug/04/newt-gingrich-twitter
-
No idea.
Incentivizing social is easier than incentivizing backlinks and there's a quite a bit of gray area in acquiring backlinks.
-
Is there a way Google can detect hacked social buzz vs those who pay FB to boost a post?
Best,
Christopher -
I hope Google is reading and adjusting social algo indicators accordingly.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Nuisance visitors to non active page. What's going on?
Hi Guys, for the past several months, I get high volume of searches on a non-existing page /h/9249823.html. These searches come from all over the world from different domains and have a zero session duration. They are automatically forwarded to my home page. The source re Google Analytics is 12-reasons-for-seo.com. The full referrer is 12.reasons-for-seo.com/seo2php. Any idea what is provoking this activity? Any chance it's screwing with my legitimate search results or rankings?
White Hat / Black Hat SEO | | Lysarden0 -
Infinite Scrolling on Publisher Sites - is VentureBeat's implementation really SEO-friendly?
I've just begun a new project auditing the site of a news publisher. In order to increase pageviews and thus increase advertising revenue, at some point in the past they implemented something so that as many as 5 different articles load per article page. All articles are loaded at the same time and from looking in Google's cache and the errors flagged up in Search Console, Google treats it as one big mass of content, not separate pages. Another thing to note is that when a user scrolls down, the URL does in fact change when you get to the next article. My initial thought was to remove this functionality and just load one article per page. However I happened to notice that VentureBeat.com uses something similar. They use infinite scrolling so that the other articles on the page (in a 'feed' style) only load when a user scrolls to the bottom of the first article. I checked Google's cached versions of the pages and it seems that Google also only reads the first article which seems like an ideal solution. This obviously has the benefit of additionally speeding up loading time of the page too. My question is, is VentureBeat's implementation actually that SEO-friendly or not. VentureBeat have 'sort of' followed Google's guidelines with regards to how to implement infinite scrolling https://webmasters.googleblog.com/2014/02/infinite-scroll-search-friendly.html by using prev and next tags for pagination https://support.google.com/webmasters/answer/1663744?hl=en. However isn't the point of pagination to list multiple pages in a series (i.e. page 2, page 3, page 4 etc.) rather than just other related articles? Here's an example - http://venturebeat.com/2016/11/11/facebooks-cto-explains-social-networks-10-year-mission-global-connectivity-ai-vr/ Would be interesting to know if someone has dealt with this first-hand or just has an opinion. Thanks in advance! Daniel
White Hat / Black Hat SEO | | Daniel_Morgan1 -
On the use of Disavow tool / Have I done it correctly, or what's wrong with my perception?
On a site I used GSA search engine ranker. Now, I got good links out of it. But, also got 4900 links from one domain. And, I thought according to ahrefs. One link from the one domain is equal to 4900 links from one domain. So, I downloaded links those 4900 and added 4899 links to disavow tool. To disavow, to keep my site stable at rankings and safe from any future penalty. Is that a correct way to try disavow tool? The site rankings are as it is.
White Hat / Black Hat SEO | | AMTrends0 -
Sudden influx of 404's affecting SERP's?
Hi Mozzers, We've recently updated a site of ours that really should be doing much better than it currently is. It's got a good backlink profile (and some spammy links recently removed), has age on it's side and has been SEO'ed a tremendous amount. (think deep-level, schema.org, site-speed and much, much more). Because of this, we assumed thin, spammy content was the issue and removed these pages, creating new, content-rich pages in the meantime. IE: We removed a link-wheel page; <a>https://www.google.co.uk/search?q=site%3Asuperted.com%2Fpopular-searches</a>, which as you can see had a **lot **of results (circa 138,000). And added relevant pages for each of our entertainment 'categories'.
White Hat / Black Hat SEO | | ChimplyWebGroup
<a>http://www.superted.com/category.php/bands-musicians</a> - this page has some historical value, so the Mozbar shows some Page Authority here.
<a>http://www.superted.com/profiles.php/wedding-bands</a> - this is an example of a page linking from the above page. These are brand new URLs and are designed to provide relevant content. The old link-wheel pages contained pure links (usually 50+ on every page), no textual content, yet were still driving small amounts of traffic to our site.
The new pages contain quality and relevant content (ie - our list of Wedding Bands, what else would a searcher be looking for??) but some haven't been indexed/ranked yet. So with this in mind I have a few questions: How do we drive traffic to these new pages? We've started to create industry relevant links through our own members to the top-level pages. (http://www.superted.com/category.php/bands-musicians) The link-profile here _should _flow to some degree to the lower-level pages, right? We've got almost 500 'sub-categories', getting quality links to these is just unrealistic in the short term. How long until we should be indexed? We've seen an 800% drop in Organic Search traffic since removing our spammy link-wheel page. This is to be expected to a degree as these were the only real pages driving traffic. However, we saw this drop (and got rid of the pages) almost exactly a month ago, surely we should be re-indexed and re-algo'ed by now?! **Are we still being algor****hythmically penalised? **The old spammy pages are still indexed in Google (138,000 of them!) despite returning 404's for a month. When will these drop out of the rankings? If Google believes they still exist and we were indeed being punished for them, then it makes sense as to why we're still not ranking, but how do we get rid of them? I've tried submitting a manual removal of URL via WMT, but to no avail. Should I 410 the page? Have I been too hasty? I removed the spammy pages in case they were affecting us via a penalty. There would also have been some potential of duplicate content with the old and the new pages.
_popular-searches.php/event-services/videographer _may have clashed with _profiles.php/videographer, _for example.
Should I have kept these pages whilst we waited for the new pages to re-index? Any help would be extremely appreciated, I'm pulling my hair out that after following 'guidelines', we seem to have been punished in some way for it. I assumed we just needed to give Google time to re-index, but a month should surely be enough for a site with historical SEO value such as ours?
If anyone has any clues about what might be happening here, I'd be more than happy to pay for a genuine expert to take a look. If anyone has any potential ideas, I'd love to reward you with a 'good answer'. Many, many thanks in advance. Ryan.0 -
Will implementing 301's on an existing domain impact massively on rankings?
Hi Guys,I have a new SEO client who only has the non-www domain setup for GWT and I am wondering if implementing a 301 for www will have a massive negative impact on rankings. I know a percentage of link juice and PageRank will be affected. So my question is: If I implement the 301 should I brace myself for a fall in rankings. Should I use a 301 instead to maintain link juice and PageRank? Is it good practice to forward to www? Or could I leave the non www in place and have the www redirect to it to maintain the data? Dave
White Hat / Black Hat SEO | | icanseeu0 -
What's the right way to gain the benefits of an EMD but avoid cramming the title?
Hi Guys, Say I'm (completely hypothetically) building weddingvenuesnewyork.com and right now I'm organizing the tags for each page. What's the best layout so that I can optimize for "wedding venues new york" as much as possible without it becoming spammy. Right now I'm looking at something like "Wedding Venues New York: Wedding Receptions and Ceremony Venues" for the title.. To get other strong keywords in there too. Is there a better layout/structure?.. And is having the first words of the title on the homepage the same as the domain name going to strengthen the ranking for that term, or look spammy to Google and be a bad move? This is a new site being built
White Hat / Black Hat SEO | | xcyte0 -
Are the Majority of SEO Companies 'Spammers, Evildoers, & Opportunists'?
This may not be the most productive Q&A discussion, but I've had some really interesting experiences this last month that have made me even more distrusting of "SEO" companies. I can't help but think of this post (not much has changed since '09). Even though it takes a pretty extreme stance, I agree with the core of it - _"The problem with SEO is that the good advice is obvious, the rest doesn’t work, and it’s poisoning the web." _ I didn't start doing this type of work wanting to have such a negative opinion of SEO companies, but I just keep having the same experience: I'll get referred to someone who isnt' happy with their SEO company. They send me their web address, I check out the site, and seriously can't believe what I find. MISSING PAGE TITLES, EVERY CANONICAL URL ISSUE IMAGINABLE, AND 10'S OF THOUSANDS OF BOT SPAM EMAT LINKS FROM PAGES LIKE THIS...AND THIS and just recently a company a called one of my clients and conned him into paying for this piece of spam garbage, obviously scraped from the site that I made for him. and what's worse, sometimes for whatever reason these companies will have all the client's FTP and CMS logins and it can be hell trying to get them to hand them over. There's no webmaster tools set up, no analytics, nothing.... These businesses are paying a good chunk of change every month, I just can't believe stuff like this is so common...well acutally, it's what i've come to expect this point. But I used to think most SEO companies actually had their clients best interest at heart. Does every honest consultant out there run into this same type of stuff constantly? How common is this type of stuff really? Now, on to the positive. This community rocks, and I feel like it represents real, ethical, solution-oriented, boundary-less SEO. So thank you Mozzers for all you do. and I love using the tools here to help businesses understand why they need an honest person helping them. If anyone has thoughts on the topic, I'd love to hear 'em...
White Hat / Black Hat SEO | | SVmedia3 -
Anchor text penalty doesn't work?!
How do you think, does the anchortext penalty exactly work? Keyword domains obviously can't over-optimize for their main keyword (for example notebook.com for the keyword notebook). And a lot of non-keyword-domains do optimize especially in the beginning for their main keyword to get a good ranking in google (and it always works). Is there any particular point (number of links) I can reach, optimizing for one keyword, after what i'm gonna get a penalty?
White Hat / Black Hat SEO | | TheLastSeo0