Does Google+ make a huge difference?
-
I run a website that's been ranked well for good keywords related to our business for some time. It was founded back in 2007 and has been there a while.
Recently a new site has popped up that ranks brilliantly for everything. It's a new site, and the only redeeming factor I can see is that it has an AddThis box showing the Facebook Likes and Google Plus Ones, and they are around 400 Facebook Likes and 80 Google+ (for every page that ranks).
Any other pages on their site which doesn't have any Facebook likes or Google Plus Ones, they don't rank.
Our site doesn't have any likes or pluses. Is this making the difference? I stress that other than this our sites are very similar, other than the fact we've been around over 5 years.
-
Just to follow up on this, today the competitor's site has disappeared from Google.
Again one up for decency! Glad to see things like this being punished.
-
Yes, all I am saying is the numbers are double. I have my own problems with google, so I am the last one to say how it really works, but maybe their ugly links don't fit the spam profile that would get them squashed.
Remember, google is not perfect. They can screw up just as any of us can. There must be hundreds of thousands or even millions of examples where people are looking at results and saying "why is that crap ahead of my site?"
-
I get most of what you say, except that they have put betting in their subheads. When looking at "sportsbet" as a google.com.au search term, I really don't understand why they should be ranked SO far ahead of us.
Regarding the linking domains, again, it's just spam. The links aren't real natural ones, and I don't want to go down that road.
I'm really losing faith/interest, call it what you will, in this game when a site like this is, to me, pretty clearly using dodgy tactics and is having this success
-
You competitor's site does have a lot of likes and a lot of G+ hits.
The numbers of the G+ are very close for each page.
That may mean they just bought 70 G+ hits or maybe they made their visitors hit up all their pages for some benefit.
I've had G+ on my site all this year. We've done almost nothing to get people to like or G+ us - they just do it on their own.
We have just over 1000 likes and 40 G+
So for that site to have 70 G+ on most pages and 100 on the front page, seems very suspicious to me.
It could be they have fooled google. They have also done some great keyword stuffing in the text near the bottom of the front page. Many of our stories only get 5 to 8 G+ hits. I think maybe one page has 12. Google WMT says we don't have enough for them to show any stats. Also, it doesn't appear that we get much - if any - benefit from G+ hits. To begin with, G+ was a liability as all it did was slow down our pages.
They also have bold and put Betting in their subheads.
They also have double the linking domains you have
Looks close to over-optimization, but maybe its not quite enough for the google algorithm to flag it.
So all of that said, I think they are beating you because of their onpage and offpage effort. You have done something similar to them, but they just did it better.
-
I mean the corresponding links to that specific page, which I agree, are spammy.
-
When you say corresponding links, do you mean spam? External links?
-
.02 on quick glance:
It probably has more to do with the fact that they're specifically targeting "sportsbet" on that page and have built corresponding links. Unfortunately, doesn't look like Penguin has got to this one...yet...
Again, this is my opinion after a very brief look.
-
Ok, here are the websites in question.
- My site: http://bit.ly/MvT3gI
- Competitior: http://bit.ly/N5fS0N
Here's an example of a search term that we are nowhere for - "sportsbet", and they are ranked around #4, which is a very good ranking: http://bit.ly/Mjwe4v
The rankings are very similarly good for all his pages which refer to each bookmaker. All have lots of Facebook likes and Google+.
The reason I think he's paid for the social likes is because the sites really aren't the sort to become viral, and get links in a "real" way.
Appreaciate any input into this!
-
Mark,
You can post URLs, however, you might consider using a URL shortener service. If you're willing to share the searches and sites, I'd be happy to offer my .02.
-
Social has/is becoming a increasing factor in rankings and will only become bigger in the future. I would strongly recommend getting into the social aspect. As for the "paying" for likes and such, defiantly stay away from that. Social is not all about trying to get higher in the ranking but about Brand Recognition and Reputation, Communicating with your fan base, customers and clients.
I know businesses that get around 50% of their customers from social networks such as Facebook and twitter. Its defiantly worth getting into and from what I have seen in the past 2 years, it’s no longer an option.
-
Thanks for that. I'm not sure what the rules are regarding putting URLs on here? That's why I haven't put the addresses up yet.
I know the site hasn't got good real social interactions, purely because of the type of site it is. I'm 99% sure that the owner has gotten these likes/pluses through paying people to like/plus the site, or something similar.
I don't want to go along the lines of fighting fire with fire, but if it works as well as it appears to with their site, then it's sure tempting.
-
It’s hard to tell since I can't see and compare both sites that you're talking about but that could very well be a contributing factor. It’s no secret that Google is putting more and more weight on social signals such as likes, followers, and social interactions. It sounds like that site has good social interaction and is getting rewarded in the rankings by Google but I can’t be 100% since I can’t compare the two.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Script must not be placed outside HTML tag? If not, how Google treats the page?
Hi, We have recently received the "deceptive content" warning from Google about some of our website pages. We couldn't able to find the exact reason behind this. However, we placed some script outside the HTML tag in some pages (Not in the same pages with the above warning). We wonder whether this caused an issue to Google to flag our pages. Please help. Thanks
White Hat / Black Hat SEO | | vtmoz0 -
Google spider
If someone provide 1 or more cent discount to our customers who put up a link on their site, and wanted to actually show the referral discount in their shopping cart for that customer, can Google see that and realize they are providing a discount for a link? Can Google see what's displayed in our their web application - like in the upload, shopping cart and complete transaction pages?
White Hat / Black Hat SEO | | K_Monestel0 -
Is bad English detected by Google
Hi, I am based in the UK and in a very competitive market - van leasing - and I am thinking about using an Indian SEO company for my ongoing SEO. They have sent me some sample artilces that they have written for link building and the English is not good. Do you think that google can tell the difference between a well written article and a poorly written article? Will the fact that articles are poorly writtem mean we will lose potential value from the link? Any input would be much appreciated. Regards John J
White Hat / Black Hat SEO | | Johnnyh0 -
Google penalty having bad sites maybe and working on 1 good site ?!!!
I have a list of websites that are not spam.. there are ok sites... just that I need to work on the conent again as the sites content might not be useful for users at 100%. There are not bad sites with spammy content... just that I want to rewrite some of the content to really make great websites... the goal would be to have great content to get natual links and a great user experience.. I have 40 sites... all travel sites related to different destinations around the world. I also have other sites that I haven't worked on for some time.. here are some sites: www.simplyparis.org
White Hat / Black Hat SEO | | sandyallain
www.simplymadrid.org
www.simplyrome.org etc... Again there are not spam sites but not as useful as they coul become... I want to work on few sites only to see how it goes.... will this penalise my sites that I am working on if I have other sites with average content or not as good ? I want to make great content good for link bait 🙂0 -
Why does Google recommend schema for local business/ organizations?
Why does Google recommend schema for local business/ organizations? The reason I ask is I was in Structed Data Testing Tool, and I was running some businesses and organizations through it. Yet every time, it says this "information will not appear as a rich snippet in search results, because it seems to describe an organization. Google does not currently display organization information in rich snippets". Additionally, many of times when you do search the restaurant or a related query it will still show telephone number and reviews and location. Would it be better to list it as a place, since I want to have its reviews and location show up thanks? I would be interested to hear what everyone else opinions are on this thanks.
White Hat / Black Hat SEO | | PeterRota0 -
Massive drop in Google traffic after upping pagecount 8-fold.
I run a book recommendation site -- Flashlight Worthy. It's a collection of original, topical book lists: "The Best Books for Healthy (Vegetarian) Babies" or "Keystone Mysteries: The Best Mystery Books Set in Pennsylvania" or "5 Books That Helped Me Discover and Love My Italian Heritage". It's been online for 4+ years. Historically, it's been made up of: a single home page ~50 "category" pages, and ~425 "book list" pages. (That 50 number and 425 number both started out much smaller and grew over time but has been around 425 for the last year or so as I've focused my time elsewhere.) On Friday, June 15 we made a pretty big change to the site -- we added a page for every Author who has a book that appears on a list. This took the number of pages in our sitemap from ~500 to 4,149 overnight. If an Author has more than one book on the site, the page shows every book they have on the site, such as this page: http://www.flashlightworthybooks.com/books-by/Roald-Dahl/2805 ..but the vast majority of these author pages have just one book listed, such as this page: http://www.flashlightworthybooks.com/books-by/Barbara-Kilarski/2116 Obviously we did this as an SEO play -- we figured that our content was getting ~1,000 search entries a day for such a wide variety of queries that we may as well create pages that would make natural landing pages for a broader array of queries. And it was working... 5 days after we launched the pages, they had ~100 new searches coming in from Google. (Ok, it peaked at 100 and dropped down to a steady 60 or so day within a few days, but still. And then it trailed off for the last week, dropping lower and lower every day as if they realized it was repurposed content from elsewhere on our site...) Here's the problem: For the last several years the site received ~30,000 search entries a month... a little more than 1,000 a day on weekdays, a little lighter on weekends. This ebbed and flowed a bit as Google made tweaked things (Panda for example), as we garnered fresh inbound links, as the GoodReads behemoth stole some traffic... but by and large, traffic was VERY stable. And then, on Saturday, exactly 3 weeks after we added all these pages, the bottom fell out of our search traffic. Instead of ~1,000 entries a day, we've had ~300 on Saturday and Sunday and it looks like we'll have a similar amount today. And I know this isn't just some Analytics reporting problem as Chartbeat is showing the same drop. As search is ~80% of my traffic I'm VERY eager to solve this problem... So: 1. Do you think the drop is related to my upping my pagecount 8-fold overnight? 2. Do you think I'd climb right back into Google's good graces if I removed all the pages at once? Or just all the pages that only list one author (which would be the vasy majority). 3. Have you ever heard of a situation like this? Where Google "punishes" a site for creating new pages out of existing content? Really, it's useful content -- and these pages are better "answers" for a lot of queries. When someone searches for "Norah Ephron books" it's better they land on a page of ours that pulls together the 4 books we have than taking them to a page that happens to have just one book on it among 5 or 6 others by other authors. What else? Thanks so much, help is very appreciated. Peter
White Hat / Black Hat SEO | | petestein1
Flashlight Worthy Book Recommendations
Recommending books so good, they'll keep you up past your bedtime. 😉0 -
Tricky Decision to make regarding duplicate content (that seems to be working!)
I have a really tricky decision to make concerning one of our clients. Their site to date was developed by someone else. They have a successful eCommerce website, and the strength of their Search Engine performance lies in their product category pages. In their case, a product category is an audience niche: their gender and age. In this hypothetical example my client sells lawnmowers: http://www.example.com/lawnmowers/men/age-34 http://www.example.com/lawnmowers/men/age-33 http://www.example.com/lawnmowers/women/age-25 http://www.example.com/lawnmowers/women/age-3 For all searches pertaining to lawnmowers, the gender of the buyer and their age (for which there are a lot for the 'real' store), these results come up number one for every combination they have a page for. The issue is the specific product pages, which take the form of the following: http://www.example.com/lawnmowers/men/age-34/fancy-blue-lawnmower This same product, with the same content (save a reference to the gender and age on the page) can also be found at a few other gender / age combinations the product is targeted at. For instance: http://www.example.com/lawnmowers/women/age-34/fancy-blue-lawnmower http://www.example.com/lawnmowers/men/age-33/fancy-blue-lawnmower http://www.example.com/lawnmowers/women/age-32/fancy-blue-lawnmower So, duplicate content. As they are currently doing so well I am agonising over this - I dislike viewing the same content on multiple URLs, and though it wasn't a malicious effort on the previous developers part, think it a little dangerous in terms of SEO. On the other hand, if I change it I'll reduce the website size, and severely reduce the number of pages that are contextually relevant to the gender/age category pages. In short, I don't want to sabotage the performance of the category pages, by cutting off all their on-site relevant content. My options as I see them are: Stick with the duplicate content model, but add some unique content to each gender/age page. This will differentiate the product category page content a little. Move products to single distinct URLs. Whilst this could boost individual product SEO performance, this isn't an objective, and it carries the risks I perceive above. What are your thoughts? Many thanks, Tom
White Hat / Black Hat SEO | | SoundinTheory0