Climate of fear in the world of SEO
-
There certainly appears to be a certain climate of fear about backlinks at the mo, and not without reason.
I was wondering why Google moved from simply discounting links to punishing site owners for their backlink profiles, many of which were built up when the risks of punishment weren't there?
I mean, I could send them the names of at least 1,000 sites in linkfarms / blog rings - you name it. I'm sure most of us on here could do the same.
Responding to the whims of Google is such a waste of time and resources. Why doesn't Google simply choose a direction and stick with it? What is their strategy exactly?
-
Some great feedback here - firstly, thanks EGOL - I'm focusing 100% on content on a new site. Should be interesting - and that's a good point re: vandalism. I am concerned with the consequences of negative SEO / scrapers, clones, etc., though. Would be so good to be able to cut nasty incoming links in some way (I can but dream...) Love that saying too Donnie!
Good points there Marie - yes I get plagued by that stuff too - I'm beginning to wonder whether many of these comments are more about hoping some lunatic will click on the link than about manipulating SEO though.
To be totally honest, I wouldn't mind if Google laid down specific rules for linkbuilding. We advise that site owners should only proactively build no more than 10 links/page from relevant sites. The rest should be generated naturally. Something far more specific than we have at the moment.
And thanks Arpeggio. A very good point indeed. I agree.
-
The more advanced technology and logistics etc. becomes the further away human accountabilty becomes. I think thats a major challenge in the modern day in general.
-
I think the latest changes made by Google are accomplishing exactly what Google wants. They want website owners to stop "building links" and instead make the best possible site that gives the user the best possible information.
If they simply discounted links then many people would still go on building them "just in case" they helped. I mean, everyone knows that nofollowed comment spam is very unlikely to be helpful, but I get thousands of crap automated comments on my blog each month that are killed by Askimet, so people are still doing it.
But by building a culture of fear around links they've managed to get a lot of people in the SEO world saying, "Man! If I keep building links I could get a big penalty and my site could tank." The result? People stop building links.
Now, there are some links that are not a bad thing to build and this is the scary thing. People will be afraid to get ANY links to their site and that's not right. I know of someone who got the Better Business Bureau to remove all links to their site because they thought it could look unnatural. That is a good link
-
Thanks
-
"Give the people what they want and Google will give you to the people"
Thanks... that's a great saying!
-
I was wondering why Google moved from simply discounting links to punishing site owners for their backlink profiles, many of which were built up when the risks of punishment weren't there?
Google finally realized that merely "discounting" the links was resulting in a continued vandalism of blogs and forums as linkbuilders deposit their rubbish.
Why doesn't Google simply choose a direction and stick with it? What is their strategy exactly?
I think that they have "stuck" with their use of links for way too long.
Responding to the whims of Google is such a waste of time and resources.
A method to try would be to place 100% of your effort into building content and allow the links to slowly build on their own. This will start very slowly but will build to a rate that reflects the value of your content.
-
They want to give users the best results possible, by ensuring that their SERPs are not easily manipulated they can ensure a better overall user experience.
My saying has always been:
"Give the people what they want and Google will give you to the people"
Its quite simple.. they want sites that have a natural link profile and a great user experience (bookmarked, linked to, or shared)
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
SEO threats of moving from [.com.au] domain to [.com] domain for a 15yr old SAAS company.
Hey Guys. I work for a 15 yr old SAAS company which originally started with a country-specific [.com.au] domain and later got a [.com] domain as the business grew. The AU website has a DA:56 while the [.com] has as DA: 25. Now we are looking to have everything migrated to the [.com] domain. But, my concern is that we might lose the SEO value of the AU domain. I was wondering if anyone has any experience in this or recommend a case study on this topic. Thanks! Allan
Algorithm Updates | | allanhenryjohn0 -
ATTN SEO MINDS: Is there a way/tool to categorize keywords from an Omniture/GA report?
So ideally I would like to take the list of keywords I am currently ranking for, and group these based on what the user intent was in making that query. For example if I am a Thai delivery chain and I am currently receiving traffic from the queries "vegan dish" and "tofu thai food", I would want to have a column in a keyword report that says these queries fall into the VEGETARIAN category. I think what I want to know is how can I filter a massive list by a range of keywords? I want to know does this cell contain, "keyword A" or "keyword B" or "keyword Z". If so list the corresponding category. This way I can look at keyword performance by category or user intent/motivation. Is there a tool out there that will help me accomplish this, or is there a good solution in excel I can use?
Algorithm Updates | | Jonathan.Smith0 -
What is your experience with markups (schema.org) in terms of SEO and best practice learnings?
Hi, I am looking to implement schema markups into a variety of websites and currently wondering about best practices. I am working on energy providers, building material, e-retailers, social association among others. While I understand every single one of these is an individual case, I could do with some advices from you, guys. Which markups would you consider key for search engines? I would have naturally chosen markups to highlight the business name, location and products but there is so much more to schema.org! Thanks,
Algorithm Updates | | A_Q0 -
New feature in seo results with icon?
I have never seen it before in the search: an icon in the title. Do you guys know how to get this icon in the title? See here: http://snag.gy/e7BiI.jpg e7BiI.jpg
Algorithm Updates | | Emilija1 -
Local SEO NAP - Two Different Cities....Same Zip Code
I've come across this recently and wanted to get your thoughts. I personally live in a city called Greenacres (yes, it's the place to be) but my zip code is also for Lake Worth. I'm a local SEO company so doing Local SEO stuff is pretty pointless (Google changed that in 2010) but I am sure other people have this issue for their business. Question, What do you do when your zip code is for two different cities. Do you try to make all NAPs (Name Address Phone Numbers) the same city. What if you cant'? Does having the NAP show up different cities hurt your efforts? etc. Obviously I think you'd try to keep the NAP as consistent as possible but what do you do if the citation source changes it or only uses the major of the two cities? There isn't a right or wrong answer (or maybe there is) but I wanted to get some thoughts on it. Darin.
Algorithm Updates | | DarinPirkey0 -
Non .Com or .Co Versus .ca or .fm sites - In terms of SEO value
We are launching a new site with a non traditional top level domain . We were looking at either .ca or .in as we are not able to get the traditional .com or .co or .net etc . I was wondering if this has any SEO effect ? Does Google/Bing treat this domain differently .Will it be penalized ? Note : My site is a US based site targeting US audience
Algorithm Updates | | Chaits0 -
SEO Test: Domain Hyphenation [Update]
In May I announced test results for domain hyphenation but after a 3 month followup the results have changed and the hyphenated domain now wins on what seems to be the first link instance advantage. I was unable to discover any other factors which may have influenced this test but if anyone has any ideas I would like to hear about it. Here are the details of the SEO test and revealed URLs.
Algorithm Updates | | Dan-Petrovic1 -
What are the good strategies using satellite sites in SEO??
Hello to everybody, We'are thinking about launching a massive amount of satellite websites in order to promote our website. Is it really efficient in terms of link building? Or is the ROI really small due to the amount of time and money needed to create and manage these websites? Thanks a lot!!! Update: Thanks to all of you for all these interesting answers!
Algorithm Updates | | sarenausa1