How Google could quickly fix the whole Links problem...
-
A Thursday morning brainstorm that hopefully an important Google manager will see...
Google could quickly end all the problems of link buying, spammy links, and negative SEO with one easy step: Only count the 100 best follow links to any domain.
Ignore all the nofollows and everything beyond the 100 best. They can choose what "best" means.
Suddenly links would be all about quality. Quantity would not matter. Fiverr links, comment links, and all the other mass-produced spam links would literally be ignored. Unless that's all a domain had, and then they would surely be stomped by any domain with 100 decent natural links.
Would it be an improvement over today's situation?
-
It's a cool thought, and it very well be an improvement over today's situation. The major reservation that jumped out at me was that it would hurt fresh content. I'm guessing 100 was a number you just threw out there, but whatever the number, once a site hits that number of quality links, they wouldn't need to produce anything else. Well, at least in terms of link building, they wouldn't need to produce anything else.
Also, different size sites would need different numbers. For example, 100 really high quality DA of 90 and above links would take my site a long time to get, but, Moz, for example, probably already has that. If you decide to make the number really high, 10,000 high quality DA of 90 then it might take Moz awhile to get that, but since neither myself, nor any of my competitors will ever reach number, it's not much of a deterrent for us to get fivver, comment links.
- Ruben
-
Hi Gregory,
I like your thinking, but Google would then need to rely heavily on other ranking factors in order to establish higher authority between websites that have equally strong links in their top 100. There's plenty of websites that have equally fantastic, relevant and followed links in their 100 - how would Google decide which one is most relevant and authoritative?
Thinking about it, it could become quite problematic. A new startup entering an established market with a strong product or USP might be able to acquire 100+ high-quality press links and suddenly level up with a competitor who's been in the game for years - perhaps even a heritage brand - and has the extensive portfolio of good quality links to prove it.
I like the idea, but I don't think Google uses its other ranking signals enough yet to take such drastic action.
Don't you think?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Will editorial links with UTM parameters marked as utm_source=affiliate still pass link juice?
Occasionally some of our clients receive editorial mentions and links in which the author adds utm parameters to the outbound links on their blog. The links are always natural, never compensated, and followed. However, they are sometimes listed as utm_source=affiliate even thought we have no existing affiliate relationship with the author. My practice has been to ask the author to add a rel="norewrite" attribute to the link to remove any trace of the word affiliate. I have read that utm parameters do not affect link juice transfer, however, given the inaccurate "affiliate" source, I wouldn't want Google to misunderstand and think that we are compensating people for followed editorial links. Should I continue following this practice, or is it fine to leave these links as they are? Thanks!
Industry News | | Terakeet0 -
Google Analytics (Not Provided) Count will Increase 100% by Oct 2014 ? - Your Advice ?
What will you Do if you cannnot find your Top Keyword in Google Analytics "not provided"
Industry News | | Esaky
Check here for more details: http://www.notprovidedcount.com/0 -
Google number one search result looks drastically different in firefox compared to chrome
I just noticed this today that some websites and brands look like this on firefox only, and others while still being number one result for their brand name, do not appear like this at all. also, this does not happen over chrome at all. both images provided for comparison are using the same google apps account logged in. It would be nice if someone could shed some light on as to why this happens sporadically and what does it take to be distinguished like this for your own brand if you own the identical domain.com or whatever. Zz7ZkX5.png lpuwheo.png
Industry News | | Raydon0 -
Google Pushed Out Panda Update 3.9 Last Night
Google said it was rolling out a Panda algorithm last night. New data refresh of Panda starts rolling out tonight. ~1% of search results change enough to notice. More context: goo.gl/huekfHas anyone been affected yet ....I seem to see a jump in some newer sites less than 3 months old already for some rankings - I don't like seeing rankings that quick!Do you see anything?
Industry News | | Chenzo1 -
Problem with SOME Indian based SEO companies, HELP!
First of all I want to say that I hire 2 great Indian SEOs at my company and this question is in no way meant to offend, or single anyone out. However It's come to a point where the amount of emails we receive for "Ethical SEO, High PR Guaranteed Link building Services, SEO professionals etc" from Indian based companies is costing us a lot of time on a daily basis to filter out spam from real enquiries. Blocking their emails is not even working as they use gmail accounts and multiple domain emails so we can't keep up with them. I have even spoken to some of the owners of these 'companies' and they admit using different email accounts so as not to be blacklisted. They also seem to believe that the opt-out option (which is legally required when sending out promotional emails) is itself optional!!!!! Now when I asked how exactly they were getting my email they said that they get info from the, and I quote "first page of Google"!!!!! So my question is the following; is there anyway I can block my site from showing up in a particular country altogether? Again this is in no way attacking ALL Indian based SEO companies, my beef is with the 74 (we counted) different 'companies' that are flooding our email with offers P.S. Is anyone else having these issues?
Industry News | | MassivePrime0 -
What is the best method for getting pure Javascript/Ajax pages Indeded by Google for SEO?
I am in the process of researching this further, and wanted to share some of what I have found below. Anyone who can confirm or deny these assumptions or add some insight would be appreciated. Option: 1 If you're starting from scratch, a good approach is to build your site's structure and navigation using only HTML. Then, once you have the site's pages, links, and content in place, you can spice up the appearance and interface with AJAX. Googlebot will be happy looking at the HTML, while users with modern browsers can enjoy your AJAX bonuses. You can use Hijax to help ajax and html links coexist. You can use Meta NoFollow tags etc to prevent the crawlers from accessing the javascript versions of the page. Currently, webmasters create a "parallel universe" of content. Users of JavaScript-enabled browsers will see content that is created dynamically, whereas users of non-JavaScript-enabled browsers as well as crawlers will see content that is static and created offline. In current practice, "progressive enhancement" in the form of Hijax-links are often used. Option: 2
Industry News | | webbroi
In order to make your AJAX application crawlable, your site needs to abide by a new agreement. This agreement rests on the following: The site adopts the AJAX crawling scheme. For each URL that has dynamically produced content, your server provides an HTML snapshot, which is the content a user (with a browser) sees. Often, such URLs will be AJAX URLs, that is, URLs containing a hash fragment, for example www.example.com/index.html#key=value, where #key=value is the hash fragment. An HTML snapshot is all the content that appears on the page after the JavaScript has been executed. The search engine indexes the HTML snapshot and serves your original AJAX URLs in search results. In order to make this work, the application must use a specific syntax in the AJAX URLs (let's call them "pretty URLs;" you'll see why in the following sections). The search engine crawler will temporarily modify these "pretty URLs" into "ugly URLs" and request those from your server. This request of an "ugly URL" indicates to the server that it should not return the regular web page it would give to a browser, but instead an HTML snapshot. When the crawler has obtained the content for the modified ugly URL, it indexes its content, then displays the original pretty URL in the search results. In other words, end users will always see the pretty URL containing a hash fragment. The following diagram summarizes the agreement:
See more in the....... Getting Started Guide. Make sure you avoid this:
http://www.google.com/support/webmasters/bin/answer.py?answer=66355
Here is a few example Pages that have mostly Javascrip/AJAX : http://catchfree.com/listen-to-music#&tab=top-free-apps-tab https://www.pivotaltracker.com/public_projects This is what the spiders see: view-source:http://catchfree.com/listen-to-music#&tab=top-free-apps-tab This is the best resources I have found regarding Google and Javascript http://code.google.com/web/ajaxcrawling/ - This is step by step instructions.
http://www.google.com/support/webmasters/bin/answer.py?answer=81766
http://www.seomoz.org/blog/how-to-allow-google-to-crawl-ajax-content
Some additional Resources: http://googlewebmastercentral.blogspot.com/2009/10/proposal-for-making-ajax-crawlable.html
http://www.seomoz.org/blog/how-to-allow-google-to-crawl-ajax-content
http://www.google.com/support/webmasters/bin/answer.py?answer=357690 -
Google's New Release "What do you love"
Is this gonna be a game changer for SEO http://www.wdyl.com/ Regards, Shailendra Sial
Industry News | | IM_Learner0 -
How does recent Google update affect e-commerce sites:
Most ecommerce sites use the original manufacturer product descriptions in their content. The product features and specifications are the content made by the manufacturer. Sometimes manufacturers insist that the ecommerce sites should use their original content and it is impossible to change what available in the original content and rewrite it.
Industry News | | IM_Learner1