Difference between Google's link: operator and GWT's links to your sites
-
I haven't used the Google operator link: for a while, and I noticed that there is a big disparity between the operator "link:" and the GWT's links to your site.
I compared these results on a number of websites, my own and competitors, and the difference seem to be the same across the board.
Has Google made a recent change with how they display link results via the operator?
Could this be an indication that they are clean out backlinks?
-
Thanks Jeepster, that video answered my question. It just seems like Google was showing a lot more links on the link: search than they currently are.
-
Hi bstone81.
Not sure I understand your question. Assuming you're asking why there's a disparity between the "link:" command and what you see in GWT's, here's Mr Cutts himself. From 2009 but I suspect it's still relevant:
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google sidebar advertising dropped
Has anyone noticed how the google sidebar advertising has completely disappeared? They only display top 4 adwords and then remaining on the bottom of each search page. I can't find any info on it or when it actually happened?
Algorithm Updates | | Purplesars110 -
Google AMP (accelerated mobile pages), can it be used for non-Google news and Ecommerce Websites?
Mozzers, I've been doing a lot of research on Google's new Accelerated Mobile Pages (AMP) https://moz.com/blog/accelerated-mobile-pages-whiteboard-friday. From what I'm seeing, these AMP version websites are only for Google News-worthy websites such as New York Times, Cosmopolitan, and the BuzzFeeds of the world. But what about Ecommerce websites like Ebay or Amazon? Will AMP versions of "scotch tape" via OfficeDepot work in the SERP's on non-Google News cards?
Algorithm Updates | | Shawn1240 -
Help Me Change My Client's Mind
My client wants to build a second site to provide targeted links for SEO to his main site. He's interested in buying a TLD with some near topic authority/links and then build the second site's authority up from there. He is clear that this could get him in trouble for a link scheme, but thinks it can all be hidden from Google. Off the top of my head I was able to recall a few of the pain-in-the-neck things you'd have to do to not get caught, but he seemed unconvinced. I recall you'd have to have: Different registrar Different contact/WhoIs Different site host Different G/A, GWT Logging into second's site's G/A, GWT with different IP address not used for main domain With the exception of the last one, he didn't seem to think it would be too hard. Aren't there more difficult maneuvers required for hiding this from Google? I want to be able to point out to him how ridiculous this low integrity effort will be, without losing the client. Thanks! Best... Darcy
Algorithm Updates | | 945010 -
New site or subdomain
what are pros and cons of launching a new product site as opposed to placing it under a subdomain of the company site? will the new site be placed in the google sandbox? the main goal is to provide credibility for the product, and by placing it under the company site that has been live for over 10 years. It is not a consumer product - more dealers. So people would be pushed to the site or find it through the brochure.
Algorithm Updates | | bakergraphix_yahoo.com0 -
Same page but appearing in Google with different titles
I have a page ranking on position 1 for a key phrase. The key phrase is the title of the page as well. I'll use a mock key phrase to aid my question - "Teeth and Gums" So the page is ranking number 1 for "Teeth and Gums" and "Teeth and Gums" is the meta title. However, I went ahead and did a new search adding an additional keyword to the original search. When I did a new search adding an additional keyword to the original search, Google has done something weird.. Let's say the search is "Dentistry - Teeth and Gums", Google has ranked my page again as number 1 but changed the title. The title in the search result is now "Dentistry - Teeth and Gums" How and why? It's kinda like Google PPC's keyword insertion but the title hasn't got anything weird like {KeyWord: Dentistry}. It's just "Teeth and Gums" Has this happened to you guys? Any ideas?
Algorithm Updates | | Bio-RadAbs0 -
How do blog comment/forum back links compare to editorial back links?
I know that Google prefers a varied back link profile, and so it's ideal to get both - but I wanted to know, are followed back links from blog comments, forum posts etc. (i.e. The low-hanging fruit) weighted significantly lower by Google than links appearing within the of a page, for example? If so, is it possible to quantify by how much?
Algorithm Updates | | ZakGottlieb710 -
Any way to tell if a link has been devalued?
I have some listings in lawyer directories some of which have very hig PR , links, traffic, etc. For example, www.nolo.com, I know that Google has more or less recently devalued a lot of directory links. I would assume that a monster site like nolo would not be one of those, but does anyone know any way to tell? Paul
Algorithm Updates | | diogenes0 -
Removing secure subdomain from google index
we've noticed over the last few months that Google is not honoring our main website's robots.txt file. We have added rules to disallow secure pages such as: Disallow: /login.cgis Disallow: /logout.cgis Disallow: /password.cgis Disallow: /customer/* We have noticed that google is crawling these secure pages and then duplicating our complete ecommerce website across our secure subdomain in the google index (duplicate content) https://secure.domain.com/etc. Our webmaster recently implemented a specific robots.txt file for the secure subdomain disallow all however, these duplicated secure pages remain in the index. User-agent: *
Algorithm Updates | | marketing_zoovy.com
Disallow: / My question is should i request Google to remove these secure urls through Google Webmaster Tools? If so, is there any potential risk to my main ecommerce website? We have 8,700 pages currently indexed into google and would not want to risk any ill effects to our website. How would I submit this request in the URL Removal tools specifically? would inputting https://secure.domain.com/ cover all of the urls? We do not want any secure pages being indexed to the index and all secure pages are served on the secure.domain example. Please private message me for specific details if you'd like to see an example. Thank you,0