Google showing 10 million less links than October
-
I've received no messages from Google about 'iffy' links whatsoever, and the links they're reporting in Webmaster Toosl have declined by 10 MILLION since October.
We did go through a CMS upgrade in December which I believe had some impact, and then I set a preferred domain at the end of last month, but we were bleeding links before then.
Any idea what could have happened? We don't engage in any link building schemes whatsoever, and like I mentioned, I've received no messages at all from Google regarding a penalty.
-
Thanks Keri. Cue, the song, "Wind beneath my wings." Thanks for helping me avoid self-flagellation
Cheers.
-
If it was just today that you noticed, it might not be just your site. In the past day, there have been a bunch of people reporting a lot fewer links in GWT. See the post at Search Engine Roundtable http://www.seroundtable.com/google-webmaster-tools-link-count-16315.html. There's also been several threads here in Q&A about it, too.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
HTTPs to HTTP Links
Hi Mozers, I have a question about the news that Google Chrome will start blocking mixed content starting in December 2019. That starting in December 2019, users that are presented insecure content will be presented a toggle allowing those Chrome users to unblock the insure resources that Chrome is blocking. And in January 2020, Google will remove that toggle option an will just start blocking mixed content or insecure web pages. Not sure what this means. What are the implications of this for a HTTPS page that has an HTTP link? Thanks, Yael
Intermediate & Advanced SEO | | yaelslater0 -
Large Number of Links appearing in Google Webmaster Tools
Hello, In the last week we have noticed an extremely large number of backlink links appearing in Google Webmaster Tools. One of the sites which links to us now have over 101,000 backlinks pointing to us, when in reality it should only have 300-600. We have check the websites have not been hacked, with hidden links etc, but we can not find any. Has anyone else experienced problems with Google webmaster tools lately, displaying way too many links? Or could this be a negative SEO attack, which is yet to emerge. Thanks Rob
Intermediate & Advanced SEO | | tomfifteen0 -
Dropped from Google?
My website www.weddingphotojournalist.co.uk appears to have been penalised by Google. I ranked fairly well for a number of venue related searches from my blog posts. Generally I'd find myself somewhere on page one or towards the top of page two. However recently I found I am nowhere to be seen for these venue searches. I still appear if I search for my name, business name and keywords in my domain name. A quick check of Yahoo and I found I am ranking very well, it is only Google who seem to have dropped me. I looked at Google webmaster tools and there are no messages or clues as to what has happened. However it does show my traffic dropping off a cliff edge on the 19th July from 850 impressions to around 60 to 70 per day. I haven't made any changes to my website recently and hadn't added any new content in July. I haven't added any new inbound links either, a search for inbound links does not show anything suspicious. Can anyone shed any light on why this might happen?
Intermediate & Advanced SEO | | weddingphotojournalist0 -
Why are bit.ly links being indexed and ranked by Google?
I did a quick search for "site:bit.ly" and it returns more than 10 million results. Given that bit.ly links are 301 redirects, why are they being indexed in Google and ranked according to their destination? I'm working on a similar project to bit.ly and I want to make sure I don't run into the same problem.
Intermediate & Advanced SEO | | JDatSB1 -
Disavowin a sitewide link that has Thousands of subdomains. What do we tell Google?
Hello, I have a hosting company that partnered up with a blogger template developer that allowed users to download blog templates and have my footer links placed sitewide on their website. Sitewides i know are frowned upon and that's why i went through the rigorous Link Audit months ago and emailed every webmaster who made "WEBSITENAME.Blogspot.com" 3 times each to remove the links. I'm at a point where i have 1000 sub users left that use the domain name of "blogspot.com". I used to have 3,000! Question: When i disavow these links in Webmaster tools for Google and Bing, should i upload all 1000 subdomains of "blogspot.com" individually and show Google proof that i emailed all of them individually, or is it wise to just include just 1 domain name (www.blogspot.com) so Google sees just ONE big mistake instead of 1000. This has been on my mind for a year now and I'm open to hearing your intelligent responses.
Intermediate & Advanced SEO | | Shawn1240 -
Google Manual Penalties:Different Types of Unnatural Link Penalties?
Hello Guys, I have a few questions regarding google manual penalties for unnatural link building. They are "partial site" penalties, not site wide. I have two sites to discuss. 1. this site used black hat tactics and bought 1000's of unnatural backlinks. This site doesn't rank for the main focus keywords and traffic has dropped. 2. this site has the same penalty, but has been all white hat, never bought any links or hired any seo company. It's all organic. This sites organic traffic doesn't seem to have taken any hit or been affected by any google updates. Based on the research we've done, Matt Cutts has stated that sometimes they know the links are organic so they don't penalize a website, but they still show us a penalty in the WMT. "Google doesn't want to put any trust in links that are artificial or unnatural. However, because we realize that some links may be outside of your control, we are not taking action on your site's overall ranking. Instead, we have applied a targeted action to the unnatural links pointing to your site." "If you don't control the links pointing to your site, no action is required on your part. From Google's perspective, the links already won't count in ranking. However, if possible, you may wish to remove any artificial links to your site and, if you're able to get the artificial links removed, submit areconsideration request. If we determine that the links to your site are no longer in violation of our guidelines, we’ll revoke the manual action." Check that info above at this link: https://support.google.com/webmasters/answer/2604772?ctx=MAC Recap: Does anyone have any experience like with site #2? We are worried that this site has this penalty but we don't know if google is stopping us from ranking or not, so we aren't sure what to do here. Since we know 100% the links are organic, do we need to remove them and submit a reconsideration request? Is it possible that this penalty can expire on its own? Are they just telling us we have an issue but not hurting our site b/c they know it's organic?
Intermediate & Advanced SEO | | WebServiceConsulting.com0 -
Does Google WMT download links button give me all the links they count
Hi Different people are telling me different things I think if I download "all links" using the button in WMT to excel, I am seeing all the links Google is 'counting' when evaluating my site. is that right?
Intermediate & Advanced SEO | | usedcarexpert0 -
Random Google?
In 2008 we performed an experiment which showed some seemingly random behaviour by Google (indexation, caching, pagerank distributiuon). Today I put the results together and analysed the data we had and got some strange results which hint at a possibility that Google purposely throws in a normal behaviour deviation here and there. Do you think Google randomises its algorithm to prevent reverse engineering and enable chance discoveries or is it all a big load balancing act which produces quasi-random behaviour?
Intermediate & Advanced SEO | | Dan-Petrovic0