Did Google just release another massive update in September?
-
Our number of external links has dropped by over 50% in mid-September!
So far our domain authority hasn't been impacted and traffic is only slightly down.
I did not hear of any major Google changes . . . did this happen to anyone else?
-
I don't think SEOMoz has that option but you can check out ahrefs and majesticseo - they both track backlinks gained/lost.
-
Our link building has always been legitimate so I don't think that would explain it. It is very odd to see 20,000 links (half of our total count) disappear in a single week.
It has not seemed to impact much though. At least, not yet . . .
-
How would I go about checking the links that have been dropped? From what I have been told, SEOmoz has no way of looking back at individual historic links . . .
-
Google went through 65 changes for Aug and Sept, not all related to search mind you, but some of these changes were aimed at "helping find more high-quality content from trusted sources". So if half of your external links were coming from link farms or sites that Google deems untrustworthy, that would explain it.
Hope that helps.
Mike
-
Have you checked the links that dropped? Maybe your link or that page just no longer exists. Likewise, G could be discrediting some negative links and therefor you aren't seeing them. There were many updates last month (October) and a few in September.
If your traffic is only slightly down, I would just continue to build high quality links to your site and not do anything drastic.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Blocking Google from telemetry requests
At Magnet.me we track the items people are viewing in order to optimize our recommendations. As such we fire POST requests back to our backends every few seconds when enough user initiated actions have happened (think about scrolling for example). In order to eliminate bots from distorting statistics we ignore their values serverside. Based on some internal logging, we see that Googlebot is also performing these POST requests in its javascript crawling. In a 7 day period, that amounts to around 800k POST requests. As we are ignoring that data anyhow, and it is quite a number, we considered reducing this for bots. Though, we had several questions about this:
Technical SEO | | rogier_slag
1. Do these requests count towards crawl budgets?
2. If they do, and we'd want to prevent this from happening: what would be the preferred option? Either preventing the request in the frontend code, or blocking the request using a robots.txt line? The latter question is given by the fact that a in-app block for the request could lead to different behaviour for users and bots, and may be Google could penalize that as cloaking. The latter is slightly less convenient from a development perspective, as all logic is spread throughout the application. I'm aware one should not cloak, or makes pages appear differently to search engine crawlers. However these requests do not change anything in the pages behaviour, and purely send some anonymous data so we can improve future recommendations.0 -
404 Hurricane Update Page After?
Hi All, I am wondering if anyone could help me decide how I should go about handling a page i plan on removing and could possibly use later on. So, a perfect example is: Let's say a company in Florida posted a page about the stores hours and possibly closing due to the incoming hurricane. Once the hurricane passes and the store is reopened, should I 404 that page since another hurricane could come after? The url for the company is www.company.com/hurricane so this is a url that we would want to use again. I guess we could just 410 and name each url www.company.com/hurricane-irma & www.company.com/hurricane-jose for each new hurricane. I am just wonder what is the best practice for a situation like this. Thanks for the help!
Technical SEO | | aua0 -
Fetch as Google issues
HI all, Recently, well a couple of months back, I finally got around to switching our sites over to HTTPS://. In terms of rankings etc all looks fine and we have not move about much, only the usual fluctuations of a place or two on a daily basis in a competitive niche. All links have been updated, redirects in place, the usual https domain migration stuff. I am however, troubled by one thing! I cannot for love nor money get Google to fetch my site in GSC. No matter what I have tried it continues to display "Temporarily unreachable". I have checked the robots.txt and it is on a new https:// profile in GSC. Has anyone got a clue as I am stumped! Have I simply become blinded by looking too much??? Site in Q. caravanguard co uk. Cheers and looking forward to your comments.... Tim
Technical SEO | | TimHolmes0 -
What's going on with google index - javascript and google bot
Hi all, Weird issue with one of my websites. The website URL: http://www.athletictrainers.myindustrytracker.com/ Let's take 2 diffrenet article pages from this website: 1st: http://www.athletictrainers.myindustrytracker.com/en/article/71232/ As you can see the page is indexed correctly on google: http://webcache.googleusercontent.com/search?q=cache:dfbzhHkl5K4J:www.athletictrainers.myindustrytracker.com/en/article/71232/10-minute-core-and-cardio&hl=en&strip=1 (that the "text only" version, indexed on May 19th) 2nd: http://www.athletictrainers.myindustrytracker.com/en/article/69811 As you can see the page isn't indexed correctly on google: http://webcache.googleusercontent.com/search?q=cache:KeU6-oViFkgJ:www.athletictrainers.myindustrytracker.com/en/article/69811&hl=en&strip=1 (that the "text only" version, indexed on May 21th) They both have the same code, and about the dates, there are pages that indexed before the 19th and they also problematic. Google can't read the content, he can read it when he wants to. Can you think what is the problem with that? I know that google can read JS and crawl our pages correctly, but it happens only with few pages and not all of them (as you can see above).
Technical SEO | | cobano0 -
How Google sees my page
When looking for crawlability issues, what is the difference between using webmaster tools Fetch as google, looking at the cached pages in google index site:mypage.com, or using spider simulator tools.
Technical SEO | | shashivzw0 -
Paid Links - How does Google classify them?
Greetings All, I have a question regarding "Paid Links." My company creates custom websites for other small businesses across the country. We always have backlinks to our primary website from our "Dealer Sites." Would Google and other search engines consider links from our "dealer sites" to be "paid links?" Example:
Technical SEO | | CFSSEO
http://www.atlanticautoinc.com/ is the "dealer site." Would Google consider the links from Atlantic Auto to be a "paid link," and therefor have less of an impact for page rankings, due to it not being organic? Any insight on this matter would be greatly appreciated. Thank you!!!0 -
Google Places, Google Plus, Oh my!
Ok - So I am in the position to try and clean up the current Google places nightmare for a company. Right now there is about 3 or 4 different google places listings for them that they have no control over. So here is what I did: 1. I took control of them all by verifying via phone and confirmed all of them. 2. I suspended all the listings but 1 3. I edited the one listing to be accurate and complete.
Technical SEO | | DylanPKI
Then I waited, and waited... A month later, the old listings are still up and none of the changes to the one listing have been made. Today it gets a bit more complicated. Today I created a Google+ page for the business which seems like it may end up adding yet ANOTHER Google Places listing, is that correct? They are sending a post card to verify, but I have the page all set up ready to go and plan on tying it to the website. I am not exactly sure what my specific question is, but I am looking for any advice anyone has on the best way to go about this situation. Thank you in advance!0 -
Google Places - need to update
I've got a client who verified their Google Places listing years ago, and noone knows who so I can't access it. The business is now moving and I need to update the address. What should I do? thanks
Technical SEO | | garymeld0