January 2013 Google update affected my projects ?
-
I am running 400+ projects. Mostly all projects keyword rank has been effected recently. IS there any new update from google between 10-19 January 2013 ?
-
This seems more of a link algorithm with a mix of keyword rich urls update.
We also track around 20,000 keywords on a weekly basis and can see a big flux. This seems more like Google devaluating some of the backlinks or sites in masses which is causing this.
There is no specific pattern which we can see. It seems more of the small business websites getting hit than the Brands.
But again too early to single out specific reasons.
-
I agree Marie it the wrong date and from what i have seen it has impacted too many sites in the UK8% of more than the 1.8%V mentioned in the tweet. Nearly all the verticals and niches we track have had changes in rankings and some very odd results. A search like UK VPN In google.co.uk has seen a number of service providers replaced with details of UK University VPN services and even the University of Kentucky in the top 20 results. I can't for the life of me see how those results would match the broad intent of the search.
-
Google did refresh Panda today (the 22 of Jan) but this would not have caused the traffic drop between Jan 10th and 19th that the OP had.
-
Google confirms it as Panda. More details over here- http://searchengineland.com/google-panda-update-version-24-1-2-of-search-queries-impacted-146149
-
Regardless of what Google officially says, there is/ was an update around 17 Jan 2013. We track 50,000+ keywords on weekly basis and we saw 6x times SERP movement than we see every week.
For us, this was bigger than Penguin or Panda. Would take some time for rankings to stabilise after which there can be some consensus as to what happened.
Marie - thanks for sharing the link. I kind of agree with you that google will use the disavow tool data. But knowing google, I don't think they will do manual checking for more than 1000 sites? They can simply calculate top 1000 domains disavowed across all disavow requests and then use that data ? But I think that update will come in future. This looks like a version or penguin ?
-
There has been no official word from Google about an update. A lot of people have been grumbling in the forums however about something going on. When Barry from SERoundtable commented on this Google stated that there was no major update but that the algorithm is always changing.
There is also speculation from the team at Branded3 (see this post - http://www.branded3.com/blogs/google-moves-towards-continual-link-devaluation/ that Google may be changing how they detect bad links. If I understand it right, the idea is that instead of devaluing bad links in bunches every time Penguin refreshes, Google is devaluing bad links as they crawl.
I have another theory. I am wondering if Google is starting to put into use the information they are getting from the disavow tool. So, let's say that a whole pile of websites have included spammyarticles.com in their disavow.txt file. Google evaluates the site and decides that it only exists to provide spammy backlinks and as such devalues all links that are coming from this site. I have no proof for this, but it's a possibility.
-
Hi Deepak
Here is the post on "Updated: Stronger Reports Of A Google Update" : http://www.seroundtable.com/google-update-january-16230.html
-
I noticed some changes on Monday and the Google dance seems to have been going on ever since. Some keywords are changing positions by the hour. Agree with Ask Hopper than it will be a few weeks before it settles but it looks like a link based issue.
-
I noticed some changes on Monday and the Google dance seems to have been going on ever since. Some keywords are changing positions by the hour. Agree with Ask Hopper than it will be a few weeks before it settles but it looks like a link based issue.
-
Have a watch of this video from Barry Schwartz and you will see many have found this but nothing has been announced.
https://www.youtube.com/watch?feature=player_embedded&v=xNplIqrs-Os
Andy
-
Hi Yes updates to the algorithm are in progress right now and ongoing, I would suspect that it is going to take few weeks now to settle down before you get any real information on your actual page positions for your keywords and phrases.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Snippet not work for my article in google
I add the script for star snippet in my website but not work in my posts you can see it in this URL https://youtech.ooo/showbox-apk-download/ when I searched in google my custom keyword "Showbox" my competitor show with star snippet in SERP but my site doesn't show snippet stars. Thank You!
Technical SEO | | JackJasonn1 -
Blocking Google from telemetry requests
At Magnet.me we track the items people are viewing in order to optimize our recommendations. As such we fire POST requests back to our backends every few seconds when enough user initiated actions have happened (think about scrolling for example). In order to eliminate bots from distorting statistics we ignore their values serverside. Based on some internal logging, we see that Googlebot is also performing these POST requests in its javascript crawling. In a 7 day period, that amounts to around 800k POST requests. As we are ignoring that data anyhow, and it is quite a number, we considered reducing this for bots. Though, we had several questions about this:
Technical SEO | | rogier_slag
1. Do these requests count towards crawl budgets?
2. If they do, and we'd want to prevent this from happening: what would be the preferred option? Either preventing the request in the frontend code, or blocking the request using a robots.txt line? The latter question is given by the fact that a in-app block for the request could lead to different behaviour for users and bots, and may be Google could penalize that as cloaking. The latter is slightly less convenient from a development perspective, as all logic is spread throughout the application. I'm aware one should not cloak, or makes pages appear differently to search engine crawlers. However these requests do not change anything in the pages behaviour, and purely send some anonymous data so we can improve future recommendations.0 -
Can a CMS affect SEO?
As the title really, I run www.specialistpaintsonline.co.uk and 6 months ago when I first got it it had bad links which google had put a penalty against it so losts it value. However the penalty was lift in Sept, the site corresponds to all guidelines and seo work has been done and constantly monitored. the issue I have is sales and visits have not gone up, we are failing fast and running on 2 or 3 sales a month isn't enough to cover any sort of cost let alone wages. hence my question can the cms have anything to do with it? Im at a loss and go grey any help or advice would be great. thanks in advance.
Technical SEO | | TeamacPaints0 -
Pages to be indexed in Google
Hi, We have 70K posts in our site but Google has scanned 500K pages and these extra pages are category pages or User profile pages. Each category has a page and each user has a page. When we have 90K users so Google has indexed 90K pages of users alone. My question is. Should we leave it as they are or should we block them from being indexed? As we get unwanted landings to the pages and huge bounce rate. If we need to remove what needs to be done? Robots block or Noindex/Nofollow Regards
Technical SEO | | mtthompsons0 -
Google Places Page Changes
We had a client(dentist) hire another marketing firm(without our knowledge) and due to some Google page changes they made, their website lost a #1 ranking, was disassociated with the places page and was placed at result #10 below all the local results. We quickly made some changes and were able to bring them up to #2 within a few days and restore their Google page after about a week, but the tracking/forwarding phone number the marketing company was using shows up on the page despite attempts to contact Google through updating the business in places management as well as submit the phone number as incorrect while providing the correct phone number. And because the client fired that marketing company, the phone number will no longer be active in a few days. Of course this is very important for a dental office. Has anyone else had problems with the speed and updating Google Places/Plus pages for businesses? What's the most efficient way to make changes like this?
Technical SEO | | tvinson0 -
My sitemap in Google is coming back with an error
I submitted my xml sitemap to Google Webmaster tools. It is giving an error, not found. 404 Error. But I can't figure out why my site map is signaling a 404. Why? 😞
Technical SEO | | cschwartzel0 -
Adding Google + to SEOmoz
I wanted to add my google + signature to every post I make on SEOmoz and I think every user should do the same... Two reasons why... Google helps our existence so we should help theirs. If someone likes what I wrote or vice versa we should be able to follow each other in a simple click. In my opinion all blogs forum posts etc... should Lead to a user not a website, this will prevent spam and help people network. In other words blog spammers and forum spammers will be SOL (Which they all ready are lol).
Technical SEO | | SEODinosaur0 -
Google Shopping Australia/Google Merchant Centre
So Google Shopping has finally landed in Australia so we've got some work todo hooking it up to our client ecom sites. Right now we have a handful of clients who are setup, the feed is getting in their ok but all products are sitting in "disapproved" status in the dashboard and clicking into each individual product the status says awaiting review. I logged a support ticket with Google to get some more info on this as it doesn't look right to me (ie the disapproved status in dashboard) and got a useless templated answer. Seems that if I switch the country destination to US the products are approved and live in google.com shopping search within the hour. Switch back to Australia and they go back to disapproved status. Anyone having the same issue/seen this before? I simply don't trust Google support and wondering if there's other factors at play here.
Technical SEO | | Brendo0