Nov 19th & 20th Update?
-
Did anyone see any big changes around Nov 19th & 20th? Mozcast had some high temps around there.
If you saw any big changes in organic search, any ideas WTH that was all about?
Any guesses?
One site I work with took about a 15% hit and has since sort of skidded sideways.
-
Evidence of something around 11/18-19 is pretty strong at this point. Glenn's article that Peter N. posted is worth checking out. I've heard rumors of a mobile connection, but that's been hard to pin down - there does seem to be a "quality" aspect, but that's such a hard word to pin down. No confirmation from Google, but MozCast and similar data definitely saw spikes, and there was solid chatter.
EGOL is right, though - that time period before Black Friday is a hairy one, search-wise, and there are so many variables to disentangle. I think something algorithmic happened, but that doesn't mean that any particular problem or drop was due to a Google change, and it's going to be really tough to piece together any particular story, I'm afraid.
-
There have been a few reports about people experiencing some drops, nothing being released from Google as per usual but at least other people are seeing an affect
-
I'm still trying to understand a total crash in Google organic search traffic that started on 25 November and hit rock bottom on 27 November. It has not recovered. At first I thought it was because I accidentally had "deter indexing" ticked in my Wordpress settings after a new site version was pushed live, but it's now been three days since I corrected that and there's no sign of even a small recovery.
I resubmitted my sitemap and forced re-index using GWT and my site is definitely indexed. But Google traffic had dropped from over 200 a day to around 10 a day.
I have a lot of links to TripAdvisor on my site (travel blog), so I don't know if that's the issue? Or if it's something in the new premium theme?
I read something about affiliate links using affiliate keyword anchors being punished by this update but I don't use affiliate links as such, just direct links to the relevant TA listing which are converted on click by a JS script provided by TA. They are all do-follow, which might be an issue (but has never been in the past). And they are all anchored by the actual hotel name or photo of the hotel. And not all my posts have them either, but the whole site has suffered just as I was getting some traction in SERPs.
There is no manual penalty notice in GWT.
Site is http://www.asiantraveltips.com if you would be kind enough to offer any advice or opinions about what's happened.
-
Hi Egol,
Thanks for the message. For this particular site, I'm looking at a very noticeable dip in position tracking that is then partially obscured by the following week's Thanksgiving holiday. This week, the week after the holiday week, is down about 15% traffic wise compared to 2 weeks prior. As a point of reference, last year this site was flat for the same period. So, don't really think it's just seasonality.
I might think it was just this site, but it just happened to coincide with that increase in mozcast temps and other stories of flux.
Has anybody benefited or suffered with the same kind of timing?
-
According Glenn Gabe there was something: http://www.hmtweb.com/marketing-blog/november-19-google-algorithm-update/ http://www.hmtweb.com/marketing-blog/unconfirmed-google-algorithm-updates-2015/
Google says: https://www.seroundtable.com/google-update-no-21225.html
"Don't have anything more specific to announce, sorry!" -
Traffic on many non-retail sites in the United States started to dip on the 19th and 20th in advance of Thanksgiving Week. Many schools take a break that week and lots of adults plan trips for the holiday. Traffic on my informational sites went down then, but has been nothing less than volcanic this week. I was wondering if something happened yesterday to give me a traffic blast that continues today. Up over 20% on a site that is already busy.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Any more info on potential Google algo update from April 24th/25th?
Apart from an article on Search Engine Roundtable, I haven’t been able to find anything out about the potential algorithm update that happened on Monday / Tuesday of this week. One of our sites (finance niche) saw drops in rankings for bad credit terms on Tuesday, followed by total collapse on Wednesday and Thursday. We had made some changes the previous week to the bad credit section of the site, but the curious thing here is that rankings for bad credit terms all over the site (not just the changed section) disappeared. Has anyone else seen the impact of this change, and are there any working theories on what caused it? I’m even wondering whether a specific change has been made for bad credit terms (i.e. the payday loan update)?
White Hat / Black Hat SEO | | thatkinson0 -
Whitehat site suffering from drastic & negative Keyword/Phrase Shifts out of the blue!
I am the developer for a fairly active website in the education sector that offers around 30 courses and has quite an actively published blog a few times a week and social profiles. The blog doesn't have comments enabled and the type of visitor that visits is usually looking for lessons or a course. Over the past year we have had an active input in terms of development to keep the site up to date, fast and following modern best practises. IE SSL certificates, quality content, relevant and high powered backlinks ect... Around a month ago we got hit by quite a large drop in our ranked keywords / phrases which shocked us somewhat.. we attributed it to googles algorithm change dirtying the waters as it did settle up a couple of weeks later. However this week we have been smashed again by another large change dropping almost 100 keywords some very large positions. My question is quite simple(I wish)... What gives? I don't expect to see drops this large from not doing anything negative and I'm unsure it's an algorithm change as my other clients on Moz don't seem to have suffered either so it's either isolated to this target area or it's an issue with something occurring to or on the site? QfkSttI T42oGqA
White Hat / Black Hat SEO | | snowflake740 -
Update: Copied Website
So I discovered a website the other day that is a complete duplicate of ours: justinchina.co.uk This is our website: petmedicalcenter.com . Thanks to help from Erica, I dug in deeper to see why this was happening. It seems that the justinchinca.co.uk which is hosted by GoDaddy has their A Record pointing at our web host. So that being said, our website does not seem to be hacked which is good news. Would this still cause an issue with our Google rankings? Our host, Host Monster said to contact GoDaddy and GoDaddy said that a domain owner can point their URL to anywhere that they choose. Anyway, any feedback would be helpful. Thanks for everyone thus far that has helped me. Brant
White Hat / Black Hat SEO | | BCB11210 -
Dust.js Client-side JavaScript Templates & SEO
I work for a commerce company and our IT team is pushing to switch our JSP server-side templates over to client-side templates using a JavaScript library called Dust.js Dust.js is a JavaScript client-side templating solution that takes the presentation layer away from the data layer. The problem with front-end solutions like this is they are not SEO friendly because all the content is being served up with JavaScript. Dust.js has the ability to render your client-side content server-side if it detects Google bot or a browser with JavaScript turned off but I’m not sold on this as being “safe”. Read about Linkedin switching over to Dust.js http://engineering.linkedin.com/frontend/leaving-jsps-dust-moving-linkedin-dustjs-client-side-templates http://engineering.linkedin.com/frontend/client-side-templating-throwdown-mustache-handlebars-dustjs-and-more Explanation of this: “Dust.js server side support: if you have a client that can't execute JavaScript, such as a search engine crawler, a page must be rendered server side. Once written, the same dust.js template can be rendered not only in the browser, but also on the server using node.js or Rhino.” Basically what would be happening on the backend of our site, is we would be detecting the user-agent of all traffic and once we found a search bot, serve up our web pages server-side instead client-side to the bots so they can index our site. Server-side and client-side will be identical content and there will be NO black hat cloaking going on. The content will be identical. But, this technique is Cloaking right? From Wikipedia: “Cloaking is a SEO technique in which the content presented to the search engine spider is different from that presented to the user's browser. This is done by delivering content based on the IP addresses or the User-Agent HTTP header of the user requesting the page. When a user is identified as a search engine spider, a server-side script delivers a different version of the web page, one that contains content not present on the visible page, or that is present but not searchable.” Matt Cutts on Cloaking http://support.google.com/webmasters/bin/answer.py?hl=en&answer=66355 Like I said our content will be the same but if you read the very last sentence from Wikipdia it’s the “present but not searchable” that gets me. If our content is the same, are we cloaking? Should we be developing our site like this for ease of development and performance? Do you think client-side templates with server-side solutions are safe from getting us kicked out of search engines? Thank you in advance for ANY help with this!
White Hat / Black Hat SEO | | Bodybuilding.com0 -
How best to do Location Specific Pages for Eccomerce Post Panda Update..
Hi , We have an eCommerce site and currently we have a problem with duplicate content. We created Location specific landing pages for our product categories which initially did very well until the recent Google Panda update caused a big drop in ranking and traffic. example http://xxx.co.uk/rent/lawn-mower/London/100 http://.xxx.co.uk/rent/lawn-mower/Manchester/100 Much of the content on these location pages is the same or very similar apart from different H1 tag, title tag and in some cases slight variations on the on page content but given that these items can be hired from 200 locations it would take years to have unique content for every location for each category... We did this originally in April as we can't compete nationally but we found it was easier to compete locally ,hence the creation of the location pages and it did do well for us until now. My question is , since the last Google Panda update, our traffic has dropped 40% , and rankings have gone through the floor and we are stuck with this mess Should we get rid off (301 ) all of the location specific pages for each of the categories ot just keep say 10 locations per cities as the most popular ones and either do No follow no index the other locations or 301's or what would people recommend ? The only examples I can see on the internet that others do with multiple locations is to have a store finder type thing... but you cant' rank for the individual product /category doing it that way... If anyone has any advice or good examples of sites I could see that employ a good location url specific method, please let me know. thanks Sarah
White Hat / Black Hat SEO | | SarahCollins0 -
Is my SEO strategy solid moving forward (post panda update) or am I doing risky things that might hurt my sites down the road?
Hey all, WIhen I first started doing SEO, I was encouraged by several supposed experts that it was a good idea to buy links from "respectable" sources and as well make use of SEO experimentation offered on Fiverr. I did that a lot for the clients I represented not knowing if this was going to hurt. But now after the latest Google shift, I am realizing that this was stupid and thus deserving of the ranking drops I have received. In the aftermath, I want to list out here what I am doing now to try to build better and stronger rankings for my sites using white hat techniques only... Below is a list of what I'm doing. Please let me know if any of these are bad choices and I will immediately dump them. Also, If i am not including some good options, please let me know that too. I am really embarrassed and humbled by this and could use whatever help you can offer. Thanks in advance for your help... What am I doing now? *Writing quality articles for external blogs with keyword links back to sites *Taking the above articles and spinning them at SEOLINKVINE to create several articles *Writing quality articles for every site's internal blog and using keywords to link out to other sites that are on different servers - All articles are original, varied and not duplicate content. *Writing quality, relevant articles and submitting them to places like Ezine *Signing clients up for Facebook, Yelp, Twitter, etc so they have a social presence *Working to fix mistakes with onsite issues (mirror sites, duplicate page titles, etc.) *Writing quality keyword-rich unique content on each page of each site *Submitting URL listings and descriptions to directories like JoeAnt, REALS and business.com (Any other good ones that people can recommend that give good link juice?) *Doing competitive research and going after highly authoritative links that our competitors have That is about it... HELP!!! Thanks again
White Hat / Black Hat SEO | | creativeguy0 -
Penguin Update and Infographic Link Bait
Is it still ok to use infographics for link bait now that the penguin update has rolled out? Are there any techniques that should be avoided when promoting an infographic? Thanks
White Hat / Black Hat SEO | | eddiejsd1 -
DropBox.com High PA & DA?
"What’s up with these dl.dropbox.com High PA & DA links?" You know, It's frustrating to spend almost an entire day getting a few great link backs... then to find out your competitor has hundreds of cheap & easy link backs for the keyword you are going for with greater Authority [according to SEOmoz's OSE]. So I ran a search on one of our top competitors in Open Site Explorer to gather an idea of where the heck they are getting all of their links. Please feel free to copy my actions so you can see what I see. Run a search in OSE for www[dot]webstaurantstore[dot]com. Click on the ‘Anchor Text’ Tab. Click on the first Anchor Text Term, which should be ‘restaurant supplies’ :: Then it will expand, click on the ‘View more links and details in the inbound links section.’ As you scroll down the list you will notice that they have a bunch of linking pages from dl.dropbox.com, all of them are .pdb files, for their targeted Anchor Text, restaurant supplies. Q: So my question is can someone please elaborate on what .pdb files are and how they are getting this to work for them so well? Also you will notice, on the expanded Anchor Text Page, that their 6<sup>th</sup> most powerful link for this phrase (restaurant supplies) seems to be linked straight from a porn site, I thought Google does not rank adult sites like this? Q: For future reference, does anyone know legitimate websites to maybe file an SEO manipulation complaint? Thanks!
White Hat / Black Hat SEO | | Burkett.com0