Google Trends - what did you do?
-
So is it me or did Google make some crazy changes -
The "trends" are no longer anchored to appropriate articles etc...
Why do you think they would remove something so useful to us?
http://www.google.com/trends/ - check it out for yourself.
-
Thanks Keri, I love ya guys!
-
Google wrote a post about this at http://googleblog.blogspot.com/2012/06/find-out-what-people-are-searching-for.html.
-
Because they can
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google Tag Manager
What are some of the best resources with learning and teaching other GTM and conversion tracking?
Industry News | | WebMarkets1 -
Very odd behavior. Google is changing “%20” to “+” in my URLS
I just realized many of the links on my site are BROKEN when entered from a Google SERP. This didn’t used to be this way. I have no idea what’s going but I’m worried. It involves a folder of our site that has a space in it. Google is even displaying the proper “%20” in the SERP but when I click the link it replaces that “%20” with a + which breaks the link!! You can see this in action by typing in “brown jordan sheffield furniture” without quotes into Google. You’ll see our site come up first and displaying this link sheffieldfurniture.com/Other%20Furniture/BrownJordan.html but when you click it the link is broken! This is happening on many of our pages! Anyone know what in the world is going on?
Industry News | | SheffieldMarketing0 -
How Google could quickly fix the whole Links problem...
A Thursday morning brainstorm that hopefully an important Google manager will see... Google could quickly end all the problems of link buying, spammy links, and negative SEO with one easy step: Only count the 100 best follow links to any domain. Ignore all the nofollows and everything beyond the 100 best. They can choose what "best" means. Suddenly links would be all about quality. Quantity would not matter. Fiverr links, comment links, and all the other mass-produced spam links would literally be ignored. Unless that's all a domain had, and then they would surely be stomped by any domain with 100 decent natural links. Would it be an improvement over today's situation?
Industry News | | GregB1230 -
Google update on Jan 17 2013 ?
Hi guys, Today ( Jan 17 2013 ) I am observing a lot of changes within google serp for a variety of keyword. im feeling like if there was a google update somehow. There seems to be few thread around the web that claim such an update ( or a panda refresh ) , were you affected ? Did somebady else noticed a huge SERP fluctuation within their primary keyword ? Thanks in advance for your answer 😄 Best regards, Yan
Industry News | | ydesjardins2001 -
Google Cached "Text Only" version
Is there a way to test what a page would look like in Google "Text Only" version before a page is indexed in Google? Is there a tool out there to help with this?
Industry News | | activejunky10 -
Chrome blocked sites used by Googles Panda update
Google's Panda update said it used Chrome users blocked sites lists as a benchmark for what they now term poor quality content. They said the Panda update effectively took about 85% of them out of the search results. This got me thinking, it would be very nice to discover what are the exact sites they don't like. Does anyone know if there is an archive of what these sites might be? Or if none exists, maybe if people could share their Chrome blocked sites on here we might get an idea?
Industry News | | SpecialCase0 -
Google places rejected
google has rejected a few listing i have for certain businesses, i have read the guidlines and I am well inside them. It does say that if business name is changed you need to re-verify, but does not allow you to do so. I think google have lost their way, they should stop building operating systems and electric cars and get their web site sorted out.
Industry News | | AlanMosley0 -
What is the best method for getting pure Javascript/Ajax pages Indeded by Google for SEO?
I am in the process of researching this further, and wanted to share some of what I have found below. Anyone who can confirm or deny these assumptions or add some insight would be appreciated. Option: 1 If you're starting from scratch, a good approach is to build your site's structure and navigation using only HTML. Then, once you have the site's pages, links, and content in place, you can spice up the appearance and interface with AJAX. Googlebot will be happy looking at the HTML, while users with modern browsers can enjoy your AJAX bonuses. You can use Hijax to help ajax and html links coexist. You can use Meta NoFollow tags etc to prevent the crawlers from accessing the javascript versions of the page. Currently, webmasters create a "parallel universe" of content. Users of JavaScript-enabled browsers will see content that is created dynamically, whereas users of non-JavaScript-enabled browsers as well as crawlers will see content that is static and created offline. In current practice, "progressive enhancement" in the form of Hijax-links are often used. Option: 2
Industry News | | webbroi
In order to make your AJAX application crawlable, your site needs to abide by a new agreement. This agreement rests on the following: The site adopts the AJAX crawling scheme. For each URL that has dynamically produced content, your server provides an HTML snapshot, which is the content a user (with a browser) sees. Often, such URLs will be AJAX URLs, that is, URLs containing a hash fragment, for example www.example.com/index.html#key=value, where #key=value is the hash fragment. An HTML snapshot is all the content that appears on the page after the JavaScript has been executed. The search engine indexes the HTML snapshot and serves your original AJAX URLs in search results. In order to make this work, the application must use a specific syntax in the AJAX URLs (let's call them "pretty URLs;" you'll see why in the following sections). The search engine crawler will temporarily modify these "pretty URLs" into "ugly URLs" and request those from your server. This request of an "ugly URL" indicates to the server that it should not return the regular web page it would give to a browser, but instead an HTML snapshot. When the crawler has obtained the content for the modified ugly URL, it indexes its content, then displays the original pretty URL in the search results. In other words, end users will always see the pretty URL containing a hash fragment. The following diagram summarizes the agreement:
See more in the....... Getting Started Guide. Make sure you avoid this:
http://www.google.com/support/webmasters/bin/answer.py?answer=66355
Here is a few example Pages that have mostly Javascrip/AJAX : http://catchfree.com/listen-to-music#&tab=top-free-apps-tab https://www.pivotaltracker.com/public_projects This is what the spiders see: view-source:http://catchfree.com/listen-to-music#&tab=top-free-apps-tab This is the best resources I have found regarding Google and Javascript http://code.google.com/web/ajaxcrawling/ - This is step by step instructions.
http://www.google.com/support/webmasters/bin/answer.py?answer=81766
http://www.seomoz.org/blog/how-to-allow-google-to-crawl-ajax-content
Some additional Resources: http://googlewebmastercentral.blogspot.com/2009/10/proposal-for-making-ajax-crawlable.html
http://www.seomoz.org/blog/how-to-allow-google-to-crawl-ajax-content
http://www.google.com/support/webmasters/bin/answer.py?answer=357690