Google fluctuates its result on Chrome's private browsing
-
I have seen an interesting Google behaviour this morning.
As usual, I would open Chrome's private browsing to see how a keyword is ranking.
This was what I see...
Typed in "sell my car", I see Auto Trader page on 3rd. (Ref:Sell My Car 1st result img)
Googled something else, then re-Googled "sell my car" and saw that our page went to 2nd!
I repeated the same process and saw that we went from 3rd to 2nd again. Has Google results gone mental?
-
Just sounds like a strange anomaly. "Sell my car" is a very broad search term. With the constantly changing landscape of the Google SERP's, it could well have been something like a new video result that was a video on selling cars that pushed you off the page. Private browsing also blocks cookies and that's not a consistent way to monitor results.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How to get a large number of urls out of Google's Index when there are no pages to noindex tag?
Hi, I'm working with a site that has created a large group of urls (150,000) that have crept into Google's index. If these urls actually existed as pages, which they don't, I'd just noindex tag them and over time the number would drift down. The thing is, they created them through a complicated internal linking arrangement that adds affiliate code to the links and forwards them to the affiliate. GoogleBot would crawl a link that looks like it's to the client's same domain and wind up on Amazon or somewhere else with some affiiiate code. GoogleBot would then grab the original link on the clients domain and index it... even though the page served is on Amazon or somewhere else. Ergo, I don't have a page to noindex tag. I have to get this 150K block of cruft out of Google's index, but without actual pages to noindex tag, it's a bit of a puzzler. Any ideas? Thanks! Best... Michael P.S., All 150K urls seem to share the same url pattern... exmpledomain.com/item/... so /item/ is common to all of them, if that helps.
Intermediate & Advanced SEO | | 945010 -
Is Chamber of Commerce membership a "paid" link, breaking Google's rules?
Hi guys, This drives me nuts. I hear all the time that any time value is exchanged for a link that it technically violates Google's guidelines. What about real organizations, chambers of commerce, trade groups, etc. that you are a part of that have online directories with DO-follow links. On one hand people will say these are great links with real value outside of search and great for local SEO..and on the other hand some hardliners are saying that these technically should be no-follow. Thoughts???
Intermediate & Advanced SEO | | RickyShockley0 -
Is there a downside of an image coming from the site's dotted quad and can it be seen as a duplicate?
Ok the question doesn't fully explain the issue. I just want some opinions on this. Here is the backstory. I have a client with a domain that has been around for a while and was doing well but with no backlinks. (Fairly low competition). For some reason they created mirrors of their site on different urls. Then their web designer built them a test site that was a copy of their site on the web designer's url and didn't bother to noindex it. Client's site dived, the web designer's site started ranking for their keywords. So we helped clean that up, and they hired a brand new web designer and redesigned the site. For some reason the dotted quad version of the site started showing up as a referer in GA. So one image on the site comes from that and not the site's url. So I ran a copyscape and site search and discovered the dotted quad version like 69.64.153.116 (not the actual address) was also being indexed by the search engine. To us this seems like a cut and dry duplicate content issue, but I'm having trouble finding much written on the subject. I raised the issue with the dev, and he reluctantly 301 the site to the official url. The second part of this is the web designer still has that one image on the site coming from the numerical version of the site and not the written url. Any thoughts if that has any negative SEO impact? My thought it isn't ideal, but it just looks like an external referral for pulling that one image. I'd love any thoughts or experience on a situation like this.
Intermediate & Advanced SEO | | BCutrer0 -
301's, Mixed-Case URLs, and Site Migration Disaster
Hello Moz Community, After placing trust in a developer to build & migrate our site, the site launched 9 weeks ago and has been one disaster after another. Sadly, after 16 months of development, we are building again, this time we are leveled-up and doing it in-house with our people. I have 1 topic I need advice on, and that is 301s. Here's the deal. The newbie developer used a mixed-case version for our URL structure. So what should have been /example-url became /Example-Url on all URLs. Awesome right? It was a duplicate content nightmare upon launch (among other things). We are re-building now. My question is this, do we bite the bullet for all URLs and 301 them to a proper lower-case URL structure? We've already lost a lot of link equity from 301ing the site the first time around. We were a PR 4 for the last 5 years on our homepage, now we are a PR 3. That is a substantial loss. For our primary keywords, we were on the first page for the big ones, for the last decade. Now, we are just barely cleaving to the second page, and many are 3rd page. I am afraid if we 301 all the URLs again, a 15% reduction in link equity per page is really going to hurt us, again. However, keeping the mixed-case URL structure is also a whammy. Building a brand new site, again, it seems like we should do it correctly and right all the previous wrongs. But on the other hand, another PR demotion and we'll be in line at the soup kitchen. What would you do?
Intermediate & Advanced SEO | | yogitrout10 -
What's your daily SEO checklist?
First thing every morning I login to Google Webmaster tools looking for any errors, review data, sites linking to us, etc. I then login to Google Analytics and SEOMOz to check traffic to our terms to see if there have been any changes that need to be addressed. What's your daily checklist?
Intermediate & Advanced SEO | | Prospector-Plastics1 -
What's the best way to manage content that is shared on two sites and keep both sites in search results?
I manage two sites that share some content. Currently we do not use a cross-domain canonical URL and allow both sites to be fully indexed. For business reasons, we want both sites to appear in results and need both to accumulate PR and other SEO/Social metrics. How can I manage the threat of duplicate content and still make sure business needs are met?
Intermediate & Advanced SEO | | BostonWright0 -
Starting Over with a new site - Do's and Don'ts?
After six months, we've decided to start over with a new website. Here's what I'm thinking. Please offer any constructive Do's or Don'ts if you see that I'm about to make a mistake. Our original site,(call it mysite.com ) we have come to the conclusion, is never going to make a come back on Google. It seems to us a better investment to start over, then to to simply keep hoping. Quite honestly, we're freakin' tired of trying to fix this. We don't want to screw with it any more. We are creative people, and would much rather be building a new race car rather than trying to overhaul the engine in the old one. We have the matching .net domain, mysite.net, which has been aged about 6 years with some fairly general content on a single page. There are zero links to mysite.net, and it was really only used by us for FTP traffic -- nothing in the SERPS for mysite.net. Mysite.NET will be a complete redesign. All content and images will be totally redone. Content will be new, excellent writing, unique, and targeted. Although the subject matter will be similar to mysite.COM, the content, descriptions, keywords, images -- all will be brand spankin' new. We will have a clean slate to begin the long painful link building process.We will put in the time, and bite the bullet until mysite.NET rules Google once again. We'll change the URL in all of our Adwords campaigns mysite.net. My questions are: 1. Mysite.com still gets some ok traffic from Bing. Can I leave mysite.com substantially intact, or does it need to go? 2. If I have "bad links" pointing to mysite.com/123.html what would happen if I 301 that page to mysite.NET/abc.html ? Does the "bad link juice" get passed on to the clean site? It would be a better experience for users who know our URL if they could be redirected to the new site. 3. Should we put Mysite.net on a different server in a different clean IP block? Or doesn't matter? We're willing to spend for the new server if it would help 4. What have I forgotten? Cheers, all
Intermediate & Advanced SEO | | DarrenX0 -
What's the best method for segmenting HTML sitemap?
Hello all, I was wondering if anyone can help me. Currently I'm trying to set up a HTML sitemap for our website and am having trouble with the 500+ pages of content under each category. How do you segment your HTML sitemap in a case like this, keeping in mind the less than 100 links per page rule? For example, http://www.careerbliss.com/salary/ allows our users to search salaries under company, job title, and location. You can imagine how many thousands of pages we need to represent. Any help will be greatly appreciated! Cheers! Reyna
Intermediate & Advanced SEO | | CareerBliss0