Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Search results vary in chrome vs other browsers even in Incognito mode: Google's stand?
-
Hi all,
We use incognito mode or private browsing to check the Actual results which are not impacted by previous history, location (sometimes), etc. Even we browse this way, we can see the different search results. Why would this happen? What's Google's stand on this? What is the actual way to browse to get the unbiased results for certain search queries? I have experienced that Chrome will rank our own websites bit higher compared to the other browsers even in incognito mode.
Thanks
-
I agree with seoman10 in that it is often difficult to get accurate results, especially when a client is out of state and you are trying to replicate what they are seeing. One saving grace is that some tools, like SemRush actually allow you to set where you want to search from ex: if your client is in Little Rock Arkansas & you are located in NYC you can set your rank tracking from Little Rock and it is actually pretty accurate. Nothing is 100%, but we have found this is pretty reliable.
Cheers,
G
-
Google tries to personalize the results as much as possible, they use cookies and other types of identification data to try and track users and provide the most relevant results.
After you close chrome technically there is no method for Google to identify who you are, hence you may see different results.
Getting accurate search position results for tracking marketing progress seems a bit of a buzz word at the moment. It comes back to as Marketers/SEO's we need to track and monitor our efforts, on the flipside Google is trying to provide the most accurate result to the user often by means of personalisation. That is where you have an immediate conflict, for maximum user engagement (which all of us want) you sometimes have to sacrifice tractability.
Assuming you did manage to track personalised results you would need a lot of data about that person and their browsing habits to understand how you back best matched their search query. Most of us would quickly get ourselves lost in data. Does that make sense?
There are plenty of Serps checker tools around, but I sometimes have my doubts how accurate they are - Google definitely doesn't like them.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Getting indexed in Google Scholar
Hi all! We have a client who publishes scholarly research as a highly regarded non-profit. Their Publications aren't being indexed in Google Scholar 50% of the time and when they are, Google is pulling random stuff from a PDF vs. from the html page. Any advice on best practices is enormously appreciated
SERP Trends | | SimpleSearch1 -
Getting indexed by Google scholar
Often my Google Scholar alerts result in exactly what I think they will: scholarly articles published in academic journals. However, today I got this completely non-scholarly article https://www.t-nation.com/training/the-exact-reps-that-make-you-grow and I have no idea why Google Scholar is indexing this site. I've read up on how to get indexed by Google Scholar, and this website doesn't seem to have the necessary requirements. I'm curious for anyone whose clients or industry need to get indexed by Google Scholar, what has worked for you?
SERP Trends | | newwhy2 -
URL Parameter for Limiting Results
We have a category page that lists products. We have parameters and the default value is to limit the page to display 9 products. If the user wishes, they can view 15 products or 30 products on the same page. The parameter is ?limit=9 or ?limit=15 and so on. Google is recognizing this as duplicate meta tags and meta descriptions via HTML Suggestions. I have a couple questions. 1. What should be my goal? Is my goal to have Google crawl the page with 9 items or crawl the page with all items in the category? In Search Console, the first part of setting up a URL parameter says "Does this parameter change page content seen by the user?". In my opinion, I think the answer is Yes. Then, when I select how the parameter affects page content, I assume I'd choose Narrows because it's either narrowing or expanding the number of items displayed on the page. 2. When setting up my URL Parameters in Search Console, do I want to select Every URL or just let Googlebot decide? I'm torn because when I read about Every URL, it says this setting could result in Googlebot unnecessarily crawling duplicate content on your site (it's already doing that). When reading further, I begin to second guess the Narrowing option. Now I'm at a loss on what to do. Any advice or suggestions will be helpful! Thanks.
SERP Trends | | dkeipper0 -
Ways to fetch search analytics - historical search query data from Google Search Console
Is there any way to fetch all historical search query data from Google Search Console ? Google allows us to view only 90 days report at the maximum. Does integrating google search console with google analytics tool solve this problem ?
SERP Trends | | NortonSupportSEO0 -
Local SEO citations: Do business description text variations matter? If yes how important is it to vary them?
I would like to see what is the consensus here about this stuff as virtually any automated service, be it yext or yahoo, will use one description text and use it for every available listing in their ecosystem. In general what is your take on varying this business description text? of course, I would personally put a safer bet on avoiding any duplicate text between listings and the domain of the business in any case. but my questions is more between listing vs listing. THIS IS BAD - WHY? THIS HAS NO IMPORTANCE WHATSOEVER - WHY NOT? and in conclusion and hindsight, would one need to watch out for duplicate content across non-domain assets used by the business so long as none of the content is duplicated from the business domain? I tried my best googling this, but did not find a straight answer anywhere. I would really appreciate some experienced and insightful comments on this one 🙂
SERP Trends | | Raydon0 -
How to get Google Results for Did You Mean | Showing results for
If someone misspells our company name in Google, how do I get google to display **Did You Mean: **xyz. Our company name is difficult to spell and could be spelled multiple ways. What is the trick to this?
SERP Trends | | hfranz0 -
Search Volume Data for iTunes App Store?
Ok, so maybe I'm looking in all the wrong places, But where can I get good search volume data for searches done in iTunes app store? Or is this something that Apple just doesn't share?
SERP Trends | | TheSEOWiz0 -
Searching google without bias
i read a tip once and can't find the reference to that tip. It had to do with searching google without including the history of your searches to bias the result. The user was suppose to put a code at the end of the URL, like &pw or something like that. Anyone hear of this?
SERP Trends | | StreetwiseReports0