Why differents browsers return different search results?
-
Hi everyone,
I don't understand the reason why if I delete cookies, chronology, set anonymous way surfing in Chorme and Safari, I have different results on Google. I tried it from the same pc and at the same time.
Searching in google the query "vangogh" the internet site "www.vangogh-creative.it" is shown in the first page in Chrome but not in Safari.
I asked in Google webmaster forum, but nobody seems to know the reason of this behavior.
Can anyone help me?
Thanks in advance.
Massimiliano
-
I just browse via Google.IT for "vangogh" and you're on N:2 spot after Italian Wikipedia article. I strongly recommend you to create account in SearchConsole and verify your site. There you can see in SearchTraffic exactly what position you get in SERP results.
Now answer for browsers - modern search engines return SERP pages optimized for specific browsers. That's why it's possible browser X to see some results, browser Y to get other results. There is no way to escape this unless you change "user-agent" string in browser with some plugin. Some engines are even going further that return personalized results based on IP, previous searches from this IP, previous searches from this account, previous clicks on SERP, etc.
The only way to get right position is with SearchConsole or special tracking tools as Moz RankTracker. And there you can see "average position".
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Short description about our search results drop + forum moving to subdomain question.
Hello, here is our story. Our niche is mental health (psychology, psychotherapy e.t.c). Our portal has thousand of genuine articles, news section about mental health, researches, job findings for specialists, a specialized bookstore only with psychology books, the best forum in country, we thousands of active members and selfhelp topics etc. In our country (non english), our portal has been established in 2003. Since then, for more than 15 years, we were no 1 in our country, meaning that we had the best brand name, hundreds of external authors writing unique content for our portal and hundreds of no1 keywords in google search results. Actually, we had according to webmaster tools, more than 1.000 keywords, in 1 and 2 position. (we were ranking no1 in all the best keywords). Before 2 years, we purchased the best domain in our niche. I ll use the below example (of course, domains are not the real ones):
Intermediate & Advanced SEO | | dodoni
We had: e-pizza.com and now we have: pizza.com
We did the appropriate redirects but from day one, we had around 20-30% drop in search engines. After 6 months -which is something that google officialy mentions, we lost all "credits from the old domain.. .and at that point, we had another 20-30% drop in search results. Further more, in any google core update, we were keep dropping. Especially in last May (coronovirus update), we had another huge drop. We do follow seo guides, we have a dedicated server, good load speed, well structured data, amp, a great presence in social media, with more than 130.000 followers, etc. According to our investigation, we came to one only conclusion: that our forum, kills our seo (of course, noone in our team can guarantee that this is the actual reason of the uge drop in may-in coronovirus google core update). We believe that the forum kills our seo, because it produces low quality posts by members. For example, psychopharmacology in a very active sections and we believe, google is very "sensitive" in these kind of posts and information. So here is the question: although the forum is very very active, with thousands of new topics and posts every month, we are thinking of moving it to a subdomain, from the subfolder that now is.
This will help our domain authority to increase from 38 that is stuck 2 years now, to larger scales. We believe that althougth this forum gave a great boost to the portal, in the past 10-15 years, it somehow makes a negative impact now. If I could give more spesific details, I d say this: in all seo tools we run, the best kewwords bringing visitors to us, arent anymore, psychology and psychotherapy and mental health and this kind of top-keywords, but are mostly the ones from the forum, like: I want to proceed with a suicide, I m taking efexor or xanax and they have side effects, why i gain wieght with the antidepressants I get etc. 1. Moving our forum to subdomain, will be some kind of pain, since it is a large community, with thousands of backlinks that we somehow must handle in a proper way, also with a mobile application, things that will have to change and probably have some kind of negative impact. Would that be according to your knowledge a correct move and our E-A-T will benefit for google, or since google will know that the subdomain is still part of the same website/portal, it will handle it somehow, the same way as it does now? I have read hundreds of articles about forum in subdomains or in subfolders, but none of them covers a case stydy like ours, since most articles are talking about new forums and what is the best way to handle them and where is the best place to create them (in subfolder of subdomain) when from scratch. Looking forward to your answers.0 -
Search function rendering cached pages incorrectly
On a category page the products are listed via/in connection with the search function on the site. Page source and front-end match as they should. However when viewing a browser rendered version of a google cached page the URL for the product has changed from, as an example - https://www.example.com/products/some-product to https://www.example.com/search/products/some-product The source is a relative URL in the correct format, so therefore /search/ is added at browser rendering. The developer insists that this is ok as the query string in the Google cache page result URL is triggering the behaviour, confusing the search function - all locally. I can see this but just wanted feedback that internally Google will only ever see the true source or will it's internal rendering mechanism possibly trigger similar behaviour?
Intermediate & Advanced SEO | | MickEdwards1 -
Getting too many links on Google search results, how do I fix?
I'm a total newbie so I apologize for what I am sure is a dumb question — I recently followed Moz suggestions for increasing visibility on my site for a specific keyword by including that keyword in more verbose page descriptions for multiple pages. This worked TOO well as now that keyword is bringing up too many results in Google for these different pages on my site . . . is there a way to compile them into one result with the subpages like for instance, the attached image for a search on Apple? Do I need to change something in my robots.txt file to direct these to my main page? Basically, I am a photographer and a search for my name now brings up each of my different photo gallery pages in multiple results, it's a little over the top. Thanks for any and all help! CNPJZgb
Intermediate & Advanced SEO | | jason54540 -
Country specific results
Our country specific pages reside as a subfolder under the main domain. So for example in US it's /us/, in Canada it's /ca/. What we've noticed is that Google Canada is showing US pages in some of the search results. Does anyone have experience with how to direct Google to display country specific page results?
Intermediate & Advanced SEO | | kxu0 -
Noindex search pages?
Is it best to noindex search results pages, exclude them using robots.txt, or both?
Intermediate & Advanced SEO | | YairSpolter0 -
What can you do when Google can't decide which of two pages is the better search result
On one of our primary keywords Google is swapping out (about every other week) returning our home page, which is more transactional, with a deeper more information based page. So if you look at the Analysis in Moz you get an almost double helix like graph of those pages repeatedly swapping places. So there seems to be a bit of cannibalizing happening that I don't know how to correct. I think part of the problem is the deeper page would ideally be "longer" tail searches that contain the one word keyword that is having this bouncing problem as a part of the longer phrase. What can be done to try prevent this from happening? Can internal links help? I tried adding a link on that term to the deeper page to our homepage, and in a knee jerk reaction was asked to pull that link before I think there was really any evidence to suggest that that one new link made a positive or negative effect. There are some crazy theories floating around at the moment, but I am curious what others think both about if adding a link from a informational to a transactional page could in fact have a negative effect, and what else could be done/tried to help clarify the difference between the two pages for the search engines.
Intermediate & Advanced SEO | | plumvoice0 -
Homepage shuffling in the SERP results continously - any thoughts?
We've had some shuffling for a keyword in the SERP results over last few days! Anyone else seeing their rankings bounce all over? It's only affecting one keyword that was previously a stable performer - this has occurred for the last few weeks (with no major changes to the page). Would be keen to hear your thoughts!
Intermediate & Advanced SEO | | Creode1 -
Serving different content based on IP location
I have city centric website. For sake of simplicity, say I only have 2 cities -- City A and City B. Depending on a user's IP address, they will either get City A or City B. Users can change their location through javascript on pages. But there is no cross-linking between cities. By this, I mean that unless you can read or execute javascript, there is no way for you to get from city A to City B. My concern is this: googlebot comes to my site, and we serve them up City A. How does City B get discovered if Googlebot doesn't read javascript? We have an xml sitemap plus plenty of backlinks to City B. Is this sufficient? Should I provide a static link to City B (and vice versa) on the homepage for crawling purposes?
Intermediate & Advanced SEO | | ChatterBlock0