Switching prices for google base
-
We would like to be able to submit lower prices to google than we do to other sources. How i see it working is that at the end of each url we submit to google base there is a tracking code (source=googlebase).
When a user visits the site via one of these urls we would knock 10% of the price of that item and store the item in a cookie to ensure that the price of that item, for that user would stay at the low price for 24 hours.
My question is whether google would have a problem with us doing this? The second part of my question is whether they check the full url including the query strings? If theyt just checked the canocial URL they would see a price thats 10% higher than the one we submitted to base - which, of course - would be bad
-
Hi Martin,
I have less of an SEO concern over this than a user experience one. It doesn't sound like a good idea to have different prices for users who come from Google product search. What happens if they return over 24 hours later to purchase the product? Unless there is some type of clear message on the page to indicate that the price is only valid for 24 hours, it will cause a lot of confusion, and even then, it's not the right approach from a business perspective. I've talked a few individuals who had prior experience with mismatched prices in Google and overall, customers are never happy when that happens.
Also, it's okay for a few products to have different prices. However, for the most part, when prices are not matched between product pages and feed data, you're risking Google suspending your feed.
Although, I can understand the reasoning behind this, in general, I think it is far too risky to implement.
Best,
Stephanie
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplication content management across a subdir based multisite where subsites are projects of the main site and naturally adopt some ideas and goals from it
Hi, I have the following problem and would like which would be the best solution for it: I have a site codex21.gal that is actually part of a subdirectories based multisite (galike.net). It has a domain mapping setup, but it is hosted on a folder of galike.net multisite (galike.net/codex21). My main site (galike.net) works as a frame-brand for a series of projects aimed to promote the cultural & natural heritage of a region in NW Spain through creative projects focused on the entertainment, tourism and educational areas. The projects themselves will be a concretion (put into practice) of the general views of the brand, that acts more like a company brand. CodeX21 is one of those projects, it has its own logo, etc, and is actually like a child brand, yet more focused on a particular theme. I don't want to hide that it makes part of the GALIKE brand (in fact, I am planning to add the Galike logo to it, and a link to the main site on the menu). I will be making other projects, each of them with their own brand, hosted in subsites (subfolders) of galike.net multisites. Not all of them might have their own TLD mapped, some could simply be www.galike.net/projectname. The project codex21.gal subsite might become galike.net/codex21 if it would be better for SEO. Now, the problem is that my subsite codex21.gal re-states some principles, concepts and goals that have been defined (in other words) in the main site. Thus, there are some ideas (such as my particular vision on the possibilities of sustainable exploitation of that heritage, concepts I have developed myself as "narrative tourism" "geographical map as a non lineal story" and so on) that need to be present here and there on the subsite, since it is also philosophy of the project. BUT it seems that Google can penalise overlapping content in subdirectories based multisites, since they can seem a collection of doorways to access the same product (*) I have considered the possibility to substitute those overlapping ideas with links to the main page of the site, thought it seems unnatural from the user point of view to be brought off the page to read a piece of info that actually makes part of the project description (every other child project of Galike might have the same problem). I have considered also taking the subsite codex21 out of the network and host it as a single site in other server, but the problem of duplicated content might persist, and anyway, I should link it to my brand Galike somewhere, because that's kind of the "production house" of it. So which would be the best (white hat) strategy, from a SEO point of view, to arrange this brand-project philosophy overlapping? (*) “All the same IP address — that’s really not a problem for us. It’s really common for sites to be on the same IP address. That’s kind of the way the internet works. A lot of CDNs (content delivery networks) use the same IP address as well for different sites, and that’s also perfectly fine. I think the bigger issue that he might be running into is that all these sites are very similar. So, from our point of view, our algorithms might look at that and say “this is kind of a collection of doorway sites” — in that essentially they’re being funnelled toward the same product. The content on the sites is probably very similar. Then, from our point of view, what might happen is we will say we’ll pick one of these pages and index that and show that in the search results. That might be one variation that we could look at. In practice that wouldn’t be so problematic because one of these sites would be showing up in the search results. On the other hand, our algorithm might also be looking at this and saying this is clearly someone trying to overdo things with a collection of doorway sites and we’ll demote all of them. So what I recommend doing here is really trying to take a step back and focus on fewer sites and making those really strong, and really good and unique. So that they have unique content, unique products that they’re selling. So then you don’t have this collection of a lot of different sites that are essentially doing the same thing.” (John Mueller, Senior Webmaster Trend Analyst at Google. https://www.youtube.com/watch?time_continue=1&v=kQIyk-2-wRg&feature=emb_logo)
White Hat / Black Hat SEO | | PabloCulebras0 -
Significant "Average Position" dips in Search Console each time I post on Google My Business
Hi everyone, Several weeks ago I noticed that each Wednesday my site's Average Position in Search Console dipped significantly. Immediately I identified that this was the day my colleague published and back-linked a blog post, and so we spent the next few weeks testing and monitoring everything we did. We discovered that it was ONLY when we created a Google My Business post that the Average Position dipped, and on the 1st July we tested it one more time. The results were the same (please see attached image). I am 100% confident that Google My Business is the cause of the issue, but can't identify why. The image I upload belongs to me, the text isn't spammy or stuffed with keywords, the Learn More links to my own website, and I never receive any warnings from Google about the content. I would love to hear the community's thoughts on this and how I can stop the issue from continuing. I should note, that my Google My Business insights are generally positive i.e. no dips in search results etc. My URL is https://www.photographybymatthewjames.com/ Thanks in advance Matthew C0000OTrpfmNWx8g
White Hat / Black Hat SEO | | PhotoMattJames0 -
How does google know if rich snippet reviews are fake?
According to: https://developers.google.com/structured-data/rich-snippets/reviews - all someone has to do is add in some html code and write the review. How does google do any validation on whether these reviews are legitimate or not?
White Hat / Black Hat SEO | | wlingke0 -
Google webmasters tools, Majestic and Ahref in a simple case study (Bad links and Good links)
Hey guys, This case study started from here. A simple summary, I discover that I got +1000 backlinks from Blogspot though Google webmasters tools after making a connection with owners of these blogs which points to my new blog. Before starting I proudly invite Thomas Zickell and Gary Lee in this discussion. I wish you accept my invitation. Let's go to the main point, I've used Google webmaster tools so I will start with. Then Ahref which used by **Thomas **and then Majestic which used by Gary. Take a look at "001" screenshot, you will see that Google webmaster tools discovered 1291 links points to my site. Take another look at "002" screenshot, you will find that there are 22 domains points to my site. Most of them are good links since they are coming from websites such as Google.com, Wikipedia.org, Reddit, Shoutmeload, WordPress.org, ...etc. Beside +1000 backlinks came from Blogspot.com (blogs). Also, there's some bad links such as this one came from tacasino.com Necessary to say that I've got some competitors and they nicely asked me to stop the competition for some keywords and I've ignored their request. So, I'm not surprised to see these bad links. At "002" screenshot, we can see that Google didn't discover the bad links as they discovered the good links. And they discovered a lot of backlinks which not discovered by any other tools. **Let's move to Ahref, ** I will use screenshots provided by Thomas. At "003" screenshot, you can see Ahref report that say 457 links from 10 domains. By the way, social engagements data are wrong. I got more than zero engagements .. really. At "004" screenshot, you can see domains points to my site, links with anchor text. Take a look at the second link you will find that it's a spammy link coming from PR2 home page since it's is over optimized. the third link is also a spammy link since it coming from a not-relevant website. Beside other bad links need to be removed. So, Ahref didn't discover all of my good links. Instead of that it discovered few good links and a lot of bad links. In a case like this a question come needs to be answered since there are some people trying so hard to hurt my site, Do I have to remove all this bad links? Or, just links discovered by Google. Or, Google understand the case? **Let's move to majestic, ** Gray Lee provided data from majestic which say "10 Unique Referring Domains, with 363 links, 2 domains make up a majority." Since Gray didn't take any screenshots I will provide mine. At "005" screenshot, you can see some of the bad links discovered by Majestic. Not all of them discovered by Ahref or Google. In the other hand, Majestic didn't discover all of my Good links. Also, there's a miss understand I would like to explain here. When I published the Discussion about +1000 link. Some people may think that I trying to cheat you by providing fake info and this totally wrong. I said before and I'm saying that again you are so elite and I respect you. Also, I'm preparing for an advanced case study about this thing. If any expert would like to join me this will be great. Thank you for reading and please feel free to share you thoughts, knowledge and experience in this Discussion. EE5bFNc jYg21cf Xyfgp28.png iR4UOwi.png D1pGAFO
White Hat / Black Hat SEO | | Eslam-yosef1 -
Do some sites get preference over others by Google just because? Grandfathered theory
So I have a theory that Google "grandfathers" in a handful of old websites from every niche and that no matter what the site does, it will always get the authority to rank high for the relevant keywords in the niche. I have a website in the crafts/cards/printables niche. One of my competitors is http://printable-cards.gotfreecards.com/ This site ranks for everything... http://www.semrush.com/info/gotfreecards.com+(by+organic) Yet, when I go to visit their site, I notice duplicate content all over the place (extremely thin content, if anything at all for some pages that rank for highly searched keywords), I see paginated pages that should be getting noindexed, bad URL structure and I see an overall unfriendly user experience. Also, the backlink profile isn't very impressive, as most of the good links are coming from their other site, www.got-free-ecards.com. Can someone tell me why this site is ranking for what it is other than the fact that it's around 5 years old and potentially has some type of preference from Google?
White Hat / Black Hat SEO | | WebServiceConsulting.com0 -
Google Disavow and Penalty lifted please help?
We disavowed 80% of our backlink profile due to our last SEO building cheap nasty links and filed a reconsideration requested (we had the Google Webmaster Tools notice of detected unnatural links to http://www.xxx.co.uk penalty for a year from the 24<sup>th</sup> march 2012 but thought it best to clean up before round 2 – even though we had no real penalty and we dd some decent link building that moved us up). We then received a successful penalty lifted note (on the 22<sup>nd</sup> of May 2013) but our rankings dropped (due to the crap links propping us up) since then we have built a fair few high quality links but our rankings do not seem to be moving much if at all (7 weeks clear now). has anyone had any experience with the above (are we in a sandbox type situation). Thank you for your time Thanks Bob
White Hat / Black Hat SEO | | BobAnderson0 -
Does Google Penalize for Managing multiple Google Places from the same IP Address? Can you manage from same google account or separate? Or does it matter since it's created from the same IP?
I manage a number of client's Google Places from the same IP and heard this is not a good thing. Are there Do's and Don'ts when managing multiple Google Places? Create separate google accounts for each or can you use the same account?
White Hat / Black Hat SEO | | Souk0 -
Has anyone seen this kind of google cache spam before?
Has anyone seen this kind of 'hack'? When looking at a site recently I found the Google cache version (from 28 Oct) strewn with mentions of all sorts of dodgy looking pharma products but the site itself looked fine. The site itself is www.istc.org.uk Looking in the source of the pages you can see the home pages contains: Browsing as googlebot showed me an empty page (though msnbot etc. returned a 'normal' non-pharma page). As a mildly amusing aside - when I tried to tell the istc about this, the person answering the phone clearly didn't believe me and couldn't get me off the line fast enough! Needless to say they haven't fixed it a week after being told.
White Hat / Black Hat SEO | | JaspalX0