Google Products / Google Shopping
-
My client has a site with products a lot of which are so similar in function that for usability reasons we have combined some products on the same pages.
We want to get into Google Shopping, but on the face of it the Google feed seems to want unique urls per product.
I guess we could have products on the same page then have single pages as well, though that could generate duplicate content.
We could also try pointing several products to 1 URL, does anyone know if this would work?
Or can anyone suggest any work arounds?
Justin
-
Thanks for taking the time to answer guys...
"take the time out to highlight the slightly different features of each product and put each of them on a different page."
I would if I could the differences are minor, sometimes only about the product size.
I am more concerned about user experience, adding them on different pages will overwhelm the site with very similar products.
Each page typically has 4 product variations I know its not ideal but is there any reason why I cannot pick one of the variations and leave the other 3 out?
-
I'd give each product a unique URL and its own content, but for displaying the products some shopping carts offer a Bundle or Grouped Product option.
Also, you could just group products under a Category page to show to non-Google Shopping traffic, and use Related Items links to promote similar products on each individual Product Page
-
I am not keen on using the canonical tag as an upfront solution. I would say to take the time out to highlight the slightly different features of each product and put each of them on a different page. If you find that difficult to do, then you can noindex the pages you that are duplicates and nofollow all internal links to these pages. Finally, I would highlight the difference of each product in the title of the product or Google Shopping will have a very hard differentiating these products.
-
Sounds to me like you could give each product it's own URL for that purpose, then to avoid the duplicate content issue just rel=canonical those pages back to the page where they actualy live.
I haven't had to do this yet myself though, so I'm sure someone will jump in and say for sure
Brian
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What is the most frustrating/challenging part about sending marketing emails?
I'm in the process of gathering research into the challenges most commonly faced by marketers when executing email campaigns. What are yours? And what role do your play in the process? (Job title?) Looking forward to a spirited discussion!
Industry News | | johnbonini0 -
How Google could quickly fix the whole Links problem...
A Thursday morning brainstorm that hopefully an important Google manager will see... Google could quickly end all the problems of link buying, spammy links, and negative SEO with one easy step: Only count the 100 best follow links to any domain. Ignore all the nofollows and everything beyond the 100 best. They can choose what "best" means. Suddenly links would be all about quality. Quantity would not matter. Fiverr links, comment links, and all the other mass-produced spam links would literally be ignored. Unless that's all a domain had, and then they would surely be stomped by any domain with 100 decent natural links. Would it be an improvement over today's situation?
Industry News | | GregB1230 -
Is this still Google?
My niche, my concern.
Industry News | | webfeatus
http://www.google.com/search?q=jimbaran+villa
My site just dropped out of the rankings completely. But if you look at the Google search above you will notice 2 things:
1. First page: 75% of space above the fold is dedicated to Google making money
2. Subsequent pages: It is like you don't actually search "Google" If you flip through a few pages what you actually search is:
agoda.com
flipkey.com
tripadvisor.com
homeaway.com Do I have a point or am I simply having a cynical day?1 -
Does Google still have a standard search result? How can I get it?
I have heard a lot from the experts that there are no "Standard" Google search results anymore. They said that most of the SERP's of Google that show up are customized/tailored for each individuals even if they are not logged-in using their Google Custom Search. My questions are, Is there still a way to retrieve the standard Google search result? How? Are these scripts will be helpful when searching on Google? *webhp?
Industry News | | RafaelRanada
*complete=0
*pws=0 watch?v=B8ofWFx525s B8ofWFx525s watch?v=B8ofWFx525s0 -
Google Product Feeds - New Requirements
We are in the jewelry industry, and for Google product feeds, we list our products under "Apparel & Accessories > Jewelry". As of the new Google feed requirements, they are saying that we have to choose a gender and color for each product that is in the Apparel category. While this makes sense for clothes, it doesn't exactly for jewelry because many items are for both men and women, and there's not always a color associated with each product. I can enter some of these fields manually, but with 5,000+ products, it makes it difficult w/ each update. Anyone have solutions for this? Or a way around it? Can we just include those fields but leave them blank? Any other solutions?
Industry News | | applesofgold1 -
Google+ profiles and Rel Author. Extensive question
A bit of a mammoth question for discussion here: With the launch of Google+ and profiles, coupled with the ability to link/verify authorship using rel=me to google+ profile - A few questions with respect to the long term use and impact. As an individual - I can have a Google+ Profile, and add links to author pages where I am featured. If rel=me is used back to my G+ profile - google can recognise me as the writer - no problem with that. However - if I write for a variety of different sites, and produce a variety of different content - site owners could arguably become reluctant to link back or accredit me with the rel=me tag on the account I might be writing for a competitor for example, or other content in a totally different vertical that is irrelevant. Additionally - if i write for a company as an employee, and the rel=me tag is linked to my G+ profile - my profile (I would assume) is gaining strength from the fact that my work is cited through the link (even if no link juice is passed - my profile link is going to appear in the search results on a query that matches something I have written, and hence possibly drain some "company traffic" to my profile). If I were to then leave the employment of that company - and begin writing for a direct competitor - is my profile still benefiting from the old company content I have written? Given that google is not allowing pseudonyms or ghost writer profiles - where do we stand with respect to outsourced content? For example: The company has news written for them by a news supplier - (each writer has a name obviously) - but they don't have or don't want to create a G+ profile for me to link to. Is it a case of wait for google to come up with the company profiles? or, use a ghost name and run the gauntlet on G+? Lastly, and I suppose the bottom line - as a website owner/company director/SEO; Is adding rel=me links to all your writers profiles (given that some might only write 1 or 2 articles, and staff will inevitably come and go) an overall positive for SEO? or, a SERP nightmare if a writer moves on to another company? In essence are site owners just improving the writers profile rather than gaining very much?
Industry News | | IPINGlobal541 -
What is the best method for getting pure Javascript/Ajax pages Indeded by Google for SEO?
I am in the process of researching this further, and wanted to share some of what I have found below. Anyone who can confirm or deny these assumptions or add some insight would be appreciated. Option: 1 If you're starting from scratch, a good approach is to build your site's structure and navigation using only HTML. Then, once you have the site's pages, links, and content in place, you can spice up the appearance and interface with AJAX. Googlebot will be happy looking at the HTML, while users with modern browsers can enjoy your AJAX bonuses. You can use Hijax to help ajax and html links coexist. You can use Meta NoFollow tags etc to prevent the crawlers from accessing the javascript versions of the page. Currently, webmasters create a "parallel universe" of content. Users of JavaScript-enabled browsers will see content that is created dynamically, whereas users of non-JavaScript-enabled browsers as well as crawlers will see content that is static and created offline. In current practice, "progressive enhancement" in the form of Hijax-links are often used. Option: 2
Industry News | | webbroi
In order to make your AJAX application crawlable, your site needs to abide by a new agreement. This agreement rests on the following: The site adopts the AJAX crawling scheme. For each URL that has dynamically produced content, your server provides an HTML snapshot, which is the content a user (with a browser) sees. Often, such URLs will be AJAX URLs, that is, URLs containing a hash fragment, for example www.example.com/index.html#key=value, where #key=value is the hash fragment. An HTML snapshot is all the content that appears on the page after the JavaScript has been executed. The search engine indexes the HTML snapshot and serves your original AJAX URLs in search results. In order to make this work, the application must use a specific syntax in the AJAX URLs (let's call them "pretty URLs;" you'll see why in the following sections). The search engine crawler will temporarily modify these "pretty URLs" into "ugly URLs" and request those from your server. This request of an "ugly URL" indicates to the server that it should not return the regular web page it would give to a browser, but instead an HTML snapshot. When the crawler has obtained the content for the modified ugly URL, it indexes its content, then displays the original pretty URL in the search results. In other words, end users will always see the pretty URL containing a hash fragment. The following diagram summarizes the agreement:
See more in the....... Getting Started Guide. Make sure you avoid this:
http://www.google.com/support/webmasters/bin/answer.py?answer=66355
Here is a few example Pages that have mostly Javascrip/AJAX : http://catchfree.com/listen-to-music#&tab=top-free-apps-tab https://www.pivotaltracker.com/public_projects This is what the spiders see: view-source:http://catchfree.com/listen-to-music#&tab=top-free-apps-tab This is the best resources I have found regarding Google and Javascript http://code.google.com/web/ajaxcrawling/ - This is step by step instructions.
http://www.google.com/support/webmasters/bin/answer.py?answer=81766
http://www.seomoz.org/blog/how-to-allow-google-to-crawl-ajax-content
Some additional Resources: http://googlewebmastercentral.blogspot.com/2009/10/proposal-for-making-ajax-crawlable.html
http://www.seomoz.org/blog/how-to-allow-google-to-crawl-ajax-content
http://www.google.com/support/webmasters/bin/answer.py?answer=357690 -
SEO hosting (EU/Spanish IP's) any1?
Hello all, I'm currently trying to find a great hosting company which can offer a fast and stable service and more important, with the possibility of geolocalizing IP's of Europe and concretely for Spain. I've found Ixwebhosting but they only offer US Ip's, as their datacenters are there. I don't know if it's the same thing outside spain, but here the hosting companies only use to have a single "C-class" range so even buying them different IP's doesn't help that much, and it isn't cheap (2€/month/ip...). So if you know someone offering what I'm looking for (or really close to it) drop a line, I'll appreciate your help! 🙂
Industry News | | PabloGV0