Site Search Tracking Of Non Existing Products
-
I am working towards optimizing the site search box of an ecommerce website and I wish to track the keywords which users are searching but which are yielding no results. Please see the image for the same.
I wish to assimilate data on the same which would then allow me to add products which users are searching but which the site doesn't have. However my problem is that I don't know how you could obtain this data in analytics because these results manifest itself in the form of searchresults.php.
I know that analyzing search refinements and percentage of exits in Google Analytics is an option but I want a more compact and simpler solution to the problem where I could see exactly all the data in one place. Does anyone have suggestions on how this can be done? Thanks in advance,
-
I understand that optimizing the site search box for the right pages is crucial but my question is how do you track the searches that are returning 0 results. For example, is there a code which I could implement for the page that returns no results.
Again this would be a dynamic page and therefore I won't be able to track it in analytics. I want a system where I get compact data so that I could include products which my site doesn't have provided the search volume of that product is large.
-
I've seen this done on a project I worked on before.
First, we logged every query that returned 0 results with a little bit of PHP code. The code wrote to a .txt file that was easily readable. After a month we had a significant amount of data.
The data was used to track what products were being searched for that weren't returning results. Then we used the data to decide if we should add those products or if our search wasn't returning proper results. What we found was that many people searched for products that we did carry, but the search didn't return the proper results. For instance, the search might not return a result if a search was for the BRAND of the item. Though it should have, it didn't. Also, a user might search for "widgets", but all the products were named "$brand $size widget". Because of the "s" at the end of the word our search would return 0 results, but we carried hundreds of widgets.
First we improved the search to pick up results of one character off. Then we included any manufacturer / brands. We also had the search display results from the title of articles on our site. This significantly improved the user experience and sales.
We then took the data left over and decided what the user was searching for and how to use that to our advantage. Many users are too lazy to use the navigation on a website to find what they want. If they search for "blue widgets" and 0 results display the users often assume we don't carry them - though we do. So, this is a very good tactic to use to increase conversions.
It worked very well for us. I hope this helps.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How much does URLs with CAPS and URLs with non-CAPS existing on an IIS site matter nowadays?
I work on a couple ecommerce sites that are on IIS. Both sites have return a 200 header status for the CAPS and non CAPS version of the URLs. While I suppose it would be ok if the canonicals pointed to the same version of the page, in some cases it doesn't (ie; /Home-Office canonicalizes to itself and /home-office canonicalizes to itself). I came across this article (http://www.searchdiscovery.com/blog/case-sensitive-urls-and-seo-case-matters/) that is a few years old and I'm wondering how much of an issue it is and how I would determine if it is/isn't?
Intermediate & Advanced SEO | | OfficeFurn0 -
If UGC on my site also exists elsewhere, is that bad? How should I properly handle it?
I work for a reviews site, and some of the reviews that get published on our website also get published on other reviews websites. It's exact duplicate content -- all user generated. The reviews themselves are all no-indexed; followed, and the pages where they live are only manually indexed if the reviews aren't duplicate. We leave all pages with reviews that live elsewhere on the web nofollowed. Is this how we should properly handle it? Or would it be OK to follow these pages regardless of the fact that technically, there's exact duplicate UGC elsewhere?
Intermediate & Advanced SEO | | dunklea0 -
SEO Site Analysis
I am looking for a company doing a SEO analysis on our website www.interelectronix.com and write a optimization proposal incl. a budgetary quote for performing those optimizations.
Intermediate & Advanced SEO | | interelectronix0 -
An affiliate website uses datafeeds and around 65.000 products are deleted in the new feeds. What are the best practises to do with the product pages? 404 ALL pages, 301 Redirect to the upper catagory?
Note: All product pages are on INDEX FOLLOW. Right now this is happening with the deleted productpages: 1. When a product is removed from the new datafeed the pages stay online and are showing simliar products for 3 months. The productpages are removed from the categorie pages but not from the sitemap! 2. Pages receiving more than 3 hits after the first 3 months keep on existing and also in the sitemaps. These pages are not shown in the categories. 3. Pages from deleted datafeeds that receive 2 hits or less, are getting a 301 redirect to the upper categorie for again 3 months 4. Afther the last 3 months all 301 redirects are getting a customized 404 page with similar products. Any suggestions of Comments about this structure? 🙂 Issues to think about:
Intermediate & Advanced SEO | | Zanox
- The amount of 404 pages Google is warning about in GWT
- Right now all productpages are indexed
- Use as much value as possible in the right way from all pages
- Usability for the visitor Extra info about the near future: Beceause of the duplicate content issue with datafeeds we are going to put all product pages on NOINDEX, FOLLOW and focus only on category and subcategory pages.0 -
Site duplication issue....
Hi All, I have a client who has duplicated an entire section of their site onto another domain about 1 year ago. The new domain was ranking well but was hit heavily back in March by Panda. I have to say the set up isn't great and the solution I'm proposing isn't ideal, however, as an agency we have only been tasked with "performing SEO" on the new domain. Here is an illustration of the problem: http://i.imgur.com/Mfh8SLN.jpg My solution to the issue is to 301 redirect the duplicated area of the original site out (around 150 pages) to the new domain name, but I'm worried that this could be could cause a problem as I know you have to be careful with redirecting internal pages to external when it comes to SEO. The other issue I have is that the client would like to retain the menu structure on the main site, but I do not want to be putting an external link in the main navigation so my proposed solution is as follows: Implement 301 redirects for URLs from original domain to new domain Remove link out to this section from the main navigation of original site and add a boiler plate link in another area of the template for "Visit xxx for our xxx products" kind of link to the other site. Illustration of this can be found here: http://i.imgur.com/CY0ZfHS.jpg I'm sure the best solution would be to redirect in URLs from the new domain into the original site and keep all sections within the one domain and optimise the one site. My hands are somewhat tied on this one but I just wanted clarification or advice on the solution I've proposed, and that it wont dramatically affect the standing of the current sites.
Intermediate & Advanced SEO | | MiroAsh0 -
After Receiving a "Googlebot can't access your site" would this stop your site from being crawled?
Hi Everyone,
Intermediate & Advanced SEO | | AMA-DataSet
A few weeks ago now I received a "Googlebot can't access your site..... connection failure rate is 7.8%" message from the webmaster tools, I have since fixed the majority of these issues but iv noticed that all page except the main home page now have a page rank of N/A while the home page has a page rank of 5 still. Has this connectivity issues reduced the page ranks to N/A? or is it something else I'm missing? Thanks in advance.0 -
How Does This Site Rank So Well?!
So this website -> http://bailbondsripoffreport.com/ Ranks on the First Page for the term "Bail Bonds" It's the spammiest crappiest piece of junk website ever! lol - How does this site rank so well, it's not even a year old and it's link structure is crap. Can I like report them and have them removed lol. Any ideas would be appreciated. Thanks!
Intermediate & Advanced SEO | | utahseopros0