Transactional vs Informative Search
-
I have a page that id ranking quiet good (Page1) for the plural of a Keyword but it is just ranking on Page 3 for the Singular Keyword. For more then one Year I am working on Onpage and Offpage optimization to improve ranking for the singular term, without success.
Google is treating the two terms almost the same, when you search for term one also term 2 is marked in bold and the results are very similar.
The big difference between both terms is in my opinion that one is more for informational search the other one is more for transactional search.
Now i would be curious to know which factors could Google use to understand weather a search and a website is more transactional or informative?
Apart of mentioning: Buy now, Shop, Buy now, Shop, Special offer etc.
Any Ideas?
-
Hi,
Ya i am agree with you that there is big difference between both the search transactional and informational search. Can you share with me your both keyword which provides you similar result??
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Load Balancer issues on Search Console
The top linked domains in search console are coming from our load balancer setup. Does anyone know how to remove these as unique sites pointing back to our primary domain? I was told Google is smart enough to ignore these as duplicate domains but if that was the case, why would they be listed as the top linked domains in search console? Most concerned....
Intermediate & Advanced SEO | | DonFerrari21690 -
Submitting URLs After New Search Console
Hi Everyone I wanted to see how people submit their urls to Google and ensure they are all being indexed. I currently have an ecommerce site with 18,000 products. I have sitemaps setup, but noticed that the various product pages haven't started ranking yet. If I submit the individual url through the new Google Search Console I see the page ranking in a matter of minutes. Before the new Google Search Console you could just ask Google to Fetch/Render an XML sitemap and ask it to crawl all the links. I don't see the same functionality working today on Google Search Console and was wondering if there are any new techniques people could share. Thanks,
Intermediate & Advanced SEO | | abiondo
Anthony1 -
Informational query
Hello, In an informational query can the answer people are looking for have multiple intent or will it always have 1 intent ? For example New York, the intent is probably where ? On a longer query such as "Provence bike tour" what is the intent ? Where, what, Why, How to, when ? Thank you,
Intermediate & Advanced SEO | | seoanalytics0 -
Duplicate Contact Information
My clients has had a website for many years, and his business for decades. He has always had a second website domain which is basically a shopping module for obtaining information, comparisons, and quotes for tires. This tire module had no informational pages or contact info. Until recently, we pulled this information in through iframes. Now however the tire module is too complex and we do not bring in this info through iframes, and because of the way this module is configured (or website framework), we are told we can not place it as a sub-directory. So now this tire module resides on another domain name (although similar to the client's "main site" domain name) with some duplicate informational pages (I am working through this with the client), but mainly I am concerned about the duplicate contact info -- address and phone. Should I worry that this other tire website has duplicated the client's phone and address, same as their main website? And would having a subdomain (tires.example.com) work better for Google and SEO considering the duplicate contact info? Any help is much appreciated. ccee bar (And, too, The client is directing AdWords campaigns to this other website for tires, while under the same AdWords account directing other campaigns to their main site? - I have advised an entirely separate AdWords account for links to the tire domain. BTW the client does NOT have separate social media accounts for each site -- all social media efforts and links are for the main site.)
Intermediate & Advanced SEO | | cceebar0 -
Sub-domain vs Root domain
I have recently taken over a website (website A) that has a domain authority of 33/100 and is linked to from 39 root domains. I have not yet selected any keywords to target so am currently unsure of ranking positions. However, website A is for a division of a company that has its own separate website (website B) which has a domain authority of 58/100 and over 1000 legitimate linking root domains. I have the option of moving website A to a sub-domain of website B. I also have the option of having website B provide a followed link to website A. So, my question is, for SEO purposes, is my website better off remaining on its own existing domain or is it likely to rank higher as a sub-domain of website B? I am sure there are pros and cons for both options but some opinions would be much appreciated.
Intermediate & Advanced SEO | | BallyhooLtd0 -
Google Sitelinks Search Box
For some reason, a search for our company name (“hometalk”) does not produce the search box in the results (even though we do have sitelinks). We are adding schema markup as outlined here, but we're not sure about: Will adding the code make the search bar appear (or at least increase the chances), or is it only going to change the functionality of the search box (to on-site search) for results that are already showing a search bar?
Intermediate & Advanced SEO | | YairSpolter0 -
Linking to Short vs Long URL
Suppose I have a long url on an established site and created a shorter version of it so it is easier for people to enter directly and click. I 301 the short version to the long. I don't think there is much concern for people linking to the long version pages, but will there be a tendency for people to link to the short url instead of the long for the domain links? Will I not benefit as much from links to the short vs the long? Thanks.
Intermediate & Advanced SEO | | AWCthreads0 -
Robots.txt: Link Juice vs. Crawl Budget vs. Content 'Depth'
I run a quality vertical search engine. About 6 months ago we had a problem with our sitemaps, which resulted in most of our pages getting tossed out of Google's index. As part of the response, we put a bunch of robots.txt restrictions in place in our search results to prevent Google from crawling through pagination links and other parameter based variants of our results (sort order, etc). The idea was to 'preserve crawl budget' in order to speed the rate at which Google could get our millions of pages back in the index by focusing attention/resources on the right pages. The pages are back in the index now (and have been for a while), and the restrictions have stayed in place since that time. But, in doing a little SEOMoz reading this morning, I came to wonder whether that approach may now be harming us... http://www.seomoz.org/blog/restricting-robot-access-for-improved-seo
Intermediate & Advanced SEO | | kurus
http://www.seomoz.org/blog/serious-robotstxt-misuse-high-impact-solutions Specifically, I'm concerned that a) we're blocking the flow of link juice and that b) by preventing Google from crawling the full depth of our search results (i.e. pages >1), we may be making our site wrongfully look 'thin'. With respect to b), we've been hit by Panda and have been implementing plenty of changes to improve engagement, eliminate inadvertently low quality pages, etc, but we have yet to find 'the fix'... Thoughts? Kurus0