Google Site Search
-
Hi,
I was just wondering if anyone had used Google Site Search before and what they thought of it?
http://www.google.com/sitesearch/
It seems quite expensive for just returning your own pages but would be interested to find out more.
Thanks
-
I would totally recommend Google Sitesearch.
I found that in our latest client, they manage to increase conversion rates and thus increasing their profits...coincidence? Nope.
I found that the display of results enticed the users to stay on for longer and make it more engaging. It’s faster than a open source search engine
Hope this helps. I would really love to give a demo and stats on my clients work but i cant due to a confidentiality clause.
Two thumbs Up!
-
ditto - just seeing what people think, you're in my camp by the looks.
-
OK got ya, I dont know if its worth it, i would just click on link myself.
-
Nope, it's not free. It's actually something that appears in SERPs - like the House of Fraser result you see here: - it's not acutally on site as such.
http://www.google.co.uk/search?gcx=c&sourceid=chrome&ie=UTF-8&q=house+of+fraser
A
-
I thought that was free,
Bing has a free site search you can use
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How to create a dynamic visual sitemap using Google sheets?
Does anyone have a solution where you can use a listing of page names with tiers in a Google spreadsheet and have it dynamically appear in a visual sitemap architecture format within in a Google document? Thanks in advance Moz community!
Content Development | | peteboyd0 -
B2b services site, deep long tail services or flat and broad
Hi all, i have a small internet agency with services in several area's.
Content Development | | mdkay
Currently i structud my services in 2 different area's.
General services and specialized products with services underneath. like so:
domain.com/services
domain.com/services/service1
domain.com/services/service2
and so on. domain.com/product
domain.com/product/service1
domain.com/product/service2 domain.com/product2
domain.com/product2/service1
domain.com/product2/service2 But now im thinking of create loads of different services i provide,
there are some overlapping content of course, but all services are still individual.
But i can write a lot of unique content about all the services. But is it smarter to make just a few big pages with the services more bundled.
Or really long tail. Now i have around 15 services pages.
when i make it a lot deeper im talking about 40 pages. Currently my complete website is around 35 pages.0 -
SEO for a News based site : Outdated content
How do we maintain the SEO of a 5 year old News content based site? How should we deal with 3-4 year old content, which are outdated or not searched? Some of them are still useful as archiving/history of a topic..but not searched.... ?? Should we no-index them? or should we keep it like that?
Content Development | | Wpfreesetup0 -
SEO advice needed regarding Bookmark Sites
Hi all We have recently employed a SEO company. They have written some blogs and promoted the blogs on up to 20 bookmark sites. On each bookmark site the text is the same. Will Google class this as duplicate content? Is this a good idea? Any advise would be appreciated, thanks.
Content Development | | Palmbourne0 -
How to make new content Indexed faster by google
I would like to know what can I do. Normally it takes google around 3 days to index my content. I got a site map, swiched the crawling rate to the fastest in my webmaster tools. I also tried crawling my homepage as google bot and sending it to the index with all linked pages but even if I do so my content takes around 3 days if not more to get indexed. I publish around 20 posts a week. My SEOmoz page authority is 48. Some sites of my competition seem to be getting their content indexed in the same day. What else can be done?
Content Development | | sebastiankoch0 -
Will our two retail sites get hit with duplicate content?
Our retail site just rolled out a second online store. The URL is new and it is showing some of the same products from the same vendors (probably about 40% of the fist store is in the second store). Down the road, we will remove the products from the first site, however, we are keeping it for now. The products show up on both sites, with the same images, and the same descriptions and almost the same URL query string. Are we going to get hit with any penalties due to duplicate content?
Content Development | | klmarketing0 -
How to best implement "metered model" on a site
Hi, I'm scratching my head over how to best implement the "metered model" on a site without users being able to game it all too easily. Has anybody in this QA forums implemented one before and is willing to share his/her best practises and findings? Currently I think raising the bar to force everybody to login is a bad idea + we would still need to open the site for google and other engines and can be tricked that way. Also this might lead to some penalty (cloaking)? Using cookies might not be enought as I think almost every Internet user these days knows that this might be the #1 place to look and they are deleted in a second. Counting based on a users IP-adress is also a bit critical as this is not accurate enough. Should we just use cookies and hope for the best?
Content Development | | jmueller0 -
Please help me stop google indexing https pages on my wordpress site
I added SSL to my wordpress blog because that was the only way to get a dedicated IP address for my site at my host. Now I am noticing Google has started indexing posts both as http and https. Can some one please help how to force google not to index https as I am sure its like having duplicate content. All help is appreciated. So far I have added this to top of htaccess file: RewriteEngine on Options +FollowSymlinks RewriteCond %{SERVER_PORT} ^443$ RewriteRule ^robots.txt$ robots_ssl.txt And added robots_ssl.txt with following: User-agent: Googlebot Disallow: / User-agent: * Disallow: / But https pages are still being indexed. Please help.
Content Development | | rookie1230