[prev in list] [next in list] [prev in thread] [next in thread] 

List:       lucene-user
Subject:    Re: Why does the StandardTokenizer split hyphenated words?
From:       Daniel Naber <daniel.naber () t-online ! de>
Date:       2004-12-16 19:03:27
Message-ID: 200412162003.28028 () danielnaber ! de
[Download RAW message or body]

On Thursday 16 December 2004 13:46, Mike Snare wrote:

> > Maybe for "a-b", but what about English words like "half-baked"?
>
> Perhaps that's the difference in thinking, then.   I would imagine that
> you would want to search on "half-baked" and not "half AND baked".

A search for half-baked will find both half-baked and "half baked" (the 
phrase). The only thing you'll not find if halfbaked.

Regards
 Daniel

-- 
http://www.danielnaber.de

---------------------------------------------------------------------
To unsubscribe, e-mail: lucene-user-unsubscribe@jakarta.apache.org
For additional commands, e-mail: lucene-user-help@jakarta.apache.org


[prev in list] [next in list] [prev in thread] [next in thread] 

Configure | About | News | Add a list | Sponsored by KoreLogic