Open-source rule-based pattern matching / informat

2020-05-29 05:51发布

I'm shopping for an open-source framework for writing natural language grammar rules for pattern matching over annotations. You could think of it like regexps but matching at the token rather than character level. Such a framework should enable the match criteria to reference other attributes attached to the input tokens or spans, as well as modify such attributes in an action.

There are three options I know of which fit this description:

Are there any other options like these available at this time?

Related Tools

  • While I know that general parser generators like Antlr can also serve this purpose, I'm looking for something which are more specifically tailored for natural language processing or information extraction.
  • UIMA includes a Regex Annotator plugin for declaring rules in XML, but appears to operate at the character rather than high-level objects.
  • I know that this kind of task is often performed with statistical models, but for narrow, structured domains there's benefit in hand-crafting rules.

* With GExp 'rules' are actually implemented in code but since there are so few options I chose to include it.

2条回答
乱世女痞
2楼-- · 2020-05-29 06:07

You may also check HTQL. It supports regular expression search of tokens. An example to search for state and zip from US address is:

a=htql.RegEx(); 
a.setNameSet('states', states);
a.reSearchList(address.split(), r"&[ws:states]<,>?<\d{5}>", case=False) 
查看更多
可以哭但决不认输i
3楼-- · 2020-05-29 06:22

French academic soft Unitex from University Paris East also matches your description (http://www-igm.univ-mlv.fr/~unitex/)

It's C++ based, comprises many optional preprocessing rules and lexicons for 20+ languages.

The GUI is graph based (you design automata ie 'grammars').

查看更多
登录 后发表回答