15

I'm trying to find a way to automatically convert arbitrary natural language sentences into first-order logic predicates. Although complex, this seems to be feasible to me, through inverse lambda calculus; one of the biggest downsides of this technique is that it heavily relies on a combinatory categorial grammar (CCG), which must be trained to yield the appropriate results.

Are there other known approaches to such conversion at all, or it is a bad idea?

izilotti
  • 261
  • 2
  • 7

2 Answers2

9

... it heavily relies on a combinatory categorial grammar (CCG), which must be trained...

There are two broad-coverage CCG parsers for English, which you could use:

  1. Curran & Clark Parser by itself only generates CCG derivations with categories, not with semantics. Boxer works on the parses to generate semantics.

  2. OpenCCG when used with the broad-coverage grammar, MOLOKO can show semantics. But I haven't used this one yet, and can't read their output.

Are there other approaches...

CCG is not the only logic-based grammar. There are others that closer to mathematics, in that, their derivation trees for sentences that look exactly like natural deduction proofs. I think the ACG guys have a working parser. I am not sure about the others like TLG, CVG, etc.

prash
  • 3,649
  • 3
  • 25
  • 33
4

It is also possible to convert natural language into first-order logic using a discourse representation theory. For example, the ACE reasoner is an automated theorem prover that can convert English text into first-order logical predicates.

Anderson Green
  • 249
  • 1
  • 9
  • 1
    But if I understand correctly, we need to convert English to ACE language first and only then we can get first order logic representation. – user1700890 Mar 21 '19 at 15:12