Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merge the forward and backward chainer into a mixed chainer #23

Open
ngeiswei opened this issue Mar 26, 2018 · 0 comments
Open

Merge the forward and backward chainer into a mixed chainer #23

ngeiswei opened this issue Mar 26, 2018 · 0 comments

Comments

@ngeiswei
Copy link
Member

Here are some vague guidelines for implementing the mixed chainer. Should be refined as we'll progress into the task.

  1. Forward and backward chaining often need to occur over the same inference tree in order to construct the desire inference tree.
  2. Providing a target may not always mean that the best chaining strategy should be backward, likewise providing sources may not always mean that best chaining strategy should be forward, these decisions should not be hardwired, and instead be delegated to control policies (possibly handwritten by default).
  3. The mixed chainer should be able to take input intermediary sources/targets, not just end sources/targets.
  4. It is expected as well that different parts of an inference tree could be evolved independently then stick together (using unification, just like a rule is stuck to an inference tree).
@linas linas transferred this issue from opencog/atomspace Jul 26, 2019
ngeiswei added a commit to ngeiswei/ure that referenced this issue Jul 9, 2020
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant