Every contribution is welcome and needed to make Aika better. If you would like to get involved, please contact me.

On integrating symbolic inference into deep neural networks

Workshop: Einführung in den Aika-Algorithmus

A data science blog article about the Aika algorithm written in German.

A JAXenter article about the Aika algorithm.

Jumble : A Job Search Engine whose tagging is powered by Aika.

KDnuggests : A list of Text Analysis, Text Mining and Information Retrieval software.

- Bugfixes

- Bugfixes

- Refactoring of the relation architecture.
- Bugfixes

- The Range class has now been replaced with by a slots concept. Previously, an activation had a range with a begin and an end position. Now activations can possess an arbitrary number of positions.
- The synapse bias is now used to decide whether a synapse is conjunctive or disjunctive.

- API Refactoring: Synapse relations are now established through a separate builder class.
- The range positions are now optionally variable. This feature is required for text generation. In this use case the positions are not known in advance and need to be computed during processing.
- Introduced passive neurons. Passive neurons are only evaluated if the connected output neuron requires it. Passive neurons act basically like callback functions.
- Optimization of the interpretation search.
- Lots of bug fixes.

- Introduction of synapse relations. Previously the relation between synapses was implicitly modeled through word positions (RIDs). Now it is possible to explicitly model relations like: The end position of input activation 1 equals to the begin position of input activation 2. Two types of relations are currently supported, range relations and instance relations. Range relations compare the input activation range of a given synapse with that of the linked synapse. Instance relations also compare the input activations of two synapses, but instead of the ranges the dependency relations of these activations are compared.
- Removed the norm term from the interpretation objective function.
- Introduced an optional distance function to synapses. It allows to model a weakening signal depending on the distance of the activation ranges.
- Example implementation of a context free grammar.
- Example implementation for co-reference resolution.
- Work on an syllable identification experiment based on the meta network implementation.

- Simplified interpretation handling by removing the InterpretationNode class and moving the remaining logic to the Activation class.
- Moved the activation linking and activation selection code to separate classes.
- Ongoing work on the training algorithms.

- Caching of partially computed states in the neural network during the interpretation search.
- Refactoring of the interpretation search. The search is now iterative to prevent stack overflows. The debugging output is much more detailed now.
- Ongoing work on the training algorithms.

- API cleanups: Input -> Synapse.Builder, Activation.Builder
- Optimization and simplification of the interpretation search.
- Refactoring of the range matching within synapses.
- Ongoing work on the training algorithms.

- Memory optimization: Disjunctive synapses are now stored on the input neuron side.
- The bias delta value in a neuron input is now an absolute value.
- Bug fixes, code cleanups, code readability improvements, lambda expression usage, convenience functions.

- Optimization for the search for the best interpretation.
- Optimization of the checkSelfReferencing function.
- Fixes for the training and pattern discovery functions.

- Simplification: Activations are now only added during processing never removed. However, they might be suppressed if they are conflicting with other activations.
- Removed some old experimental training code and provided two APIs for training and pattern discovery. The APIs allow to implement heuristics when deciding which synapses should be created or which patterns should be selected.
- Experimental support for text generation.

- Rewrite of the conversion of synapse weights to logic nodes.
- Optimization of the interpretation search.
- Fixes for a few deadlocks.

- Optimization of the interpretation search using an upper bound on the interpretation weights.
- Support for very large models with millions of neurons by suspending rarely used neurons to disk.

- Refactoring of the range model. Now the range begin and the range end can be treated independently of each other. Synapses now have three properties: range match, range output and range mapping.
- The Iteration class has been merged into the document class.
- Performance optimizations for the interpretation search in the SearchNode class.
- Test case fixes
- Class renaming: Option -> InterprNode, ExpandNode -> SearchNode
- Lots of javadoc

- Mainly optimizations

- Simplification of the algorithm!
- Lots of optimizations!
- New unified model for weights and neuron activation values.
- Options are now generated for all neuron activations.
- Disjunctions of options are now supported.

- Annotation of text ranges instead of individual characters.
- Use of the relational id for pattern matching instead of character positions.
- Use of a lattice to represent text ranges.
- Allow a single neuron to suppress multiple inputs.
- Lots of optimizations!

- Implementation of a symmetric lattice structure to represent the different interpretations that arise from the usage of non-monotonic logic.
- Implementation of an algorithm to determine the interpretation with the highest associated weight sum. This algorithm efficiently searches for non-conflicting combinations of options by limiting the search space with an upper bound.

YourKit supports open source projects with its full-featured Java Profiler. YourKit, LLC is the creator of YourKit Java Profiler and YourKit .NET Profiler, innovative and intelligent tools for profiling Java and .NET applications.

Lukas Molzberger

Lukas Molzberger

Weiherstrasse 16

53111 Bonn

contact@aika-software.org