November 12, 2025

About the AIKA Neural Network

AIKA (Artificial Intelligence for Knowledge Acquisition) is an experimental AI framework that bridges the gap between modern transformer-based deep learning, biologically inspired spiking neural networks, and hybrid neuro-symbolic approaches.

Current Version: Beta (Active Development) | Overall Completion: ~72%

By leveraging sparse representations and event-driven processing, AIKA explores how neural networks can operate more efficiently and interpretably than traditional dense matrix-based architectures. Originally implemented in Java, AIKA has been migrated to a C++ core with Python bindings, combining high performance with Python's ease of use.

Unlike conventional frameworks like PyTorch, AIKA focuses on dynamic object instantiation and type-hierarchy-based graph representation, allowing network structures to adapt at runtime rather than being fixed upfront. This approach draws inspiration from both neuroscience and symbolic AI, aiming to create systems that are more transparent, flexible, and computationally efficient.

Module Status:

  • Fields Module (100%): Fully implemented and production-ready
  • ⚠️ Network Module (85%): Core functionality complete, advanced features in progress
  • ⚠️ Transformer (60%): Type structure complete, math incomplete

Key Features:

  • Event-Driven Processing: AIKA processes activations asynchronously using an event queue, enabling dynamic and sparse computation.
  • Dynamic Object Graph: Unlike static vector/matrix-based architectures, AIKA instantiates network elements at runtime, making the topology adaptable.
  • Type-Based Model Representation: AIKA's functional graph is linked to dynamically created objects, allowing more flexible neural network architectures.
  • Python & C++ Hybrid: Define your models in Python while C++ handles the heavy computation—similar to PyTorch's approach.
  • Builder Pattern: Modern API for constructing neural network types with ease.

What Works Today: Field-based computations, basic neural networks, event-driven propagation, and builder-based type definitions.

In Development: Full transformer attention mechanism (latent linking and corrected softmax formula).

Get Started →    View Tutorial

New to AIKA? Start with the Installation Guide to set up your environment.