You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: Project-Description.md
+173-4Lines changed: 173 additions & 4 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -63,8 +63,8 @@ A Python-based example specification demonstrating the use of the Fields and Neu
63
63
64
64
### General Coding Guidelines
65
65
-**Setup**: C++ library using CMake as the build system and pybind11 for Python integration (similar to PyTorch).
66
-
---The C++ code is intended to be high performance with as little overhead as possible. The code base should avoid smart pointers and manage its memory manully.
67
-
For perfomance critical portions of the code, dynamic data structures should be avoided and array based data-structure should be used instead. ---
66
+
The C++ code is intended to be high performance with as little overhead as possible. The code base should avoid smart pointers and manage its memory manully.
67
+
For perfomance critical portions of the code, dynamic data structures should be avoided and array based data-structure should be used instead.
68
68
69
69
70
70
#### Overall Description of the Fields Module
@@ -204,7 +204,7 @@ Below are detailed descriptions for key classes within the fields module, focusi
204
204
-**Relation**: Adjusts signal magnitude in the field graph.
205
205
206
206
13.**SoftmaxFields.java**
207
-
-**Purpose**: Implements a softmax operation across multiple inputs.
207
+
-**Purpose**: Implements a softmax operation across multiple inputs and input objects.
208
208
-**Key Features**:
209
209
- Composes `ExponentialFunction`, `SumField`, and `Division` to normalize inputs.
210
210
- Uses relations to connect input, normalization, and output fields.
@@ -228,7 +228,7 @@ Below are detailed descriptions for key classes within the fields module, focusi
228
228
- Single input; supports comparisons (e.g., above, below); optional final state.
229
229
-**Relation**: Implements sparse activation by thresholding field values.
230
230
231
-
### `network.aika.fields.defs`
231
+
######`network.aika.fields.defs`
232
232
233
233
1.**FieldDefinition.java**
234
234
-**Purpose**: Defines a field in the field graph, serving as a blueprint.
@@ -320,4 +320,173 @@ Below are detailed descriptions for key classes within the fields module, focusi
320
320
- Manages fields, relations, and queue access. Initializes and stores fields based on type; provides field access methods.
321
321
-**Relation**: Concrete representation of network objects, integrating with fields.
322
322
323
+
Below is a structured textual description of the neural network module from the AIKA project's Java codebase. This description is designed to be high-level yet detailed enough to serve as a blueprint for implementing equivalent C++ classes. It captures the purpose and functionality of each class, relates them to the neural network module's goals as outlined in the AIKA project description, and fills in missing details where applicable. The description is organized at multiple granularities: an overall summary, per-package overviews, and detailed per-class descriptions for key components.
324
+
325
+
326
+
#### Neural Network Module
327
+
328
+
##### Overall Description of the Neural Network Module
329
+
330
+
The neural network module in AIKA is a core component of an innovative neural network framework that emphasizes flexibility, sparsity, and dynamic inference over traditional layered architectures. It separates the static structure of the neural network—composed of neurons and synapses—from the dynamic activation network, which consists of activations and links tied to specific input data (e.g., tokenized text). This separation enables efficient processing of large-scale networks by activating only relevant subsections based on input and thresholds, a concept known as sparse activation.
331
+
332
+
Key features include:
333
+
-**Neurons**: Static computational units defined by a type hierarchy, managing synapses and capable of suspension to optimize memory usage.
334
+
-**Synapses**: Connections between neurons, with types (e.g., conjunctive, disjunctive) determining signal propagation and binding signal transitions.
335
+
-**Activations**: Dynamic instances of neurons, created for specific inputs, handling binding signals and linking logic.
336
+
-**Links**: Connections between activations, mirroring synapses and facilitating the flow of binding signals during inference.
337
+
-**Binding Signals (BS)**: Relational references that propagate through the network, ensuring coherent activation patterns by defining valid connections.
338
+
-**Event-Driven Processing**: Managed via a time-ordered queue, processing events like neuron firings and link instantiations asynchronously.
339
+
-**Linker**: A distributed mechanism (not a single class) that transfers the neural network structure to the activation network by creating activations and links based on firing events and binding signal propagation.
340
+
341
+
The module builds on the fields module, which provides the mathematical foundation through graph-based computations. The type hierarchy, implemented in the `typedefs` package, defines the properties and behaviors of network elements, supporting the flexible topology required for dynamic responses. This design aligns with AIKA's goal of handling large, sparse networks efficiently, leveraging selective activation and relational coherence via binding signals.
342
+
343
+
##### Per-Package Descriptions
344
+
345
+
##### `network.aika`
346
+
**Purpose**: Provides core framework classes for configuration, document management, and model oversight.
347
+
-**Config**: Stores settings like learning rate and timeouts, influencing training and processing behavior.
348
+
-**Document**: Represents an input instance (e.g., a document), managing activations, binding signals, and the processing queue.
349
+
-**Element**: Interface for activation graph elements (activations and links), tracking creation and firing timestamps.
350
+
-**Model**: Oversees the neural network, managing neurons, documents, and suspension logic.
351
+
-**ModelProvider**: Interface for accessing the model instance.
352
+
**Relation to Project**: These classes establish the framework's foundation, coordinating the static neural network and dynamic inference processes.
353
+
354
+
###### `network.aika.activations`
355
+
**Purpose**: Manages dynamic inference through activations and links.
356
+
-**Activation**: Abstract base for activations, with subtypes (`ConjunctiveActivation`, `DisjunctiveActivation`, `InhibitoryActivation`) handling specific linking behaviors.
357
+
-**ActivationKey**: Record for uniquely identifying activations.
358
+
-**Link**: Connects activations, carrying binding signals and reflecting synapse relationships.
359
+
**Relation to Project**: Implements the activation network, enabling sparse and dynamic responses to input data.
360
+
361
+
###### `network.aika.bindingsignal`
362
+
**Purpose**: Defines and manages binding signals for relational coherence.
363
+
-**BSType**: Interface for binding signal types.
364
+
-**BindingSignal**: Represents a binding signal tied to a token, tracking associated activations.
365
+
-**Transition**: Defines binding signal transitions across synapses.
366
+
**Relation to Project**: Ensures valid connections in the activation graph, a key feature of AIKA’s sparse activation mechanism.
367
+
368
+
###### `network.aika.misc.direction`
369
+
**Purpose**: Specifies directionality for synapses and links.
370
+
-**Direction**: Interface with implementations `Input` and `Output`.
371
+
**Relation to Project**: Supports the graph structure by defining data flow directions.
372
+
373
+
###### `network.aika.misc.exceptions`
374
+
**Purpose**: Custom exceptions for error handling (e.g., `LockException`, `MissingNeuronException`, `NeuronSerializationException`).
375
+
**Relation to Project**: Enhances robustness, though not directly tied to neural functionality.
376
+
377
+
###### `network.aika.misc.suspension`
378
+
**Purpose**: Manages neuron suspension to reduce memory usage.
379
+
-**SuspensionCallback**: Interface for suspension logic, with implementations `FSSuspensionCallback` (file-based) and `InMemorySuspensionCallback` (memory-based).
380
+
**Relation to Project**: Optimizes resource use for large networks, supporting scalability.
381
+
382
+
###### `network.aika.misc.utils`
383
+
**Purpose**: Utility classes for concurrency and general operations.
-**Purpose**: Define types for activations, neurons, synapses, and links.
474
+
-**Key Features**:
475
+
-`ActivationDefinition`, `NeuronDefinition`, `SynapseDefinition`, etc., use registries (`TypeRegistry`) and relations (`Relation`) to specify properties.
-**Relation**: Enables a flexible, hierarchical structure for network elements, supporting AIKA’s non-layered design.
478
+
479
+
##### Filling in Missing Parts
480
+
481
+
The project description mentions the **Linker** as a distinct component transferring structure from the neural network to the activation network. In the codebase, this functionality is distributed:
482
+
-**Activation.linkOutgoing()**: Triggers link creation based on output synapses and binding signals.
483
+
-**Synapse.createLink()**: Instantiates links between activations, respecting transitions.
484
+
-**Neuron.createActivation()**: Generates activations with binding signals.
485
+
486
+
**Sparse Activation** is implemented through:
487
+
-**Activation.updateFiredStep()**: Checks thresholds to trigger firing.
488
+
-**Linker Logic**: Selective linking via `collectLinkingTargets` and binding signal checks ensures only relevant connections are made.
489
+
490
+
**Event-Driven Processing** is fully realized in the `Queue` class and steps like `Fired`, aligning with the asynchronous update requirement.
0 commit comments