The KARL Core (For Contributors)
This section provides technical insights for developers interested in understanding the internal architecture of Project KARL and contributing to its core library and implementation modules. Familiarity with Kotlin, Kotlin Multiplatform (KMP), and Gradle is assumed.
Architecture Overview
Project KARL is a multi-module Kotlin Multiplatform project designed for flexibility and
extensibility. The architecture centers around a platform-agnostic core
(:karl-core
) that defines interfaces and orchestrates interactions, with
platform-specific or library-specific implementations provided in separate modules.
Key Modules & Components
-
:karl-core
(KMP - commonMain, jvmMain, etc.):-
Purpose: The heart of KARL. Defines all primary public interfaces (
KarlContainer
,LearningEngine
,DataStorage
,DataSource
,InstructionParser
) and core data models (InteractionData
,KarlContainerState
,Prediction
,KarlInstruction
). -
Logic: Contains the platform-agnostic implementation of
KarlContainer
(e.g.,KarlContainerImpl
), which orchestrates the lifecycle and data flow between the engine, storage, and data source. It also includes the publicKarlAPI
for building container instances. -
Dependencies: Minimal, primarily Kotlin standard libraries and kotlinx.coroutines for its API.
-
-
:karl-kldl
(KMP - jvmMain):-
Purpose: Provides a concrete implementation of the
LearningEngine
interface using the KotlinDL library. -
Logic: Includes classes like
KLDLLearningEngine
and model definition files (e.g.,SimpleMLPModel.kt
). Handles model creation, training (fit
), prediction (predict
), and state serialization/deserialization specific to KotlinDL models. -
Dependencies:
:karl-core
,kotlin-deeplearning-api
,kotlin-deeplearning-dataset
, etc.
-
-
:karl-room
(KMP - jvmMain, potentially commonMain for entities/DAOs):-
Purpose: Provides a concrete implementation of the
DataStorage
interface using the AndroidX Room Persistence Library (with KMP support). -
Logic: Includes Room
@Entity
classes (e.g.,InteractionDataEntity
),@Dao
interfaces, the@Database
class (e.g.,KarlRoomDatabase
), and theRoomDataStorage
implementation class which maps between core models and Room entities. -
Dependencies:
:karl-core
, Room runtime, Room KTX, KSP for Room compiler, SQLite framework, and a JVM SQLite driver (e.g.,org.xerial:sqlite-jdbc
).
-
-
:karl-compose-ui
(KMP - commonMain, jvmMain for previews):-
Purpose: Contains optional, reusable Jetpack Compose UI components for visualizing KARL's status, learning progress, and suggestions (e.g.,
KarlContainerUI
,KarlLearningProgressIndicator
). -
Dependencies:
:karl-core
(to understand models likePrediction
if displaying them), Jetpack Compose libraries.
-
-
:karl-example-desktop
(KMP - jvmMain):-
Purpose: A runnable Jetpack Compose Desktop application demonstrating a full integration of the KARL modules. Serves as a reference implementation and testing ground.
-
Dependencies: All other KARL modules, Jetpack Compose Desktop application libraries.
-
Data Flow
-
Application Event: User interacts with the host application.
-
DataSource
(App-Side): Captures the interaction, converts it to anInteractionData
object. -
onNewData
Callback:DataSource
sendsInteractionData
to theKarlContainer
. -
KarlContainer
:-
May persist
InteractionData
viaDataStorage
(if configured). -
Passes
InteractionData
to theLearningEngine
'strainStep()
method.
-
-
LearningEngine
(trainStep
):-
Performs feature engineering.
-
Updates its internal ML model parameters.
-
-
Application Request for Prediction: App calls
karlContainer.getPrediction()
. -
KarlContainer
:-
May retrieve context data from
DataStorage
. -
Calls
LearningEngine.predict(context, instructions)
.
-
-
LearningEngine
(predict
): Uses its model to generate aPrediction
object. -
Application Receives
Prediction
: Uses the prediction to update UI or behavior.
For a visual representation, please see the architecture diagram
Setting Up the Development Environment
To contribute to Project KARL, you'll need:
-
Git: For version control.
-
JDK (Java Development Kit): Version 11 or 17 (or as specified by the project's current Gradle configuration).
-
IntelliJ IDEA: The latest stable version with the Kotlin plugin is highly recommended for the best KMP development experience.
Setup Steps:
-
Fork the Repository: Create a fork of the main
Project KARL
Repository on GitHub. -
Clone Your Fork:
git clone https://github.com/theaniketraj/project-karl.git
-
Navigate to Project Directory:
cd project-karl
-
Add Upstream Remote:
git remote add upstream https://github.com/theaniketraj/project-karl.git
-
Open in IntelliJ IDEA: Use "Open..." and select the cloned
project-karl
directory. IntelliJ should automatically detect it as a Gradle project. -
Gradle Sync: Allow Gradle to sync the project and download dependencies. This might take a few minutes on the first import.
Building and Testing KARL
The project uses Gradle as its build system.
-
Build All Modules: From the project root, run:
./gradlew build
-
Clean Build Output:
./gradlew clean
-
Run Tests for All Modules:
./gradlew check
or./gradlew test
-
Build/Test Specific Module:
./gradlew :module-name:build
(e.g.,./gradlew :karl-core:build
)./gradlew :module-name:check
-
Run Example Desktop App:
./gradlew :karl-example-desktop:run
Ensure your environment is correctly configured (JDK path, etc.) if you encounter build issues related to the environment.
Contribution Guidelines (Code style, submitting PRs, issue tracking)
We adhere to a set of guidelines to ensure a smooth and effective contribution process. Please familiarize yourself with them before submitting contributions.
For detailed information on coding conventions, commit message formats, the pull request
process, and how we manage issues, please refer to our main CONTRIBUTION GUIDELINE
Please also review and adhere to our Code of Conduct
How to Contribute
We welcome contributions in various forms. Here are some areas where you could help:
-
New
LearningEngine
Implementations:-
Integrate other ML libraries suitable for on-device Kotlin/JVM (e.g., ONNX Runtime with a suitable Kotlin wrapper, other lightweight Java ML libraries).
-
Implement different model architectures within existing engines (e.g., RNNs, LSTMs, or attention-based models in
:karl-kldl
).
-
-
New
DataStorage
Implementations:-
Integrate other local persistence solutions (e.g., Realm KMP, a file-based solution with kotlinx.serialization).
-
Enhance existing implementations with more robust encryption or data migration strategies.
-
-
Model Optimizations:
-
Investigate model quantization or pruning techniques compatible with KotlinDL or other chosen engines to reduce model size and improve inference speed on device.
-
Optimize feature engineering processes for efficiency.
-
-
Enhanced Security: Improve encryption methods in storage implementations or explore platform-specific secure key management.
-
Instruction System (
KarlInstruction
/InstructionParser
):-
Design and implement a more expressive DSL for user instructions.
-
Add new types of
karlInstruction
to allow finer-grained control over the AI's behavior.
-
-
Platform Expansion (KMP): Extend core functionalities and implementations to other KMP targets (Android, iOS, JS) where feasible.
-
Testing: Increase unit test coverage, add integration tests between modules, and develop more comprehensive E2E tests for the example application.
-
Documentation: Improve existing documentation, add more tutorials, or provide clearer API explanations.
-
Bug Fixes & Performance Enhancements: Address existing issues or identify and fix performance bottlenecks.
If you have an idea, it's often best to open an issue first to discuss it with the maintainers.
Roadmap & Future Development
Project KARL is an evolving library. Our current focus and potential future directions include:
-
Stabilizing Core APIs: Refining the interfaces for
v1.0
based on initial feedback and implementation experiences. -
More Sophisticated Default Models: Exploring RNNs or other sequence-aware models for the default
LearningEngine
implementations to better handle common interaction patterns. -
Enhanced State Serialization: Developing more robust and flexible mechanisms for serializing and deserializing
KarlContainerState
, especially for complex models. -
Improved Feature Engineering Utilities: Providing helpers or guidance within engine implementations for common feature engineering tasks.
-
Tooling for Debugging/Introspection: Investigating ways to allow developers (and potentially end-users in a controlled manner) to understand what a KARL container has learned, without compromising privacy.
-
Broader KMP Target Support: Gradually extending support and providing reference implementations for Android, and potentially iOS/JS where on-device learning is viable.
-
Community-Driven Features: Incorporating valuable features and improvements suggested and contributed by the community.
(Note: This roadmap is indicative and subject to change based on development progress and community input.)
We encourage you to check our GitHub repository's Project Board or Milestones
for more up-to-date information on active development and future plans.