Installation
Setting up Nevaarize on your system
Requirements
Before you begin, ensure you have the following installed:
- C++23 compatible compiler — GCC 13+ or Clang 16+
- Make — For building the project
- Git — For cloning the repository
- Linux x86-64 — Currently the primary supported platform
Nevaarize uses AVX2 SIMD instructions for maximum performance. Most CPUs from 2013 onwards support AVX2.
Clone the Repository
First, clone the Nevaarize repository:
git clone https://github.com/gtkrshnaaa/nevaarize.git
cd nevaarize
Build from Source
Nevaarize uses a simple Makefile for building. Run:
make
This will compile all source files and create the nevaarize binary
in the bin/ directory.
For a clean rebuild, use make clean && make
Verify Installation
Test that the build was successful:
./bin/nevaarize --version
You should see output like:
Nevaarize v0.1.0
Native JIT Compiler for the Nevaarize Programming Language
Built with C++23, Zero External Dependencies
Run Your First Program
Try running the hello world example:
./bin/nevaarize examples/basics/helloWorld.nva
Output:
Hello, World!
Welcome to Nevaarize - Native JIT Performance
Optional: Add to PATH
For convenience, add Nevaarize to your system PATH:
# Add to ~/.bashrc or ~/.zshrc
export PATH="$PATH:/path/to/nevaarize/bin"
# Reload your shell
source ~/.bashrc
Now you can run nevaarize from anywhere:
nevaarize script.nva
Project Structure
Here's an overview of the Nevaarize directory structure:
nevaarize/
├── bin/ # Compiled binary
├── build/ # Object files
├── core/ # Core compiler implementation
│ ├── include/ # Header files
│ └── src/ # Source files
├── stdlib/ # Standard library
│ ├── include/ # Library headers
│ └── src/ # Library implementations
├── examples/ # Example programs
│ ├── basics/ # Basic examples
│ ├── async/ # Async examples
│ ├── benchmarks/ # Performance benchmarks
│ └── ...
├── docs/ # Documentation (you are here)
└── Makefile # Build configuration
Model Commands
Nevaarize includes CLI commands for AI model training and inference:
Train a Model
Run a training script and save the resulting model:
./bin/nevaarize model train examples/ai/trainModel.nva to demo.nmod
The output path is resolved relative to the training script's location.
Run Model Inference
Load a trained model and run inference:
# View model info
./bin/nevaarize model run examples/ai/demo.nmod
# Run with input data
./bin/nevaarize model run examples/ai/demo.nmod --input "[1.0, 0.0, 1.0, 0.0]"
See the AI Library documentation for the complete model training API and examples.