Skip to content

Latest commit

 

History

History
40 lines (26 loc) · 1.23 KB

File metadata and controls

40 lines (26 loc) · 1.23 KB

OMEGA: One-Model Efficient Generalized ANNS

Adaptive ANN search that learns when to stop searching. Train once on K=1, works for any K.

Built on HNSW + LightGBM. Adapts search effort per query instead of using fixed parameters for everything.

Installation

Requirements: Linux, Conda, GCC with C++17

# Setup environment
conda env create -f environment.yml
conda activate OMEGA

# Build
bash ./scripts/build_hnswlib.sh
bash ./scripts/build_diskann.sh  # For computing ground truth

# Download dataset
python create_dataset.py --dataset bigann-100M

Running Experiments

# Run full evaluation (OMEGA vs LAET/DARTH/Fixed-ef baselines)
bash ./scripts/eval_end2end.sh

# Plot results
python ./plot/eval_end2end.py

Results are saved to ./plot/eval_end2end_results/ as JSON files with recall, latency, and throughput metrics.

References

Built on big-ann-benchmarks, hnswlib, DiskANN (for ground truth computation), and LightGBM.

Baselines implemented: LAET (SIGMOD'20), DARTH (SIGMOD'26).