Skip to content

Sung2021/rnaseq_local_chat

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

2 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

rnaseq-local-chat

Chat with your RNA-seq results using a fully local LLM — no API keys required.

Upload DESeq2 tables, GSEA pathway results, sample metadata, and figures (volcano plots, UMAPs) and ask questions about them in a NotebookLM-style interface.

Requirements

  • Ollama installed and running
  • llama3.2 model: ollama pull llama3.2
  • llava model (for image understanding): ollama pull llava

Setup

pip install -r requirements.txt

Usage

streamlit run app.py

Then open http://localhost:8501 in your browser.

  1. Upload files in the sidebar
  2. Click Build / Update DB
  3. Ask questions in the chat

Or use the CLI ingestion tool:

python ingest.py --files data/deseq2_results.csv data/volcano.png --db ./chroma_db

Supported File Types

File Naming Convention What is stored
DESeq2 results contains deseq, deg, differential Per-gene: log2FC, padj, baseMean, direction
GSEA results contains gsea, pathway, kegg, hallmark Per-pathway: NES, padj, direction, size
Sample metadata contains meta, sample, coldata Per-sample conditions + summary
Images (PNG/JPG) any filename LLaVA-generated text description

Example Questions

  • "Which genes are significantly upregulated (padj < 0.05, log2FC > 1)?"
  • "What immune-related pathways are activated?"
  • "What does the volcano plot show?"
  • "How many samples are in each condition?"
  • "Which cancer hallmark pathways are enriched?"

Project Structure

rnaseq-local-chat/
├── app.py          # Streamlit UI
├── ingest.py       # CSV + image → ChromaDB (importable + CLI)
├── data/           # Put your RNA-seq files here
├── requirements.txt
├── .env.example
└── LICENSE

Stack

About

No description, website, or topics provided.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors

Languages