Skip to content

Linux llm

LLM Hackthon Sandbox Setup for Linux

The instructions are very simple. Incase your setting up a dedicated server VM instance to run LLM and Aider.

Hardware Requirements

  • Everything we do is Linux, if you windows user, you can install VirtualBox like software
  • Personally I recommend not to run virtualbox like VM's which will have to be shutdown or drain battery.
  • Running a small server on used old laptop/desktop is best way, so you can always SSH into it.
  • Get a machine which lot of memory which is important 16GB minimum, 32 Best, 64GB awesome
  • Invest in SSD (2TB) drive, hard-drives are too slow. it's so cheap on amazon.

Dell Workstation with 256GB RAM which can support GPU card later

Virtual Machine Specifications for LLM

Memory 4GB
CPU : 2 cores.
Storage 100GB

Operating System

Install Ubuntu Server 24.04 LTS Follow this instructions

SnapShot Base Image.

Once the good install is up and running. Make sure you take SnapShot of good working configuration If your using VirtualBox use this instruction

Name the image : 2404-lts-golden0. Always do this before you start running the installations script..

This saves lot of time, if you have to do a fresh install, which is very common for Security Projects.

Installation Script

sudo sh
apt-get -y -qq install curl 
clear
bash <(curl -s https://unovie.ai/repo/sandbox-llm.sh)

Register Google Gemini Flash 2.0 key

Register for FREE google gemini API key using your personal gmail address "https://ai.google.dev/gemini-api/docs/api-key" Store the copy of the API key in Notebook for futher llm usage

Always set you python environment for working

cd /opt
source /opt/conda/etc/profile.d/conda.sh    # make sure you activate conda
conda activate llm                          # conda llm profile

verify llm install

llm keys set gemini                         # use google_gemini_api_key
llm models list                             # should show you all gemini models
llm models default gemini-2.0-flash-exp     # set default model
llm "why is sky blue"                       # you should see llm response

Aider

verify if aider is already installed

aider --version
Create a .env file in your project directory
GEMINI_API_KEY=your_api_key_here
aider --model gemini/gemini-2.0-flash-exp

TXTAI

TxtAI is advanced platform with very advanced embeddings and agents capabilities. https://neuml.github.io/txtai/

Next Steps.