Wednesday, July 31, 2024

Install Perplexica with SearXNG and Ollama and Llama 3.1 for Local AI Search Engine for Free

 This video shows how to locally install Perplexica with SearXNG and Ollama Llama 3.1 model and do AI-powered search.



Code:
conda create -n px python=3.11 -y && conda activate px

pip install torch transformers accelerate huggingface_hub sentencepiece

SearXNG:

git clone https://github.com/searxng/searxng && cd searxng

under searx directory in settings.yml file, change following:

search:
  formats:
    - html
    - json

sudo chmod 666 /var/run/docker.sock
make docker.build  

docker run --rm -d -p 32768:8080 -v "${PWD}/searxng:/etc/searxng" -e "BASE_URL=http://localhost:$PORT/" -e "INSTANCE_NAME=my-instance" searxng/searxng

http://localhost:32768

Ollama:

curl -fsSL https://ollama.com/install.sh | sh

ollama pull llama3
ollama pull bgesmall
             
             
perplexica :

git clone https://github.com/ItzCrazyKns/Perplexica.git && cd Perplexica

cp sample.config.toml config.toml
vi config.toml change following:

[API_ENDPOINTS]
SEARXNG = "http://localhost:32768"
OLLAMA = "http://localhost:11434"

sudo chmod 666 /var/run/docker.sock
docker compose up -d

http://localhost:3000

No comments: