Skip to content
nqngo edited this page Feb 2, 2024 · 6 revisions

The current design of this bot provides 2 command function on Discord.

  1. /askllm <query(required):the question to ask the LLM, similar to chatGPT> <model(optional):selectable model available>: Submit the question directly to the LLM backend.
  2. /reviewresume <url(required):link to the chat> Fetch the attached PDF in the linked chat and feed to LLM backend to review. Will require the use of PDF parser. This command could also be a context command.

Component Overview

Option 1

flowchart LR
  subgraph docker
    Ollama <-- API --> monolithbot
    Ollama <--> B[NVIDIA Container Toolkit]
  end
  monolithbot -. websocket .-> Discord
Loading

Option 2

flowchart LR
  subgraph docker
    Ollama <-- API --> A[askllm bot] & R[resumereview bot]
    Ollama <--> B[NVIDIA Container Toolkit]
  end
  A & R  -. websocket .-> Discord
Loading

Infrastructure Overview

---
title: Hyperstack
---
classDiagram
  VM <--> ModelVolume1
  VM <--> ModelVolume2
  VM: +vCPUs 16
  VM: +RAM 59.5GB
  VM: +Disk 425GB
  VM: +OS Ubuntu Server 22.04 LTS
  VM: +GPU(RTX-A6000)
  VM: +GPUMEM(48GB GDDR6)
  VM: +GPUIO(768 GB/s)

  ModelVolume1: +Disk 1GB-1TB
  ModelVolume2: +Disk 1GB-1TB
Loading
Clone this wiki locally