AI/Ollama: Difference between revisions

From Chorke Wiki
Jump to navigation Jump to search
No edit summary
No edit summary
Line 1: Line 1:
{|class='wikitable'
{|class='wikitable'
|valign='top' style='width:50%'|
|valign='top' style='width:50%'|
{|class="wikitable"
<syntaxhighlight lang='bash'>
|-
curl -fsSL https://ollama.com/install.sh | sh
!scope='col'| Name
ollama pull model gpt-oss:20b
!scope="col"| CIDR
ollama --version
!scope='col'| MEMO
ollama ls
!scope='col'| Status
 
|-
curl -fsSL https://claude.ai/install.sh  | bash
!scope='row' style='text-align:left'|  Cudy TR3000
ollama launch claude --model gpt-oss:20b
| <code>192.168.010.01/24</code>    || WiFi » Router                ||style='text-align:center'| 🟢
</syntaxhighlight>
|-
 
!scope='row' style='text-align:left'|  Pi02W
| <code>192.168.010.02/24</code>    || WiFi » Pi-Hole                ||style='text-align:center'| 🟢
|-
!scope='row' style='text-align:left'|  Laptop
| <code>192.168.010.03/24</code>    || WiFi » Operator              ||style='text-align:center'| 🟢
|-
!scope='row' style='text-align:left'|  Herelink Ground Unit
| <code>192.168.010.04/24</code>    || WiFi » Ground Unit            ||style='text-align:center'| 🟢
|-
!scope='row' style='text-align:left'|  Herelink Ground Unit
| <code>192.168.144.11/24</code>    || Here » Ground Unit            ||style='text-align:center'| 🟡
|-
!scope='row' style='text-align:left'|  Herelink Air Unit
| <code>192.168.144.10/24</code>    || Here » Air Unit              ||style='text-align:center'| 🟡
|-
!scope='row' style='text-align:left'|  Raspberry Pi4B
| <code>192.168.144.09/24</code>   || Here » Bridge                ||style='text-align:center'| 🟡
|}
|valign='top' style='width:50%'|
|valign='top' style='width:50%'|
<syntaxhighlight lang='bash'>
export ANTHROPIC_BASE_URL=http://localhost:11434
export ANTHROPIC_AUTH_TOKEN=ollama
export ANTHROPIC_API_KEY=""
claude --model gpt-oss:20b
</syntaxhighlight>
|-
|-
|valign='top' colspan='2'|
|valign='top' colspan='2'|

Revision as of 23:02, 28 February 2026

curl -fsSL https://ollama.com/install.sh | sh
ollama pull model gpt-oss:20b
ollama --version
ollama ls

curl -fsSL https://claude.ai/install.sh  | bash
ollama launch claude --model gpt-oss:20b
export ANTHROPIC_BASE_URL=http://localhost:11434
export ANTHROPIC_AUTH_TOKEN=ollama
export ANTHROPIC_API_KEY=""



claude --model gpt-oss:20b

Diagram

@startuml
skinparam actorStyle awesome
autonumber

skinparam backgroundColor    transparent
skinparam DefaultFontName    Helvetica
skinparam ParticipantPadding 20
skinparam BoxPadding         10

title Claude ↔ Local FS ↔ Ollama ↔ GPT-OSS:20b

actor "Developer"                  as dev

box "Local Development PC" #LightBlue
    participant "Claude Code CLI"  as claude
    participant "Local Filesystem" as fs
end box

box "Kubernetes Cluster (K3s)" #Yellow
    participant "Ollama Service"   as ollama
    participant "GPT-OSS:20b"      as model
end box

dev      -> claude : Runs "claude --model gpt-oss:20b"
claude   -> fs     : Scans repository context
fs      --> claude : File contents / Git history

claude   -> ollama : POST /v1/messages (Anthropic API)
note right: Payload includes system prompt \nand local code context

ollama   -> model  : Load weights into GPU VRAM
model   --> ollama : Inference processing...

ollama -->> claude : Streamed Response (Tokens)
claude   -> dev    : Displays suggested code changes
@enduml

References

References