Skip to content

Commit

Permalink
Update archives.
Browse files Browse the repository at this point in the history
  • Loading branch information
poneding committed Jun 19, 2024
1 parent 3be2e88 commit bf82145
Show file tree
Hide file tree
Showing 4 changed files with 62 additions and 1 deletion.
4 changes: 4 additions & 0 deletions content/_index.md
Original file line number Diff line number Diff line change
@@ -1,5 +1,9 @@
# 秋河落叶

## ai

- [llama](ai/llama.md)

## 数据结构与算法

- [堆排序](algo/堆排序.md)
Expand Down
5 changes: 5 additions & 0 deletions content/ai/_index.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,5 @@
[🏠 首页](../_index.md) / ai

# ai

[llama](llama.md)
52 changes: 52 additions & 0 deletions content/ai/llama.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,52 @@
[🏠 首页](../_index.md) / [ai](_index.md) / llama

# llama

## 安装

MacOS:

```bash
brew install ollama
```

Linux:

```bash
curl -sSL https://ollama.com/install.sh | sh
```

安装完成之后,查看安装版本:

```bash
ollama -v
```

## 启动

```bash
ollama start

# 通过注入 OLLAMA_HOST 环境变量设置监听地址
# OLLAMA_HOST=0.0.0.0 ollama start
```

## 下载并运行大模型

Llama3 目前可供下载的大模型有两个版本:8B 和 70B,本地运行容量有限,选择 8B 版本即可,大小大概 5G 左右。

```bash
ollama run llama3
```

执行完成后,会直接进入一个交互界面,可以直接进行对话了。

## 执行生成命令

```bash
curl http://localhost:11434/api/generate -d '{
"model": "llama3",
"prompt": "Why is the sky blue?",
"stream": false
}'
```
2 changes: 1 addition & 1 deletion content/dev/Apifox 导入 Kubernetes API.md
Original file line number Diff line number Diff line change
Expand Up @@ -11,7 +11,7 @@ Apifox 直接以 Url 方式导入,导入地址如下:
第一步,本地代理:

```bash
kubectl proxy --address 0.0.0.0 --accept-hosts '^*$
kubectl proxy --address 0.0.0.0 --accept-hosts '^*$'
```

第二步,提取 API json 文件至本地:
Expand Down

0 comments on commit bf82145

Please sign in to comment.