Administrator
发布于 2024-12-07 / 14 阅读
0
0

ollama本地AI模型安装

下载安装后,打开 Terminal,执行命令 ollama run llama3.2-vision

• 打开 ollama:输入 ollama 即可启动。

• 退出:在会话中输入 /bye 或使用快捷组合键盘 Ctrl+d。

• 删除 ollama 模型:输入命令 ollama rm llama3.2-vision。

Python Library

To use Llama 3.2 Vision with the Ollama Python library:

import ollama

response = ollama.chat(
    model='llama3.2-vision',
    messages=[{
        'role': 'user',
        'content': 'What is in this image?',
        'images': ['image.jpg']
    }]
)

print(response)

JavaScript Library

To use Llama 3.2 Vision with the Ollama JavaScript library:

import ollama from 'ollama'

const response = await ollama.chat({
  model: 'llama3.2-vision',
  messages: [{
    role: 'user',
    content: 'What is in this image?',
    images: ['image.jpg']
  }]
})

console.log(response)

cURL

curl http://localhost:11434/api/chat -d '{
  "model": "llama3.2-vision",
  "messages": [
    {
      "role": "user",
      "content": "what is in this image?",
      "images": ["<base64-encoded image data>"]
    }
  ]
}'


评论