网站制作学习网AI→正文:ollama 适配openai 接口
字体:

ollama 适配openai 接口

AI 2024/4/21 15:56:23  点击:不统计


 ollama 适配openai autogen接口,只有chat 接口,没有function call 等其他接口

 
接口如下:
 
 
1. curl 版本接口
 
curl http://localhost:11434/v1/chat/completions \
    -H "Content-Type: application/json" \
    -d '{
        "model": "llama2",
        "messages": [
            {
                "role": "system",
                "content": "You are a helpful assistant."
            },
            {
                "role": "user",
                "content": "Hello!"
            }
        ]
    }'
2. 适配openai 接口
 
from openai import OpenAI
 
client = OpenAI(
    base_url = 'http://localhost:11434/v1',
    api_key='ollama', # required, but unused
)
 
response = client.chat.completions.create(
  model="llama2",
  messages=[
    {"role": "system", "content": "You are a helpful assistant."},
    {"role": "user", "content": "Who won the world series in 2020?"},
    {"role": "assistant", "content": "The LA Dodgers won in 2020."},
    {"role": "user", "content": "Where was it played?"}
  ]
)
print(response.choices[0].message.content)
 
 
3. 适配autogen 接口
from autogen import AssistantAgent, UserProxyAgent
 
config_list = [
  {
    "model": "codellama",
    "base_url": "http://localhost:11434/v1",
    "api_key": "ollama",
  }
]
 
assistant = AssistantAgent("assistant", llm_config={"config_list": config_list})
 
user_proxy = UserProxyAgent("user_proxy", code_execution_config={"work_dir": "coding", "use_docker": False})
user_proxy.initiate_chat(assistant, message="Plot a chart of NVDA and TESLA stock price change YTD.")
 
 
 
 
以上就是 ollama 适配openai 接口,Embeddings API 、Function calling、Vision support 目前尚未适配

转载%77%77%77请%2E%66%6F%72%61%73%70%2E%63%6E注明

·上一篇:ollama测试使用llama3 70b模型 >>    ·下一篇:llama3 运行需要多少内存 >>
推荐文章
最新文章