欧美free性护士vide0shd,老熟女,一区二区三区,久久久久夜夜夜精品国产,久久久久久综合网天天,欧美成人护士h版

目錄

柚子快報(bào)邀請(qǐng)碼778899分享:前端 Llama

柚子快報(bào)邀請(qǐng)碼778899分享:前端 Llama

http://yzkb.51969.com/

基于ollama的本地大模型的LlamaIndex示例代碼

from llama_index.core import VectorStoreIndex, SimpleDirectoryReader, Settings

from llama_index.core.embeddings import resolve_embed_model

from llama_index.llms.ollama import Ollama

import logging

import sys

logging.basicConfig(stream=sys.stdout, level=logging.DEBUG)

logging.getLogger().addHandler(logging.StreamHandler(stream=sys.stdout))

documents = SimpleDirectoryReader("data").load_data()

# bge embedding model

Settings.embed_model = resolve_embed_model("local:/Users/leicq/ai_pychat/BAAI_bge-base-zh-v1.5")

# ollama

Settings.llm = Ollama(model="llama2-chinese", request_timeout=30.0)

import os.path

from llama_index.core import (

VectorStoreIndex,

SimpleDirectoryReader,

StorageContext,

load_index_from_storage,

)

# check if storage already exists

PERSIST_DIR = "./storage"

if not os.path.exists(PERSIST_DIR):

# load the documents and create the index

documents = SimpleDirectoryReader("data").load_data()

index = VectorStoreIndex.from_documents(documents)

# store it for later

index.storage_context.persist(persist_dir=PERSIST_DIR)

else:

# load the existing index

storage_context = StorageContext.from_defaults(persist_dir=PERSIST_DIR)

index = load_index_from_storage(storage_context)

# Either way we can now query the index

query_engine = index.as_query_engine()

response = query_engine.query("What did the author do growing up?")

print(response)

柚子快報(bào)邀請(qǐng)碼778899分享:前端 Llama

http://yzkb.51969.com/

相關(guān)文章

評(píng)論可見(jiàn),查看隱藏內(nèi)容

本文內(nèi)容根據(jù)網(wǎng)絡(luò)資料整理,出于傳遞更多信息之目的,不代表金鑰匙跨境贊同其觀點(diǎn)和立場(chǎng)。

轉(zhuǎn)載請(qǐng)注明,如有侵權(quán),聯(lián)系刪除。

本文鏈接:http://m.gantiao.com.cn/post/19306722.html

發(fā)布評(píng)論

您暫未設(shè)置收款碼

請(qǐng)?jiān)谥黝}配置——文章設(shè)置里上傳

掃描二維碼手機(jī)訪問(wèn)

文章目錄