Deep Lake Vector Store in LangChain
Using Deep Lake as a Vector Store in LangChain
How to Use Deep Lake as a Vector Store in LangChain
!pip3 install langchain deeplake openai tiktokenDownloading and Preprocessing the Data
from langchain.embeddings.openai import OpenAIEmbeddings
from langchain.vectorstores import DeepLake
from langchain.document_loaders import TextLoader
from langchain.text_splitter import CharacterTextSplitter
from langchain.chat_models import ChatOpenAI
from langchain.chains import RetrievalQA, ConversationalRetrievalChain
import os!git clone https://github.com/twitter/the-algorithmrepo_path = '/the-algorithm'
docs = []
for dirpath, dirnames, filenames in os.walk(repo_path):
for file in filenames:
try:
loader = TextLoader(os.path.join(dirpath, file), encoding='utf-8')
docs.extend(loader.load_and_split())
except Exception as e:
print(e)
passA note on chunking text files:
Creating the Deep Lake Vector Store
Use the Vector Store in a Q&A App
Adding data to to an existing Vector Store
Adding Hybrid Search to the Vector Store
Using Deep Lake in Applications that Require Concurrency
Concurrency Using Zookeeper LocksAccessing the Low Level Deep Lake API (Advanced)
SelfQueryRetriever with Deep Lake
Was this helpful?