Pinecone Database (Vector Database) AI example
Автор: KanuITAcademy
Загружено: 2025-08-04
Просмотров: 83
Pinecone Database (Vector Database) AI example
______________Code_______________
Step 1: Install required libraries (run in terminal or notebook if not already installed)
pip install --upgrade openai pinecone
Step 2: Import libraries
from pinecone import Pinecone, ServerlessSpec
import openai
Step 3: Set your API keys
#OPENAI_API_KEY = "your-openai-api-key"
#PINECONE_API_KEY = "your-pinecone-api-key"
Step 3: Set your API keys
PINECONE_API_KEY = "your pinecone api key"
OPENAI_API_KEY = "your openai api key"
Step 4: Create Pinecone client
pc = Pinecone(api_key=PINECONE_API_KEY)
Step 5: Create index if it doesn't exist
index_name = "persondetails-search"
if index_name not in pc.list_indexes().names():
pc.create_index(
name=index_name,
dimension=1536, # required for OpenAI embeddings (text-embedding-ada-002)
metric="cosine",
spec=ServerlessSpec(cloud="aws", region="us-east-1")
)
Step 6: Connect to Pinecone index
index = pc.Index(index_name)
Step 7: Define get_embedding function using OpenAI's new SDK
client = openai.OpenAI(api_key=OPENAI_API_KEY)
def get_embedding(text):
response = client.embeddings.create(
input=[text],
model="text-embedding-ada-002"
)
return response.data[0].embedding
Step 8: Embed and upsert a document
doc_text = "My name is Kanu Butani . I am a software IT engineer and also I write Poems. I sing and i like music. Milind is Kanu's buddy, they both go to office together"
doc_vector = get_embedding(doc_text)
index.upsert(vectors=[
{
"id": "doc1",
"values": doc_vector,
"metadata": {"text": doc_text}
}
])
Step 9: Query the index with a user question
query_text = "Kanu's Friend?"
query_vector = get_embedding(query_text)
#print(query_text)
#print(query_vector)
results = index.query(
vector=query_vector,
top_k=3,
include_metadata=True
)
#print(results)
Step 10: Print matches
for match in results["matches"]:
print(f"\nScore: {match['score']}")
print(f"Text: {match['metadata']['text']}")
prompt = query_text
context = {doc_text}
print(context)
Question = {query_text}
print(Question)
prompt = Question
from openai import OpenAI
client = OpenAI(api_key="your key")
response = client.chat.completions.create(
model="gpt-4",
messages=[
{
"role": "system",
"content": "You are a helpful assistant."
},
{
"role": "user",
"content": f"""Use the following context to answer the question.
Context:
{context}
Question: {query_text}
Answer:"""
}
],
temperature=0.2
)
Print the generated answer
print("Answer:", response.choices[0].message.content) IBM Maximo tutorials
IBM Maximo configuration
IBM Maximo automation
Maximo scripting
IBM Maximo escalation setup
IBM Maximo business rules
Maximo for beginners
Maximo integration
IBM Maximo training
AI tutorials
Artificial Intelligence learning
Machine Learning basics
AI tools for beginners
Data science for IT professionals
IT automation
IT and AI education
Enterprise asset management IBM Maximo
Maximo developer guide
Python for Maximo
Maximo workflows
Maximo advanced topics
IBM Maximo implementation
Maximo system configuration
IT learning channel
AI and IT explained
Доступные форматы для скачивания:
Скачать видео mp4
-
Информация по загрузке: