LLM Module API Reference
The empire_chain.llms
module provides interfaces to various Language Model providers.
OpenAILLM
Class for interacting with OpenAI's language models.
Constructor
Parameters:
- model_name
(str): The OpenAI model to use. Options include:
- "gpt-4"
- "gpt-3.5-turbo"
- "gpt-4-turbo"
Methods
generate()
Generate text based on a prompt.
Parameters:
- prompt
(str): The input prompt for text generation
Returns: - str: The generated text response
Example:
AnthropicLLM
Class for interacting with Anthropic's Claude models.
Constructor
Parameters:
- model_name
(str): The Anthropic model to use. Options include:
- "claude-3-sonnet"
- "claude-3-opus"
- "claude-3-haiku"
Methods
generate()
Generate text using Claude.
Parameters:
- prompt
(str): The input prompt for text generation
Returns: - str: The generated text response
Example:
GroqLLM
Class for interacting with Groq's language models.
Constructor
Parameters:
- model_name
(str): The Groq model to use. Options include:
- "mixtral-8x7b"
- "llama2-70b"
Methods
generate()
Generate text using Groq.
Parameters:
- prompt
(str): The input prompt for text generation
Returns: - str: The generated text response
Example:
GeminiLLM
Class for interacting with Google's Gemini language models.
Constructor
Parameters:
- model_name
(str): The Gemini model to use. Options include:
- "gemini-1.5-flash"
- "gemini-1.5-pro"
Methods
generate()
Generate text using Gemini.
Parameters:
- prompt
(str): The input prompt for text generation
Returns: - str: The generated text response
Example:
Common Features
All LLM classes share these common features:
Error Handling
try:
llm = OpenAILLM()
response = llm.generate("prompt")
except Exception as e:
print(f"Error: {e}")
Environment Variables
Required environment variables:
- OpenAI: OPENAI_API_KEY
- Anthropic: ANTHROPIC_API_KEY
- Groq: GROQ_API_KEY
- Gemini: GEMINI_API_KEY
Best Practices
-
Model Selection
-
Error Handling
-
Environment Setup