Get Started
Quickstart
Start building awesome A.I Projects with LlamaAPI
Llama API Client
LlamaAPI is a Python SDK for interacting with the Llama API. It abstracts away the handling of aiohttp sessions and headers, allowing for a simplified interaction with the API.
Installation
You can install the LlamaAPI SDK using pip:
Python
pip install llamaapi
Javascript
npm install llamaai
Usage
Python
After installing the SDK, you can use it in your Python projects like so:
import json
from llamaapi import LlamaAPI
# Initialize the llamaapi with your api_token
llama = LlamaAPI("<your_api_token>")
# Define your API request
api_request_json = {
"messages": [
{"role": "user", "content": "What is the weather like in Boston?"},
],
"functions": [
{
"name": "get_current_weather",
"description": "Get the current weather in a given location",
"parameters": {
"type": "object",
"properties": {
"location": {
"type": "string",
"description": "The city and state, e.g. San Francisco, CA",
},
"days": {
"type": "number",
"description": "for how many days ahead you wants the forecast",
},
"unit": {"type": "string", "enum": ["celsius", "fahrenheit"]},
},
},
"required": ["location", "days"],
}
],
"stream": False,
"function_call": "get_current_weather",
}
# Make your request and handle the response
response = llama.run(api_request_json)
print(json.dumps(response.json(), indent=2))
Other parameters that you can pass in the request json is:
{
...
"max_length" = 500,
"temperature"= 0.1,
"top_p"= 1.0,
"frequency_penalty"=1.0
...
}
Javascript
To Install the library via npm for nodejs:
npm install llamaai
-
Bring the library into your project:
import LlamaAI from 'llamaai';
-
Instantiate the LlamaAPI class, providing your API token:
const apiToken = 'INSERT_YOUR_API_TOKEN_HERE'; const llamaAPI = new LlamaAI(apiToken);
-
Execute API requests using the
run
method:const apiRequestJson = { "messages": [ {"role": "user", "content": "What is the weather like in Boston?"}, ], "functions": [ { "name": "get_current_weather", "description": "Get the current weather in a given location", "parameters": { "type": "object", "properties": { "location": { "type": "string", "description": "The city and state, e.g. San Francisco, CA", }, "days": { "type": "number", "description": "for how many days ahead you wants the forecast", }, "unit": {"type": "string", "enum": ["celsius", "fahrenheit"]}, }, }, "required": ["location", "days"], } ], "stream": false, "function_call": "get_current_weather", }; llamaAPI.run(apiRequestJson) .then(response => { // Process the API response here }) .catch(error => { // Handle any errors here });
List of models (default: llama-13b-chat):
llama-7b-chat
llama-13b-chat
llama-70b-chat
mistral-7b
mistral-7b-instruct
NousResearch/Nous-Hermes-Llama2-13b
falcon-7b-instruct
falcon-40b-instruct
alpaca-7b
codellama-7b-instruct
codellama-13b-instruct
codellama-34b-instruct
openassistant-llama2-70b
vicuna-7b
vicuna-13b
vicuna-13b-16k
Change Log
Version 0.1: Initial release
Contributing
We welcome contributions to this project. Please see the Contributing Guidelines for more details.
License
Llamaapi SDK is licensed under the MIT License. Please see the License File for more details.