Quickstart

In this guide you will find the essential commands for interacting with LlamaAPI, but don’t forget to check the rest of our documentation to extract the full power of our API.

Available Models

The following models are currently available through LlamaAPI. You will use their names when build a request further on this Quickstart Guide.

Llama-3

  • llama3-70b (instruct model)
  • llama3-8b (instruct model)

Llama-2

All calls with prefix llama or llama2 migrated to Llama 3 on May/5/2024.

  • llama-7b-chat or are mapped to llama3-8b
  • llama-13b-chat and llama-70b-chat are mapped to llama3-70b
  • codellama-7b-instruct
  • codellama-13b-instruct
  • codellama-34b-instruct

Mistral

  • mixtral-8x22b-instruct
  • mixtral-8x7b-instruct
  • mistral-7b-instruct
  • mistral-7b (not a chat model)
  • mixtral-8x22b (not a chat model)

Gemma

  • gemma-7b
  • gemma-2b

Other

  • alpaca-7b
  • vicuna-7b
  • vicuna-13b
  • vicuna-13b-16k
  • falcon-7b-instruct
  • falcon-40b-instruct
  • openassistant-llama2-70b
  • Nous-Hermes-Llama2-13b
  • Nous-Hermes-llama-2-7b
  • Nous-Hermes-2-Mistral-7B-DPO
  • Nous-Hermes-2-Mixtral-8x7B-SFT
  • Nous-Hermes-2-Mixtral-8x7B-DPO
  • Nous-Hermes-2-Yi-34B
  • Nous-Capybara-7B-V1p9
  • OpenHermes-2p5-Mistral-7B
  • OpenHermes-2-Mistral-7B
  • Qwen1.5-72B-Chat ( replace 72B with 32B / 14B / 7B / 4B / 1.8B / 0.5B)

Installing the SDK

Our SDK allows your application to interact with LlamaAPI seamlessly, abstracting the handling of aiohttp sessions and headers, allowing for a simplified interaction with LlamaAPI.

Python

pip install llamaapi

Javascript

npm install llamaai

Usage

Once you have installed our library, you can follow the examples in this section to build powerfull applications, interacting with different models and making them invoke custom functions to enchance the user experience.

Python

import json
from llamaapi import LlamaAPI

# Initialize the SDK
llama = LlamaAPI("<your_api_token>")

# Build the API request
api_request_json = {
    "messages": [
        {"role": "user", "content": "What is the weather like in Boston?"},
    ],
    "functions": [
        {
            "name": "get_current_weather",
            "description": "Get the current weather in a given location",
            "parameters": {
                "type": "object",
                "properties": {
                    "location": {
                        "type": "string",
                        "description": "The city and state, e.g. San Francisco, CA",
                    },
                    "days": {
                        "type": "number",
                        "description": "for how many days ahead you wants the forecast",
                    },
                    "unit": {"type": "string", "enum": ["celsius", "fahrenheit"]},
                },
            },
            "required": ["location", "days"],
        }
    ],
    "stream": False,
    "function_call": "get_current_weather",
}

# Execute the Request
response = llama.run(api_request_json)
print(json.dumps(response.json(), indent=2))

Other parameters that you can pass in the request json are:

{
  ...
  "max_length" = 500,
  "temperature"= 0.1,
  "top_p"= 1.0,
  "frequency_penalty"=1.0
  ...
}

Javascript

  1. Import the Library:

    import LlamaAI from 'llamaai';
    
  2. Initialize the Library:

    const apiToken = 'INSERT_YOUR_API_TOKEN_HERE';
    const llamaAPI = new LlamaAI(apiToken);
    
  3. Make a Request

    // Build the Request
    const apiRequestJson = {
       "messages": [
           {"role": "user", "content": "What is the weather like in Boston?"},
       ],
       "functions": [
           {
               "name": "get_current_weather",
               "description": "Get the current weather in a given location",
               "parameters": {
                   "type": "object",
                   "properties": {
                       "location": {
                           "type": "string",
                           "description": "The city and state, e.g. San Francisco, CA",
                       },
                       "days": {
                           "type": "number",
                           "description": "for how many days ahead you wants the forecast",
                       },
                       "unit": {"type": "string", "enum": ["celsius", "fahrenheit"]},
                   },
               },
               "required": ["location", "days"],
           }
       ],
       "stream": false,
       "function_call": "get_current_weather",
      };
    
      // Execute the Request
       llamaAPI.run(apiRequestJson)
         .then(response => {
           // Process response
         })
         .catch(error => {
           // Handle errors
         });
    

Change Log

Version 0.1: Initial release

Contributing

We welcome contributions to this project. Please see the Contributing Guidelines for more details.

License

Llamaapi SDK is licensed under the MIT License. Please see the License File for more details.