🎒Bring Your Own Model

Introduction

DecentAI's "Bring Your Own Model" (BYOM) feature gives you the power to connect directly to your own AI models. Whether your model is running on your personal computer, a local device, or deployed to a private cloud service, BYOM puts you in the driver's seat. This guide will walk you through the process of setting up and using your own model with DecentAI.

Why Use Your Own Model?

Before we dive into the how-to, let's talk about why you might want to use your own model:

  1. It's free: When you use your own hardware, you're not paying for additional compute resources. You're just using what you already have.

  2. More control: You can choose exactly which model to run. Want a quantized version for faster performance? A fine-tuned model for a specific task? With BYOM, the choice is yours.

  3. Total privacy: If you run the model locally, no one but you can see the inputs or outputs. This is perfect for handling sensitive or confidential information.

  4. Customization: You can tweak and adjust your model as much as you want, optimizing it for your specific needs.

Connecting to Your Model in DecentAI

Now that your model is accessible online, let's connect it to the DecentAI app:

  1. Open the DecentAI app and go to the Model Selector.

  2. Tap on "Add your own model".

  3. In the "Base URL" field, enter your ngrok URL followed by /v1/. For example:

    https://8eff-184-152-68-224.ngrok-free.app/v1/
  4. If your endpoint requires authentication, enter the bearer token. If not (like in our example), you can leave this field empty.

  5. In the "Model" field, enter the name of the model you're running locally. In this case, it's llama3.1.

And that's it! Hit "Test" to ensure the endpoint is working and then "Save" to get started. You've successfully set up your own model and connected it to DecentAI. Now you can enjoy the benefits of using your personal AI model while still having access to DecentAI's user-friendly interface.

Remember, when using your own model, you're responsible for its performance and availability. But don't let that scare you – it also means you have full control over your AI experience.

Last updated