Minnie

[Private Alpha] The continuous finetuning platform for AI engineers!

The OpenAI-superset way to build AI Apps with smaller models.

From static to continuous

AI Engineers reinvent wheels because the large AGI labs stop at serving LLMs via API. Quickly swap in our OpenAI-superset SDK to gain immediate developer experience improvements. Then easily to switch to smaller models for even more benefits.
1
Install the SDK
2
Plug in your model
3
Continously improve
Select your language
smol maintains official open source OpenAI-superset Nodejs and Python SDKs. Open to community maintained SDKs in other languages.
$ npm i smolai
Change one line
Any OpenAI-compatible API will work!
1. import OpenAI from 'openai';
1. import SmolAI from 'smolai';
smol ai company logo
is fast
Finetuned GPT-3.5 models are 4-10x faster than GPT-4. Finetuned CodeLlama-34B models can beat GPT-4. Fewer parameters, less mmults, better performance, for a specific usecase.
smol ai company logo
is continuous
Why must AI have a knowledge cutoff? Why aren’t more AI Engineers constantly finetuning their models? Because it’s hard? Not anymore.
smol ai company logo
is safe
No SuperCrazyUltraMax AGI here. Domain specific language models offer fewer hallucinations.
import SmolAI from 'smolai';
const smolai = new SmolAI({
smolRateLimit: { count: 10, per: '10s' }
});
export default async (req) => {
const chatCompletion = await smolai
.chat
.completions
.create({
messages: [{
role: 'user',
content: req.body.text
}],
model: 'gpt-3.5-turbo',
});
return new Response(
chatCompletion.choices[0].content
);
};

How does it work?

Small model enjoyers delight!

For we are waitlist disrespecters. Join today for immediate access as a founding supporter.

A word from swyx

I am delighted by your interest and have been actively researching the AI Engineer landscape to build the platform I think serves the community best. smol.ai is under active development and we hope you will forgive our humble beginnings.
Shawn Wang
CEO of Smol AI
@swyx

Keep up with smol

Sign up for the Latent Space newsletter which will bring news and updates on the entire AI Engineering space through our lens.
Minnie