@monteloai/fizz
TypeScript icon, indicating that this package has built-in type declarations

1.0.34 • Public • Published

@monteloai/fizz

Fizz - the LLM function wizard.

Fizz is a tool to help you write LLM functions. It works by organizing your functions into files, and attaching a schema, description, and handler to each tool.

Fizz will then generate a type-safe client for you, which you can use in your LLM calls. It will generate the schema to pass into the LLM call based on your schema, leaving you to focus on writing your function.

Installation

npm install @monteloai/fizz@latest

Then init to create the fizz.config.json.

npx fizz init

This will create a fizz.config.json file at the root of your project, add a functions directory, and an example function.

{
  "functionsDirectory": "src/functions"
}

Usage

Example directory structure:

your-project/
├── functions/
│   ├── getCurrentWeather.ts
│   └── getFutureWeather.ts
├── package.json
└── tsconfig.json

Each file should look something like this. Notice that the function is exported as default.

import { FizzFunction } from "@monteloai/fizz";
import { z } from "zod";

const FunctionInput = z.object({
  location: z.string().describe("The city, e.g San Francisco."),
  unit: z.enum(["Celsius", "Fahrenheit"]).describe("The unit of temperature."),
});
type TFunctionInput = z.infer<typeof FunctionInput>;

const getCurrentWeather = async (params: TFunctionInput): Promise<string> => {
  return `The weather in ${params.location} is currently 22 degrees ${params.unit}.`;
};

export default FizzFunction({
  function: getCurrentWeather,
  description: "Get the current weather in a given location.",
  schema: FunctionInput,
});

You've now created a function. You can create as many functions as you want, and they can be organized into subdirectories.

Now you can generate a client for your function.

Simply run:

npx fizz generate

And a type-safe client will be generated for you, which you can access as

import { functions, getAllSchemas } from "@monteloai/fizz";

This client will have the schema, description, and handler attached to it, so you can use it like this:

import { functions, getAllSchemas } from "@monteloai/fizz";

openai.chat.completions.create({
  model: "gpt-3.5-turbo",
  messages: [{ role: "user" as const, content: "What's the weather in New York right now?" }],
  // as a tool
  tools: getAllSchemas(),
  // or as a function
  functions: getAllSchemas().map((schema) => schema.function),
  // specific tool only
  // notice your IDE will autocomplete here!
  tools: [functions.getCurrentWeather.schema],
});

Roadmap

  • ✅ Introduce a config for fizz to change the default directory
  • ✅ Find nicer naming and APIs (FizzFunction instead of createFunction?)
  • ✅ Add a get all api functions.getAllDefinitions()
  • Add a check to make sure that every input field has a describe on it
  • Support Joi/Yup/TypeBox/other schema libraries. Also refactor code base to make this an easy process.
  • Add decorator support instead of createFunction
  • Add tests
  • Browser support
  • Remove zodToJsonSchema as dependency, write our own
  • Looking into supporting other models

Readme

Keywords

Package Sidebar

Install

npm i @monteloai/fizz

Weekly Downloads

107

Version

1.0.34

License

MIT

Unpacked Size

18 kB

Total Files

25

Last publish

Collaborators

  • samatmontelo