Search results

10 packages found

A quick spin-up HTTTP server for dev and testing using @nlbridge/express

published 0.3.10 3 hours ago
M
Q
P

A client for lollms server

published 1.0.5 a month ago
M
Q
P

🪨 Bedrock Wrapper is an npm package that simplifies the integration of existing OpenAI-compatible API objects with AWS Bedrock's serverless inference LLMs.

published 1.0.15 16 days ago
M
Q
P

This package is used to communicate with Fullmetal Server

published 1.0.19 16 days ago
M
Q
P

A simple server-sent events (SSE) request library for browser, for streaming LLM APIs

published 0.0.8 a month ago
M
Q
P

Local Large Language Models. Providing a toolkit to run and host multiple large language models on any machine.

published 1.0.0-beta.2 6 days ago
M
Q
P

This package is used to help Fullmetal API server with agents (nodes)

published 1.0.17 16 days ago
M
Q
P

Altiplano base inference server

published 0.0.7 a year ago
M
Q
P

use `npm i --save llama.native.js` to run lama.cpp models on your local machine. features a socket.io server and client that can do inference with the host of the model.

published 1.1.0 a year ago
M
Q
P

Altiplano tasks server

published 0.0.1 a year ago
M
Q
P