Models Enterprise Developers About Contact Get Started

Getting Started

Quick Start

Get up and running with the NAPH API in minutes. Our endpoints are compatible with the OpenAI SDK, so your existing code works with minimal changes.

1. Get Your API Key

Contact our team to receive API credentials. Enterprise customers receive dedicated API keys with custom rate limits and priority access.

2. Install the SDK

Use our official Python SDK or the OpenAI SDK with a custom base URL. Both approaches work identically.

3. Make Your First Request

Send a completion request using the model of your choice. Response format matches OpenAI exactly for seamless migration.

OpenAI Compatible

Works with any library or framework built for the OpenAI API.

Streaming

Server-sent events for real-time token delivery. Sub-50ms TTFT.

Function Calling

Structured tool use with JSON schema validation for agents.

Python quickstart.py
from naph import NAPH

# Initialize client
client = NAPH(
    api_key="your-api-key"
)

# Create a completion
response = client.chat.completions.create(
    model="naph-70b",
    messages=[
        {
            "role": "user",
            "content": "Explain quantum computing in simple terms."
        }
    ],
    max_tokens=1024,
    temperature=0.7
)

print(response.choices[0].message.content)

Libraries

Official SDKs

Native libraries for major programming languages with full type support, automatic retries, and streaming helpers.

Py

Python

pip install naph

TS

TypeScript

npm install @naph/sdk

Go

Go

go get github.com/naph/go-sdk

Rs

Rust

cargo add naph

Resources

Documentation & Examples

Everything you need to build production applications with NAPH models.

API Reference

Complete documentation of all endpoints, parameters, and response formats with interactive examples.

View API Reference

Authentication

Guide to API key management, token rotation, and security best practices for production deployments.

Read Guide

Streaming

Implement real-time token streaming with Server-Sent Events for responsive user interfaces.

Streaming Guide

Function Calling

Build AI agents with structured tool use. Define functions, validate inputs, and handle responses.

Function Calling Guide

Embeddings

Generate text embeddings for semantic search, clustering, and retrieval-augmented generation.

Embeddings Guide

Rate Limits

Understand rate limiting, implement backoff strategies, and optimize for high-throughput applications.

Rate Limits Guide
Python function_calling.py
# Define available tools
tools = [
    {
        "type": "function",
        "function": {
            "name": "get_weather",
            "description": "Get current weather",
            "parameters": {
                "type": "object",
                "properties": {
                    "location": {
                        "type": "string"
                    }
                },
                "required": ["location"]
            }
        }
    }
]

response = client.chat.completions.create(
    model="naph-70b",
    messages=[{
        "role": "user",
        "content": "What's the weather in Tokyo?"
    }],
    tools=tools,
    tool_choice="auto"
)

Advanced Features

Build Intelligent Agents

NAPH models support structured function calling for building AI agents that interact with external systems, APIs, and databases.

JSON Schema Validation

Define tool parameters with JSON Schema. The model generates validated, structured outputs.

Parallel Tool Calls

The model can invoke multiple tools in a single response for efficient multi-step workflows.

Reliable Outputs

Consistent function call formatting with proper error handling and retry support.

Integration

Framework Support

NAPH works with every major AI framework and orchestration library out of the box.

LangChain

Full compatibility with LangChain's chat models, agents, and chains. Use NAPH as a drop-in replacement for any OpenAI-based workflow.

  • ChatNAPH model class
  • Agent support
  • Streaming callbacks

LlamaIndex

Build RAG applications with LlamaIndex and NAPH models. Native integration for query engines, chat engines, and data agents.

  • Query engine support
  • Chat engine support
  • Custom embeddings

Vercel AI SDK

Build AI-powered React applications with the Vercel AI SDK. Streaming UI components work seamlessly with NAPH endpoints.

  • useChat hook
  • useCompletion hook
  • Streaming responses

OpenAI SDK

Use the official OpenAI Python and Node.js SDKs by pointing to NAPH's base URL. Zero code changes required.

  • Python SDK
  • Node.js SDK
  • Full API parity

Ready to Start Building?

Get API access and start integrating NAPH models into your applications today.