Docschevron_rightGetting Startedchevron_rightIntroduction

Introduction to NymAI

NymAI is a developer-first privacy firewall designed to secure your AI workflows. By redacting sensitive data (PII, API keys, secrets) locally before requests ever leave your infrastructure, NymAI ensures compliance with SOC2, GDPR, and HIPAA without sacrificing the utility of LLMs.

lock

100% Local Processing

Your data never touches our servers. The redaction engine runs entirely within your perimeter or VPC.

bolt

Zero Latency Overhead

Written in Rust, our core engine adds less than 5ms latency to your typical API request pipeline.


How it works

NymAI acts as a transparent proxy or SDK wrapper. When you send a prompt to an LLM provider (like OpenAI or Anthropic), NymAI intercepts the payload, identifies sensitive patterns using configurable regex and NER models, and replaces them with reversible tokens.

quickstart.js
import { NymAI } from '@nymai/core';

const nym = new NymAI({
  apiKey: process.env.NYM_API_KEY,
  strategies: ['email', 'phone', 'credit_card']
});

// 1. Redact sensitive input automatically
const userInput = "My email is [email protected], call me at 555-0199";
const { redacted, mappings } = await nym.redact(userInput);
// redacted -> "My email is <EMAIL_1>, call me at <PHONE_1>"

// 2. Send safe prompt to LLM
const llmResponse = await openai.chat.completions.create({
  messages: [{ role: "user", content: redacted }]
});

// 3. (Optional) Deanonymize the response
const finalResponse = await nym.deanonymize(llmResponse.content, mappings);

Installation

You can install the core SDK via npm, or run the CLI tool directly.

npm install @nymai/core
info
Note: You will need an API key to initialize the SDK. Get your API key here.