๐ซ NSFW Content Detection
The /nsfw-filter endpoint analyzes an image to detect potentially inappropriate or sensitive content. It classifies images into Safe or Not Safe for Work (NSFW), helping developers build safer applications for all audiences.
๐ฅ Endpoint: POST /nsfw-filterโ
Base URL: https://api.visionary.ai/v1/nsfw-filter
๐ Headersโ
| Key | Value | Required |
|---|---|---|
| Authorization | Bearer YOUR_API_KEY | โ |
| Content-Type | application/json or multipart/form-data | โ |
๐ธ Request Bodyโ
Option 1 โ JSON (Base64)โ
{
"image": "data:image/jpeg;base64,/9j/4AAQSkZ..."
}
Option 2 โ Multipartโ
Use image as the field name.
๐งช Example (curl โ multipart)โ
curl -X POST https://api.visionary.ai/v1/nsfw-filter \
-H "Authorization: Bearer demo-api-key" \
-F "image=@/path/to/image.jpg"
โ Responseโ
{
"nsfw_score": 0.87,
"classification": "nsfw"
}
๐ง Fieldsโ
| Field | Type | Description |
|---|---|---|
nsfw_score | float | Value between 0 and 1 indicating probability of NSFW content |
classification | string | Returns "safe" or "nsfw" |
โ ๏ธ Threshold Logicโ
-
nsfw_score< 0.50 โ"safe" -
nsfw_scoreโฅ 0.50 โ"nsfw"
Threshold can be customized at integration level.
๐ซ Error Responsesโ
| Code | Message | Cause |
|---|---|---|
| 400 | "Missing image field" | No file or base64 string sent |
| 401 | "Unauthorized" | Invalid or missing API key |
| 422 | "Image unreadable or unsupported" | File corrupted or unsupported |
๐ Notesโ
-
This model uses an open-source NSFW classifier fine-tuned for general image filtering.
-
This system does not store or log any images.
-
Use responsibly and comply with local moderation laws and app store policies.
๐ Next endpointโ
โก๏ธ Explore /metadata to extract EXIF and technical data from an image.