A nudity classifier for web browsers utilizing Yahoo's OpenNSFW model
Run the OpenNSFW model against an image
A promise that resolves to a single result or an array of results in the format: {nsfw_confidence = (0.0 - 1.0), is_nsfw = (nsfw_confidence > this.nsfw_threshold)}
A single image or array of images of these types: ImageData, HTMLImageElement, HTMLCanvasElement, ImageBitmap
Private
classifyPrivate
imagePrivate
isLoad the model either from the project's GitHub repo or from the local cache (if it exists)
A void promise that resolves once the load has completed
Whether or not to save the model locally immediately after loading (see this.save())
Private
preprocessChanges the image into a format and shape that is compatible with the model
A tensor representation of the preprocessed image
An image-like object on which to run the preprocessing
Generated using TypeDoc
A image classifier that determines an image's inappropriateness (in regards to nudity) with percent certainty. This percentage roughly correlates to the amount of nudity present in an image.
Utilizes a ported version of Yahoo's OpenNSFW model.