NSFW JS for React Native
Client-side indecent content checking, now on mobile!

You may have seen our recent announcement about NSFW JS and got excited about the possibility of doing client-side indecent content checking in your web and mobile apps, only to realize mobile apps weren’t supported. Well now, thanks to substantial work by Yannick Assogba, that’s about to change!
How Do I Get Started?
While the TFJS setup process is a bit involved (and may require you to disable auto-linking), however the whole process is surprisingly straightforward.
1. Install and configure TensorFlow JS for React Native
2. Install and configure NSFW JS
3. Load the model:
await tf.ready()
this.model = await nsfwjs.load()
4. Load an image:
const imageAssetPath = Image.resolveAssetSource(require("./path/to/image.jpg"))
const response = await fetch(imageAssetPath.uri, {}, { isBinary: true })
const rawImageData = await response.arrayBuffer()
5. Convert the raw image data to a Tensor:
imageToTensor(rawImageData: ArrayBuffer): tf.Tensor3D {
const TO_UINT8ARRAY = true
const { width, height, data } = jpeg.decode(rawImageData, TO_UINT8ARRAY)
// Drop the alpha channel info for mobilenet
const buffer = new Uint8Array(width * height * 3)
let offset = 0; // offset into original data
for (let i = 0; i < buffer.length; i += 3) {
buffer[i] = data[offset]
buffer[i + 1] = data[offset + 1]
buffer[i + 2] = data[offset + 2]
offset += 4
}
return tf.tensor3d(buffer, [height, width, 3])
}
const imageTensor = imageToTensor(rawImageData)
6. Classify the Tensor:
this.model.classify(imageTensor)
And there you have it, working client-side indecent content checking, in a React Native App!
Note: We intend to move step 5 into a TFJS helper function in a f̵u̵t̵u̵r̵e̵ pull request, which would significantly simplify this process.
Getting More Advanced
Classifying a static image is cool, however it’s obviously not really useful for real-world use cases. With `react-native-image-picker` we can spice things up a bit and allow the user to select photos from their phone or take a picture with their camera for evaluation.
1. Install & configure React Native Image Picker
2. Allow the user to select an image
selectImage = () => {
const options = {
title: 'Select Image',
storageOptions: {
skipBackup: true,
path: 'images',
},
}
ImagePicker.showImagePicker(options, (response) => {
console.log('Response = ', response)
if (response.didCancel) {
console.log('User cancelled image picker')
} else if (response.error) {
console.log('ImagePicker Error: ', response.error);
} else if (response.customButton) {
console.log('User tapped custom button: ', response.customButton)
} else {
const source = { uri: response.uri }
this.setState({
image: source,
predictions: null
})
this.classifyImage()
}
})
}
And trigger the function from somewhere in your UI: <button onPress={this.selectImage} />
3. Update your image loading to use the selected image
const imageAssetPath = Image.resolveAssetSource(this.state.image)
And just like that, you’ve gone from static image classification to classifying user selected images!
NSFW JS Mobile App
In our example app, we’ve gone ahead and implemented some additional features, like blurring out the image until the classification tells us if the content is safe or not and displaying the classifications to the user. Of course, how you use NSFW JS in your app is completely up to you; you could simply decline to upload indecent content or go all-out and blacklist users that try to upload indecent content too frequently. We are releasing this open source example app just in time for Digital Ocean’s Hacktoberfest, so clone the project and give it a spin. Contributions and issue reports are welcome!
Update: The example app has been updated to load the NSFW JS model locally, which substantially decreases the loading time and network load.

Many thanks to Yannick Assogba for doing the heavy lifting to get TFJS running on React Native and to Gant Laborde for building NSFW JS and providing guidance on the example app.