Company
Date Published
Author
Team Clarifai
Word count
1133
Language
English
Hacker News points
None

Summary

The text outlines a method for moderating user-generated content by using a JavaScript-based project that utilizes the Clarifai API to check images for Not Safe For Work (NSFW) content before they are uploaded. The project is entirely client-side and involves disabling the form's submit button with JavaScript until an image is verified as Safe For Work (SFW). Users need to create an account with Clarifai to obtain a Client ID and Secret, which are essential for setting up the JavaScript files required for the project. The process includes converting image files to Base64 strings that are then analyzed by Clarifai's NSFW model, which determines if the content is appropriate based on a predefined safe threshold. The project provides visual feedback on the file's approval status and is particularly useful for applications where inappropriate content is undesirable, such as online contests or platforms with user-uploaded listings.