Node-1st-gen/moderate-images/README.md
This sample demonstrates how to automatically moderate offensive images uploaded to Firebase Storage. It uses The Google Cloud Vision API to detect if the image contains adult or violent content and if so uses ImageMagick to blur the image.
See file functions/index.js for the moderation code.
The detection of adult and violent content in an image is done using The Google Cloud Vision API.
The image blurring is performed using ImageMagick which is installed by default on all Cloud Functions instances. The image is first downloaded locally from the Firebase Storage bucket to the tmp folder using the google-cloud SDK.
The dependencies are listed in functions/package.json.
The function triggers on upload of any file to your Firebase project's default Cloud Storage bucket.
moderate-image directory.npm install -g firebase-tools and then configure it with firebase login.firebase use --add and select your project in the list.cd functions; npm install; cd -To test the sample:
firebase deploy