NsfwSpy.NET and NsfwSpy.js
These are ecosystem siblings, specifically two different language implementations of the same NSFW content detection logic, with one targeting .NET environments (C#) and the other targeting JavaScript environments (TypeScript).
About NsfwSpy.NET
NsfwSpy/NsfwSpy.NET
A .NET image and video classifier used to identify explicit/pornographic content written in C#.
This tool helps online platforms and app developers automatically identify explicit or unwanted content in images and videos uploaded by users. It takes in various image and video formats and outputs a classification indicating if the content is "Pornography," "Sexy," "Hentai," or "Neutral." This is ideal for content moderators, community managers, and platform administrators who need to maintain a safe and appropriate environment for their users.
About NsfwSpy.js
NsfwSpy/NsfwSpy.js
A JavaScript image classifier used to identify explicit/pornographic content written in TypeScript.
This helps website and app developers automatically screen user-submitted images for explicit content. It takes an image file or element and categorizes it as pornography, sexy, hentai, or neutral. Developers would integrate this into their user-generated content moderation systems.
Related comparisons
Scores updated daily from GitHub, PyPI, and npm data. How scores work