View Issue Details
ID | Project | Category | View Status | Date Submitted | Last Update |
---|---|---|---|---|---|
0005570 | Composr | galleries | public | 2024-01-21 15:53 | 2024-07-25 19:50 |
Reporter | Chris Graham | Assigned To | |||
Severity | Feature-request | ||||
Status | non-assigned | Resolution | open | ||
Product Version | |||||
Fixed in Version | |||||
Summary | 0005570: Generative AI poisoning | ||||
Description | Consider optional integration of a library to poison uploaded images so that AI bots screw up their models if they try to use them: https://nightshade.cs.uchicago.edu/whatis.html | ||||
Tags | Type: Anti-big-tech | ||||
Time estimation (hours) | 4 | ||||
Sponsorship open | |||||
related to | 0005548 | non-assigned | Allow easy GPT crawl blocking |
|
A typical Composr user will likely not have a web server capable of running Glaze or Nightshade. These tools require quite a large amount of data / RAM and a decent GPU. I'd suggest we make some sort of AI protection / awareness documentation page and talk about Nightshade and Glaze and how to use it in conjunction with Composr rather than implementing it in Composr directly. Or, if we can find an external SaaS API that does this, we could incorporate that in Composr. |
Date Modified | Username | Field | Change |
---|---|---|---|
2024-01-21 15:53 | Chris Graham | New Issue | |
2024-01-21 15:54 | Chris Graham | Tag Attached: Type: Anti-big-tech | |
2024-01-21 17:13 | Patrick Schmalstig | Note Added: 0008220 | |
2024-01-21 17:14 | Patrick Schmalstig | Note Edited: 0008220 | View Revisions |
2024-01-21 17:14 | Patrick Schmalstig | Note Edited: 0008220 | View Revisions |
2024-01-21 17:16 | Patrick Schmalstig | Note Edited: 0008220 | View Revisions |
2024-07-25 19:50 | Chris Graham | Relationship added | related to 0005548 |