View Issue Details

IDProjectCategoryView StatusLast Update
0005570Composrgalleriespublic2024-07-25 19:50
ReporterChris GrahamAssigned To 
SeverityFeature-request 
Status non-assignedResolutionopen 
Product Version 
Fixed in Version 
Summary0005570: Generative AI poisoning
DescriptionConsider optional integration of a library to poison uploaded images so that AI bots screw up their models if they try to use them:
https://nightshade.cs.uchicago.edu/whatis.html
TagsType: Anti-big-tech
Time estimation (hours)4
Sponsorship open

Relationships

related to 0005548 non-assigned Allow easy GPT crawl blocking 

Activities

Patrick Schmalstig

2024-01-21 17:13

administrator   ~0008220

Last edited: 2024-01-21 17:16

View 4 revisions

A typical Composr user will likely not have a web server capable of running Glaze or Nightshade. These tools require quite a large amount of data / RAM and a decent GPU.

I'd suggest we make some sort of AI protection / awareness documentation page and talk about Nightshade and Glaze and how to use it in conjunction with Composr rather than implementing it in Composr directly.

Or, if we can find an external SaaS API that does this, we could incorporate that in Composr.

Issue History

Date Modified Username Field Change
2024-01-21 15:53 Chris Graham New Issue
2024-01-21 15:54 Chris Graham Tag Attached: Type: Anti-big-tech
2024-01-21 17:13 Patrick Schmalstig Note Added: 0008220
2024-01-21 17:14 Patrick Schmalstig Note Edited: 0008220 View Revisions
2024-01-21 17:14 Patrick Schmalstig Note Edited: 0008220 View Revisions
2024-01-21 17:16 Patrick Schmalstig Note Edited: 0008220 View Revisions
2024-07-25 19:50 Chris Graham Relationship added related to 0005548