• 0 Posts
  • 2 Comments
Joined 1Y ago
cake
Cake day: Jul 13, 2023

help-circle
rss

Respectfully, I worked for Alexa AI on compositional ML, and we were largely able to do exactly this with customer utterances, so to say it is impossible is simply not true. Many companies have to have some degree of ability to remove troublesome data, and while tracing data inside a model is rather difficult (historically it would be done during the building of datasets or measured at evaluation time) it’s definitely something that most big tech companies will do.


I’m fine with that, but let’s put some rules against this.

  • Any AI models should be able to determine the source of their data to a defined level of accuracy.
  • There should be a well-defined way to block data from being used by AI. If one of these ways (e.g. robots.txt) has been breached, the model has to be rebuilt without the data, and reparations made to the content owners.