Key paragraph: «It argued that AI systems do not "understand" what they process in the same way humans do. "As philosopher Luciano Floridi puts it, AI acts without understanding – it follows statistical patterns rather than engaging with meaning. This difference matters legally," it said.»
That analysis distinguishes between human-like understanding and other unanticipated use. As I read it, it says that using something for human-like understanding is fair use, and that other use is only fair if in a class that's anticipated by law. Search engines weren't anticipated by law, the software used by plagiarism hunters weren't anticipated, and both of those involve internal copying in much the same way as AI training.
Oh, this looks worrying, and perhaps wrong.
Key paragraph: «It argued that AI systems do not "understand" what they process in the same way humans do. "As philosopher Luciano Floridi puts it, AI acts without understanding – it follows statistical patterns rather than engaging with meaning. This difference matters legally," it said.»
That analysis distinguishes between human-like understanding and other unanticipated use. As I read it, it says that using something for human-like understanding is fair use, and that other use is only fair if in a class that's anticipated by law. Search engines weren't anticipated by law, the software used by plagiarism hunters weren't anticipated, and both of those involve internal copying in much the same way as AI training.
Here's a search engine I like: https://books.google.com/ngrams/