Thinking about this more generally ...
"Training AI and LLMs" is the machine version of getting an entity to learn something. The flesh-and-blood analog is "getting a person (student) to learn something."
If a human-person reads your book, presumably you get a bit of royalty (roughly that of selling one copy of your book). The human-person then goes on to do whatever with whatever he learnt from reading your book -- short of that human-person making additional copies of your book or quoting substantial content from your book (to the point of violating copyrights), you derive no "residual" income from that human-person who has read your book.
So maybe the AI developer is thinking/arguing "ok, I buy one copy of [your] book from the publisher to train my LLM", akin to "*I buy a copy of [your] book from the publisher/author to train my human-employee." You get no further share of whatever economic value generated by the LLM just like you get no further share of whatever share of whatever value generated by the human-employee.
Of course an LLM is not the same as a human-employee -- the LLM can be duplicated into a billion exact copies, will never quit its job, has perfect memory, and will effectively live forever. I.e., the long-term potential economic that can be generated by the LLM may be "infinite" compared to a human-employee.
Will that argument work? Is the arrangement "fair"? I suppose that's the billion dollar question to be shaken out in the years to come.