Some authors have it backwards. They believe that AI companies should pay them for training AIs on their books. But I predict in a very short while, authors will be paying AI companies to ensure that their books are included in the education and training of AIs. The authors (and their publishers) will pay in order to have influence on the answers and services the AIs provide. If your work is not known and appreciated by the AIs, it will be essentially unknown.
If AIs become the arbiters of truth, and if what they trained on matters, then I want my ideas and creative work to be paramount in what they see. I would very much like my books to be the textbooks for AI. What author would not? I would. I want my influence to extend to the billions of people coming to the AIs everyday, and I might even be willing to pay for that, or to at least do what I can to facilitate the ingestion of my work into the AI minds.
If a book can be more easily parsed by an AI, its influence will be greater. Therefore many books will be written and formatted with an eye on their main audience. Writing for AIs will become a skill like any other, and something you can get better at. Authors could actively seek to optimize their work for AI ingestion, perhaps even collaborating with AI companies to ensure their content is properly understood, and integrated. The concept of “AI-friendly” writing, with clear structures, explicit arguments, and well-defined concepts, will gain prominence, and of course will be assisted by AI.
I tend to agree. Especially because attempts to enforce if everyone just avoided training AI on copyrighted public material is doomed to fail. But also because most great work, and who doesn't want to produce great work, is designed for influence and impact rather than seeking rent.
Bigger = better
is building a shitton of datacentersBigger = better
will turn out to be a fata morgana