How an AI-written Book Shows why the Tech 'Frightens' Creatives
For Christmas I got an intriguing present from a buddy - my extremely own "very popular" book.
"Tech-Splaining for Dummies" (excellent title) bears my name and my image on its cover, and it has glowing reviews.
Yet it was entirely written by AI, with a few easy prompts about me supplied by my good friend Janet.
It's a fascinating read, and uproarious in parts. But it also meanders rather a lot, and is somewhere between a self-help book and a stream of anecdotes.
It mimics my chatty style of composing, however it's likewise a bit repetitive, and really verbose. It may have surpassed Janet's prompts in looking at data about me.
Several sentences begin "as a leading innovation reporter ..." - cringe - which could have been scraped from an online bio.
There's also a mystical, repetitive hallucination in the type of my cat (I have no family pets). And there's a metaphor on almost every page - some more random than others.
There are lots of business online offering AI-book writing services. My book was from BookByAnyone.
When I contacted the president Adir Mashiach, based in Israel, he told me he had sold around 150,000 customised books, generally in the US, considering that rotating from assembling AI-generated travel guides in June 2024.
A paperback copy of your own 240-page long best-seller costs ₤ 26. The firm utilizes its own AI tools to create them, based upon an open source large language model.
I'm not asking you to purchase my book. Actually you can't - just Janet, who created it, can purchase any more copies.
There is currently no barrier to anyone producing one in anyone's name, including stars - although Mr Mashiach says there are guardrails around violent content. Each book contains a printed disclaimer mentioning that it is fictional, produced by AI, and developed "solely to bring humour and joy".
Legally, the copyright comes from the company, however Mr Mashiach worries that the product is meant as a "customised gag present", and the books do not get offered further.
He hopes to expand his variety, producing various genres such as sci-fi, and possibly providing an autobiography service. It's created to be a light-hearted form of consumer AI - selling AI-generated products to human customers.
It's likewise a bit frightening if, like me, you compose for a living. Not least due to the fact that it most likely took less than a minute to create, and it does, definitely in some parts, sound simply like me.
Musicians, authors, artists and stars worldwide have revealed alarm about their work being utilized to train generative AI tools that then churn out similar material based upon it.
"We need to be clear, when we are speaking about information here, we in fact imply human developers' life works," states Ed Newton Rex, founder of Fairly Trained, which campaigns for AI firms to regard creators' rights.
"This is books, this is short articles, this is photos. It's works of art. It's records ... The entire point of AI training is to discover how to do something and after that do more like that."
In 2023 a song featuring AI-generated voices of Canadian vocalists Drake and The Weeknd went viral on social networks before being pulled from streaming platforms due to the fact that it was not their work and they had not consented to it. It didn't stop the track's creator trying to nominate it for a Grammy award. And despite the fact that the artists were fake, it was still extremely popular.
"I do not think making use of generative AI for creative functions ought to be banned, but I do believe that generative AI for these functions that is trained on people's work without approval ought to be prohibited," Mr Newton Rex adds. "AI can be really effective but let's build it ethically and relatively."
OpenAI states Chinese competitors using its work for their AI apps
DeepSeek: The Chinese AI app that has the world talking
China's DeepSeek AI shakes industry and dents America's swagger
In the UK some organisations - including the BBC - have picked to obstruct AI designers from trawling their online material for training functions. Others have decided to team up - the Financial Times has actually partnered with ChatGPT developer OpenAI for example.
The UK government is thinking about an overhaul of the law that would enable AI developers to utilize developers' content on the web to help develop their models, unless the rights holders pull out.
Ed Newton Rex explains this as "madness".
He mentions that AI can make advances in areas like defence, healthcare and logistics without trawling the work of authors, journalists and artists.
"All of these things work without going and changing copyright law and destroying the livelihoods of the country's creatives," he argues.
Baroness Kidron, a in your house of Lords, is also strongly against eliminating copyright law for AI.
"Creative industries are wealth developers, 2.4 million jobs and a lot of joy," states the Baroness, who is also an advisor to the Institute for Ethics in AI at Oxford University.
"The federal government is weakening one of its finest carrying out industries on the vague guarantee of development."
A federal government spokesperson stated: "No relocation will be made until we are absolutely confident we have a useful strategy that delivers each of our goals: increased control for right holders to help them license their material, access to premium product to train leading AI models in the UK, and more openness for best holders from AI designers."
Under the UK government's brand-new AI strategy, a national data library including public data from a large range of sources will likewise be provided to AI researchers.
In the US the future of federal rules to control AI is now up in the air following President Trump's return to the presidency.
In 2023 Biden signed an executive order that aimed to boost the safety of AI with, amongst other things, companies in the sector required to share information of the functions of their systems with the US federal government before they are released.
But this has now been repealed by Trump. It remains to be seen what Trump will do rather, but he is said to desire the AI sector to face less policy.
This comes as a variety of lawsuits versus AI companies, and particularly against OpenAI, continue in the US. They have been gotten by everyone from the New York Times to authors, music labels, and even a comedian.
They declare that the AI firms broke the law when they took their material from the internet without their permission, and used it to train their systems.
The AI business argue that their actions fall under "reasonable usage" and are for bphomesteading.com that reason exempt. There are a number of elements which can make up reasonable use - it's not a straight-forward definition. But the AI sector is under increasing examination over how it collects training data and whether it must be paying for it.
If this wasn't all sufficient to consider, Chinese AI firm DeepSeek has actually shaken the sector over the previous week. It became one of the most downloaded free app on Apple's US App Store.
DeepSeek claims that it developed its innovation for lespoetesbizarres.free.fr a portion of the cost of the likes of OpenAI. Its success has raised security issues in the US, and threatens American's current supremacy of the sector.
When it comes to me and a profession as an author, I believe that at the minute, if I truly desire a "bestseller" I'll still need to compose it myself. If anything, Tech-Splaining for Dummies highlights the present weakness in generative AI tools for larger projects. It has lots of errors and hallucinations, and it can be quite difficult to read in parts due to the fact that it's so long-winded.
But provided how rapidly the tech is evolving, I'm not exactly sure for how long I can remain confident that my significantly slower human writing and editing skills, are much better.
Sign up for our Tech Decoded newsletter to follow the greatest advancements in global technology, with analysis from BBC correspondents all over the world.
Outside the UK? Register here.