For Christmas, I received an amusing gift from a friend – a "bestselling book" all about me.
The book, titled "Tech Explanations for Dummies" (a great title, indeed), featured my name and photo on the cover, along with glowing reviews. However, the book was entirely written by artificial intelligence, with my friend Janet only providing a few simple prompts about me.
The book was amusing to read, and parts of it were quite comical. But it was also a bit rambling, somewhere between a self-help book and a collection of anecdotes. It mimicked my chatty writing style, but was also repetitive and very verbose. It likely went beyond Janet’s prompts in gathering data about me. Several sentences began with “As a leading tech journalist…” - embarrassing - which was likely pulled from online bios.
There was also a mysterious, recurring hallucination of my cat (I don’t have a pet). And almost every page had a metaphor – some more random than others. There are dozens of companies online offering AI book writing services. Mine came from BookByAnyone.
I contacted the CEO, Adil Masiah, based in Israel, who told me that since pivoting from compiling AI-generated travel guides in June 2024, he has sold around 150,000 personalized books, mainly in the US. A 240-page paperback bestseller costs £26. The company uses its own AI tools to generate the books, which are based on open-source large language models.
I’m not asking you to buy my book. You can’t actually – only Janet, who created it, can order more copies. Currently, anyone can create a book in anyone’s name, including celebrities – though Mr. Masiah says there are safeguards in place for abusive content. Each book contains a printed disclaimer stating that it is fictional, created by AI, and intended for “pure humour and joy”.
Legally, the copyright belongs to the company, but Mr. Masiah stresses that the product is intended as a “personalized gag gift” and the books are not sold on further. He hopes to expand the product range to generate different kinds of books, such as science fiction, and perhaps offer an autobiography service. It is intended to be a light-hearted form of consumer AI – selling AI-generated products to human customers.
It’s also a little scary if you make a living from writing, like I do. Especially as it probably took less than a minute to generate, and in parts, it really did sound like me. Musicians, writers, artists and actors around the world have raised concerns about their work being used to train generative AI tools, which then create similar content based on it.
“We should be clear that when we talk about data here, we’re actually talking about the life’s work of human creators,” says Ed Newton-Rex, founder of Fairly Trained, an organisation campaigning for AI companies to respect creators’ rights. “It’s books, it’s articles, it’s photographs. It’s artwork. It’s recordings… The whole point of training AI is to learn how to do something, and then do more of that.”
In 2023, a song featuring AI-generated voices of Canadian singers Drake and The Weeknd went viral on social media, before being removed from streaming platforms for not being their work and not having their consent. But that didn’t stop the song’s creator from trying to get it nominated for a Grammy. Despite the artists being fake, the song was still hugely popular.
“I don’t think generative AI should be banned for creative purposes, but I do think generative AI that is trained on people’s work without permission should be banned,” adds Mr. Newton-Rex. “AI can be incredibly powerful, but let’s build it ethically and fairly.”
In the UK, some organisations – including the BBC – have chosen to block AI developers from scraping their online content for training. Others have decided to collaborate – for example, the Financial Times has partnered with OpenAI, the creators of ChatGPT. The UK government is considering changing the law to allow AI developers to use creators’ content from the internet to help develop their models, unless rights holders opt out.
Ed Newton-Rex describes this as “insane”. He points out that AI can make progress in areas such as defence, healthcare and logistics, without scraping the work of writers, journalists and artists. “All of those things can work without changing copyright law and destroying the livelihoods of the country’s creative people,” he argues.
Baroness Kidron, a cross-party peer in the House of Lords, is also strongly opposed to removing copyright law for AI. “The creative industries are wealth creators, with 2.4 million jobs and countless happiness,” says the Baroness, who is also an advisor to the Oxford Institute for Ethics in AI. “The government is undermining one of its best performing industries on a vague promise of growth.”
A government spokesperson said: “No action will be taken until we are fully confident that we have a workable plan that delivers on each of our objectives: enhancing rights holders’ control and helping them to license their content; access to high quality material to train the UK’s leading AI models; and greater transparency from AI developers to rights holders.”
Under the UK government’s new AI plan, a national data library of public data from various sources will also be made available to AI researchers. In the US, the future of federal regulations controlling AI is now up in the air, following President Trump’s return to the presidency.
In 2023, Biden signed an executive order to improve the safety of AI, including requiring companies in the sector to share details of how their systems work with the US government before releasing them. But that has now been repealed by Trump. It remains to be seen what measures Trump will take, but he is said to want less regulation of the AI industry.
Meanwhile, a number of lawsuits against AI companies, particularly against OpenAI, are continuing in the US. Lawsuits have been filed by everyone from the New York Times to writers, record labels, and even a comedian. They claim that AI companies are breaking the law by taking their content from the internet without their consent, and using it to train their systems.
The AI companies argue that their actions fall under “fair use” and are therefore exempt. There are many factors that can constitute fair use – it’s not a straightforward definition. But the AI industry is facing increasing scrutiny over how it gathers training data, and whether it should be paying for it.
If that wasn’t enough to ponder, the past week has seen the industry shaken by the Chinese AI company, DeepSeek. It became the most downloaded free app on Apple’s US app store. DeepSeek claims it developed its technology for a fraction of the cost of companies such as OpenAI. Its success has sparked security concerns in the US, and threatens US dominance in the sector.
As for me and my writing career, I think for now, if I really want a “bestselling book”, I still have to write it myself. If anything, "Tech Explanations for Dummies" highlighted the current weaknesses of generative AI tools for large projects. It was full of inaccuracies and hallucinations, and it was rather difficult to read in parts because it was so verbose. But given how quickly the technology is developing, I’m not sure how much longer I can confidently say that my much slower, human-dependent writing and editing skills are better.