Silicon Valley’s "move fast and break things” mantra propelled tech innovation for the internet age. In the era of artificial intelligence, it should take a leaf out of Japan’s playbook and slow down.
A rush to deploy AI tools to the public has resulted in embarrassing blunders, from an AI-powered Google search feature recently recommending glue on pizza, to consequences that can impact real people’s livelihoods — like the technology behind OpenAI’s ChatGPT showing signs of racial bias when ranking job applicants, as a Bloomberg analysis found.
It has also led to tech companies consuming enormous amounts of energy to power AI. The International Energy Agency estimates the total electricity consumption for data centers across the globe will be roughly equivalent to the power demand of Japan in 2026. Other forecasters say that by 2030, these centers are on course to use more energy than India, the world’s most populous country. Large language models (LLMs), the technology underpinning the latest crop of generative AI tools, require gargantuan troves of data and training them takes immense amounts of computing power and energy.
With your current subscription plan you can comment on stories. However, before writing your first comment, please create a display name in the Profile section of your subscriber account page.