Statement from The Center for A.I. Safety forgot the assignment:
A few days ago, on July 21, 2023, the White House released a set of voluntary guidelines agreed to by some leading A.I. companies to protect against the risks of A.I. These largely inconsequential suggestions stem from a joint statement made in May.
On May 30 of this year, The Center for A.I. Safety (TCAIS) issued the following 22-word hyperbolic statement:
“Mitigating the risk of extinction from A.I. should be a global priority alongside other societal-scale risks, such as pandemics and nuclear war.”
Are the top scientists in the world apprehensive about sentient 3D-printed Roombas vacuuming up all eight billion humans? Pray tell, how exactly will A.I. cause the extinction of the entire human race?
If we do foolish things and tell A.I. to launch nuclear weapons whenever it sees fit or to allow it to release a deadly virus when it feels threatened, then Darwinism is kicking in. We will disappear from the evolutionary chain. To quote Forrest Gump, “Stupid is as stupid does.”
A.I., especially generative A.I., is a set of powerful tools. It will be used for good and for bad purposes.
We should restrict “bad” uses of A.I., most notably dissemination of mass disinformation, fragmentation of society based on preconceived biases, and civil unrest as tensions among inhabitants of various information islands clash. This is a worthwhile and necessary endeavor.
However, the statement from TCAIS intentionally or unintentionally misses the potential for A.I. to solve some of society’s largest problems: waste, water, food, housing, productivity, efficient economic growth, and the environment. It must also address A.I.’s potential to enhance creative expression, research, innovation, music, literature, and art.
To boldly go where TCAIS has never gone, it is important to back up the claim that “A.I. can create good” with specific examples. Here are a few:
- Generative A.I. is enabling the discovery of life-saving drugs through understanding how proteins fold – something that was previously slow or, in many cases, unachievable and likely will play a crucial part in stopping future pandemics – See AlphaFold from www.DeepMind.com
- Generative A.I. is helping architects imagine beautiful and optimally designed buildings to meet the space’s function – See ArchiGAN from www.Nvidia.com
- Generative A.I. is helping companies dramatically reduce waste in supply chains – reducing excess inventory in warehouses and ensuring products efficiently get to where they need to be – See Noodle.ai from www.noodle.ai
To quickly double-click on the last one,
Supply chains have become incredibly complex and fast-paced, making it difficult for humans and non-AI technologies to manage them efficiently. Unfortunately, more than half of the world’s harvested food goes to waste each year due to issues in the supply chain. Billions of dollars are lost due to excessive inventories and empty shelves in different locations. This problem poses a significant challenge to economic growth, profits, environmental sustainability, efficient resource utilization, and quality of life. Many of us have likely experienced the negative effects of poorly planned supply chains.
Generative A.I. represents supply chains as a graph, and the algorithms (Graph Neural Networks with Reinforcement Learning) create a series of inventory movements to reduce waste and increase profits. The algorithm optimizes the supply chain simultaneously, considering uncertainty and probabilities – unachievable with non-AI approaches.
The supply chain planners also go from being blamed when things go wrong – and they always go wrong – to heroes who can measure the amount of waste they reduced and the profits they increased.
Instead of focusing only on preventing “Bad A.I.,” let’s all appeal to our better angels and figure out how to use “Good A.I.”
TCAIS should amend its 22-word statement to:
“Accelerating benefits from A.I. should be a global priority to mitigate societal-scale risks, such as wasted resources, economic stagnation, and pandemics.”