Monday, May 29, 2023

Technology bubbles create a lot of emotional energy – both excitement and fear – but they are bad information environments

 --- Lee Vinsel, historian of technology at Virginia Tech, quoted in Jeremy Hsu, "How this moment for AI will change society forever (and how it won't)" (paywall), New Scientist, 18 April 2023

Context

But the powerful AIs released by large technology companies tend to be closed systems that restrict access for the public or outside developers. Closed systems can help control for the potential risks and harms of letting anyone download and use the AIs, but they also concentrate power in the hands of the organisations that developed them without allowing any input from the many people whose lives the AIs could affect.

“The most pressing concern in closedness trends is how few models will be available outside a handful of developer organisations,” says Irene Solaiman, policy director at Hugging Face, a company that develops tools for sharing AI code and data sets.

Such trends can be seen in how OpenAI has moved towards a proprietary and closed stance on its technology, despite starting as a non-profit organisation dedicated to open development of AI. When OpenAI upgraded ChatGPT’s underlying AI technology to GPT-4, the company cited “the competitive landscape and safety implications of large-scale models like GPT-4” as the reason for not disclosing how this model works.

This type of stance makes it hard for outsiders to assess the capabilities and limitations of generative AIs, potentially fuelling hype. “Technology bubbles create a lot of emotional energy – both excitement and fear – but they are bad information environments,” says Lee Vinsel, a historian of technology at Virginia Tech.

Many tech bubbles involve both hype and what Vinsel describes as “criti-hype” – criticism that amplifies technology hype by taking the most sensational claims of companies at face value and flipping them to talk about the hypothetical risks.