Brian Armstrong, the CEO of the crypto exchange Coinbase, expressed his views on Artificial Intelligence on his X platform (Formerly Twitter). He stated that AI should not be regulated as the space needs to develop as soon as possible.
He added that national security is among the reasons why innovation and competition should be incentivized in the new industry. The Conbase CEO argued that regulation could have unintended consequences for the new invention by killing competition and innovation.
Coinbase CEO Calls for AI Non-regulation
Brian Armstrong compared the AI craze with the Internet Golden Age. He stated that the Internet grew as software, and the Internet at large was not regulated. He added that the same approach should be used for Artificial Intelligence.
Armstrong suggested that the best alternative would be to decentralize the space and make it open source in what he said was “to let the cat out of the bag.”
Various states have already expressed concerns over using Artificial Intelligence, and most have already begun regulation. China, for instance, on August 15, introduced the provisional guidelines for Artificial Intelligence and Management. The regulations resulted from a joint task force of six government agencies and were published on 10 July amid the AI boom.
The United Kingdom’s Competition Authority studied the effects of artificial intelligence on competition and consumers in the state. The Markets Authority stated that while AI could potentially change lives, the changes need to be controlled as if they happen too quickly, they may adversely affect competition.
Well, should AI be regulated?
Since the release of ChatGPT by OpenAI late last year, there has been a lot of talk on the potential impact of Artificial intelligence. Crypto enthusiasts and tech fanboys are welcome to the idea as a logical continuation of the digital world, which has so far been instrumental in generating untold wealth.
Conversely, Boomers are not crazy about this idea and are more skeptical about its possible effects. The latter believe that technology is a tool without a mind of its own. It is up to the user to decide to do good with the technology, which is inherently discretional. Therefore, governments must step in and create policies to regulate its use.
ChatGPT leaders are surprisingly one of the parties arguing for regulation. They highlighted the need for regulating the super-intelligent AI, saying that there is a need to create an equal of the International Atomic Energy Agency to protect the general population from creating something that may ultimately destroy them.
The Co-founders of OpenAI called for international regulation focused on inspecting systems, compliance tests, and audits to reduce the risks the technology could pose.
The crypto industry has been a significant benefactor of artificial intelligence, from AI-based tokens to intelligent trading bots. The industry has already begun looking at the possibility of integrating these two technologies.
AI-powered algorithms are also used to detect anomalies in the industry and market analysis on various exchanges such as Coinbase. The intelligence can potentially ensure investors make better decisions and provide security solutions for the crypto industry.
Armstrong’s sentiments are a response to the AI debate that is currently ongoing globally as governments and other relevant regulators try to come up with policies that promote competition and innovation while also mitigating possible risks associated with the technology.
Some experts argue that regulation is needed in the space to ensure accountability; on the other side of the dais, they assert that regulation could impede innovation and consequent progress, thus compromising national security.
Governments are still pounding on the billion-dollar question of what will happen to the intelligence in the near future. Armstrong’s tweet introduces a new perspective on an already dystopian-led conversation. The CEO of the largest BTC holding exchange views may be instrumental in the technology and crypto industries.