OpenAI has added a retired US Army general, Paul M. Nakasone, to its board of directors. Nakasone also served as director of the National Security Agency (NSA).
Also read: Employees claim OpenAI and Google DeepMind hiding AI risks
Nakasone was also head of the US Cyber Command, a separate entity usually headed by the same individual who leads the NSA. The retired general will join OpenAIโs board of safety and cyber committee to secure the artificial intelligence firmโs technology. The committeeโs primary purpose is to oversee critical safety issues and security decisions at OpenAI, the company building ChatGPT.
OpenAI is strengthening its board
Nakasone stepped down from the position of director at NSA earlier this year. He was appointed by President Trump. OpenAI said in a recent blog post that his responsibilities will include making โsafety and security decisions for all OpenAI projects and operations.โ The post noted that:
โMr. Nakasoneโs insights will also contribute to OpenAIโs efforts to better understand how AI can be used to strengthen cybersecurity by quickly detecting and responding to cybersecurity threats.โ
OpenAI said that the committee is evaluating the firmโs procedures and safety measures. After three months, the committee will make recommendations to the board, after which the company will inform the public about the updates.
OpenAI is strengthening its board after concerns arose about the firmโs safety practices. The company was a hot topic of discussion for many news outlets after the departure of top-level employees, including the chief scientist and co-founder, Ilya Sutskever.
Nakasoneโs experience in cybersecurity
Bret Taylor, chair of the OpenAI board, said that AI has the capacity to significantly benefit peopleโs lives. However, he clarified that this can only be made possible after ensuring that the new innovations built with the technology are secure before they are deployed.ย
Taylor said that Nakasoneโs experience in cybersecurity will help guide the artificial intelligence company in achieving its mission. He said that OpenAI wants the technology to benefit humanity as a whole. Meanwhile, General Nakasone, on joining the company, said,
โI look forward to contributing to OpenAIโs efforts to ensure artificial general intelligence is safe and beneficial to people around the world.โ
Nakason has experience in global cyber defense and technology advancement from his army career. He was also the longest-serving head of US Cyber Command (USCYBERCOM) and was responsible for protecting the countryโs digital infrastructure.ย
Nakasone and OpenAI share common beliefs
Along with the boardโs chair, Bret Taylor, Sam Altman, OpenAIโs CEO, and Larry Summers, former Secretary of Treasury, are also on the board. Other members include Tasha McCauley, a tech entrepreneur; Nicole Seligman, ex-VP of Sony Corporation; Dam DโAngelo, CEO of Quora Inc.; and Dr. Sue Desmond-Hellmann, ex-CEO of Bill and Melinda Gates Foundation.ย
Fidji Simo, Instacartโs chief executive, is also a member, along with Dee Templeton from Microsoft. Templeton holds an observer seat on the board but has no voting power.ย
Also read: Has safety taken a back seat at OpenAI?
Techcrunch made an interesting comparison to Nakasoneโs statement, โOpenAIโs dedication to its mission aligns closely with my own values.โ The website wrote that he seems right, as he defended the NSAโs practice of buying internet data of โquestionable provenanceโ for the agencyโs surveillance networks.ย
Nakasone argued that the practice wasnโt against the law. Techcrunch said that โthey seem to be of one mindโ because itโs easier to apologize than to get permission. Many news outlets are comparing Nakasoneโs addition to OpenAI as an effort to cool down the safety allegations. As Jan Leike, who was heading the superalignment team at OpenAI, said after leaving the company,ย โSafety culture and processes have taken a backseat.โย
Cryptopolitan reporting by Aamir Sheikh
Want your project in front of cryptoโs top minds? Feature it in our next industry report, where data meets impact.

