The Futures Industry Association (FIA) sent a substantial request to the Commodity Futures Trading Commission (CFTC) to consider limiting AI regulation to the use cases in which AI would be employed instead of testing AI itself. This recommendation proposal, made in a joint letter to ASME, BIS, and FIA PTG, calls for the adoption of a technology-neutral approach that gives inputs and use cases utter importance.
Technology neutral regulation
The main point of the FIA letter is to develop a regulatory framework focused on outcomes rather than on the specific AI technique used. We like this option because it is the best and fastest way to handle the derisking problems associated with AI in the derivatives market. The FIA goes ahead and states that there is no need for a different set of rules or standards of guidance to elaborate on specific control or oversight elements that maintain market integrity and resilience, as what is offered by CFTC is enough for the purpose. In that case, it would be prudent to evaluate if such general regulations could be applied to the use case of AI systems previously before deciding on the necessity of new AI-specific regulations.
Regulation of AI technology
The FIA, among other things, identified the focus on the problem of defining AI as one of the most daunting aspects. Among the flaws of intervention in the too-broad field of AI regulation is that one has to define AI comprehensively and concisely, and this task is not only extremely difficult but also energy-consuming. The letter emphasizes that differentiating AI in a way that it can be ascertained apart from other technology seems very difficult and that it may require big expenses.
The FIA interprets the position of the CFTC as being more general rather than concrete; therefore, the FIA suggests a specific perspective on AI use cases in which the CFTC could adopt the practical approach for regulation. According to FIA, the application of AI in derivatives markets is not restricted to any particular use case and is changing as we speak. Such broad use cases can be exemplified in data transcription, trading strategies, compliance procedures, and systems controls. Furthermore, AI early applications involve mostly activities related to nonconsumers other than serving end-consumers, as the wide range of applications is not restricted only to certain industries.
Tailored governance and controls
The Financial Instrument Authority’s major aspect addressed within the same context is the need to customize AI governance and controls to the particular natures of companies. The letter says that the mechanism setup ought to depend on the factors of the kind of business operation, the size or scale of the operation, and the degree of complexity of the business.
For example, instead of an IT Director specializing in AI, some universities prefer to delegate this role to their AI Committee, taking into consideration some unique situations and conditions. This flexibility is considered to be crucial to making AI management risks manageable; thus, companies can customize the AI operation to meet their unique business circumstances. Based on the FIA’s letter to the CFTC, we can understand why regulating AI applications in the derivatives market is so significant and imperative.
By choosing a technology-independent approach, which is grounded in the application of AI or even outcome level, regulators can solve the risks without getting involved in the nuisance related to the definition and regulation of AI itself. Although regulation can be site-specific and requires various business requirements, it remains one of the major impediments that AI is facing in this sector. The CFTC will keep collecting information and observing the consequences. As FIA – an industry association – and other stakeholders will make their contributions, the implications of AI and the regulatory environment will be defined.
Land a High-Paying Web3 Job in 90 Days: The Ultimate Roadmap