Foundation models like ChatGPT are having a significant impact on society, with their capabilities and risks being widely discussed. The European Union is in the process of finalizing the AI Act, which will be the first comprehensive regulation for AI. However, a recent evaluation found that major foundation model providers, such as OpenAI and Google, do not comply with the draft requirements of the AI Act, particularly in terms of transparency and disclosure of key information. The evaluation recommends that policymakers prioritize transparency and that foundation model providers improve their compliance with the AI Act. Overall, the assessment highlights the need for regulation and transparency in the foundation model ecosystem.
Signal | Change | 10y horizon | Driving force |
---|---|---|---|
Foundation models transforming society | Transformation of society | Increased integration and impact of foundation models | Advancements in AI technology |
EU finalizing AI Act as comprehensive regulation | Regulation of AI | Global adoption of AI regulation | Need for oversight and accountability in AI development |
Foundation model providers lack transparency | Lack of transparency | Increased transparency and disclosure of model development and use | Push for accountability and adherence to regulatory requirements |
Compliance with AI Act requirements varies | Varying compliance | Improved compliance with AI Act requirements | Incentives and potential fines for noncompliance |
Dichotomy in compliance based on release strategy | Compliance based on release strategy | Strengthened deployment requirements for foundation model providers | Accountability in the digital supply chain |
Waning transparency in foundation model releases | Decreased transparency | Collective action to set industry transparency standards | Recognition of the importance of transparency in AI technology |