Silicon Republic May 13, 2024
Leigh Mc Gowran

The UK’s AI Safety Institute said its Inspect platform will help testers around the world to evaluate AI models.

The UK has released its own safety testing platform to help organisations around the world develop safe AI models.

The country’s AI Safety Institute said this platform – called Inspect – is a software library that lets testers assess the capabilities of AI models and produce scores for various criteria based on the results. Inspect has been released as an open-source platform for global testers, such as AI start-ups, researchers and governments.

The UK Safety Institute said Inspect can evaluate AI models in various areas such as their core knowledge, ability to reason and autonomous capabilities. The organisation said the platform...

Today's Sponsors

Venturous
ZeOmega

Today's Sponsor

Venturous

 
Topics: AI (Artificial Intelligence), Govt Agencies, Healthcare System, Safety, Technology
AI-enabled clinical data abstraction: a nurse’s perspective
Contextual AI launches Agent Composer to turn enterprise RAG into production-ready AI agents
OpenAI’s latest product lets you vibe code science
WISeR in 2026: Legal, Compliance, and AI Challenges That Could Reshape Prior Authorization for Skin Substitutes
Dario Amodei warns AI may cause ‘unusually painful’ disruption to jobs

Share Article