Silicon Republic May 13, 2024
The UK’s AI Safety Institute said its Inspect platform will help testers around the world to evaluate AI models.
The UK has released its own safety testing platform to help organisations around the world develop safe AI models.
The country’s AI Safety Institute said this platform – called Inspect – is a software library that lets testers assess the capabilities of AI models and produce scores for various criteria based on the results. Inspect has been released as an open-source platform for global testers, such as AI start-ups, researchers and governments.
The UK Safety Institute said Inspect can evaluate AI models in various areas such as their core knowledge, ability to reason and autonomous capabilities. The organisation said the platform...