Russell's definition of intelligence
The previous article in the sequence is Some problematic definitions of intelligence.
I find great value in Stuart Russell’s definition of intelligence, as it addresses many of the limitations discussed in the previous article. Here is my recollection of his definition:
intelligence the ability of an agent to solve some particular task 1
An agent is anything that makes a decision: a human, an animal, an algorithm, a system of any kind. This includes complex systems such as nuclear reactor control systems, ant colonies, and school boards. This definition intentionally avoids the following concepts: (a) humanity; (b) consciousness; (c) moral worth; (d) using an ability threshold to draw a sharp boundary; (e) generalization across tasks. All of these concerns can be handled separately; they are already difficult enough without bundling them.
This usage bypasses a lot of traps that mire down other conceptualizations of intelligence. I recommend finding a way to shift conversations towards it wherever possible.
In the next article, I explain why I care about definitions so much.
Later, in the sixth article in the sequence, I return to Criticisms of Russell’s definition.
Endnotes
I haven’t checked to ensure this is verbatim, but I think it is at least very close to how Russell defines the term in his book Human Compatible: Artificial Intelligence and the Problem of Control (2019).
The next article in the sequence is Why care about definitions?.