HirokazuShirado.jpeg

 

Cooperation in human groups is challenging, and various mechanisms are required to sustain it, although it nevertheless usually decays over time. Here, we perform theoretically informed experiments involving networks of humans playing a public-goods game to which we some- times added autonomous agents (bots) programmed to use only local knowledge. This experiment shows that cooperation can not only be stabilized, but even promoted, when the bots intervene in the partner selections made by the humans, reshaping social connections locally within a larger group. This network-intervention strategy outperformed other strategies, such as adding bots playing tit-for-tat. On the other hand, we also found that personalized intervention strategies did not work and sometimes exacerbated human cooperation. Overall, this work sheds light on hybrid systems of humans and machines that are embedded in networks, and it shows that simple machine intelligence can be used to help humans to help themselves.