AI in Tetris Game Alters Player Perception and Performance

In a new study, players of a modified two-person Tetris game perceived their partners as less likable when they had fewer turns, irrespective of whether a human or algorithm determined the turn allocation. The researchers aimed to explore how AI-driven decisions can influence human relationships and interactions.

Check out this cool experiment led by Cornell University! They played a modified version of Tetris with two people and discovered something interesting. It turns out that when one player gets fewer turns, they tend to perceive the other player as less likable. Surprisingly, it didn’t matter if a person or an algorithm decided the turns.

Usually, studies on algorithmic fairness focus solely on the decision-making process, but these researchers wanted to dig deeper into how it affects relationships among the people involved.

According to Malte Jung, associate professor of information science and head of the study, “AI is increasingly making decisions on resource distribution among people. We want to understand how this influences our perceptions and behavior towards each other. There’s mounting evidence that machines can mess with our interactions.”

In a previous study, they had a robot choose whom to give a block to and observed how individuals reacted to the robot’s decisions. Every time the robot seemed to favor one person, the other person became upset. This piqued their interest, leading them to explore the impact of decision-making machines, whether it’s a robot or an algorithm, on human emotions.

To conduct their research, they created a two-player version of Tetris called Co-Tetris. Houston Claure, the study’s first author and a postdoctoral researcher at Yale University, developed the game using open-source software. Players work together to stack falling geometric blocks without leaving gaps.

In the experiment, an “allocator,” either human or AI, determined who took each turn. The researchers designed three conditions: players received 90% of the turns (the “more” condition), 10% (“less”), or 50% (“equal”).

Unsurprisingly, participants who received fewer turns were acutely aware of their partner getting more opportunities. However, what surprised the researchers was that the feelings about this discrepancy were similar, regardless of whether a human or an AI made the allocations.

They referred to this behavior as “machine allocation behavior,” which is a unique response resulting from a machine deciding how resources are allocated. It’s akin to the established concept of “resource allocation behavior,” which describes observable behavior based on allocation decisions.

Interestingly, the researchers also found that equal turn allocation didn’t necessarily lead to better gameplay or performance. In fact, rounds with unequal allocation often resulted in better scores. As Houston Claure pointed out, “If a strong player receives most of the blocks, the team performs better. If one person gets 90% of the turns, they eventually become better at it than if two average players split the blocks.”

Fascinating, right? This study sheds light on the impact of machine decision-making on our perceptions and behaviors. It’s a reminder that fairness isn’t always synonymous with optimal outcomes in cooperative endeavors

Author: Neurologica

Leave a Reply