(Image credit- Tech Xplore)
Researchers have discovered insights into how unfair AI allocation and human response might affect social dynamics in an interesting experiment using a modified version of the classic game Tetris.
The Ann S. Bowers College of Computing and Information Science at Cornell University study illuminates the attitudes and actions of people who are impacted by allocation decisions made by either humans or artificial intelligence.
The study, which was directed by Associate Professor of Allocation Decisions Malte Jung, looked at the relationships between those affected by allocation decisions, a topic frequently ignored in studies on algorithmic fairness.
While most research concentrates on the algorithm or decision, Jung and his team studied how these decisions affect society.
According to Jung, “We are starting to see a lot of situations where AI decides how resources should be distributed among people.”
“We want to comprehend how that affects how individuals view and interact with one another. More and more data suggests that technology interferes with how we communicate with one another.
According to Houston B. Claure, the first author of the study titled “The Social Consequences of Machine Allocation Behaviour: Fairness, Interpersonal Perceptions, and Performance,” their earlier investigation into a robot’s allocation choices prompted them to continue looking into how people respond to machines’ preferences.
Tetris, the well-known block-stacking video game that has long been used as a research tool for comprehending human behavior and cognition, was used by the team to expedite their findings.
Co-Tetris, a variation of the game designed for two players, was a platform developed by researcher Claure for group play.
Claure used open-source software to build an “allocator” mechanism that decided how players would be assigned turns. Participants were made aware of the identity of the allocator, who might pretend to be either a human or an AI.
The researchers changed the allocation conditions, such as those in which one player received 90% of the turns, another player received 10% (referred to as the “less” condition), or situations in which each player continued to get 50% of the turns equally (referred to as the “equal” condition).
No matter whether the turn allocator was a human or an AI, the results showed that people who got fewer turns were quite conscious of their partner’s advantage.
Also read: Just released is Garmin Beta 15.09 for Forerunner 255 with wrist-based running dynamics
The researchers unexpectedly found that perceptions of power varied depending on the allocator. The player who received more turns perceived their partner as less dominant when the allocation was handled by an AI, but not when it was handled by a human.