November 10, 2021, 11:00 AM - 12:00 PM
Location:
Online Event
Grant Schoenebeck, University of Michigan
Crowdsourcing, peer grading, peer prediction, and surveys all require eliciting information from agents. Without a reward, agents may not participate, but providing rewards can distort incentives. For example, sophisticated agents may strategically withhold effort or information when they believe their payoff will be based upon comparison with other agents whose reports will likely omit this information due to lack of effort or expertise. This is even more difficult when answers cannot be directly verified.
This talk will argue that information theory provides a powerful way to think about these problems in that measuring information theoretic properties can directly translate into incentive compatible mechanisms. Moreover, this talk will show how a soft-predictor for an agent's report (given the other agents' reports) can typically be leveraged to provide measurements of the relevant information theoretic properties (thus yielding incentive compatible mechanisms).
Special Note: The Theory of Computing Seminar is being held online. Contact the organizers for the link to the seminar.