Browsing by Author "Greene, Kristen K."
Now showing 1 - 3 of 3
Results Per Page
Sort Options
Item Effects of Multiple Races and Header Highlighting on Undervotes in the 2006 Sarasota General Election: A Usability Study and Cognitive Modeling Assessment(2011) Greene, Kristen K.; Byrne, Michael D.Large-scale voting usability problems have changed the outcomes of several recent elections. The 2006 election in Sarasota County, Florida was one such incident, where the number of votes lost was nearly 50 times greater than the margin of victory for the US Representative race. Multiple hypotheses were proposed to explain this incident, with prevailing theories focused on malicious software, touchscreen miscalibration or poor ballot design, Study I aimed to empirically determine whether Sarasota voters unintentionally skipped the critical US Representative race due to poor ballot design. The Sarasota ballot was replicated initially, then header highlighting and number of races presented on the first screen were manipulated. While the presentation of multiple races had a significant effect on undervotes in the US Representative race, header highlighting did not. Nearly 20% of all voters (27 of 137) skipped the race their first time on that screen, an even greater undervote rate than that originally seen in Sarasota. In conjunction with other research, Study I results strongly suggests that the 2006 Sarasota election was almost certainly a human factors problem. A cognitive model of human voters was developed based on Study I data. Model predictions were then compared with behavioral data from Study 2, in which participants voted on a replica of the Charlotte County, Florida 2006 ballot.Item How To Build an Undervoting Machine: Lessons from an Alternative Ballot Design(USENIX, 2013-08) Greene, Kristen K.; Byrne, Michael D.; Goggin, Stephen N.Despite the importance of usability in ensuring election integrity, it remains an under-studied aspect of voting systems. Voting computers (a.k.a. DREs) offer the opportunity to present ballots to voters in novel ways, yet this space has not been systematically explored. We constructed a DRE that, unlike most commercial DREs, does not require voters to view every race, but instead starts at the “review screen” and lets voters directly navigate to races. This was compared with a more traditional, sequentially-navigated, DRE. The direct access navigation model had two effects, both of which were quite large. First, voters made omission (undervote) errors markedly more often. Second, voters who were free to choose who to vote for chose to vote in substantially fewer races. We also examined the relationship between the true error rate—which is not observable in real elections—and the residual vote rate, a measure of effectiveness commonly used for real elections. Replicating the findings of [Campbell and Byrne 2009a], the mean residual vote rate was close to the mean true error rate, but the correlation between these measures was low, suggesting a loose coupling between these two measures.Item Usability of New Electronic Voting Systems and Traditional Methods: Comparisons Between Sequential and Direct Access Electronic Voting Interfaces, Paper Ballots, Punch Cards, and Lever Machines(2008) Greene, Kristen K.; Byme, Michael D.; Kortum, Philip T.; Lane, David M.It has been assumed that new Direct-Recording Electronic voting machines (DREs) are superior to the older systems they are replacing, despite a lack of supporting research. The current studies contribute much-needed data on the usability of both older and newer voting systems. Study 1 compared a DRE with a sequential navigation model to paper ballots, punch cards, and lever machines; a DRE with a direct access navigation model was added in Study 2. Changing the navigation style from sequential to direct decreased voter satisfaction and greatly increased undervote errors and intentional abstentions. Premature ballot casting was seen with the direct DRE only. Across both studies, participants were neither faster nor less error-prone with the DREs than the older methods. Nonetheless, they found the sequential DRE significantly more satisfying, an interesting disassociation between preference and performance. Despite voter preferences, the assumption that DREs are superior may be unfounded.