An audio CAPTCHA to distinguish humans from computers

Haichang, Gao, Liu, Honggang, Yao, Dan, Liu, Xiyang and Aickelin, Uwe (2010) An audio CAPTCHA to distinguish humans from computers. In: Third International Symposium on Electronic Commerce and Security, ISECS2010: Guangzhou, China, 29-31 July 2010. IEEE Computer Society, Los Alamos, Calif., pp. 265-269.

Full text not available from this repository.

Abstract

CAPTCHAs are employed as a security measure to differentiate human users from bots. A new sound-based

CAPTCHA is proposed in this paper, which exploits the gaps

between human voice and synthetic voice rather than relays on the auditory perception of human. The user is required to read out a given sentence, which is selected randomly from a specified book. The generated audio file will be analyzed automatically to judge whether the user is a human or not. In this paper, the design of the new CAPTCHA, the analysis of the audio files, and the choice of the audio frame window function are described in detail. And also, some experiments are conducted to fix the critical threshold and the coefficients of three indicators to ensure the security. The proposed audio CAPTCHA is proved accessible to users. The user study has shown that the human success rate reaches approximately 97% and the pass rate of attack software using Microsoft SDK 5.1 is only 4%. The experiments also indicated that it could be solved

by most human users in less than 14 seconds and the average

time is only 7.8 seconds.

Item Type: Book Section
RIS ID: https://nottingham-repository.worktribe.com/output/1012688
Schools/Departments: University of Nottingham, UK > Faculty of Science > School of Computer Science
Related URLs:
URLURL Type
http://isecs2010.gdcc.edu.cn/4219z003.pdfUNSPECIFIED
http://www.ieee.org/index.htmlPublisher
10.1109/isecs.2010.65Publisher
Depositing User: Aickelin, Professor Uwe
Date Deposited: 23 Aug 2010 15:01
Last Modified: 06 Nov 2024 14:52
URI: https://eprints.nottingham.ac.uk/id/eprint/1343

Actions (Archive Staff Only)

Edit View Edit View