首页 | 本学科首页   官方微博 | 高级检索  
     

嵌入式语音识别系统中的DTW在线并行算法*
引用本文:姜干新,陈伟b.嵌入式语音识别系统中的DTW在线并行算法*[J].计算机应用研究,2010,27(3):977-980.
作者姓名:姜干新  陈伟b
作者单位:1. 浙江大学计算机科学与技术学院,杭州,310027
2. 浙江大学浙江省服务机器人重点实验室,杭州,310027
基金项目:浙江省科技厅重点实验室建设项目(2008E10004)
摘    要:为提高语音识别系统的实时性,利用动态规划和并行计算思想,提出一种适用于嵌入式语音识别系统的DTW(动态时间规整)在线并行算法。通过分析标准DTW及其主要衍生算法,对DTW算法的数据结构进行改进以满足在线算法要求,在寻找最佳路径过程中动态连续地分配和释放内存或预先分配固定大小的内存,并将多个关键词的DTW计算分布到多个运算单元;最后汇总各运算单元的结果得到识别结果。实验表明,该算法比经典DTW降低了内存使用和识别时间,并使语音识别的实时系数达到1.17,具有较高的实时性。

关 键 词:语音识别    动态时间规整    在线算法    并行算法    嵌入式系统

Online parallel dynamic time warping algorithm for speech recognition in embedded system
JIANG Gan -xin,CHEN Weib.Online parallel dynamic time warping algorithm for speech recognition in embedded system[J].Application Research of Computers,2010,27(3):977-980.
Authors:JIANG Gan -xin  CHEN Weib
Affiliation:a.College of Computer Science & Technology/a>;b.Zhejiang Key Laboratory of Service Robot/a>;Zhejiang University/a>;Hangzhou 310027/a>;China
Abstract:The classical DTW can be enhanced using dynamic programming and parallel computing. This paper introduced an online parallel DTW to improve the real -time performance for embedded speech recognition systems. After comprehensive analysis of DTW and its major derivatives, the algorithm used data structures that fit the requirements of online algorithm. During the stage of figuring out optimal warping path, manipulated memory as dynamically allocated (and released) or statically allocated prior, and distributed calculations for each keyword to multiple computing units, then obtained the final recognition result from them. Experimental results indicate that the algorithm can reduce memory and time usage compared with classical DTW. Besides, the coefficient of real -time performances is reduced to 1.17, which is of high performance.
Keywords:speech recognition  dynamic time warping (DTW)  online algorithm  parallel algorithm  embedded system
本文献已被 CNKI 万方数据 等数据库收录!
点击此处可从《计算机应用研究》浏览原始摘要信息
点击此处可从《计算机应用研究》下载全文
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号