完整後設資料紀錄
DC 欄位 | 值 | 語言 |
---|---|---|
dc.contributor.author | Tsai, Shi-Chun | |
dc.contributor.author | Tzeng, Wen-Guey | |
dc.contributor.author | Wu, Hsin-Lung | |
dc.date.accessioned | 2009-06-02T06:37:40Z | |
dc.date.accessioned | 2020-05-25T06:41:31Z | - |
dc.date.available | 2009-06-02T06:37:40Z | |
dc.date.available | 2020-05-25T06:41:31Z | - |
dc.date.issued | 2006-10-18T07:41:50Z | |
dc.date.submitted | 2004-12-15 | |
dc.identifier.uri | http://dspace.lib.fcu.edu.tw/handle/2377/1827 | - |
dc.description.abstract | Abstract-We study the distance measures between two probability distributions via two dierent distance metrics, a new metric induced from Jensen-Shannon Divergence[4] and the well known L1 metric. First we show that the bounds between these two distance metrics are tight for some particular distributions. Then we show that the L1 distance of a binomial distribution does not imply the entropy power inequality for the binomial family, proposed in [5]. Moreover, we show that, several important results and constructions in computational complexity under the L1 metric carry over to the new metric, such as Yao’s next-bit predictor [13], the existence of extractors [11], the leftover hash lemma[?] and the construction of expander graph based extractor. Finally we show that the useful parity lemma [12] in studying pseudo-randomness does not hold in the new metric. | |
dc.description.sponsorship | 大同大學,台北市 | |
dc.format.extent | 6p. | |
dc.format.extent | 353100 bytes | |
dc.format.mimetype | application/pdf | |
dc.language.iso | zh_TW | |
dc.relation.ispartofseries | 2004 ICS會議 | |
dc.subject | Jensen-Shannon Divergence | |
dc.subject | variational distance | |
dc.subject | extractors | |
dc.subject.other | Miscellaneous | |
dc.title | On the Jensen-Shannon Divergence and Variational Distance | |
分類: | 2004年 ICS 國際計算機會議 |
文件中的檔案:
檔案 | 描述 | 大小 | 格式 | |
---|---|---|---|---|
ce07ics002004000239.pdf | 344.82 kB | Adobe PDF | 檢視/開啟 |
在 DSpace 系統中的文件,除了特別指名其著作權條款之外,均受到著作權保護,並且保留所有的權利。