论文部分内容阅读
1 前言 C·E·shannon于1948年创立的信息论)用消息发生的概率的对数负值去计算信息量的大小。即 I=-logP. 三十年来,信息论有了一定的发展,对信息量的度量,不少学者做了更进一步的探讨。文献[2]给出了语法信息、语义信息及语用信息的综合测度方法和公式,是传统信息论的一个发展。传统的信息度量方法,用概率作为其测度的基本要素,这对只对语法信息进行度量的情况是合理的,但在很多复杂的场合及众多的带有主观因素的领域,概率的统计计算往往是比较困难的,在有的情况下,甚至是不可能的。即使在可以统计计算的场合,亦需要大样本量的数据,况且,有了大样本量的数据,其统计计算结果亦不一定符合实际情
1 INTRODUCTION The theory of information created by C. Shannon in 1948) uses the logarithmic negative value of the probability of a message to calculate the amount of information. That is, I = -logP. For thirty years, there has been a certain development of information theory, and many scholars have done further research on the measurement of information volume. Literature [2] gives a synthetical measurement method and formula of grammatical information, semantic information and pragmatic information, which is a development of traditional information theory. The traditional method of information measurement, using probability as the basic element of its measure, is reasonable for the measurement of grammatical information only. However, in many complicated situations and many subjective factors, the statistical calculation of probability often Is more difficult, in some cases, or even impossible. Even in cases where statistics can be calculated, a large sample size of data is required. Moreover, with a large sample size of data, the statistical calculation results may not necessarily conform to the actual situation