聲音信號處理的筆記

MINES A3 課程,不定期更新

課程代碼地址:boisgera/audio-notebooks

課程地址: Digital Audio Coding

02/09/2018

mac 上安裝 audio

先安裝 Pyaudio,然後安裝audio

brew install portaudiopip2.7 install pyaudiosudo pip2.7 install --upgrade audio

測試

import audio.wavefrom audio.bitstream import BitStreamBitStream([False, True])

  • Binary Data

In [1]: bin(42)Out[1]: 0b101010In [2]: hex(42)Out[2]: 0x2aIn [5]: 5 // 2Out[5]: 2In [6]: 5 % 2Out[6]: 1In [7]: 5 ** 2Out[7]: 25

  • Binary Arithmetic

In [8]: 42 | 7Out[8]: 47In [9]: 42 ^ 7Out[9]: 45In [10]: 0b101010 ^ 0b000111 # where ^ is XOROut[10]: 45In [11]: 42 & 7 Out[11]: 2In [12]: 0b101010 & 0b000111Out[12]: 2In [13]: bin(0b101010 << 3)Out[13]: 0b101010000In [14]: bin(0b101010 >> 3)Out[14]: 0b101

  • integer types

numpy.int8(255)numpy.int8(128)>>> numpy.int8(255)-1>>> numpy.int8(128)-128

int8 規則下

0b10000000 代表 -2^7

0b11111111 代表 -1

0b01111111 代表 (2^7 - 1)

  • BitStream

# a sample transform integer to 0b streamimport numpy as npfrom audio.bitstream import BitStreamdef write_uint8(stream, integers): integers = np.array(integers) for integer in integers: mask = 0b10000000 while mask != 0: stream.write((integer & mask) != 0) mask = mask >> 1 print stream.read()if __name__ == __main__: write_uint8(BitStream(), [1,2,32,54,225])

02/15/2018

Information content axiom (信息內容公理)

Positive : I : zeta 
ightarrow [0 , +infty ]

Additive : I(E_1 wedge E_2) = I(E_1) + I(E_2), quad if qquad E_1, E_2 quad independent

Neutral : I(E) = Fcirc P(E) , P 表示概率

Normalized : I(E) = 1, quad if quad P(E) = 0.5

有了「信息內容」這個概念,我們定義「熵」 為「信息內容」 的期望:

  • Entropy

X is a source, the entropy of X is the mean info, content of X : H(X) = E(I(X))

or explicitly,

H(x) = - sum_{x}{p(x)log_2(p(x))}

a normal distribution gives the largest entropy.

a Dirac distribution gives 0 entropy.


推薦閱讀:

信息與香農...

TAG:數據 | 資訊理論 |