I'm out of my depth with this.

Discussion in 'General Chat' started by amazingtrade, Feb 26, 2005.

  1. amazingtrade

    amazingtrade Mad Madchestoh fan

    Joined:
    Jun 19, 2003
    Messages:
    5,139
    Likes Received:
    0
    Location:
    Manchester
    amazingtrade, Feb 26, 2005
    #1
  2. amazingtrade

    Paul Ranson

    Joined:
    Sep 4, 2003
    Messages:
    1,602
    Likes Received:
    0
    Location:
    An octopus's garden.
    Any compression system involves the concept of entropy, since entropy is a measure of information, and maximising information density or minimising redundancy is what compression is all about.

    Can you enlarge on your actual assignment?

    Paul
     
    Paul Ranson, Feb 26, 2005
    #2
  3. amazingtrade

    amazingtrade Mad Madchestoh fan

    Joined:
    Jun 19, 2003
    Messages:
    5,139
    Likes Received:
    0
    Location:
    Manchester
    I haven't got the assignment yet, that is next week, this week are supposed to be doing background reading on Entropy, I think I need to read a bit of Huffman.

    I've not been taught any of this stuff, we ust get a little bit of background reading to each week and this week it is entropy.

    One thing is describe the properties of the entropy function, I am guessing logerithims have somthign to with it.
     
    amazingtrade, Feb 26, 2005
    #3
  4. amazingtrade

    Paul Ranson

    Joined:
    Sep 4, 2003
    Messages:
    1,602
    Likes Received:
    0
    Location:
    An octopus's garden.
    Say we had 32 equally likely symbols. The Shannon Entropy function suggests that this requires 'minus one times the sum, where n=0-31, of the probability of the nth symbol times the log2 of the probability of the nth symbol' bits per symbol. Since each symbol is equally likely this resolves to -32*(1/32)*log2(1/32). I think this calculates out at '5' which is convenient.

    Paul
     
    Paul Ranson, Feb 26, 2005
    #4
  5. amazingtrade

    domfjbrown live & breathe psy-trance

    Joined:
    Jun 20, 2003
    Messages:
    2,641
    Likes Received:
    0
    Location:
    Exeter (not quite Cornwall!)
    Aaaaaahhhghhghgghghhhhhh - the flashbacks are starting - the HELL of cybernetics has finally come back to haunt me! Information Theory - I *knew* I'd heard that "entropy" word somewhere before...

    Good luck AT - it's not as scary as all that (but I've forgotten all of what I was taught on it!)
     
    domfjbrown, Mar 1, 2005
    #5
  6. amazingtrade

    technobear Ursine Audiophile

    Joined:
    Jun 22, 2003
    Messages:
    2,099
    Likes Received:
    0
    Location:
    Glastonbury
    Huffman? Sounds familiar. Isn't that the one used for fax compression. I seem to remember writing a codec for that once. My boss at the time implemented an LZW codec in assembler in about 2 weeks so it can't be all that hard. Don't panic AT!
     
    technobear, Mar 1, 2005
    #6
  7. amazingtrade

    I-S Good Evening.... Infidel

    Joined:
    Jun 25, 2003
    Messages:
    4,842
    Likes Received:
    1
    Location:
    In a world of pain
    From Flanders and Swann's song about thermodynamics:
     
    I-S, Mar 3, 2005
    #7
Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments (here). After that, you can post your question and our members will help you out.
Similar Threads
There are no similar threads yet.
Loading...