Discussion:
Questions 5 and 6
(too old to reply)
Jack
2005-09-26 01:57:10 UTC
Permalink
Anybody made significant headway on these problems? I'm doing a great job
spinning my wheels, and getting nowhere :D.

It seems like the thing to do is manipulate the notion of entropy to fit
these formulas... is that what you guys are doing?
Siddharth Natarajan
2005-09-26 05:49:16 UTC
Permalink
yea, i got no. 5...the basic idea is tht weight <= 1+H where H is the
entropy of the alphabet
As for no. 6, i am stumped as well
Cheers
Sidd
Post by Jack
Anybody made significant headway on these problems? I'm doing a great job
spinning my wheels, and getting nowhere :D.
It seems like the thing to do is manipulate the notion of entropy to fit
these formulas... is that what you guys are doing?
Jamie Hargrove
2005-09-26 08:34:52 UTC
Permalink
User-Agent: OSXnews .10/b
Xref: number1.nntp.dca.giganews.com utexas.class.cs337:299

I think I've got both 5 and 6 done - for anyone still working on this,
I'd suggest trying to find the quantities involved first rather than
taking the arcane looking formulas and staring at them, I made much more
progress from the other end. For number 6, having been exposed to
combinatorics helps in my opinion, as does going back to look at the definition
of entropy and thinking about what is different in this case.
Post by Jack
Anybody made significant headway on these problems? I'm doing a great job
spinning my wheels, and getting nowhere :D.
It seems like the thing to do is manipulate the notion of entropy to fit
these formulas... is that what you guys are doing?
.
max funderburk,,,,
2005-09-26 21:44:30 UTC
Permalink
hmmm...

maybe there's a misunderstanding about entropy of the input file as
specified in the question vs. entropy as defined in the course packet.

look at the example given on page 11 of the course packet. a two symbol
alphabet with probabilities 0.9 and 0.1 yields an entropy of h = 0.469.
however, using the formula given in question 6, the entropy of a 10
character message from this same alphabet (including frequencies) yields an
entropy of 3.3219, far beyond the range of h + 1.

is anyone out there able to explain this without giving away too much???
is there a fundamental misunderstanding here?
Post by Jamie Hargrove
I think I've got both 5 and 6 done - for anyone still working on this,
I'd suggest trying to find the quantities involved first rather than
taking the arcane looking formulas and staring at them, I made much more
progress from the other end. For number 6, having been exposed to
combinatorics helps in my opinion, as does going back to look at the
definition of entropy and thinking about what is different in this case.
Post by Jack
Anybody made significant headway on these problems? I'm doing a great job
spinning my wheels, and getting nowhere :D.
It seems like the thing to do is manipulate the notion of entropy to fit
these formulas... is that what you guys are doing?
.
Loading...