Tillbaka till svenska Fidonet
English   Information   Debug  
ECHOLIST   0/18295
EC_SUPPORT   0/318
ELECTRONICS   0/359
ELEKTRONIK.GER   1534
ENET.LINGUISTIC   0/13
ENET.POLITICS   0/4
ENET.SOFT   0/11701
ENET.SYSOP   33788
ENET.TALKS   0/32
ENGLISH_TUTOR   0/2000
EVOLUTION   0/1335
FDECHO   0/217
FDN_ANNOUNCE   0/7068
FIDONEWS   23483
FIDONEWS_OLD1   0/49742
FIDONEWS_OLD2   0/35949
FIDONEWS_OLD3   0/30874
FIDONEWS_OLD4   0/37224
FIDO_SYSOP   12841
FIDO_UTIL   0/180
FILEFIND   0/209
FILEGATE   0/212
FILM   0/18
FNEWS_PUBLISH   4178
FN_SYSOP   41525
FN_SYSOP_OLD1   71952
FTP_FIDO   0/2
FTSC_PUBLIC   0/13569
FUNNY   0/4886
GENEALOGY.EUR   0/71
GET_INFO   105
GOLDED   0/408
HAM   0/16047
HOLYSMOKE   0/6791
HOT_SITES   0/1
HTMLEDIT   0/71
HUB203   466
HUB_100   264
HUB_400   39
HUMOR   0/29
IC   0/2851
INTERNET   0/424
INTERUSER   0/3
IP_CONNECT   719
JAMNNTPD   0/233
JAMTLAND   0/47
KATTY_KORNER   0/41
LAN   0/16
LINUX-USER   0/19
LINUXHELP   0/1155
LINUX   0/22010
LINUX_BBS   0/957
mail   18.68
mail_fore_ok   249
MENSA   0/341
MODERATOR   0/102
MONTE   0/992
MOSCOW_OKLAHOMA   0/1245
MUFFIN   0/783
MUSIC   0/321
N203_STAT   898
N203_SYSCHAT   313
NET203   321
NET204   69
NET_DEV   0/10
NORD.ADMIN   0/101
NORD.CHAT   0/2572
NORD.FIDONET   189
NORD.HARDWARE   0/28
NORD.KULTUR   0/114
NORD.PROG   0/32
NORD.SOFTWARE   0/88
NORD.TEKNIK   0/58
NORD   0/453
OCCULT_CHAT   0/93
OS2BBS   0/787
OS2DOSBBS   0/580
OS2HW   0/42
OS2INET   0/37
OS2LAN   0/134
OS2PROG   0/36
OS2REXX   0/113
OS2USER-L   207
OS2   0/4783
OSDEBATE   0/18996
PASCAL   0/490
PERL   0/457
PHP   0/45
POINTS   0/405
POLITICS   0/29554
POL_INC   0/14731
PSION   103
R20_ADMIN   1117
R20_AMATORRADIO   0/2
R20_BEST_OF_FIDONET   13
R20_CHAT   0/893
R20_DEPP   0/3
R20_DEV   399
R20_ECHO2   1379
R20_ECHOPRES   0/35
R20_ESTAT   0/719
R20_FIDONETPROG...
...RAM.MYPOINT
  0/2
R20_FIDONETPROGRAM   0/22
R20_FIDONET   0/248
R20_FILEFIND   0/24
R20_FILEFOUND   0/22
R20_HIFI   0/3
R20_INFO2   2722
R20_INTERNET   0/12940
R20_INTRESSE   0/60
R20_INTR_KOM   0/99
R20_KANDIDAT.CHAT   42
R20_KANDIDAT   28
R20_KOM_DEV   112
R20_KONTROLL   0/13047
R20_KORSET   0/18
R20_LOKALTRAFIK   0/24
R20_MODERATOR   0/1852
R20_NC   76
R20_NET200   245
R20_NETWORK.OTH...
...ERNETS
  0/13
R20_OPERATIVSYS...
...TEM.LINUX
  0/44
R20_PROGRAMVAROR   0/1
R20_REC2NEC   534
R20_SFOSM   0/340
R20_SF   0/108
R20_SPRAK.ENGLISH   0/1
R20_SQUISH   107
R20_TEST   2
R20_WORST_OF_FIDONET   12
RAR   0/9
RA_MULTI   106
RA_UTIL   0/162
REGCON.EUR   0/2055
REGCON   0/13
SCIENCE   0/1206
SF   0/239
SHAREWARE_SUPPORT   0/5146
SHAREWRE   0/14
SIMPSONS   0/169
STATS_OLD1   0/2539.065
STATS_OLD2   0/2530
STATS_OLD3   0/2395.095
STATS_OLD4   0/1692.25
SURVIVOR   0/495
SYSOPS_CORNER   0/3
SYSOP   0/84
TAGLINES   0/112
TEAMOS2   0/4530
TECH   0/2617
TEST.444   0/105
TRAPDOOR   0/19
TREK   0/755
TUB   0/290
UFO   0/40
UNIX   0/1316
USA_EURLINK   0/102
USR_MODEMS   0/1
VATICAN   0/2740
VIETNAM_VETS   0/14
VIRUS   0/378
VIRUS_INFO   0/201
VISUAL_BASIC   0/473
WHITEHOUSE   0/5187
WIN2000   0/101
WIN32   0/30
WIN95   0/4276
WIN95_OLD1   0/70272
WINDOWS   0/1517
WWB_SYSOP   0/419
WWB_TECH   0/810
ZCC-PUBLIC   0/1
ZEC   4

 
4DOS   0/134
ABORTION   0/7
ALASKA_CHAT   0/506
ALLFIX_FILE   0/1313
ALLFIX_FILE_OLD1   0/7997
ALT_DOS   0/152
AMATEUR_RADIO   0/1039
AMIGASALE   0/14
AMIGA   0/331
AMIGA_INT   0/1
AMIGA_PROG   0/20
AMIGA_SYSOP   0/26
ANIME   0/15
ARGUS   0/924
ASCII_ART   0/340
ASIAN_LINK   0/651
ASTRONOMY   0/417
AUDIO   0/92
AUTOMOBILE_RACING   0/105
BABYLON5   0/17862
BAG   135
BATPOWER   0/361
BBBS.ENGLISH   0/382
BBSLAW   0/109
BBS_ADS   0/5290
BBS_INTERNET   0/507
BIBLE   0/3563
BINKD   0/1119
BINKLEY   0/215
BLUEWAVE   0/2173
CABLE_MODEMS   0/25
CBM   0/46
CDRECORD   0/66
CDROM   0/20
CLASSIC_COMPUTER   0/378
COMICS   0/15
CONSPRCY   0/899
COOKING   28042
COOKING_OLD1   0/24719
COOKING_OLD2   0/40862
COOKING_OLD3   0/37489
COOKING_OLD4   0/35496
COOKING_OLD5   9370
C_ECHO   0/189
C_PLUSPLUS   0/31
DIRTY_DOZEN   0/201
DOORGAMES   0/2006
DOS_INTERNET   0/196
duplikat   5999
Möte EVOLUTION, 1335 texter
 lista första sista föregående nästa
Text 86, 207 rader
Skriven 2004-09-17 16:37:00 av Michael Ragland (1:278/230)
Ärende: Re: Dawkins gives incorre
=================================



Guy Hoelzer <hoelzer@unr.edu> wrote or quoted: 
in article ci7mqk$24qd$1@darwin.ediacara.org, Tim Tyler at
tim@tt1lock.org 
Guy Hoelzer <hoelzer@unr.edu> wrote or quoted: 
in article chvng2$2hqs$1@darwin.ediacara.org, Tim Tyler at
tim@tt1lock.org: 
Guy Hoelzer <hoelzer@unr.edu> wrote or quoted: 
in article chsg65$1hqg$1@darwin.ediacara.org, Tim Tyler at
tim@tt1lock.org: 
Guy Hoelzer <hoelzer@unr.edu> wrote or quoted: 

GH:
Are you arguing that treating p_i as frequency is almost never done, or
that this practice has not increased in frequency? Or are you just
arguing that you don't think it has become sufficiently common to call
it a transition? 
p_i is /always/ the probability of the i'th symbol arising. 

TT:
Sometimes the probabilities are determined completly by symbol
frequencies - but the p_i's are never frequencies. 
If they are "determined completely by by symbol frequencies" then they
are frequencies. 
A frequency is normally a measurement of the number of times that a
repeated event occurs per unit time. 

GH:
I am aware of that definition, but I am using a different conventional
meaning. This distinction might be a source of some of our differences.
The definition I am using is the one I believe to be most commonly used
in the biological sciences, and it well represented by the one expressed
by "A Dictionary of Ecology, Evolution, and Systematics." It reads: 
"The number of items belonging to a category or class; the number of
occasions that a given species occurs in a series of examples." 
This dictionary does not list any other definitions for "frequency." 

TT:
I note that that still doesn't result in a series of numbers that add up
to 1.0. 

GH:
How do you explain the information theoretical methods of analysis, such
as the Akaike Information Content measure, that have been growing fast
in application. It is fundamental to these methods that they yield
precisely the same result in the hands of every scientist, so that they
are repeatable and verifiable. The role of perceiver, which was
Shannon's initial concern, has been dropped from information theory by
many. 

TT:
I'm not sure about the Akaike Information Criterion, but - as far as I
can tell - is escapes observer-dependence by completely specifying a
particular hypothetical observer (its model) and then asking how
effective that observer is at predicting the data. 
In other words, the term "information" in its title appears to refer not
to the information gained by someone measuring its value - but to the
information that can be expected to be gained by a completely- specified
hypothetical observer witnessing the data stream. 

GH:
A good resource for learning about AIC and its application (IMHO) is the
book: 
Burnham, K. P., and D. R. Anderson. 1998. Model selection and inference:
a practical information-theoretic approach. Springer-Verlag, New York,
New York, USA. 353 pp. 
The authors explain why Kullback-Leibler information is more fundamental
than Shannon information and show that it is more general (it includes
Shannon information). It is Kullback-Leibler that is assumed under the
AIC paradigm, which does not posit an hypothetical observer, according
to the authors. Instead, they argue, the set of AIC values (or adjusted
analogues, such as AICc) that you get out of a comparative analysis
express the relative distance of competing models from objective Truth.
That claim took me by surprise when I first ran across it, but you
really have to examine the theory closely to make an informed judgment
about it. 

TT:
I had never heard of Kullback-Leibler information. 

MR:
Here's a brief source on it: "Date: 13. - 16. September 04
Location: Institute of Environmental Sciences, UniZH 
STATISTICAL MODEL SELECTION AND INFERENCE: A PRACTICAL COURSE 
Prof. David Anderson, author of the book Â"Model Selection and
Multi-Model InferenceÂ" 
Model selection using information criteria is an alternative to
traditional null hypothesis testing that connects information theory and
likelihood theory. Traditional statistical models like regression and
ANOVA use null hypothesis tests and an arbitrary probability value P of
0.05 to decide whether a factor has an effect or not. These models are
sensitive to Type I and Type II errors. Model selection uses
Akaike¹s information criterion (AIC) to choose the best model from
a set of candidate models, and AIC is used to decide whether a factor
should be included in a model that describes the structure of the
data.  AIC-based model selection can be applied to experimental and
observational data. The course is a practical course. The aim is to
learn about model selection and how to use it  The course focuses on
application, not theory. David Anderson, the teacher of the course, is
the leading expert in the field of model selection. Because he worked in
Fish and Wildlife Departments he knows the needs of biologists. During
the course participants will have the possibility to analyse their
data. 
In particular, the course will cover the following topics:
- Some philosophy about science and data analysis issues
- Kullback-Leibler information and its centrality in the sciences
- Estimators of K-L information (AIC, AICc, QAICc, and TIC)
- Model selection, the principle of parsimony, bias/variance trade-offs
- Strength of evidence for models in the candidate set,
- Scaling models (delta values)
- Akaike weigths (the likelihood of model i, given the data)
- Incorportating model selection uncertainty into estimates of precision
- Multi-model Inference (MMI) -- making formal inference from several
models with special sessions on
- Likelihood theory, maximum likelihood estimates, etc
- Model building
- Null hypothesis testing, problems, limitations 

TT:
I visited http://googleduel.com/ with the terms 
"shannon information" and "Kullback-Leibler information" 
Shannon information won by more than 100 to 1. 

MR:
I tried a variant and the results were "Google Duel
(I left out Kulback Leibler "Information".  

And the Winner Is...
Kullback Leibler (7,430)
Shannon Information (3,510)

TT:
Maybe an option for you would be to use one of the terms referring to
this quantity - if it is what you are talking about. 
The terms "relative entropy", "divergence", "directed divergence", and
"cross entropy" all appear to refer to this metric. 

TT:
The metric represents a measure of distance between two probability
distributions. If the distributions are given, then metric does not
depend on who measures it. 
However Shannon information does not normally consider the probabilities
it is considering to be given and agreed-upon in advance - instead it
allows the possibility that different observers may have different
information about the events and may make different estimates of their
probabilities. In the terminology of relative entropy, they would be
said to be considering different models. 
If you caluclate the /relative entropy/ between the predictions of
different models and some fixed set of observations then you would
indeed arrive at different values. 

GH:
They always add up to 1.0 - like probabilities do. 

TT:
Like frequencies always do. 

GH:
Frequencies are usually measured in Hertz - and never add up to a
dimensionless quantity such as 1.0. 

TT:
Indeed, adding the values of frequencies together is usually a bad move:
since 1hz+2hz != 3hz. 

GH:
Under the definition provided above frequencies must always add to one
if you have included all possible types in your data. For example, if
you consider the frequency of each allele present in a data set, those
frequencies must add to one. 

TT:
How could they possibly - if the frequency is defined to be a count of
the number of occurrences of an item in a set? 
Frequencies have no upper bound. They can become as large as you like. 
You appear to be talking about a proportion of some sort - not a
frequency. 
Your unorthodox definition of frequency appears to matches your unusual
definition of information. This sort of thing seems bound to cause
communication problems :-| 

GH:
It doesn't appear to be what you are talking about - but it shares the
element of observer-independence (though it tends to become
language-dependent in the process). 
You are correct that this is not exactly what I am talking about, but I
do not see how it is observer-dependent. [...] 

TT:
I said it had "observer-*in*dependence" not "observer-dependence". 
--
__________ 
  |im |yler http://timtyler.org/ tim@tt1lock.org Remove lock to
reply.

"It's uncertain whether intelligence has any long term survival value.
Bacteria do quite well without it."
 Stephen Hawking
---
þ RIMEGate(tm)/RGXPost V1.14 at BBSWORLD * Info@bbsworld.com

---
 * RIMEGate(tm)V10.2áÿ* RelayNet(tm) NNTP Gateway * MoonDog BBS
 * RgateImp.MoonDog.BBS at 9/17/04 4:37:38 PM
 * Origin: MoonDog BBS, Brooklyn,NY, 718 692-2498, 1:278/230 (1:278/230)