Skip to yearly menu bar Skip to main content



Abstract: This paper names structural fundaments in ‘information’, to cover an issue seenby Claude Shannon and Warren Weaver as a missing “theory of meaning”. First,varied informatic roles are noted as likely elements for a general theory of mean-ing. Next, Shannon Signal Entropy as a likely “mother of all models” is decon-structed to note the signal literacy (logarithmic Subject-Object primitives) innateto ‘scientific’ views of information. It therein marks GENERAL intelligence ‘firstprinciples’ and a dualist-triune (2-3) pattern. Lastly, it notes ‘intelligence building’as named contexts wherein one details meaningful content—rendered via materialtrial-and-error—that we later extend abstractly. This paper thus tops today’s vaguesense of Open World ‘agent intelligence’ in artificial intelligence, framed herein asa multi-level Entropic/informatic continuum of ‘functional degrees of freedom’;all as a mildly-modified view of Signal Entropy.—Related video found at: $\href{https://youtu.be/11oFq6g3Njs?si=VIRcV9H3GNJEYzXt}{The Advent of Super-Intelligence}$.

Chat is not available.