Keywords: information, information theory, semantics, meaning, entropy, intelligence, general intelligence, Shannon, nature, open world, cosmos
TL;DR: An information theory approach to framing 'first principles' for artificial intelligence and beyond.
Abstract: This paper names structural fundaments in ‘information’, to cover an issue seen
by Claude Shannon and Warren Weaver as a missing “theory of meaning”. First,
varied informatic roles are noted as likely elements for a general theory of mean-
ing. Next, Shannon Signal Entropy as a likely “mother of all models” is decon-
structed to note the signal literacy (logarithmic Subject-Object primitives) innate
to ‘scientific’ views of information. It therein marks GENERAL intelligence ‘first
principles’ and a dualist-triune (2-3) pattern. Lastly, it notes ‘intelligence building’
as named contexts wherein one details meaningful content—rendered via material
trial-and-error—that we later extend abstractly. This paper thus tops today’s vague
sense of Open World ‘agent intelligence’ in artificial intelligence, framed herein as
a multi-level Entropic/informatic continuum of ‘functional degrees of freedom’;
all as a mildly-modified view of Signal Entropy.
—Related video found at: $\href{https://youtu.be/11oFq6g3Njs?si=VIRcV9H3GNJEYzXt}{The Advent of Super-Intelligence}$.
Submission Number: 4
Loading