Keywords: information theory
TL;DR: A framework for General (Universal/Natural) Intelligence, beyond notions of Human Level Intelligence.
Abstract: This paper explores a key issue in information theory seen by Claude Shannon and Warren Weaver as a missing "theory of meaning”. It names structural fundaments to cover the matter. Varied informatic roles are first noted as likely elements for a general theory of meaning. It next deconstructs Shannon Signal Entropy in a priori terms to mark the signal literacy (contiguous logarithmic Subject-Object primitives) innate to 'scientific' notions of information. It therein initiates general intelligence 'first principles' alongside a dualist-triune (2-3) pattern. This study thus tops today's vague sense of 'meaningful intelligence' in artificial intelligence, framed herein via an Entropic/informatic continuum of serially varied 'functional degrees of freedom'; all as a mildly-modified view of Signal Entropy.
Submission Number: 25
Loading