Got onto thinking about this oddity that only humans use characters.
"abcd" are four characters that never exist in the real world. They are marks on a page when we draw them, or arrays of pixels on a computer screen created by electronics that responds to binary codes from a computer.
Even Godel's famous numbering doesn't actually exist. It relies upon a human operator to chose which encoding to use. Computers are famously bad at character recognition. OCR in modern computers that can turn scanned text into actual character codes is far beyond the scope of the Godel numbering formula, but this is what would be needed if Godel numbering was to real be automated! But even here the scanned page is converted into a numerical arrangement and processed by programs that are written and calibrated by humans to replicate their own vision processes so that computers encode things as human do (or seem to do).
My thinking here went down this road. Humans work in the world of characters. At no point do we encode them. Now computer scientists and mechanics seek to model the human brain on computers and so presume that at a level beneath our human world what is really going on is encoding and logic processing. But do we know that neurons work by logic?
I began (and never finished as was the pattern at college) as second year assigment on "Whether recent research in neural networks could inform our understanding of brain physiology". I got stuck into reading and never finished. But what I discovered was that neurons were understood to sum "voltages" on their heads and at a threshold start their own propagation of the wave. I note here that the process is the summation of real voltages (i.e. continuous voltages) not discrete voltages. The only discrete element is the propagation of the action potential. But are all action potentials the same size? I question now (purely hypothetically) whether neurons only sum and whether other kinds of more complex inhibitions or promotions go one. Whatever the answer is the point is that the main processing may be occurring in real numbers on the neuron heads and even the previously discrete notion of countable neuron firings might involve real values also. Is this what "real computing" looks like. Is this why humans work in symbols and don't at any point encode what they see?
I need to revise my knowledge of neurons and think some more.
A search for happiness in poverty. Happiness with personal loss, and a challenge to the wisdom of economic growth and environmental exploitation.
Subscribe to:
Post Comments (Atom)
"The Jewish Fallacy" OR "The Inauthenticity of the West"
I initially thought to start up a discussion to formalise what I was going to name the "Jewish Fallacy" and I'm sure there is ...
-
The classic antimony is between : (1) action that is chosen freely and (2) action that comes about through physical causation. To date no ...
-
Well that was quite interesting ChatGPT can't really "think" of anything to detract from the argument that Jews have no clai...
-
There are few people in England I meet these days who do not think we are being ruled by an non-democratic elite. The question is just who t...
No comments:
Post a Comment