The original thirty-four mind-modules are described in the
AI4U Amazon Kindle e-book which was originally published in 2002.
Artificial Intelligence in German was published in English in 2013
with an updated description of how all the AI Minds in German,
English and Russian generate thoughts and perform automated
reasoning with inference.
The Amazon Kindle e-book InFerence (Artificial Intelligence) gives the mind-module source code with line-by-line explanation of how the AI implements automated reasoning with inference.
___________ ___________ / \ / \ / MotorOutput \ / SeCurity \ \_____________/\ ______ /\_____________/ __________ \ / \ / ____________ / \ \/ \/ / \ ( FreeWill )--------< MainLoop >--------( SensoryInput ) \__________/ /\ /\ \____________/ _____________ / \______/ \ _____________ / \/ \/ \ \ ThInk / \ EmotiOn / \___________/ \___________/
Code the MainLoop shown above in your chosen programming language. Use either an actual loop with subroutine calls, or make a ringlet of perhaps object-oriented module stubs, each calling the next stub. Provide the ESCAPE key or other mechanisms for the user to stop the AI. Spread your code around the Web and invite AI coders to expand on it. Watch for a proliferation of unique AI Mind entities evolving rapidly on the Web and competing genetically for the survival of the fittest.
2. Code the
Start a subroutine or module that is able to sense something coming in from the outside world, i.e., a key-press on the keyboard.
Take the ESCAPE-key handler or other mechanism of stopping the AI out of the MainLoop and transfer it to the SensoryInput stub. Keep in mind that you must keep a quit-mechanism in the Mind. Flesh out the SensoryInput stub into a working mind-module that calls its own stubs, such as AudInput, VisRecog, TacRecog, GusRecog and OlfRecog. Test the embryonic robot mind by demonstrating that the MainLoop either waits briefly for SensoryInput during each cycle, or generates an event-driven response to input detected by a SensoryInput module. The proper response will be to keep cycling upon normal input or to terminate execution upon halt [ESCAPE] input. Share your code on the Web.
Now you have two modules, a MainLoop module and a subordinate, SensoryInput module. But what should come next in AI evolution? Now we need a reaction module, so that the organism may react to its environment. Let's call the reaction-module "ThInk".
3. Stub in the
Now, of course, the simple organism is not truly thinking yet, but we have stubbed in the ThInk module and we need to show it.
In a Tutorial mode to be selected by pressing the Tab-key, present a simple message to the effect that ThInk has been called.
You should now be able to run your AI program, watch it wait (briefly) for keyboard input ending with a press of the Enter key, and see a message (in Tab-selected tutorial mode) that Think has been called. You have a partly functional AI program, but it has not yet quickened, that is, it has not yet begun to think as a mind. But it should run indefinitely (until you press the Escape key to terminate it), looping forever through the brief wait for human entry either during the action of the SensoryInput module, or upon the event-driven recognition of a key-press. If you do not have this organic functionality, your organism is not viable, and you must go back and reengineer your stem cells, as it were, of AI.
With the proper looping functionality, you now have a stimulus-response organism. There is no knowledge being accumulated, because the animal has no memory. Therefore our next step is to create an AudInput module that will feed into auditory memory (AudMem).
4. Initiate the
AudInput module for keyboard or acoustic input.
Drop the [ESCAPE] mechanism down by one tier, into the AudInput module, but do not eliminate or bypass the quite essential SensoryInput module, because another programmer may wish to specialize in implementing some elaborate sensory modality among your SensoryInput stubs. Code the AudInput module initially to deal with ASCII keyboard input. If you are an expert at speech recognition, extrapolate backwards from the storage requirements (space and format) of the acoustic input of real phonemes in your AudInput system, so that the emerging robot Mind may be ready in advance for the switch from hearing by keyboard to hearing by microphone or artificial ear. Anticipate evolution.
Stub in a new module and call it the AudListen module. Have the AudInput module call the AudListen module as a separation of the state of readiness to hear, or listening, from the actual act of hearing, or audition. By having separate AudListen and AudInput modules that distinguish the two functions, you could have an AI Mind that listened throughout an entire building, or the Pacific Ocean, or a SETI galaxy.
AudMem (Auditory Memory).
Create an array for the sequential capture and retrieval of each discrete unit of auditory input, be it an ASCII key-press or a phoneme of acoustic sound. Plan and coordinate your engram array to simulate any required feature of a neuronal memory synapse -- spiking connectivity, rapid excitation and gradual signal-decay, etc. Do not mimic what everybody else in avant-garde AI is doing, but rather fling your own line of AI evolution out onto the Web and nearby parsecs with the most advanced I/O that you can devise.
MotorOutput (Motor Memory).
As soon as you have sensory memory for audition, it is imperative to include motor memory for action. The polarity of robot-to-world is about to become a circularity of robot - motorium - world - sensorium - robot. If you have been making robots longer than you have been making minds, you now need to engrammatize whatever motor software routines you may have written for your particular automaton. You must decouple your legacy motor output software from whatever mindless stimuli were controlling the robot and you must now associate each motor output routine with memory engram nodes accreting over time onto a lifelong motor memory channel for your mentally awakening robot. If you have not been making robots, implement some simple motor output function like emitting sounds or moving in four directions across a real or virtual world.
8. Stub in the
FreeWill module for volition.
In your robot software, de-link any direct connection that you have hardcoded between a sensory stimulus and a motor initiative. Force motor execution commands to transit through your stubbed-in FreeWill module, so that future versions of your thought-bot will afford at least the option of incorporating a sophisticated algorithm for free will in robots. If you have no robot and you are building a creature of pure reason, nevertheless include a FreeWill stub for the sake of AI-Complete design patterns.
The SeCurity module is not a natural component of the mind, but rather a machine equivalent of the immune system in a human body. When we have advanced AI robots running factories to fabricate even more advanced AI robots, let not the complaint arise that nobody bothered to build in any security precautions. Stub in a SeCurity module and let it be called from the MainLoop by uncommenting any commented-out mention of SeCurity in the MainLoop code. Inside the new SeCurity module, insert a call to ReJuvenate but immediately comment-out the call to the not-yet-existent ReJuvenate module. Also insert into SeCurity any desired code or diagnostic messages pertinent to security functions.
It is a simple matter to create a stub for the TuringTest module and to call it from within the SeCurity module. Before the native quickening of the AI Mind, it may be necessary or at least advisable to have the AI program come to a pause by default in the TuringTest module so that the user must press the Enter key for the AI to continue operating. If it becomes obvious that the still rudimentary program is pausing disruptively within any module other than TuringTest, then it is time to remove the disruptive code and to ensure that the budding AI stops looping only once per cyclical calling of the TuringTest InterFace. If the AudListen or AudInput modules have only been stubbed in and have not been fleshed out, it may be time now to complete the full coding of AudListen, AudInput and AudMem for storage of the input....to be continued.
Many thanks to NeoCities.