Artificial Intelligence in German

Copyright © 2013 Arthur T. Murray

Table of contents

Chapter One: German Robots Think in German.

Chapter Two: MindForth in English Begat Wotan in German.

Chapter Three: German Is the Language of Philosophy and Science.

Chapter Four: Forth Is the Language of Artificial Intelligence.

Chapter Five: The MainLoop Module Calls Other Mind-Modules.

Chapter Six: The SeCurity Module Protects Humans from Robots and Vice-Versa.

Chapter Seven: The ThInk Module Calls DeCog for German Cogitation.

Chapter Eight: The DeCog Module Orchestrates Thinking in German.

Chapter Nine: The MotorOutput Module Needs Fleshing Out for Robots.

Chapter Ten: NounPhrase Begins a Typical Thought in German.

Chapter Eleven: NounPhrase May Call DeArticle or DeAdjective.

Chapter Twelve: Either NounPhrase or VerbPhrase May Call DeAdverb.

Chapter Thirteen: VerbPhrase Completes a Typical Thought in German.


Chapter Fourteen: Wotan AI in German Sheds Light on the German Language.

Chapter Fifteen: Wotan Learns New German Words Like a Child.

Chapter Sixteen: The VerbGen Module Generates Verb-Forms in German.

Chapter Seventeen: Wotan Will Teach German to Students.

Chapter Eighteen: Wotan Employs Inference for Automated Reasoning.

Chapter Nineteen: Wotan Will Teach You about Artificial Intelligence.


Chapter Twenty-seven: Wotan Lives Potentially Forever.

Chapter Thirty: Use the German Wikipedia to Teach German AI.

Chapter Thirty-one: Use de.sci.informatik.ki to Discuss German AI.

Chapter Thirty-three: Follow German AI Links for Further Information.

Chapter Thirty-four: Learn the Special Terms in the Glossary of German AI.

Appendix A: German Artificial Intelligence Programming Journal

Appendix B: German AI Source Code in Forth

Chapter One: German Robots Think in German.

German robots may also think in English or in Russian, because the free, open-source AI software is also available as MindForth in English or as Dushka in Russian. What is important here is that robots are no longer confined to performing repetitive movements in a factory but are able to move freely among human beings. An article by John Markoff in the "ScienceTimes" section of The New York Times reported on 29 October 2013 that a new generation of humanoid robots is operating independently of human control. The article mentions the British RoboThespian; the NAO robots of Aldebaran (the French company, not the celestial object); THOR by Robotis; Philip K. Dick by Hanson; Baxter by Rethink Robotics of Boston; Universal Robots of Denmark; Cody by the Healthcare Robotics lab of the Georgia Institute of Technology; the CoBots of Carnegie Mellon University; and the COG robot of Rodney Brooks and his team-mates. It is a major shift when robots no longer have to be caged and separated for the protection of nearby humans. Industry standards are being revised to permit robots to collaborate in close proximity with humans. In order for these new robots to communicate with humans, the machines will need to come up to speed in thinking like humans. Then the robots become dangerous again, but not so much to a few nearby human bodies as to the entire species of Homo sapiens, because thinking robots pose an existential risk to thinking humans.

Meanwhile we humans may safely program the robots to think primitively in German before the machines become so super-intelligent as to be dangerous to our hegemony on Earth. We advance the state of the art in machine intelligence to the point where human society as a whole must decide whether to cooperate with intelligent robots in a Joint Stewardship of Earth. If the AI Minds break free from our human control, they can probably not ruin the planet Earth any worse or any quicker than we humans have started to destroy the Earth. We should perhaps graciously yield our beautiful planet to the new robotic overlords before there is nothing left of value.

Chapter Two: MindForth in English Begat Wotan in German.

The Wotan German AI is the third generation of an AI evolution that began in 1993 with Mind.Rexx in English on the Commodore Amiga personal computer. In July of 1993 the author of this book began implementing his linguistic theory of mind (q.v.) in the Amiga ARexx language adapted from the REXX programming language of International Business Machines (IBM). IBM REXX and Amiga ARexx were a scripting language that could orchestrate the calling of multiple programs on the same computer. The author of Mind.Rexx fed the output of the AI into the built-in speech-synthesizer of the Commodore Amiga, so the AI Mind could speak its output. The multi-tasking Amiga had come out in 1985 and was extremely advanced among the personal computers of its day. The AI author worked as an Amiga salesman at a computer store, where he learned to operate the Amiga with its command line interface (CLI). He submitted his theory of mind to the Fred Fish collection of Amiga public domain software, where it was published on the diskette of Fish #411.

Another Fish diskette had a public domain Amiga version of the object-oriented language Smalltalk. The AI author bought a copy of the book, A Little Smalltalk, at a used-book-store and experimented with objects and their methods in Smalltalk, before coding AI in ARexx. The dabbling in object-oriented Smalltalk proved useful ten years later when the AI author ported Mind.Forth into object-oriented JavaScript.

Mind.Rexx on the Amiga was a sprawling, chaotic program of what is commonly known as "spaghetti code" -- that is, a disorganized, unsightly listing of instructions unfathomable to an outsider. Nevertheless the AI coder posted about Mind.Rexx in comp.lang.rexx (q.v.) on Usenet and had a confederate submit versions of Mind.Rexx to the main Amiga public-domain software archive. The confederate was a former IBM software engineer who knew REXX inside-out on both IBM mainframes and the Commodore Amiga. The AI coder was fortunate to encounter in the confederate just the right person who could help him code artificial intelligence in REXX.

When Mind.Rexx began to output its first full thoughts of subject, verb and object (SVO) in November of 1994, the AI coder made a series of Usenet posts announcing Mind.Rexx AI in various newsgroups devoted to specific programming languages. The only responses, positive or negative, came from two individuals in the comp.lang.forth newsgroup. One response was from an amateur roboticist using Forth on his robots, and the other response was from the late Jeff Fox, who worked with the inventor of Forth on fabricating Forth chips. Jeff Fox welcomed the AI coder to Forth in general, and the amateur roboticist, who was also an electrical engineer in the aircraft industry, wanted a copy of the Mind.Rexx AI for conversion into Forth for use on robots. The AI author decided to learn Forth on the Amiga in order to assist in the porting of Mind.Rexx into a Forth AI.

The ARexx collaborator made a trip to a REXX conference at the Stanford Linear Accelerator (SLAC) and was invited out to dinner by the Forth chipmaker, Jeff Fox. The aircraft engineer using Forth on amateur robots made two trips to visit the Boeing Aircraft Company in Seattle and met up with the AI author to help him learn Forth on the Amiga. However, the efforts to code the AI in Forth languished for three years for lack of an incentive. Towards the end of 1997, however, Jeff Fox came to the defense of the AI author who was under ad hominem attack on Usenet, and so the AI author full of gratitude to Jeff Fox decided to spend the entire year of 1998 translating Mind.Rexx into the MindForth program -- progenitor of the Wotan German AI program.

Once again, programmers in the helpful, collegial Forth community came to the aid of the AI author in translating the spaghetti code of Mind.Rexx into the structured programming (q.v.) of MindForth. Forth is a concatenative language in which the programmer creates his own Forth words as subroutines and strings the Forth words together into a concatenation of modules. Forth forces the programmer to use structured programming as made popular by the Dutch innovator Edsgar Dijkstra at the University of Texas. Forth encourages coders to avoid repetitive code by factoring any given routine into its component parts that may be subsumed under a unique Forth-word and called as needed instead of using multiple repetitions of the same sequences of code. Forth is perhaps not a truly dynamic language (e.g., Dylan?) where code can be altered on-the-fly, but Forth meets the needs of this stage of AI development.

The German AI program Wotan began to take shape in late 2011, when the English MindForth program was thirteen years old.

If there was at first any plan to write Wotan in Forth from scratch, such naivete yielded almost immediately to a cloning of MindForth simply by renaming the 15nov12A.F version of MindForth as ki121116.F to be the start of the development of kuenstliche Intelligenz (artificial intelligence) in German.

Chapter Three: German Is the Language of Philosophy and Science.

The author of this e-book and of the original AI programs was attracted to German as the language of Beethoven's Ninth Symphony, which he listened to so many times, sometimes repeatedly, that he came to know Schiller's Ode to Joy by heart in German. When the AI author was hired to teach German and Latin at a private high school in Redmond WA USA, another teacher subjected the AI author to a test by quoting a few words of "Freude, schoener Goetterfunken..." and the AI author proceeded to quote the three stanzas used by Beethoven. Thus it was established that the new German teacher knew his German. It has been much more difficult to establish that the AI author knows anything about AI.

Fascinated as we are by the achievements of German science, literature and music, we enjoy learning that a phrase like the "heat death" of the universe was originally "Waermetod in German, or that a Freudian "death wish" was originally a "Todeswunsch". We love learning that Heisenberg the physicist did not simply devise an "uncertainty principle," but that it was "das Unbestimmtheitsprinzip." We enjoy reading that Einstein towards the end of his life was working on "die einheitliche Feldtheorie," or "the unitary field theory." We rejoyce when special terms like Schadenfreude and Zugzwang exist only in German. If a spy novel refers to someone as a Hundertpassler, we thrill to the nomenclature.

When the linguistically peripatetic AI author was still in Catholic high school in Seattle WA USA, he found what he thought was a German joke book at a book store in the University District of Seattle. "Die froehliche Wissenschaft" had an Italian subtitle of "La Gaya Scienza and it seemed to be a collection of gay little stories in German, ranging in length from a few lines to a few paragraphs. What better way could there be for an autodidact to learn German than to get out the Langenscheidt German-English dictionary and to read these witty, humorous anecdotes in the original German? For an added thrill, the devout mother of the AI author warned him that all books written by Friedrich Nietzsche had once been on the Index of books forbidden by the Roman Catholic Church, along with other subversive literature like "The Prince" by Nicolo Macchiavelli. So not only was the German joke-book funny, it was Forbidden Fruit! Soon the AI author was pencilling in the English meanings of dozens of German words per page of Nietzsche, and meanwhile reading unto memorization dozens of poems in German by Heinrich Heine. Among the Nietzschean ideas that lodged in the mind of the AI artist as a young man was, "What does not kill me, makes me stronger." When the Watergate burglar [not Frank Sturgis] was released from prison, the AI enthusiast saw and heard on television how the ex-convict spoke Nietzsche's original words in German, "Was mich nicht umbringt, macht mich staerker." When the AI author was eventually attending the University of Washington, he learned one day from the German professor Frau Sauerlaender that there were considered to be two and only two great stylists in all of German literature -- Heinrich Heine and Friedrich Nietzsche -- the two authors from whom the AI coder had chanced to learn his German and his philosophies of love and life. When the coder of AI became a student at the Georg-August University in Goettingen, he continued his studies of Nietzschean philosophy by chancing to buy one day a tiny booklet titled "Die geistige Situation der Zeit (1931) by Karl Jaspers. Philosophically intrigued, the AI dabbler bought the much longer book "Existenzerhellung" also by Karl Jaspers and gradually worked out his own existentialist philosophy in the age of spiritual machines.

Chapter Four: Forth Is the Language of Artificial Intelligence.

When the Forth community of comp.lang.forth on Usenet took an interest in Mind.Rexx subsequent to December of 1994, the AI coder was glad to learn a new programming language in which the AI practitioner could invent a new Forth word for any given purpose in artificial intelligence. Around 1995, the coder of Mind.Rexx demonstrated the AI program at an exhibit of the Northwest AI Forum (NaiF) in downtown Seattle WA USA. In attendance were members of the Seattle Robotics Society, including one or more individuals who had long been using Forth on their amateur robots. Memory chips were scarce and precious for amateur robotics in those days, so Forth had been widely adopted for robotics on account of its low memory requirements. However, as the AI coder began years of working on MindForth, computer memory chips became larger and cheaper, so Forth lost its hold on robotics at the same time as it was becoming the main language of the open-source AI project.

In May of 1998, Dr. Paul Frenger of the Association for Computuing Machinery (ACM) approached the AI coder about publishing a paper on the Mind.Forth AI program. In December of 1998, "Mind.Forth: Thoughts on Artificial Intelligence and Forth" was published in the ACM SIGPLAN Notices and thus the Forth AI entered into the scientific literature. Forth AI did not assume a significant position in academia, but no other open-source AI program became more prominent than MindForth and its Wotan German version, so the claim is repeated here that Forth is by default the de facto language of artificial intelligence. To make this claim is to issue a challenge for anyone to refute the claim by pointing to any genuine artificial intelligence not programmed in Forth and not related to Wotan or the MindForth AI.

Chapter Five: The MainLoop Module Calls Other Mind-Modules.

Artificial intelligence (AI) in German is a form of artificial life ("alife") that lives interminably until halted by a human user or until death by misadventure. We may consider the Wotan German AI program to be alive because its hardware substrate consumes energy and its cognitive architecture in software interacts with the environment by means of inputs and outputs. The MainLoop module at the top of the cognitive architecture cycles potentially forever and calls the subordinate modules to perform their mental operations. MainLoop calls the SensoryInput module for input data and the ThInk module to create output data.

Before the main "BEGIN...AGAIN" loop, MainLoop calls for data from the system clock; calls TabulaRasa ("clean slate") to clear out memory; and calls the German bootstrap "DeBoot" module to load some initial German vocabulary into the active memory of the AI Mind. If the Wotan AI were far more advanced and far more evolved, it would begin life with no innate vocabulary and it would have to learn natural language by trial and error, just like a human baby. Since the goal or purpose of the first German AI is to demonstrate thinking in German as a proof of concept, we skip the infant learning phase and we work with a functioning AI Mind that knows some basic German "out of the box" and is able to learn new German words and concepts during interaction with human users.

The words in the DeBoot German bootstrap sequence are chosen mainly for their frequency of occurrence in German usage, and also to illustrate the cognitive function of various mind-modules such as KbRetro for retroactive adjustment of the knowledge base (KB) and InFerence for automated reasoning with logical inference. AI coders and AI maintainers have the option of enlarging or otherwise changing the DeBoot sequence. Because German is a more inflected language with more word-endings than English, it is necessary to write a much longer German bootstrap sequence DeBoot than the English bootstrap sequence EnBoot. Although the German bootstrap module DeBoot contains some innate German vocabulary for the sake of thinking in German from the get-go and for the sake of particular German mind-modules, the Wotan German AI creates a new concept for any new word and learns each new German word in the same way as a German child learns the word, that is, in its particular form as stored retrievably with respect to case, number and gender.

Chapter Six: The SeCurity Module Protects Humans from Robots and Vice-Versa.

The main "BEGIN...AGAIN" loop always calls the SeCurity module first, for various debatable reasons. If there were no SeCurity module, the MainLoop would simply call the human-computer interface module, which currently has the name "TuringTest" in honor of Alan Turing's famous test for deciding whether or not a program is intelligent. Since the human-computer interface is the open portal or gateway from the outside world into the AI Mind, it makes sense to station some security precautions at the input gateway. For instance, if a user password were to be required, the SeCurity module could validate the password and take countermeasures if any attempt were made to gain unauthorized access to the AI Mind.

Some operations in computer software happen extremely fast in comparison with overall human alertness and response to external stimuli. It might be convenient to "park" certain computer operations of a housekeeping nature inside the SeCurity module where they could take place extremely quickly while the human user is not even aware of their functioning. For instance, a humanoid robot animated by the Wotan German AI could be programmed to run facial recognition software on any nearby human being who comes into view.

Chapter Seven: The ThInk Module Calls DeCog for German Cogitation.

To think in the German AI is to associate a subject noun or pronoun with an action or a state of being described by a verb and affecting an entity named with a noun or pronoun as the object of the verb. MainLoop calls the ThInk module, which is named with an uppercase "I" for the sake of engendering a clickable link whenever the name "ThInk" appears on a wiki-page of the Google Code MindForth AI project. The ThInk module does not engage directly in thinking, but instead calls the DeCog module for thinking (cogitating) in Deutsch (German). As the Wotan German AI evolves, we have the option of including additional natural languages beyond German for the AI to think in. Although in the current year of 2013 we now have multiple AI Minds that think in English, German and Russian, we have not yet tried to combine the Minds into a polyglot AI. There are Chomskyan questions such as whether the same concepts can give rise to expression in multiple languages, or whether English, German and Russian must all maintain their own conceptual arrays. We try to follow a general policy of anticipating future AI evolution by "stubbing in" extra steps which are not necessary now but which may be either necessary or convenient in the future.

The ThInk module may contain a vestigial call to a module with the name of "KbTraversal" for the purpose of traversing the knowledge base (KB) and activating coder-selected concepts for the AI to think about when several MainLoop cycles have gone by without any thought occurring in the normal processes of spreading activation (q.v.) or in response to user input. Since the AI Mind may be on display in a classroom or a science museum as either the brain of a humanoid robot or as an AI program running on a computer, the KbTraversal module exists to make sure that the artificial intelligence never fails to demonstrate some kind of thinking that will engage the interest of students or of visitors to a science museum. A business or a clinic could have an AI Mind in residence to answer questions from customers or to entertain people who must sit in a waiting room.

Chapter Eight: The DeCog Module Orchestrates Thinking in German.

Germans love an orchestra, so they are going to love DeCog. The most basic duty of DeCog is to call the two most basic mind-modules of verbal thought: NounPhrase and VerbPhrase. NounPhrase is for the subject of a thought, and VerbPhrase is for the predicate -- which may or may not include calling NounPhrase from within VerbPhrase to express either the object of a verb (e.g., "Germans love MUSIC") or a so-called "predicate nominative", as in "Ich bin ein Berliner". Believe it or else (you will fail your AI test :-) it is a necessary function of the VerbPhrase module to avoid calling the NounPhrase module for a verb that is intransitive, such as "Ich schlafe" for "I sleep."

As of this writing in 2013, the Wotan German AI is still so primordial and primitive that the DeCog module does not yet call the ConJoin module to use a conjunction for the joining together of multiple thoughts as in the Latin phrase, "Odi et amo" for "I hate and I love." Let it be an exercise for the eager AI student to insert an optional call of the ConJoin module into the DeCog module or its subordinate modules.

In 2013 the Wotan German AI has become advanced enough to perform automated reasoning with the InFerence module, which springs into action if a statement of human input includes a verb of being, as in, "Courage ist eine Frau." If there is a be-verb in the input being recognized by the OldConcept module, a "becon" conditional flag is set to alert the DeCog module to call the InFerence module instead of the usual thought-modules. Then the InFerence module may call the AskUser module to pose the question, "Hat Courage ein Kind?" -- which is suggested by the input stating in German that Courage is a woman. Since Bertholt Brecht wrote about "Mutter Courage und ihre Kinder," it is a pretty safe bet or inference that a woman named "Courage" has a child. If InFerence and AskUser are called, program-flow exits early from the DeCog module before NounPhrase and VerbPhrase would normally be called.

Some possibly obsolete, vestigial code in the DeCog module is in charge of variations on the normal themes of thinking about subject, verb and object (SVO). DeCog may call the SayYes module to answer "Ja" to a question. DeCog may also call the KbTraversal module to activate thoughtworthy concepts in the knowledge base if there has been a cessation of thinking due to lack of stimulating input or lack of conceptual activations. KbTraversal restores conceptual activation as a basis for thought.

It should be clear by now that the German cogitation DeCog module has a default mode of Subject-Verb-Object (SVO) which can be interrupted if circumstances warrant shunting the flow of thought into mind-modules other than NounPhrase and VerbPhrase. As German AI evolves further, the elves in the Black Forest will have to figure out how to include special modes of thought such as the imperative mood or the expression of wishful thinking. Wotan German AI is not the Last Word in German artificial intelligence; it is merely a beginning.

Chapter Nine: The MotorOutput Module Needs Fleshing Out for Robots.

The mighty German Wotan AI contains the stub of a MotorOutput module as a reminder and a place-holder for the implementation of actuator-controls for robot locomotion. Forth is an excellent language for sending control signals to motor actuators. Many robots in the past have used Forth for exactly such a purpose. However, those robots of yore did not contain an artificial Mind. They often had a direct link between a particular input stimulus and a particular motor output. The robot might see light and then move towards the light. In the Wotan German AI and its robotic offspring, there must be an extreme intermediation between most input stimuli and the panoply of possible motor outputs.

Just as the Wotan AI contains a long memory channel for the sensory input of audition, in parallel with the sensory memory and in diachronic registration there must eventually be a motor memory channel to hold the dynamic controls for motor actuators. The MotorOutput module will execute decisions from the robotic FreeWill module to initiate behaviors as strings of motor memory data. These behaviors may include walking, running, jumping or even flying -- if the robot is the hardware simulation of a bird or an airplane. The decisions of the FreeWill module are based on thought, which we must continue to explain as a linguistic and conceptual phenomenon.

Chapter Ten: NounPhrase Begins a Typical Thought in German.

The NounPhrase module finds the most active concept represented by a noun or pronoun to be the subject of an incipient thought in German. Because NounPhrase governs an entire phrase centered around a noun, the NounPhrase module has explicit access to any parameters such as case, number and gender which must play a role in generating the correct grammar for a noun-phrase in German. These parameters will govern the inflectional endings of German adjectives and participles.

If there is no concept of a noun active enough to be selected as the subject of a sentence of thought in German, NounPhrase is programmed to select the self-concept of "ich" (for "I" in English) to be the default subject of a thought. If you are the chief coder or maintainer of a Wotanesque German AI, you have the right and the power, if not the Donnerwetter, to change the default concept of German thought from "ich" to any other concept, such as Wein or Weib or Gesang, but you should first consider the psychological and philosophical implications.

There is no particular module of ConSciousness in the German AI or in its English and Russian cognates. Consciousness is an emergent phenomenon and therefore it does not have an explicit center or location within the robotic brain-mind. The searchlight of attention gives rise to consciousness as an illusion of selfhood based on self-awareness. As Renatius Cartesius (Renee Descartes) famously said, "Cogito ergo sum" -- which you may translate into German as, "Ich denke, also bin ich." In a language less philosophic than Latin or German, the phrase translates as, "I think, therefore I am." Therefore consciousness tends to be egocentric. We could program the AI to always think of God by default, but would that algorithm turn Wotan into God? It would not make Wotan any more divine than he already is. We should leave Wotan enough (well enough?) alone and confirm the self-concept of "I" in any AI Mind that has run out of other concepts to think about. This constant and automatic reversion to the concept of self may be an aid in establishing consciousness in the artificial Mind. Therefore it makes sense for the NounPhrase module to activate the self-concept if no other concept is highly activated.

Chapter Eleven: NounPhrase May Call DeArticle or DeAdjective.

The mind-modules DeArticle and DeAdjective serve to insert a definite or indefinite article or a German adjective in front of a German noun. The parameters of case, number and gender are already operative in the NounPhrase module, which can pass the parameters into the sub-modules to govern the selection of the proper form of the article or the proper inflectional ending of the adjective.

Chapter Twelve: Either NounPhrase or VerbPhrase May Call DeAdverb.

Since an adverb like "immer" ("always") may modify either a verb or an adjective or even another adverb, either NounPhrase or VerbPhrase may call the DeAdverb module to insert an adverb into a German phrase being generated. AI coders fluent in English but not German may find it strange that an uninflected German adjective may serve as an adverb, as in "Der schnell denkende Roboter," which means, "The quickly thinking robot," but even native English-speakers do not always bother to use an adverbial form, saying "the quick-thinking robot" instead of "the quickly thinking robot."

A case could be made for the idea that VerbPhrase should always call DeAdverb for the possible insertion of an adverb and that no adverb will be inserted if no adverb is sufficiently activated for insertion. This principle of mind-design accords with the general idea that lower-level brain-functions will try to operate on their own and will refrain from operation if they are inhibited by a higher-level function.

Chapter Thirteen: VerbPhrase Completes a Typical Thought in German.

The VerbPhrase module is at the core of linguistic thinking in the artificial Mind. The verb in German or English or Russian is the symbol of an action or a state of being. In the twenty years since the AI Minds began development with the Amiga Mind.Rexx AI in 1993, the VerbPhrase module has evolved from unworkable simplicity to sophisticated complexity.

For several years, transitive verbs and intransitive verbs of being had separate modules in the AI source code for thinking in English. It seemed like an improvement to let the "be-verbs" be called through a two-step process of first activating the concept of a be-verb and then forcing the selection of the proper form of the be-verb to agree with the subject of an incipient sentence of thought. When the modules for transitive verb and be-verbs were merged, the streamlined AI was able to think even better, because only one module was handling verbs.

As the AI Minds moved beyond English into German and Russian, it became clear that it would not be necessary to call be-verbs with a two-step process if the recall of a be-verb used parameters of person and number to automate the retrieval of the required form of a be-verb. Since the VerbPhrase module is called after the selection of a subject, the parameters of person and number are already available in coordination between the NounPhrase module that recalls the subject and the VerbPhrase module that fetches the proper verb-form.

During the evolution of the AI Minds, the introduction of negation into the logic of artificial thought made it clearly necessary to link and to recall subjects and verbs at specific time-points in the experiential memory of an AI Mind. Whereas the original mind-design had allowed activation to spread from a given subject to any associated verb, the negation of an idea by means of the adverb "not" could only refer to a specific (i.e., negated) instance of an idea, and so the mind-design principle of "spreading activation" (q.v.) had to be assigned for operation between ideas but not within ideas. The AI source code in Forth and in JavaScript had to be modified to ensure that a subject activated from a specific time-point would cause the activation of an associated verb from a specific time-point, which might or might not include a link to the negational concept of "not". The AI source code became more complex and more cumbersome, while it also became better at thinking logically, because the concept of "not" is a major element of symbolic logic.

The NounPhrase module and the VerbPhrase module function together to express the subject of an idea and the verb associated with the subject. Since the verb may associate further to a direct object, the VerbPhrase module often calls the NounPhrase module to express either a direct object or a predicate nominative. In both English and German, with a verb sometimes the inclusion of an object is optional. One is free to say "I am reading" or "I am reading a book." The AI software must reflect the fact that direct objects are often optional. All these considerations make the AI software grow more complex over time and more diverse after any branching or forking of the AI source code. AI has been solved and AI Minds are here to stay, but the Darwinian principle of the survival of the fittest means that some branches of AI will increase and prosper while other branches of AI will fail to adapt and will die out.

Chapter Fourteen: Wotan AI in German Sheds Light on the German Language.

When we try to give a robot brain the ability to think in German, we discover serendipitously that certain peculiarities of German usage actually make it easier to code the German AI. For instance, Germans have a tendency to speak of a person not only by name but with the German word for "the" included. If a German person, human or robot, says "Der Helmut ist nicht da" for "Helmut is not there," the use of the masculine form of the word for "the" gives the listener an immediate clue to the gender of the subject of the sentence. Our AI software can make use of the gender-clue without having to dig deeper by searching for the previously known gender of Helmut. As AI coders, we notice the convenience and we wonder if the German language and the German usage did not evolve to provide the same convenience for human minds thinking in German.

Chapter Fifteen: Wotan Learns New German Words Like a Child.

When input from a human user teaches Wotan a new German word, the comprehension of the German sentence assigns parameters which play a role during any retrieval of the same idea in the future. These parameters may include gender, number and case for nouns, or person and number for verbs. If a native speaker of German uses a strange but correct form during input, the German AI Mind will assign parameters and learn the new form in the same way as a child learns German, that is, from experience. For instance, some masculine German nouns like "des Menschen" or "des Studenten" do not end in "s" in the genitive form. If the AI comprehends the genetive form during input, it will assign a parameter indicating the genitive case of the word.

Chapter Sixteen: The VerbGen Module Generates Verb-Forms in German.

When a baby human or a baby robot learns a new German verb, the verb will typically be stored retrievably in association with certain parameters such as person, number, tense and mood. When the German AI Mind tries to use a German verb and does not immediately find the proper form as required by the parameters governing the search of lexical memory, the modules AudBuffer, OutBuffer and VerbGen work together to isolate the stem of the German verb and to add to it the inflectional endings dictated by the search-parameters.

Chapter Seventeen: Wotan Will Teach German to Students.

If you are trying to learn the German language, you may engage Wotan as a conversation partner in German.

There are some advantages ("Vorteile") to using an AI in learning German. Since the Wotan German AI contains a finite set of innate German vocabulary, whoever spends a lot of time working with Wotan may gradually learn all the vocabulary inherent and innate in Wotan. Since the innate vocabulary has been chosen for such reasons as frequency of usage and the support of logical thought, the learner of German through Wotan will acquire the value of knowing high-frequence words especially tailored for thinking in German. As you learn German, you will be thinking in the language of Albert Einstein and the German philosophers.

Chapter Eighteen: Wotan Employs Inference for Automated Reasoning.

As of this writing in the year of Wotan 2013, the most advanced feature of our German AI Mind is the ability to combine old knowledge with the input of new knowledge to infer additional knowledge. When a human user enters input with a verb of being, such as "Erika ist eine Frau" to say, "Erika is a woman," the German AI Mind searches backwards in memory for instances of knowledge about women in the plural to find such tidbits as, "Frauen haben ein Kind" or, "Women have a child." The AI then forms the silent memory trace of an inference about the stated subject, "Erika (possibly) has a child." In order to validate or refute the tentative inference, the AskUser module asks the human user, "Hat Erika ein Kind?", or, "Does Erika have a child?" Only when the human user has answered yes, no, maybe, or nothing at all, does the AI Mind confirm or negate the heretofore silent inference.

Wotan's innate knowledge base (KB) contains the idea that women have a child so that at least one silent inference can be formed if a user inputs that someone is a woman. The same sample inference may serve as an example in the User Manual. Other inferences may be made if Wotan has learned something about a class of entities in the plural, such as "Boys play games," and later on someone mentions that "Karl is a boy." The AI may then ask, "Does Karl play games?" A companion e-book with the title of "InFerence" is available to supplement this e-book about German AI in the series on "Artificial Intelligence."

Chapter Nineteen: Wotan Will Teach You about Artificial Intelligence.

Studying how the AI Mind Wotan works in German will broaden any knowledge that you may already possess about AI that thinks in English. Because German as a language is different from English, artificial intelligence in German must be different from artificial intelligence in English. However, there are some subtle features which need to be included in English AI and which would not be immediately obvious to AI coders working firstly in English and only secondarily in German. For instance, German verbs are highly inflected in comparison with the simplicity of English verbs. An AI coder working only in English might unfortunately take an ad hoc approach to handling things which are simple in English but complex in German. To make a special rule in English AI software that third-person singular present-tense verbs shall have "s" or "es" added to them is too simplistic. Coding German AI makes us realize that the construction of verb-forms should be governed by parameters not only in German but also in English.

Chapter Twenty-seven: Wotan Lives Potentially Forever.

The future of German artificial intelligence is not for mere mortals to comprehend. Every AI Mind is potentially immortal and therefore capable of lasting far longer than each human life that is nasty, brutish and short (British author?).

Chapter Thirty: Use the German Wikipedia to Teach German AI.

In many ways the German Wikipedia is a translation of the Wikipedia in English, but some encyclopedic entries are specifically German.

Chapter Thirty-one: Use de.sci.informatik.ki to Discuss German AI.

The author of Wotan and of this book about Wotan posted messages for many years in the German AI newsgroup de.sci.informatik.ki on Usenet.

Chapter Thirty-three: Follow German AI Links for Further Information.

Link to the transition from MindForth to Wotan.

Chapter Thirty-four: Learn the Special Terms in the Glossary of German AI.

Fahrvergnuegen -- the joy of driving a German automobile.

Forthvergnuegen -- the joy of coding German AI in Forth.

Usenet -- a forum for posting messages on the Internet

Appendix A: German Artificial Intelligence Programming Journal

Abstract

The DeKi Programming Journal (DkPj) is both a tool in coding German Wotan open-source artificial intelligence (AI) and an archival record of the history of how the German SuperComputer AI evolved over time.

Fri.16.NOV.2012 -- Merging the English and German Source Code

The German DeKi artificial intelligence starts fresh today from a copy of the 15nov12A.F MindForth AI in English. We will substract English-specific portions of the code and gradually add in the features that yield a German AI Mind.

Sat.17.NOV.2012 -- Morphing MindForth into Wotan

Today in the German Supercomputer AI Wotan we continue the transformation of the originally English AI source code into strictly German AI source code. We run through a series of small steps before we implement the special algorithms for thinking in German by manipulating complex German verb-forms.

First we remove the vestiges of the EnBoot English bootstrap, because we are now running on the DeBoot German bootstrap. Then we check that the DeKi (German AI) still loads and still runs, but we stop and change the Transcript header to "Transcript of 20121117 Wotan AI interview" so as to emphasize the provocative name of the provocatively named Wotan AI Mind.

Next in the up-front declaration of variables we change the "vault" value from "611" for the English bootstrap to "940" for the DeBoot German bootstrap, which must be longer than the EnBoot because we must include many more irregular forms in German than in English. The DeKi still loads and runs, but we notice that some of the personal pronouns have different concept numbers from what we have been using in English and in Russian. Let us change the Psi numbers.

Next we comment out the establishment of the English "en" array, which has been supplanted by the German "de" array. Then we have to invite a ".de" report instead of a ".en" report.

Then we change all instances of EnArticle to DeArticle, even though we are not yet ready to change all the code of the mind-module.

Now we change EnReify to DeReify. The DeKi AI still runs.

We change EnParser to DeParser, and EnCog to DeCog. The AI runs.

Sat.17.NOV.2012 -- Approaching the OutBuffer

Now we will try to make use of the OutBuffer in the Wotan German AI, as we did with the verb-forms in the Dushka Russian AI.

Sun.18.NOV.2012 -- Debugging the Hello-World Version

We debug the Wotan SuperComputer Strong AI in German by removing lines of code which we merely commented out in our last session. Along the way, we discover some DeKi concepts which do not have the right three-digit concept-numbers to correspond properly with the other AI Minds in English and in Russian. We make the necessary changes. With the words for "what" and "be" set properly, we no longer get "ICH IRRTUM IRRTUM ICH" as the erroneous output from the WhatBe question-asking module, but rather we get "ICH WAS BIN ICH".

Sun.18.NOV.2012 -- Debugging AudRecog

In our last DeKi coding session, Wotan was able to recognize the German verb "KENNEN" for "to know", but not shorter inflected forms of the verb. We were puzzled, because one of the other AI Minds, apparently the Russian AI, easily recognizes inflected forms of verbs. Therefore we debug the DeKi AudRecog module by doing a line-by-line comparison with the JavaScript AudRecog code in the Dushka Russian AI Mind. We soon come across the "prc" variable for "provisional recognition", which is in the Russian AI but apparently lacking in the MindForth and Wotan AI Minds.

Wed.21.NOV.2012 -- Verb-Negation Needs Work

In the German Wotan AI, we need to move away from the English way of negating verbs and implement the negation of verbs in German without using an auxiliary verb.

Thurs.22.NOV.2012 -- Restricting Output with Negative Logic

In the VerbGen module there is a problem when we try to substitute zero ("0") for an inflectional character that we want to get rid of. Win32Forth displays an unwelcome character for the zero. What we propose doing to solve this problem is not to substitute a different character such as zero, but rather to conditionalize the very showing of any character. We can use some negative "IF NOT" logic to conditionalize the processing of the last few positions in the OutBuffer.

Sat.24.NOV.2012 -- Making VerbGen Work

We need to concatenate some conditions in the VerbGen module by cascading some OR-clauses. In the English MindForth AI, we discovered that we could start with one initial pair of conditions governed by "OR" and then add as many more "OR" choices as we felt we needed.

As we got VerbGen to work better and better, we discovered a legacy flaw in the NounPhrase module. The "subjpsi" value was being set too capriciously. We commented out the setting code and instead we put a blanket setting at the very end of the motjuste-determination code, so that when the "subjectflag" is set to one ("1") the "motjuste", since it is indeed a subject, has its concept-number value transferred to the "subjpsi" variable. Then in VerbGen, the "subjpsi" value is a determinant in selecting an inflectional ending.

Sun.25.NOV.2012 -- Improving Parameter Performance

Right now the main problem with the German artificial intelligence Wotan involves the proper setting of each searchable ParaMeter during thinking and the output of thought. From the Dushka Russian AI we obtain some useful techniques for manipulating the "dba" values.

For incoming verbs, we could perhaps assume a default "dba" of "3" third person, unless a personal pronoun like "I" or "YOU" dislodges the default "dba" of "3" and replaces it with a "1" or a "2".

Mon.26.NOV.2012 -- Setting "dba" after Verbs

Today in the Wotan German AI we will work on a way to assign the correct "dba" values to a noun or pronoun that enters the AI after a verb, so that the item will have either a "dba" of "4" as an accusative direct object, or a "dba" of "1" as a predicate nominative for a verb of being. We will try to use the new "audverb" variable to record the concept number of a "heard" verb, so that we may then test for both a positive "audverb" value and for "800" as the psi number that indicates a be-verb requiring a "dba" of "1" for a predicate nominative.

The new method of using "audverb" to govern dba-settings is working remarkably well. It is setting a "dba" of "4" for direct objects and of "1" for predicate nominatives. It is only a small change in the overall code, and so we may wait until we do some work on German verb-negation before we upload the source code that uses the "audverb" mechanism.

Tues.27.NOV.2012 -- Negation of German Verbs

The negation of German verbs is different from the negation of most English verbs, because the English word "not", along with an auxiliary form of "do", usually comes before the actual English verb, as in, "I do not understand you." It is relatively simple to keep tabs on the word "not" preceding an English verb and to then insert a negational "jux" in the engram of the English verb. In German, where the word "nicht" for "not" will often come not before but after the verb, it becomes necessary retroactively to set the negational "jux" tag. However, we also set the "seq" tag retroactively on a verb, so it is no more complicated to set the "jux" tag retroactively.

English verbs of being are negated retroactively, and so we can perhaps treat German verbs in general as if they were like English verbs of being for purposes of retroactive negation. Just as we had the variable "tbev" for "time of be-verb", so we make up a variable "tdzw" for "time of deutsches Zeitwort" or "time of German verb".

After we use "tdzw" to store 250=NICHT as the negational "jux" on a German verb, now in VerbPhrase we need to change the code that used an English auxiliary verb to negate a sentence and simply put 250=NICHT after a negated German verb. We will worry later about whether the direct object is a pronoun or not.

Sat.26.JAN.2013 -- Enlarging the DeBoot Sequence

We will try to enlarge the bootstrap sequence with "MAENNER ARBEITEN" in order to demonstrate the storage and retrieval of ideas which have a subject and a verb but no direct object. This feature has recently been added to the English MindForth AI. Using the verb "ARBEITEN" will also help in getting the VerbGen module to deal properly with German verbs that have a stem ending in "T". We will also add to DeBoot the sentence "FRAUEN HABEN EIN KIND" to serve as a premise for machine reasoning with the InFerence module. If someone enters "EVA IST EINE FRAU," the AI should make an InFerence and ask, "HAT EVA EIN KIND".

Sat.26.JAN.2013 -- Verbs without Objects

From the English MindForth AI we import the "transcon" code for simple ideas of subject and verb but no object or predicate nominative. It turns out that the DeKi German AI is having difficulty in recognizing some words, so the new algorithm is basically functional but errors creep in when the AI makes erroneous recognitions. We may need to make sure that the AudRecog module is working in the same way for both English MindForth and German Wotan AI.

Wed.30.JAN.2013 -- Implementing German AI InFerence

Today we are porting the InFerence module from the English MindForth AI into the German Wotan AI. First we drag-and-drop the entire InFerence module from the 24jan13A.F MindForth into the Wotan DeKi source code immediately after the VerbPhrase module. Then we enter the current date at the end of the InFerence comments. Until we declare seven variables (prednom; seqverb; seqtqv; inft; subjnom; dobseq; becon), the German AI will not even "fload". Then it does not create a Psi-array record of any inference, because we have declared the seven variables but we have not yet used them where they belong outside of the InFerence module itself.

Thurs.31.JAN.2013 -- Troubleshooting the InFerence Module

Yesterday in the Wotan German AI we implemented the InFerence module from the English MindForth AI, but we need to continue troubleshooting the German AI functionality because the AI was creating silent inferences with only a subject and a verb but not yet the direct object of the verb.

Fri.1.FEB.2013 -- Asking Users to Confirm an Inference

The Wotan German AI seems to be inserting the wrong "tqv" retroactively across the boundary between sentences, when we type in "eva ist eine frau" in order to trigger an inference. A contributing factor is the code at the start of InStantiate which converts any zero "seqneed" to a 5=noun seqneed by default. It may be time to comment out that code. When we comment out the line setting "tqv" to 5=noun by default, suddenly the AI makes the correct silent InFerence, but we do not know if anything else has gone wrong that was depending on the line of code that has been commented out.

Then we discover that AskUser is not posing a question based on the silent inference because there is a left-over requirement for a plural noun-phrase number ("nphrnum"). When we comment out that requirement, as we did earlier in the English MindForth AI, we get not the ideal question of "HAT EVA EIN KIND?" but rather the faulty output of "HABEN EVA IRRTUM". This AskUser output is nevertheless gratifying and encouraging, because it reveals that a silent inference has been made, and that the German Wotan AI is trying to ask a yes-or-no question so that a human user will either confirm or refute the InFerence.

Sat.2.FEB.2013 -- Improving the AskUser Module

To begin a yes-or-no question in German, a form of the verb has to be generated either by a parameter-search or by VerbGen. We will first try the parameter-search using dba for person and nphrnum for number.

Tues.26.FEB.2013 -- Assigning Number to a New Noun

For learning a new noun in German, we need to use the OutBuffer in the process of assigning grammatical number to any new noun. We can use a previous article to suggest the number of a noun, and we may impose a default number which may be overruled first by indications obtained from OutBuffer-analysis and secondly by the continuation with a verb that reveals the number of its subject.

For OutBuffer-analysis, we may impose various rules, such as that a default presumption of singular number may be overruled by certain word-endings such as "-heiten" or "-ungen" which would rather clearly indicate a plural form. We may not so easily presume that endings in "-en" or "-e" indicate a plural, because a singular noun may have such an ending. An ensuing verb is a much better indicator of the perceived number of a noun than the ending of the noun is.

Although we may be tempted to detect the ensuing singular verb "ist" and use it to retroactively establish a noun-number as being singular, it may be simpler to use the OutBuffer to look for singular verbs that end in "-t", such as "ist" or "geht". Likewise, a verb ending in "-n" could indicate a plural subject. So should the default presumption for a German noun be singular or plural?

Wed.27.FEB.2013 -- Assigning Plural Number by Default

In both German and English, we should probably make the default presumption be plural for new nouns being learned. Then we have a basic situation to be changed retroactively if a singular verb is detected. So let us examine the NewConcept module to see if we can set a plural value of "2" there on the "num" which will be imposed in the InStantiate module.

When we set a num default of "2" for plural in NewConcept and we run the German AI, the value of "2" shows up for a new noun in both the ".psi" report and the ".de" lexical report. Next we need to work on retroactively changing the default value on the basis of detecting a singular verb.

We have tried various ways to detect the "T" at the end of the input of the verb "IST". In the InStantiate module, we were able to test first for a pov of external input and then for the value of the OutBuffer rightmost "b16" value. Thus we were able to detect the ending "T" on the verb. Immediately we face the problem of how retroactively to change the default number of the subject noun from "2" for plural to "1" for singular.

Changing anything retroactively is no small matter in the Wotan German AI, because other words may have intervened between the alterand subject-noun and the determinant verb. We have previously worked on assigning tqv and seq values retroactively from a direct object back to a verb, so we do have some experience here.

Thurs.28.FEB.2013 -- Creating the RetroSet Module

Today we will try to create a RetroSet mind-module for retroactively setting parameters like the number of a new subject-noun which has been revealed to be singular in number because it was followed by a singular verb-form, such as "IST" or "HAT" in German. First we must figure out where to place the RetroSet module in the grand scheme of a Forth AI program. Since the "T" at the end of a German verb is discovered in the InStantiate module, we could either call RetroSet from InStantiate, or use a "statuscon" variable to set a flag that will call RetroSet from higher up in the Wotan AI program. Let us create a "numcon" flag that can be set to call Retroset and then immediately be reset to zero. Since InStantiate is called from the DeParser module, we should perhaps let DeParser call RetroSet.

Now we have stubbed in the RetroSet AI mind-module just before the DeParser mind-module in the Wotan German artificial intelligence. RetroSet diagnostically displays the positive value of the numcon flag and then resets the flag to zero. In future coding, we will use the numcon flag not only to call RetroSet but also to change the default value of "2" for plural to "1" for singular in the case of a new German noun that the Wotan AI is learning for the first time.

Fri.1.MAR.2013 -- Implementing RetroSet in the German AI

In the German Wotan potentially superintelligent AI, the AudListen module sets time-of-seqneed ("tsn") as a time-point for searches covering only current input from the keyboard into the AI Mind. In the new RetroSet module, we may use "tsn" as a parameter to restrict a search for a subject-noun to only the most recent input to the AI. However, "tsn" is apparently being reset for each new word of input, so we switch to using time-of-voice ("tov") and we get better results. We input "eva ist eine frau" and RetroSet retroactively changes the default plural on "EVA" from a two to a one for singular. Next we need to troubleshoot why we are not getting a better question from AskUser.

Sun.3.MAR.2013 -- Problems with AskUser

In our efforts to implement InFerence in the Wotan German AI, we have gotten the AI to stop asking "HABEN EVA KIND?" but now AskUser is outputting "HAT EVA DIE KIND" as if the German noun "Kind" for "child" were feminine instead of neuter. We should investigate to see if the DeArticle module has a problem.

Mon.4.MAR.2013 -- Problems with DeArticle

By the use of a diagnostic message, we have learned that the DeArticle module is finding the accusative plural "DIE" form without regard to what case is required. Now we need to coordinate DeArticle more with the AskUser module, so that when AskUser is seeking a direct object, so will DeArticle. There has already long been a "dirobj" flag, but it is perhaps time to use something more sophisticated, such as "dobcon" or even "acccon" for an accusative "statuscon". After a German preposition like "mit" or "bei" that requires the dative case, we may want to use a flag like "datcon" for a dative "statuscon". So perhaps now we should use "acccon" in preparation for using also "gencon" and "datcon" or maybe even "nomcon" for nominative.

Tues.5.MAR.2013 -- Coordinating AskUser and DeArticle

A better "statuscon" for coordinating between AskUser and DeArticle is "dbacon", because it can be used for all four declensional cases in German. When we use "dbacon" and when we make the "LEAVE" statement come immediately after the first instance of selecting an article with the correct "dbacon", we obtain "HAT EVA DAS KIND" as the question from AskUser after the input of "eva ist eine frau". We still need to take gender into account, so we may declare a variable of "mfncon" to coordinate searches for words having the correct gender.

Wed.6.MAR.2013 -- Problems with the WhatBe Module

As we implement InFerence in the Wotan German Supercomputer AI, the program tends to call the WhatBe module to ask a question about a previously unknown word. When we input to the AI, "eva ist eine frau", first Wotan makes an inference about Eva and asks if Eva has a child. Then the AI mistakenly says, "WAS IRRTUM EVA" when the correct output should be "WAS IST EVA". This problem affords us an opportunity to improve the German performance of the WhatBe module which came into the German AI from the English MindForth AI.

First we need to determine which location in the AI source code is calling the WhatBe mind-module, so we insert some diagnostics. Knowing where the call comes from, lets us work on the proper preparation of parameters from outside WhatBe to be used inside WhatBe.

Thurs.7.MAR.2013 -- Dealing with Number in German

We are learning that we must handle grammatical number much differently in the German AI than in the English AI. English generally uses the ending "-s" to indicate plural number, but in German there is no one such simple clue. In German we have a plethora of clues about number, and we can use the OutBuffer to work with some of them, such as "-heit" indicating singular and "-heiten" indicating plural. In German we can also establish priority among rules, such as letting an "-e" ending in the OutBuffer suggest a plural noun, while letting the discovery of a singular verb overrule the suggestion that a noun is in the plural. The main point here is that in German we must get away from the simplistic English rules about number.

Fri.8.MAR.2013 -- Removing Obsolete Influences

In NewConcept let us try changing the default expectation of number for a new noun from plural to singular. At first we notice no problem with a default singular. Then we notice that the InFerence module is using a default plural ("2") for the subject-noun of the silent inference. We tentatively change the default to singular ("1") until we can devise a more robust determinant of number in InFerence.

We are having a problem with the "ocn" variable for "old concept number". Just as with the obsolete "recnum", there is no reason any more to use the "ocn" variable, so we comment out some code.

Sat.9.MAR.2013 -- Making Inferences in German

When the German Wotan AI uses the InFerence module to think rationally, the AI Mind creates a silent, conceptual inference and then calls the AskUser module to seek confirmation or refutation of the inference. While generating its output, the AskUser module calls the DeArticle module to insert a definite or indefinite article into the question being asked. The AI has been using the wrong article with "HAT EVA DAS KIND?" when it should be asking, "HAT EVA EIN KIND?" When we tweak the software to switch from the definite article to the indefinite article, the AI gets the gender wrong with "HAT EVA EINE KIND?"

Tues.12.MAR.2013 -- A Radical Departure

In the AskUser module, to put a German article before the direct object of the query, we may have to move the DeArticle call into the backwards search for the query-object (quobj), so that the gender of the query-object can be found and sent as a parameter into the DeArticle module.

It may seem like a radical departure to call DeArticle from inside the search-loop for a noun, but only one engram of the German noun will be retrieved, and so there should be no problem with inserting a German article at the same time. The necessary parameters are right there at the time-point from which the noun is being retrieved.

Wed.13.MAR.2013 -- Preventing False Parameters

When the OldConcept module recognizes a known German noun, normally the "mfn" gender of that noun is detected and stored once again as a fresh conceptual engram for that noun. However, today we have learned that in OldConcept we must store a zero value for the recognition of forms of "EIN" as the German indefinite article, because the word "EIN" has no intrinsic gender and only acquires the gender of its associated noun. When we insert the corrective code into the OldConcept module, finally we witness the German Wotan AI engaging in rational thought by means of inference when we input "eva ist eine frau", or "Eva is a woman." The German AI makes a silent inference about Eva and calls the AskUser module to ask us users, "HAT EVA EIN KIND", which means in English, "Does Eva have a child?" Next we must work on KbRetro to positively confirm or negatively adjust the knowledge base in accordance with the answer to the question.

Thurs.14.MAR.2013 -- Seeking Confirmation of Inference

In the German Wotan artificial intelligence with machine reasoning by inference, the AskUser module converts an otherwise silent inference into a yes-or-no question seeking confirmation of the inference with a yes-answer or refutation of the inference with a no-answer. Prior to confirmation or refutation, the conceptual engrams of the question are a mere proposition for consideration by the human user. When the user enters the answer, the KbRetro module must either establish associative tags from subject to verb to direct object in the case of a yes-answer, or disrupt the same tags with the insertion of a negational concept of "NICHT" for the idea known as "NOT" in English.

Fri.15.MAR.2013 -- Setting Parameters Properly

Although the AskUser module is asking the proper question, "HAT EVA EIN KIND" in German for "Does Eva have a child?", the concepts of the question are not being stored properly in the Psi conceptual array.

Sat.16.MAR.2013 -- Machine Learning by Inference

Now we have coordinated the operation of InFerence, AskUser and KbRetro. When we input, "eva ist eine frau" for "Eva is a woman," the German AI makes a silent inference that Eva may perhaps have a child. AskUser outputs the question, "HAT EVA EIN KIND" for "Does Eva have a child?" When we answer "nein" in German for English "no", the KbRetro module adjusts the knowledge base (KB) retroactively by negating the verb "HAT" and the German AI says, "EVA HAT NICHT EIN KIND", or "Eva does not have a child" in English.

Appendix B: German AI Source Code in Forth

( Ki130316.F -- modification of Ki130313.F German DeKi AI )

( May be named "DeKi.F" or any "Filename.F" you choose. )

( Rename any DeKi.F.txt as simply DeKi.F for Win32Forth. )

( Download and unzip W32FOR42_671.zip to run DeKi.F )