InFerence module for automated reasoning

in the English and Russian bilingual ghost.pl AI in Perl

and Mens Latina in Latin -- MindForth for Robots


1. Cognitive architecture Diagram of the InFerence Mind-Module

 
  /^^^^^^^^^^^^  InFerence compares new with old  /^^^^^^^^^^^\
 / EYE & MEMORY\   CONCEPTS in SEMANTIC MEMORY   / EAR & MEMORY\
|               |   | | |         _________     |   ________    |
|   _______     |   |w| |        / "woman" \----|--/"women "\   |
|  /old    \    |   |o| |       / comparand \   | / "have a" \  |
| / image   \---|---|m| |       \  concept  /   | \ "child"  /  |
| \ recog   /   |   |e| |        \_________/    |  \________/   |
|  \_______/    |   |n| |          _______      |   _________   |
|               |  h| | |         / input \     |  /"Anna"   \  |
|               |  a| | |        / of "is" \    | / "is a"    \ |
|   visual      |  v| | |        \"a woman"/----|-\ "woman"   / |
|               |  e| | |         \_______/     |  \_________/  |
|   memory      |   | |c|         _________     |   _________   |
|               |   | |h|        / silent  \    |  /"Anna"   \  |
|   channel     |   | |i|       / inference \---|-/ "have"    \ |
|               |   | |l|       \  is made  /   | \ "child"   / |
|               |   | |d|        \_________/    |  \_________/  |
|   _______     |   | | |         __________    |   __________  |
|  /new    \    |   |_|_|_       / AskUser  \   |  /"Does"    \ |
| / percept \   |  / Psy  \     / requests   \--|-/ "Anna have"\|
| \ engram  /---|-/concepts\   / confirmation \ | \ "a child?" /|
|  \_______/    | \________/   \______________/ |  \__________/ |

The above ASCII diagram shows how the various AI Minds achieve automated reasoning with the InFerence mind-module. Some of the six AI Minds contain a basis for inference built into the MindBoot sequence, such as "Women have a child" or "Students read books." If the AI then receives input that a particular person is a woman or a student, the AI Mind may then make a not-yet-substantiated logical inference and seek confirmation by asking a question like "Does Anna have a child?" or "Does Johnny read books?"


2. Purpose of the AI4U InFerence Module

The InFerence module serves the purpose of automated reasoning with logical inference.


3. Function of the AI4U InFerence AI Module

Into each AI Mind coded for inference, you may also enter first a premise like "Boys play games" and then at any later time enter a statement like "John is a boy," and the AI Mind should then make a confirmand inference and ask, "Does John play games?" There are four categories of human response: Yes; No; Maybe; or no response at all. If the answer is "no", the AI software negates the idea of the original inference and stores the confirmed knowledge, "John does not play games." In the cases of "maybe" or no response, the AI Mind simply abandons the inference as an unconfirmed and therefore forgettable idea. If the confirmation is "Yes", the AI keeps the confirmed idea in the knowledge base (KB).

When you run the AI Mind in Diagnostic mode and you trigger a silent inference by inputting a sentence such as "anna is a woman", in the Diagnostic display you may see the inferred idea as a tiny cluster of several concepts bundled together at contiguous time-points with no spaces between them, because no words in auditory memory are attached to the individual concepts of the inferred idea. Only when the AI thinks about the inference, do the concepts find expression as auditory words. You may observe that the tiny bundle of concepts holding the silent inference is changed by the KbRetro() module when you answer "No" to a challenge-question like "Does Anna have a child". Suddenly the 250=NOT adverb appears as the number '250" in the flag-panel of the verb 810=HAVE, along with the various other associative tags.


4. Code of InFerence() from ghost294.pl First Working AGI in Perl


sub InFerence() {  # http://ai.neocities.org/InFerence.html
  $moot = 1;  # 2018-06-26: prevent interference via tag-forming;
  if ($prednom > 0) {  # 2018-06-26: positive predicate nominative? 
    for (my $i=$tpu; $i>$midway; $i--) {  # 2018-06-26: search KB to infer facts.
      my @k=split(',',$psy[$i]);  # 2018-06-26: examine @psy array;
      if ($k[1] == $prednom && $k[8] == 2) {  # 2018-06-26: plural KB data?
        if ($k[7] == 1) {  # 2018-06-26: dba nominative for subject?
          if ($seqverb == 0) {  # 2018-06-26: only once
            $seqverb = $k[12];  # 2018-06-26: hold to be the verb of an inference.
            $quverb  = $k[12];  # 2018-06-26: query-verb for AskUser(); 
            $seqtkb =  $k[13];  # 2018-06-26: hold for fetching same direct object; 
            $ynverb = 0;        # 2018-06-26: since lacking, use $quverb for AskUser()
            $seq = $seqverb;    # 2018-06-26: to be inferred as applying to subject. 
          }  # 2018-06-26: end of test that no "seqverb" has yet been declared.
        }  # 2018-06-26: end of test that "prednom" does occur in nominative.
      }  # 2018-06-26: end of test for a fact about the "prednom" as plural subject. 
    }  # 2018-06-26: end of backwards loop in search of inferrable knowledge. 
  }  # 2018-06-26: end of test for positive predicate nominative. 
  if ($seqverb > 0) {  # 2018-06-26:  verb available for inference? 
    $inft = $t;  # 2018-06-26: for AskUser to find auditory engrams.
    $qusnum = 1;  # 2018-10-08: assumption based on Is-A status.
    $t = ($t + 1);  # 2018-06-26: increment time "$t" by one for a gap; 
    $t = ($t + 1);  # 2018-06-26: increment time to create an inference; 
    my @k=split(',',$psy[$t]);  # 2018-06-26: expose row where no values are present;
    $k[13] = ($t + 1);  # 2018-06-26: k13=tkb, which is one unit later.
    $seq = $seqverb;  # 2018-06-26: prevent override? 
    $psy[$t]="$k[0],$subjnom,1,64,$k[4],$k[5],5,"   # 2018-09-27
    . "$k[7],$qusnum,$k[9],$k[10],$k[11],$seqverb,$k[13],$k[14]";  # 2018-06-26
    $tkbn = $t;  # 2018-06-26: conceptual array-time for subject-noun of AskUser query. 
    $t = ($t + 1);  # 2018-06-26: increment $t for storage of inference-verb; 
    @k=split(',',$psy[$seqtkb]);  # 2018-06-26: obtain $dobseq from Psy array; 
    $dobseq = $k[12];  # 2018-06-26: to serve as k12 "$seq" in next insertion; 
    $quobj = $dobseq;  # 2018-06-26: so AskUser() will use the particular dir.obj. 
    @k=split(',',$psy[$t]);  # 2018-06-26: expose values for change of some; 
    $k[13] = ($t +1);  # 2018-06-26: $tkb is the next time-point in silent inference.
    $psy[$t]="$k[0],$seqverb,1,40,$k[4],$k[5],8,"   # 2018-09-27
    . "$k[7],$k[8],$k[9],$subjnom,$k[11],$dobseq,$k[13],$k[14]";  # 2018-06-26
    $tkbv = $t;  # 2018-06-26: conceptual array-time for verb of AskUser() query. 
    $t = ($t + 1);  # 2018-06-26: increment time to store direct object; 
    $tkbo = $t;  # 2018-06-26: conceptual array-time for object-noun of AskUser query. 
    @k=split(',',$psy[$seqtkb]);  # 2018-06-26: obtain seqdob from Psy array; 
    $seqdob = $k[12];  # 2018-06-26: to serve as "k[1]" in next insertion;
    $seqrvx = $k[14];  # 2018-06-26: to serve as "k14" in next insertion; 
    @k=split(',',$psy[$t]);  # 2018-06-26: expose values for change of some; 
    $psy[$t]="$k[0],$seqdob,1,8,$k[4],$k[5],5,"  # 2018-09-27: lower activation?
    . "$k[7],$k[8],$k[9],$k[10],$k[11],$k[12],$k[13],$k[14]";  # 2018-06-26
    $quobj = $dobseq;  # 2018-06-26: for transfer to AskUser() 
    $t = ($t+1);  # 2018-06-26: increment time "$t" for an ending gap; 
    $yncon = 1;   # 2018-06-26: for AskUser() to ask yes-or-no question;
    $qusub = $subjnom;  # 2018-06-26: for transfer to AskUser() 
  }  # 2018-06-26: end of test for a verb to be part of inference.
  $becon = 0;    # 2018-06-26: reset after use; 
  $dobseq = 0;   # 2018-06-26: reset after use; 
  $moot = 0;     # 2018-06-26: reset after use; 
  $prednom = 0;  # 2018-06-26: reset after use; 
  $seqdob = 0;   # 2018-06-26: reset after use; 
  $seqrvx = 0;   # 2018-06-26: reset after use; 
  $seqtkb = 0;   # 2018-06-26: reset after use; 
  $seqverb = 0;  # 2018-06-26: reset after use; 
  $subjnom = 0;  # 2018-06-26: reset after use;  
}  # 2018-06-26: InFerence() returns to EnThink() or RuThink().


5. Variables of the InFerence module for automated reasoning.

$becon -- be-conditional -- carries a flag from the OldConcept recognition module indicating that new input includes a verb of being which may permit the InFerence module to make a general inference about the subject of the input so that the InFerence module can look for prior knowledge as the basis of an InFerence.

$dobseq -- direct-object-subSEQuent word -- is used only within the InFerence module to hold onto the psi identifier of a noun or pronoun that was the direct object of a verb retrieved from memory as part of an inference that will require the same direct object to be associated with the verb if the inference is to be confirmed or refuted as a valid InFerence.

$inft -- inference-time -- holds the current time at the start of the InFerence module so that the AskUser module may ask a question based upon pre-existing knowledge before the formation of a silent InFerence in the AI memory.

$moot -- as in legally moot -- is a flag to prevent the formation of associative tags during mental operations which are not truly a part of cognition, such as the processing of an input query, the formation of a silent InFerence, or the creation of an output query.

$prednom -- predicate nominative for InFerence.

$quobj -- query object -- holds onto the psi identifier of a word chosen by the InFerence module to be the direct object of a query created by the AskUser module.

$quverb -- query verb -- is set in the InFerence module with the identifier of a verb concept serving as part of an InFerence being made about user input. Then the AskUser module transforms the quverb identifier into the yes-or-no-verb identifier ynverb so that AskUser can use the query-verb to ask a question expecting a yes-or-no answer.

$seqdob -- for direct object transfer within InFerence.

$seqtkb -- subSEQuent Time-in-Knowledge-Base -- is used only within the InFerence module to latch onto the specific time-point in memory of a verb which was linked in the past to a concept now occurring within user input as a predicate nominative which identifies a class of entities from which an InFerence can be drawn and assigned to the subject of the user input. For instance, if the AI knows "Boys play games" and the user inputs "John is a boy," the old verb "play" can now be used to infer, "John plays games," because John is a boy.

$seqverb -- subSEQuent-concept VERB – is an interstitial carrier of a verb-identifier in the InFerence module, permitting a verb which was used in old knowledge to be used as part of an inference of new knowledge and as part of a question by AskUser seeking confirmation or refutation of an InFerence.

$subjnom -- subject nominative -- is a concept identified in the OldConcept recognition module as the subject of an input causing the AI to make an InFerence.

$tkbn -- time-in-knowledge-base-noun -- is set in the InFerence module as the concept-array time-point of the subject-noun in a yes-or-no question to be asked of the human user by AskUser to confirm or negate an inference being made by the logical InFerence module.

$tkbo -- time-in-knowledge-base-object -- is set in the InFerence module as the concept-array time-point of the direct-object-noun in a silent inference to be confirmed or negated retroactively by the KbRetro module after a human user has responded to a question posed by the AskUser module. May be similar to "tdo" in the EnParser module, but "tkbo" is the last of three consecutive time-points in the silent inference created by the InFerence module for automated reasoning.

$tkbv -- time-in-knowledge-base-verb -- is the time-point of a verb-concept in an AskUser question for a human user to confirm or deny the truth of a logical InFerence made by the AI.


6. Troubleshooting and Debugging for AI Mind Maintainers

6.1.a. Symptom: AI asks confirmation question lacking a verb.
6.1.b. Solution: The AI Mind needs to have the query-verb in the MindBoot sequence as an infinitive form with a $dba=0 indicating that the verb is not in first person or second person or third person.

6.2.a. Symptom: (Something goes wrong.)
6.2.b. Solution: (AI Mind Maintainer devises solution.)


7. Future Development

The InFerence module needs to be developed further and improved in several ways. The tendency of an AI Mind to make a silent logical inference upon user input is not a problem, but the use of the AskUser module to automatically ask for validation or rebuttal of the silent inference is indeed a problem. The pertinent code should be changed to engender only the possibility of asking for validation, not the certainty of interrupting a dialogue with an inference-related question.

For purposes of inference, the AI software should make a distinction between inferences based on all members of a set and on some members of a set. Since an inference based on all members of a set is presumed to be valid, AskUser should refrain from seeking validation of the presumably true inference. AskUser should sometimes seek validation only of an inference based on knowledge about some members of a set.

Since a server-based Perl AI may be enhanced with the ability to surf the Web and absorb vast stores of information, the InFerence module and similar reasoning modules should be honed and specialized to engage in complex logical thinking far beyond the fleeting situations of user dialogue.

  • Roadmap to Artificial Intelligence


    8. Resources for the AI4U Textbook InFerence Module


    9. Spread the News on TikTok and Other Venues

    Are you on TikTok? Are you eager to be a ThoughtLeader and Influencer?
    Create a TikTok video in the following easy steps.

    I. Capture a screenshot of https://ai.neocities.org/InFerence.html
    for the background of your viral TikTok video.

    II. In a corner of the screenshot show yourself talking about the InFerence module.

    III. Upload the video to TikTok with a caption including all-important hash-tags which will force governments and corporations to evaluate your work because of FOMO -- Fear Of Missing Out:
    #AI #ИИ #brain #мозг #ArtificialIntelligence #ИскусственныйИнтеллект #consciousness #сознание #Dushka #Душка #psychology #психология #subconscious #подсознание
    #AGI #AiMind #Alexa #ChatAGI #chatbot #ChatGPT #cognition #cyborg #Eureka #evolution #FOMO #FreeWill #futurism #GOFAI #HAL #immortality #JAIC #JavaScript #linguistics #metempsychosis #Mentifex #mindmaker #mindgrid #ML #neuroscience #NLP #NLU #OpenAI #OpenCog #philosophy #robotics #Singularity #Siri #Skynet #StrongAI #transhumanism #Turing #TuringTest #volition

    A sample video is at
    https://www.tiktok.com/@sullenjoy/video/7206322230974926126


    10. AiTree of Mind-Modules for Natural Language Understanding


    Nota Bene: This webpage is subject to change without notice. Any Netizen may copy, host or monetize this webpage to earn a stream of income by means of an affiliate program where the links to Amazon or other booksellers have code embedded which generates a payment to the person whose link brings a paying customer to the website of the bookseller.

    This page was created by an independent scholar in artificial intelligence who was a contemporary of H.G. Wells and who created the following True AI Minds with sentience and with limited consciousness.

  • http://ai.neocities.org/mindforth.txt -- MindForth Robot AI in English.

  • http://ai.neocities.org/DeKi.txt -- Forth Robot AI in German.

  • http://ai.neocities.org/perlmind.txt -- ghost.pl Robot AI thinks in English and in Russian.

  • http://ai.neocities.org/Ghost.html -- JavaScript Robot AI Mind thinks in English.

  • http://ai.neocities.org/mens.html -- JavaScript Robot AI Mind thinks in Latin.

  • http://ai.neocities.org/Dushka.html -- JavaScript Robot AI Mind thinks in Russian.

    The following books describe the free, open-source True AI Minds.

    AI4U -- https://www.iuniverse.com/BookStore/BookDetails/137162-AI4U

    AI4U (paperback) -- http://www.amazon.com/dp/0595259227

    AI4U (hardbound) -- http://www.amazon.com/dp/0595654371

    The Art of the Meme (Kindle eBook) -- http://www.amazon.com/dp/B007ZI66FS

    Artificial Intelligence in Ancient Latin (paperback) -- https://www.amazon.com/dp/B08NRQ3HVW

    Artificial Intelligence in Ancient Latin (Kindle eBook) -- https://www.amazon.com/dp/B08NGMK3PN

  • https://redditfavorites.com/products/artificial-intelligence-in-ancient-latin

    Artificial Intelligence in German (Kindle eBook) -- http://www.amazon.com/dp/B00GX2B8F0

    InFerence at Amazon USA (Kindle eBook) -- http://www.amazon.com/dp/B00FKJY1WY

    563 Mentifex Autograph Postcards were mailed in 2022 primarily to autograph collector customers at used bookstores to press the issue of whether or not the Mentifex oeuvre and therefore the autograph is valuable. These artwork-postcards with collectible stamps may be bought and sold at various on-line venues.

  • https://www.ebay.com
  • https://galaxycon.com/search?q=Mentifex

    See AI 101 AI 102 AI 103 year-long community college course curriculum for AI in English.
    See Classics Course in Latin AI for Community Colleges or Universities.
    See College Course in Russian AI for Community Colleges or Universities.
    Collect one signed Mentifex Autograph Postcard from 563 in circulation.
    See Attn: Autograph Collectors about collecting Mentifex Autograph Postcards.

    Return to top; or to
    Image of AI4U found at a book store
    The collectible AI4U book belongs in every AI Library as an early main publication of Mentifex AI.


    Website Counter