1. Cognitive architecture
Diagram of the EnParser English Parser Module
/^^^^^^^^^\ EnParser Tags Subject-Verb-Object /^^^^^^^^^\ / EYE \ MINDCORE _____ / EAR \ / \ CONCEPTS /New- \ / \ | _______ | | | | _____ (Concept)-|-------------\ | | /old \ | | | | /Old- \ \_____/ | Audition | | | / image \---|-----+ | (Concept)------|---|----------\ | | | \ recog / | | | | \_____/-------|---|-------\ | | | | \_______/ | a| | | |________V | c | | | | | | b|C| | / EnParser \ | a | | | | | visual | s|O|f| \__________/ | t | | | | | | t|N|i| |noun? | s-/ | | | | memory | r|C|b| |verb? | | | | | | a|E|e| |adj.? | e | | | | channel | c|P|r| |adverb? | a | | | | | t|T|s| |prep.? | t-----/ | | | _______ | | | | |conj.? | | | | /new \ | |_|_| ______V____ | f | | | / percept \ | / \ / \ | i | | | \ engram /---|--\ Psi /-----( InStantiate ) | s | | | \_______/ | \___/ \___________/ | h-------/ |
EnParser serves the purpose of not only identifying a part of speech such as a noun, preposition or verb, but also of comprehending the part of speech in context by helping to assign associative tags among concepts in the Psy conceptual array. Thus EnParser and its Russian counterpart RuParser serve the purpose of Natural Language Understanding (NLU).
3. Algorithm of the
EnParser AI Mind-Module
Since the main feature of the concept-based AI Minds is their demonstration of solving the AI-hard problem of natural language understanding (NLU), the EnParser module for parsing English is the main instrument of achieving the NLU goal.
The ghost.pl AI Mind in Strawberry Perl Five is becoming a conversational agent that may be installed to run in either the background or the foreground on any host computer.
The JavaScript AI Mind requires no download of either source code or programming language. Simply clicking on the link brings the tutorial AI Mind into your MSIE browser, where you may use the AI to teach students or AI coders.
If you want to issue verbal instructions to your autonomous humanoid robot, MindForth as the robot brain enables you and the robot to engage in a discussion of what work the robot should be doing, and how the work should be done, and how the robot should engage in back-and-forth communication using English for you to clarify youe instructions and for the robot to report to you its completion of tasks.
3.A. Words included in the MindBoot sequence lighten the load of EnParser.
Populating the MindBoot sequence with English words, the AI mind maintainer tries to include the most frequent English nouns and verbs, all the conjunctions, all the prepositions, and all the pronouns. Except for disambiguation, the embedded words do not need further parsing as to their parts of speech.
3.B. The first AI Minds could only parse SVO sentences into nouns and verbs.
The subject-verb-object (SVO) format of admissible inputs made it simple for the earliest AI Minds to classify English words as nouns or verbs. The parsing module could skip over the known English articles and an important adverb like the word "NOT" being used to negate an English sentence.
3.C. The ghost112.pl AI became able to parse prepositional phrases.
Since the MindBoot sequence innately contains all the English prepositions identified as such, the AI Mind easily detects the input of a known English preposition and sets the $prepcon flag to a positive number one and loads the $tpp time-of-preposition flag with the ending time-point at which the input preposition is being instantiated as a node in conceptual memory. Then $tpp is used to zero in on the associative-tag flag-panel of the preposition, filling in the next noun or pronoun as the $seq of the preposition. Then the $tvb time-of-verb flag is used to insert the concept-number of the preposition as a $seq of the verb.
3.D. The ghost114.pl AI began to parse the indirect objects of verbs.
To parse and comprehend an indirect object, EnParser tentatively fills the time-of-indirect-object $tio flag with the input-time of the first noun being input after a verb. The $tio flag is set only once by requiring that it be at zero for it to be set. Simultaneously, the time-of-direct-object flag $tdo is filled with the same value as the $tio flag for an indirect object, because it is not yet known whether one noun or two nouns are being entered subsequent to the input of a transitive verb. If and when a second noun comes in, the value for the time-of-direct-object $tdo flag, originally filled with the same value as the indirect-object $tio flag, is replaced or overwritten with the new time of the second post-verb noun in the input stream.
If only one noun comes in after the verb, the identifier of the direct object is set only once. If two post-verb nouns come in, the first noun becomes the indirect object and the second noun becomes the direct object.
3.E. The ghost309.pl AI became able not only to parse the input of a prepositional phrase but also to think with a prepositional phrase.
4. Code of EnParser() from ghost327.pl AI
source code in
Perl
# 2017-09-03: Requirements of English Parser module: # 2017-09-13: EnParser shall at first assume that a new word is a noun. sub EnParser() { # http://ai.neocities.org/EnParser.html if ($pos == 5 || $pos == 7) { # 2019-10-19: if noun or pronoun... if ($tsj == 0) { $dba = 1; } # 2019-10-19: if not subject has been declared... if ($tult == $tdo) { $dba = 4; } # 2019-10-19: acc. dir. obj. } # 2019-10-19: end of test for noun or pronoun. $act = 48; # 2016apr27: an arbitrary activation for InStantiate() $bias = 5; # 2015jun04: Expect a noun until overruled. $tkb = 0; # 2019-10-22: BUGFIX if ($fyi > 2) { # 2016feb08: if mode is Diagnostic # 2018-07-01: Diagnostic or tutorial message. } # 2016feb07: end of test for Diagnostic or Tutorial mode. InStantiate(); # 2019-08-08: first instantiation during parsing Stage One if ($pos == 5) { $bias = 8 } # 2016feb10: after noun, expect verb. if ($pos == 7) { $bias = 8 } # 2016feb10: after pronoun, expect verb. if ($pos == 8) { $bias = 5 } # 2016feb10: after verb, expect noun if ($prepcon == 0) { # 2017-09-13: if not handling a preposition... if ($pos == 5 || $pos == 7) { # 2016mar21: expanding then-clause # $tsj = ($t - 1); # 2017-09-13: subject? # $tsj = ($t - 1); # 2019-10-19: declare subject-time elsewhere? if ($verbcon == 1) { # 2017-09-13: if a verb has come in... if ($tvb > 0) { $tdo = $tult; } # 2019-10-19: a default value. if ($tio == 0) {$tdo = $tult} # 2017-09-13: set once or twice if ($tio == 0) {$tio = $tult} # 2017-09-13: set only once if ($tio > 0) { # 2017-09-13: if $tio previously set... $tdo = $tult; # 2017-09-13: second noun sets time of dir.obj. } # 2017-09-13: end of test to make 2nd noun the direct object. $tkb = $tdo; # 2019-10-18: let verb have a "nounlock" to direct object. my @k=split(',',$psy[$tvb]); # 2017-09-13: expose flag-panel of verb $pre = $k[1]; # 2017-09-13: verb psi will be $pre of direct object @k=split(',',$psy[$tio]); # 2017-09-13: expose flag-panel of indir.obj. $iob = $k[1]; # 2019-08-01: excerpt indirect-object concept for k[13] $psy[$tio]="$k[0],$k[1],$k[2],$k[3],$k[4],$k[5],$k[6]," . "3,$k[8],$k[9],$k[10],$k[11],$k[12],$k[13]," . "$k[14],$k[15],$k[16],$k[17],$k[18],$k[19],$k[20]"; # 2019-08-01: panel # 2017-09-13: Above lines insert k7 dba=3 for dative-case indirect object. @k=split(',',$psy[$tdo]); # 2017-09-13: expose flag-panel of dir.obj. $psy[$tdo]="$k[0],$k[1],$k[2],$k[3],$k[4],$k[5],$k[6]," . "4,$k[8],$k[9],$k[10],0,$k[12],$k[13]," . "0,$k[15],$k[16],$k[17],$k[18],$k[19],$k[20]"; # 2019-08-02: no $tkb for dir.obj. # 2017-09-13: Above lines insert verb-$psi as k10 $pre of direct object. @k=split(',',$psy[$tvb]); # 2017-09-13: expose flag-panel of main verb. if ($k[5]==250) { $k[7]=0;$k[8]=0; } # 2019-10-23: negated verb is dba=0 infinitive. $tdo = $t - 1; # 2017-09-13: insert time-of-direct-object for nounlock; $tkb = $tdo; # 2017-09-13: TEST # 2017-09-13: Next two lines create psy-array row for a verb. $psy[$tvb]="$k[0],$k[1],$k[2],$k[3],$k[4],$k[5],$k[6]," . "$k[7],$k[8],$k[9],$k[10],$psi,$k[12],$iob," . "$tkb,$k[15],$k[16],$k[17],$k[18],$k[19],$k[20]"; # 2019-08-01: panel. @k=split(',',$psy[$tdo]); # 2019-10-19: expose flag-panel of direct object $psy[$tdo]="$k[0],$k[1],$k[2],$k[3],$k[4],$k[5],$k[6]," . "4,$k[8],$k[9],$k[10],$k[11],$k[12],$k[13]," # 2019-10-19: acc. dba=4 dir.obj. . "$k[14],$k[15],$k[16],$k[17],$k[18],$k[19],$k[20]"; # 2019-10-19: tdo-panel } # 2017-09-13: end of test for a condition following a verb. } # 2017-09-13: end of test for noun not object of a preposition. } # 2017-09-13: end of test for a non-prepositional condition. if ($pos == 6) { $prepcon = 1 } # 2019-09-24: prepare for noun. if ($pos == 5 || $pos == 7) { # 2017-09-13: if ($prepcon == 1) { # 2017-09-13: my @k=split(',',$psy[$tpr]); # 2019-08-06: expose flag-panel of preposition; $pre = $k[1]; # 2017-09-13: Let $pre briefly be the preposition. $prep = $k[1]; # 2018-11-08: identify prep. to be the $seq of the verb. $k[14] = $tult; # 2019-08-01: establish $tkb between preposition and its object. @k=split(',',$psy[$tult]); # 2017-09-13: expose flag-panel of obj of prep. $psy[$tult]="$k[0],$k[1],$k[2],$k[3],$k[4],$k[5],$k[6]," . "4,$k[8],$k[9],$pre,0,$k[12],$k[13]," . "$k[14],$k[15],$k[16],$k[17],$k[18],$k[19],$k[20]"; # 2019-08-01: panel. $prep = 0; # 2018-11-08: Reset to prevent carry-over. $prepcon = 0; # 2017-09-13: Reset to prevent carry-over. } # 2017-09-13: end of test for a positive $prepcon. } # 2017-09-13: end of test for a noun or pronoun. if ($pos == 8 && $psi != 800 && $psi != 818) { # 2019-08-06: not "BE" or "DO" $tvb = ($t - 1); # 2017-09-13: hold onto time-of-verb for flag-insertions. $verbcon = 1; # 2017-09-13: verb-condition is "on" for ind. & dir. objects. my @k=split(',',$psy[$tsj]); # 2017-09-13: expose flag-panel of subject noun $subjpre = $k[1]; # 2017-09-13: Hold onto $subjpre for the pos=8 verb $psy[$tsj]="$k[0],$k[1],$k[2],$k[3],$k[4],$k[5],$k[6]," . "1,$k[8],$k[9],$k[10],$psi,$k[12],$k[13]," . "$tvb,$k[15],$k[16],$k[17],$k[18],$k[19],$k[20]"; # 2019-08-01 @k=split(',',$psy[$tult]); # 2017-09-13: expose flag-panel of verb. $psy[$tult]="$k[0],$k[1],$k[2],$k[3],$k[4],$k[5],$k[6]," . "$k[7],$k[8],$k[9],$subjpre,0,$k[12],$k[13]," . "$k[14],$k[15],$k[16],$k[17],$k[18],$k[19],$k[20]"; # 2019-08-01 $subjpre = 0; # 2017-09-13: Reset for safety. } # 2017-09-13: end of test for a pos=8 verb. # if ($pos == 8 && $psi == 800) { # 2019-10-18: if 800=BE verb... if ($pos == 8) { # 2019-10-19: any verb... $tvb = ($t - 1); # 2019-10-18: hold onto time-of-verb for flag-insertions. my @k=split(',',$psy[$tsj]); # 2019-10-18: expose flag-panel of subject noun $psy[$tsj]="$k[0],$k[1],$k[2],$k[3],$k[4],$k[5],$k[6]," . "$k[7],$k[8],$k[9],$k[10],$psi,$k[12],$k[13]," # 2019-10-18: verb is "seq" of noun. . "$tvb,$k[15],$k[16],$k[17],$k[18],$k[19],$k[20]"; # 2019-10-18: verb "tkb" of noun. } # 2019-10-18: end of test for 800=BE verb. # if ($tpr > 0 && $pos == 5) { # 2019-08-11: if a noun follows a preposition... if ($tpr > $vault && $pos == 5) { # 2019-10-22: if a noun follows a preposition... my @k=split(',',$psy[$tpr]); # 2019-08-10: expose flag-panel at time-of-prep. $tkb = ($t-1); # 2019-08-11: time of object of preposition $psy[$tpr]="$k[0],$k[1],$k[2],$k[3],$k[4],$k[5],$k[6]," . "$k[7],$k[8],$k[9],$k[10],$k[11],$k[12],$k[13]," . "$tkb,$k[15],$k[16],$k[17],$k[18],$k[19],$k[20]"; # 2019-08-11 } # 2019-08-11: end of test for a noun after a preposition. if ($pos == 6 && $mri > 0) { # 2019-09-24: removing restriction to input-only. $tpr = ($t - 1); # 2019-08-11: time-of-preposition for back-tag insertion. my @k=split(',',$psy[$mri]); # 2019-08-10: expose flag-panel of most-recent word $psy[$mri]="$k[0],$k[1],$k[2],$k[3],$k[4],$k[5],$k[6]," . "$k[7],$k[8],$k[9],$k[10],$k[11],$k[12],$k[13]," . "$k[14],$k[15],$k[16],$k[17],$k[18],$tpr,$k[20]"; # 2019-08-11 $etc = 0; # 2019-09-26: thwart compounding of thought at first after a preposition. $tpr = 0; # 2019-10-22: prevent carry-over. TEST; TRUNCATE } # 2019-09-24: end of test for preposition part-of-speech and positive $mri value. if ($pos == 8) { # 2019-10-19: if part-of-speech is 8=verb... if ($psi != 818) { # 2019-10-19: if verb other than auxiliary 818=DO... $verbcon = 1; # 2019-10-19: verb condition is "on" for indir. and dir. objects. } # 2019-10-19: end of test for not auxiliary 818=DO. my @k=split(',',$psy[$tsj]); # 2019-10-19: expose flag-panel at time-of-subject $psy[$tsj]="$k[0],$k[1],$k[2],$k[3],$k[4],$k[5],$k[6]," . "$k[7],$k[8],$k[9],$k[10],$psi,$k[12],$k[13]," # 2019-10-19: seq = verb. . "$k[14],$k[15],$k[16],$k[17],$k[18],$k[19],$k[20]"; # 2019-10-20 } # 2019-10-19: end of test for a pos=8 verb. if ($pos == 5 || $pos == 7) { # 2019-10-22: noun or pronoun as direct object my @k=split(',',$psy[$tdo]); # 2019-10-19: expose flag-panel of direct object $psy[$tdo]="$k[0],$k[1],$k[2],$k[3],$k[4],$k[5],$k[6]," . "4,$k[8],$k[9],$svo2,0,$k[12],$k[13]," # 2019-10-22: dba=4 pre=verb seq=0 . "$k[14],$k[15],$k[16],$k[17],$k[18],$k[19],$k[20]"; # 2019-10-22: tdo-panel } # 2019-10-22: end of test for a noun or pronoun. if ($pos == 1) { # 2019-10-22: if adjective $tult = ($t -1); # 2019-10-22: certify my @k=split(',',$psy[$tult]); # 2019-10-22: expose flag-panel $psy[$tult]="$k[0],$k[1],$k[2],$k[3],$k[4],$k[5],$k[6]," . "$k[7],$k[8],$k[9],$k[10],$k[11],$k[12],$k[13]," # 2019-10-22 . "$k[14],$k[15],$k[16],$k[17],$k[18],0,$k[20]"; # 2019-10-22: no tpr for adjective. } # 2019-10-22: end of test for pos=1 adjective if ($pos == 6) { # 2019-10-22: if preposition $tult = ($t -1); # 2019-10-22: certify my @k=split(',',$psy[$tult]); # 2019-10-22: expose flag-panel $psy[$tult]="$k[0],$k[1],$k[2],$k[3],$k[4],$k[5],$k[6]," . "$k[7],$k[8],$k[9],$k[10],$k[11],$k[12],$k[13]," # 2019-10-22 . "$k[14],$k[15],$k[16],$k[17],$k[18],0,$k[20]"; # 2019-10-22: no tpr for prep. } # 2019-10-22: end of test for pos=6 preposition. } # 2019-10-19: EnParser() returns to OldConcept() or NewConcept().
5.
Variables for EnParser Module with
NLU
$act -- quasi-neuronal activation-level
$bias --
EnParser(); NewConcept(): expected part-of-speech POS
$fyi -- (for your information) display mode, which the user selects by pressing the Tab key and toggling through the modes: 1 = Normal; 2 = Transcript; 3 = Tutorial mode; 4 = Diagnostic mode
$mri -- -- most-recent-instantiation. The mri is a time-point which may be used for the insertion of an associative tag into the conceptual flag-panel of a previous concept in order to establish a link between a newly input concept being instantiated and the previously instantiated concept. The designation of the time-point of the mri concept may be qualified or restricted by code that requires the mri concept to be a special ("pos") part-of-speech, such as either a noun or a verb. Thus a preposition being instantiated may be back-linked to a noun or a verb as a mental point of departure leading to the preposition within a chain of thought. For example, if we say "John writes books for money", the preposition "for" could be linked backwards to a tpr (time-of-preposition) tag in the flag-panel of the verb "writes" or in the flag-panel of the noun "books", depending on whether the basic idea is "writes for money" or "books for money" -- especially if "John writes books for money and poems for fun."
$pos -- (part of speech) 1=adj 2=adv 3=conj 4=interj 5=noun 6=prep 7=pron 8=verb
$pre -- pre(vious) associated @psy concept, such as the subject of a verb, or such as the verb of a direct object.
$prep -- a preposition used in the EnPrep mind-module for English prepositions.
$prepcon -- prepositional condition-flag for parsing.
$subjpre -- subject-$pre to be held for verb in parsing.
$tdo -- time-of-direct-object for a parser module.
$tio -- time-of-indirect-object for parser module.
$tkb -- time-in-knowledge-base of an idea. The EnParser() module stores time-of-verb $tvb in the k[13] $tkb slot of the subject of a sentence being stored at the time-of-subject $tsj time-point in the Psy conceptual array. The EnParser module stores $tkb as equal to $tdo (time of the direct object) in the Psy array row of the verb at time-of-verb $tvb. Thus for subjects $tkb means "verb" and for verbs $tkb means "direct object". No $tkb is stored for a direct object, because the associative tags go from subject to verb to object (SVO) but no further.
$tpr -- time-of-preposition for parsing.
$tsj -- conceptual flag-panel tag for time-of-preposition.
$tult -- t penultimate, or time-minus-one.
$tvb -- time-of-verb for parsing. The EnParser module sets "t - 1" as the tvb if the word being parsed is a verb by part-of-speech. InStantiate uses the time-of-verb tvb to fill the tkb-slot in a verb-panel with the time-location of the direct object of the verb. EnParser uses the time-of-verb tvb to find the numeric verb-concept that will be the pre of a direct object being parsed. EnParser also uses the time-of-verb tvb to store the time-of-direct-object tdo as the tkb in the flag-panel of the verb. Therefore the tvb-flag needs to be reset to zero not during but after AudInput, when program-flow has returned to the Sensorium module.
$verbcon -- verb-condition for seeking (in)direct objects. $verbcon is set to a positive unitary one in the EnParser module when a verb comes in during input and remains at one while any indirect or direct object comes in. After an input sentence, $verbcon is reset to zero.
6. Troubleshooting and Debugging for
AI Mind Maintainers
6.1.a. Symptom: (Something goes wrong.)
6.1.b. Solution: (AI Mind
Maintainer devises solution.)
To debug the function of assigning indirect and direct objects, the AI Mind Maintainer enters a typical sentence such as "I give the boy a robot" that contains both an indirect object and a direct object, and presses Escape to halt the AI after the input. Then the Maintainer examines the display of the conceptual array to see if $iob and $seq have been properly assigned. The AI coder may also insert diagnostic "print" messages into InStantiate() and EnArticle() so as to observe the process of assigning a tag during the input and comprehension of a sentence.
7. Future Development of EnParser Module with
NLU
Considerations:
- For future development. EnPronoun could be used to replace a noun
with a relative pronoun and to introduce a subordinate clause, such as;
"I know a man who loves music."
EnParser must deal with the
NLU of such a sentence, using the incoming relative pronoun
as a trigger to treat the ensuing verb as referring back to the relative pronoun.
8. Resources for EnParser Module with
NLU
9. Spread the News on TikTok and Other Venues
Are you on TikTok? Are you eager to be a ThoughtLeader and Influencer?
Create a
TikTok video in the following easy steps.
I. Capture a screenshot of
https://ai.neocities.org/EnParser.html
for the background of your viral TikTok video.
II. In a corner of the screenshot show yourself talking about the EnParser module.
III. Upload the video to TikTok with a caption including all-important hash-tags
which will force governments and corporations to evaluate your work
because of FOMO -- Fear Of Missing Out:
#AI
#ИИ
#brain
#мозг
#ArtificialIntelligence
#ИскусственныйИнтеллект
#consciousness
#сознание
#Dushka
#Душка
#psychology
#психология
#subconscious
#подсознание
#AGI #AiMind #Alexa #ChatAGI #chatbot #ChatGPT #cognition #cyborg #Eureka #evolution
#FOMO #FreeWill #futurism #GOFAI #HAL #immortality #JAIC
#JavaScript #linguistics #metempsychosis #Mentifex #mindmaker #mindgrid
#ML #neuroscience #NLP #NLU #OpenAI #OpenCog #philosophy #robotics #Singularity #Siri #Skynet
#StrongAI #transhumanism #Turing #TuringTest #volition
A sample video is at
https://www.tiktok.com/@sullenjoy/video/7230082904812981546
10.
AiTree of Mind-Modules for
Natural Language Understanding
Nota Bene: This webpage is subject to change without notice. Any Netizen may copy, host or monetize this webpage to earn a stream of income by means of an affiliate program where the links to Amazon or other booksellers have code embedded which generates a payment to the person whose link brings a paying customer to the website of the bookseller.
This page was created by an
independent scholar in artificial intelligence
who created the following True AI Minds with sentience and with limited
consciousness/a>.
The following books describe the free, open-source True AI Minds.
AI4U -- https://www.iuniverse.com/BookStore/BookDetails/137162-AI4U
AI4U (paperback) -- http://www.amazon.com/dp/0595259227
AI4U (hardbound) -- http://www.amazon.com/dp/0595654371
The Art of the Meme (Kindle eBook) -- http://www.amazon.com/dp/B007ZI66FS
Artificial Intelligence in Ancient Latin (paperback) --
https://www.amazon.com/dp/B08NRQ3HVW
Artificial Intelligence in Ancient Latin (Kindle eBook) --
https://www.amazon.com/dp/B08NGMK3PN
https://redditfavorites.com/products/artificial-intelligence-in-ancient-latin
Artificial Intelligence in German (Kindle eBook) -- http://www.amazon.com/dp/B00GX2B8F0
InFerence at Amazon USA (Kindle eBook) -- http://www.amazon.com/dp/B00FKJY1WY
563
Mentifex Autograph Postcards were mailed in 2022 primarily to
autograph collector customers at used bookstores to press the issue of
whether or not the Mentifex oeuvre and therefore the autograph is valuable.
These artwork-postcards with collectible stamps may be traded at
various on-line venues.
See
AI 101
AI 102
AI 103 year-long community college course curriculum for AI in English.
See
Classics Course in Latin AI for Community Colleges or Universities.
See
College Course in Russian AI for Community Colleges or Universities.
Collect one signed
Mentifex Autograph Postcard from 563 in circulation.
See
Attn: Autograph Collectors about collecting
Mentifex Autograph Postcards.
Return to top; or to
The collectible
AI4U book belongs in every AI Library as an early main publication of Mentifex AI.