1. Cognitive architecture
diagram of the RuThink() Russian-Thinking Mind-Module
/^^^^^^^^\ Volition calls the RuThink module /^^^^^^^^^\ / EYE \ ____________ / EAR \ / \ / \ / \ | | | | ( MainLoop ) | auditory | | _______ | | | \____________/ | memory | | /image \ | | | | \ | channel | | / percept \--|-----+ ___V___ \________ | where | | \ engram / | a|c|f / \ / \ | English | | \_______/ | b|o|i (Sensorium) ( Volition )| and | | | s|n|b \_______/ \________/ | Russian | | visual | t|c|e / | | words | | | r|e|r / | | are | | memory | a|p|s / | | stored | | | c|t| _______ / ___V___ | | | channel | t| | / \/ / \ | | | | | | ( EnThink ) ( RuThink ) | | | | | | \_______/ /\_______/ | | | | | | | / | | | | | | | ____V____/ ____V_____ | | | | | | (InFerence) / \ | | | | | | \_______/ (RuIndicative)| | | | | | \__________/ | | | | | | | | | | | | | | ___________V | | | | ______ | | | / \ | | | | / \ | | | ( RuNounPhrase )-|-----|---СТУДЕНТЫ | | / reentry\ | | | \____________/ | | | | / of \ | |_|_ __________V_ | | | \remembered/ | / \ / \---|---ЧИТАЮТ | | \ image /--|--\ Psy / ( RuVerbPhrase ) | | | \______/ | \___/ \____________/ | КНИГИ |
2. Purpose of the RuThink AI Mind Module
Although the purpose of the RuThink module in the Perl AI is simply to call the subordinate modules which generate a thought in Russian, here we discuss Russian-language AI in general because RuThink is the top module in the hierarchy of Russian AI mind-modules. Since the ghost.pl Perl AI is meant as a primitive proof-of-concept example of artificial intelligence, the Russian-language features of the AI are initially rather simple and rudimentary, on a level designed to show that AI is possible in Russian and designed also for vast expansion by Russian-speaking Perl coders.
A. Calling RuThink
The
Volition() module calls RuThink if the human-language code
$hlc
is set to "ru" for Russian. Thinking in Russian or English
is a component of decision-making in free will or volition.
If a human user addresses the
ghost.pl AI in Russian, the software
detects the entry of Russian Cyrillic characters and automatically
sets the
$hlc
code to "ru" for Russian. Versions of the
ghost.pl AI may be set to think in Russian initially so as to
demonstrate to all users that the Russian capability is there in the AI Mind.
After a user types with Roman characters into the AI, the
$hlc
code switches to "en" for English.
B. RuThink calls subordinate module
RuIndicative.
Concepts in the ghost.pl AI are expressed as Russian or English words, with a special $mtx tag attached to them for "machine-translation transfer" of activation from a Russian word expressing a concept to an English word expressing, as closely as algorithmically possible, the same concept.
4. Uses
5. Code of RuThink() from ghost294.pl First Working AGI in
Perl
sub RuThink() { # http://ai.neocities.org/RuThink.html $output = ""; # 2016apr21: Show output between array-display and aud-input. $pov = 1; # 2016apr01: thinking occurs in self or "I" mode. $tvb = 0; # 2017-06-17: reset time-of-verb for safety before thinking. print "\nGhost: "; # 2016apr01: Listen to the ghost in the machine. RuIndicative(); # 2018-09-26: Preparing also for RuImperative(). $hlc = 3; # 2016feb22: Think in the particular human language. $idea = " "; # 2016apr23: reset for safety. $nounlock = 0; # 2016apr23: reset for safety. $svo1 = 0; # 2017-06-17: reset subject-verb-object values for safety. $svo2 = 0; # 2017-06-17: reset subject-verb-object values for safety. $svo3 = 0; # 2017-06-17: reset subject-verb-object values for safety. $svo4 = 0; # 2017-06-17: reset subject-verb-object values for safety. $tdo = 0; # 2016apr28: reset time-of-direct-object for safety. $tkb = 0; # 2017-06-30: reset time-in-knowledge base for safety. $tvb = 0; # 2016apr28: reset time-of-verb for safety. $verblock = 0; # 2016apr28: reset for safety. PsiDecay(); # 2016apr28: Reduce activation after each thought. $pov = 2; # 2017-06-19: give human user a "pause" for input... } # 2017-04-02: RuThink() returns to the FreeWill Volition() module.
6.
Variables for the RuThink Russian AI Mind Moduile
$hlc -- human-language code; 1=en; 2=de; 3=ru.
$idea -- for re-entry of AI thought back into the AI.
$nounlock -- for a verb to lock onto a seq-noun.
$output -- output string as in JavaScript FirstWorkingAGI.html or ghost.pl AGI.
$pov -- point-of-view: 1=self; 2=dual; 3=alien. When pov=1, the word "you" is somebody in the external word. When pov=2, the word "you" refers to the self-concept "I" in the AI. When pov=3, the word "you" is interpreted as part of a conversation by a third-party or as a word in a text, not referring to the self-concept of either the AI or of someone talking to the AI.
$snu -- subject-number as parameter for verb-selection.
$svo1 -- subject -- item #1 in subject-verb-object.
$svo2 -- verb -- item #2 in subject-verb-object.
$svo3 -- indirect object -- item #3 in subject-verb-object.
$svo4 -- direct object -- item #4 in subject-verb-object.
$tdo -- time-of-direct-object for a parser module.
$tkb -- time-in-knowledge-base of an idea. The EnParser() module stores time-of-verb $tvb in the k[13] $tkb slot of the subject of a sentence being stored at the time-of-subject $tsj time-point in the Psy conceptual array. The EnParser module stores $tkb as equal to $tdo (time of the direct object) in the Psy array row of the verb at time-of-verb $tvb. Thus for subjects $tkb means "verb" and for verbs $tkb means "direct object". No $tkb is stored for a direct object, because the associative tags go from subject to verb to object (SVO) but no further.
$tvb -- time-of-verb for parsing. The EnParser module sets "t - 1" as the tvb if the word being parsed is a verb by part-of-speech. InStantiate uses the time-of-verb tvb to fill the tkb-slot in a verb-panel with the time-location of the direct object of the verb. EnParser uses the time-of-verb tvb to find the numeric verb-concept that will be the pre of a direct object being parsed. EnParser also uses the time-of-verb tvb to store the time-of-direct-object tdo as the tkb in the flag-panel of the verb. Therefore the tvb-flag needs to be reset to zero not during but after AudInput, when program-flow has returned to the Sensorium module.
$verblock -- for subject-noun to lock onto seq-verb. The $verblock is a time-point in conceptual @psy memory where the $seq of a subject-noun is located, so that any crucial consideration such as the negation of a verb will be found when the verb-phrase module fetches the verb-concept from memory. If the thinking process finds and activates a particular subject-noun in memory, the $tkb of the subject noun becomes the $verblock so that a particular verb at a particular time will be selected by the English or Russian verb-phrase module.
7. Troubleshooting and Debugging for
AI Mind Maintainers
7.1.a. Symptom: (Something goes wrong.)
7.1.b. Solution: (AI Mind
Maintainer devises solution.)
10. Spread the News on TikTok and Other Venues
Are you on TikTok? Are you eager to be a ThoughtLeader and Influencer?
Create a TikTok video in the following easy steps.
I. Capture a screenshot of
https://ai.neocities.org/RuThink.html
for the background of your viral TikTok video.
II. In a corner of the screenshot show yourself talking about the RuThink module.
III. Upload the video to TikTok with a caption including all-important hash-tags
which will force governments and corporations to evaluate your work
because of FOMO -- Fear Of Missing Out:
#AI
#ИИ
#brain
#мозг
#ArtificialIntelligence
#ИскусственныйИнтеллект
#consciousness
#сознание
#Dushka
#Душка
#psychology
#психология
#subconscious
#подсознание
#AGI #AiMind #Alexa #ChatAGI #chatbot #ChatGPT #cognition #cyborg #Eureka #evolution
#FOMO #FreeWill #futurism #GOFAI #HAL #immortality #JAIC
#JavaScript #linguistics #metempsychosis #Mentifex #mindmaker #mindgrid
#ML #neuroscience #NLP #NLU #OpenAI #OpenCog #philosophy #robotics #Singularity #Siri #Skynet
#StrongAI #transhumanism #Turing #TuringTest #volition
A sample video is at
https://www.tiktok.com/@sullenjoy/video/7203367394805435690
11.
AiTree of Mind-Modules for
Natural Language Understanding
Nota Bene: This webpage is subject to change without notice. Any Netizen may copy, host or monetize this webpage to earn a stream of income by means of an affiliate program where the links to Amazon or other booksellers have code embedded which generates a payment to the person whose link brings a paying customer to the website of the bookseller.
This page was created by an
independent scholar in artificial intelligence who was
a contemporary of H.G. Wells and who created the following
True AI Minds with
sentience and with limited
consciousness/a>.
The following books describe the free, open-source True AI Minds.
AI4U -- https://www.iuniverse.com/BookStore/BookDetails/137162-AI4U
AI4U (paperback) -- http://www.amazon.com/dp/0595259227
AI4U (hardbound) -- http://www.amazon.com/dp/0595654371
The Art of the Meme (Kindle eBook) -- http://www.amazon.com/dp/B007ZI66FS
Artificial Intelligence in Ancient Latin (paperback) --
https://www.amazon.com/dp/B08NRQ3HVW
Artificial Intelligence in Ancient Latin (Kindle eBook) --
https://www.amazon.com/dp/B08NGMK3PN
Artificial Intelligence in German (Kindle eBook) -- http://www.amazon.com/dp/B00GX2B8F0
InFerence at Amazon USA (Kindle eBook) -- http://www.amazon.com/dp/B00FKJY1WY
563
Mentifex Autograph Postcards were mailed in 2022 primarily to
autograph collector customers at used bookstores to press the issue of
whether or not the Mentifex oeuvre and therefore the autograph is valuable.
These artwork-postcards with collectible stamps may be bought and sold at
various on-line venues.
See
AI 101
AI 102
AI 103 year-long community college course curriculum for AI in English.
See
Classics Course in Latin AI for Community Colleges or Universities.
See
College Course in Russian AI for Community Colleges or Universities.
Collect one signed
Mentifex Autograph Postcard from 563 in circulation.
See
Attn: Autograph Collectors about collecting
Mentifex Autograph Postcards.
Return to top; or to
The collectible
AI4U book belongs in every AI Library as an early main publication of Mentifex AI.