AGI RoadMap Table of Variables in the

English and Russian bilingual ghost.pl AGI in Perl

Mens Latina in Latin -- MindForth for Robots


$abc -- AudBuffer() transfer character

$act -- quasi-neuronal activation-level

$actbase -- discrimination activation base used as a local "my" variable in AudRecog().

$actnext -- holds an activation value (e.g., "8") to be imposed on the "next" character of a word being recognized in AudRecog()

$actpsi -- a low-priority psi concept from which the SpreadAct module may spread activation, unless a higher-priority flag in SpreadAct is focusing attention upon a complex input-query, causing SpreadAct to ignore the low-priority actpsi concept.

adjw -- adjective weight -- a proposed variable for use in the LangLearn (language-learning) module to hold the high or low weight that determines whether or not an adjectival mind-module is habituated into a particular position in a syntactic chain of linguistic mind-modules. For adjectives, a typical position might be before a noun or after a noun. The language-learning module is intended to be called when an AI Mind fails to comprehend an input consisting of unfamiliar syntax, such as "The more, the merrier" or even a sentence in the passive voice such as "Russian is spoken here," when the AI program does not yet include passive constructions. A rotating series of enumerated variables (eg, pos1, pos2..., pos7) may constantly monitor and briefly keep track of the parts of speech of user input to the AI, so that a record will remain of any novel syntax. In the LangLearn module, there may be a sequence of part-of-speech nodes ready to be habituated into a learnable syntax. Each node may consist of available calls to eight part-of-speech subroutines, with only one part-of-speech actually being called when its pertinent weight is higher than that of the other subroutines bunched together at the syntactic node. For instance, the "adjw" weight may hold a high value for the passive participle "spoken", taken as an adjective in the passive syntax of the sentence "Russian is spoken here." The learning of such a syntax might consist of nodes weighted for calling a noun-phrase (Russian), a verb of being (is), an adjective (spoken), plus or minus an adverb (here). An AI able to learn its own syntax could break free from the necessity of human coding of linguistic mechanisms.

$age -- Temporary age for loop-counting.

$anset -- Sets "an" before a vowel at the start of a noun.

$apb -- (all points bulletin) -- is a JavaScript variable for displaying on-screen messages about what is happening internally in the AI Mind.

$aud -- in the Speech() module is an associative tag leading to the typically most recent engram of a word stored in the @ear array of the auditory memory channel.

$audbase -- is the start of the @ear auditory engram of a word that needs to have its inflectional ending changed in a generative mind-module such as LaNounGen() or RuVerbGen(). $audbase is incremented by one successive unit as long as the target word in auditory memory continues, and each character of the target word from auditory memory is sent as the value of $abc into the AudBuffer() mind-module, which left-justifies the target word before sending it to be right-justified in the OutBuffer() mind-module prior to any manipulation of the inflectional ending of the target word.

auddata -- ( auditory data ) is a cumulative variable which the JavaScript Diagnostic() module uses to display a column of the flag-panels of the auditory engrams over a range of time-points in the Aud array of the auditory memory channel.

$audjuste -- EnNounPhrase() $motjuste $aud to Speech() module

$audnew -- holds onset-tag while rest of word comes in.

audnum -- grammatical num(ber) variable potentially of use for locating an infinitive verb-form with zero as a grammatical num(ber).

$audpsi -- concept number of word in @ear auditory memory array

$audrec -- auditory recognition concept-number

$audrun -- in the AudRecog() module, which increments audrun, it is a counter of how many times the incrementing module has been called. If the audrun count is at unitary one or below two, then the initial character at the start of a word is being processed. audrun is reset to one in the AudInput() and FileInput() modules.

$audrv -- auditory recall-vector for Speech mind-module.

$audstop -- flag to stop Speech module after one word. The Speech module lets a CR-13 carriage-return or a SPACE-32 change the audstop flag from its zero value at the start of Speech to a unitary one (1), so that audstop and spacegap may work together to send the final non-letter into the AudInput module.

$auxverb (auxiliary verb) -- such as 800=BE; 818=DO; or modal verb.

$b1 -- buffer variable 1 in the OutBuffer() quasi-array for the right-justifying of an English or Russian word of up to sixteen characters in length or longer. The buffer variable $b1 always contains the very last character in a word of any length. Since each word is right-justified in the OutBuffer(), a module like NounGen() or RuVerbGen() can perform tests on the contents of $b3 and $b2 and $b1 so as to detect one inflectional ending and to replace it with a different ending as necessary. In English, where a noun like "beach" or a verb like "teach" will sometimes need to have "-es" as an ending ("beaches", "teaches"), $b2 and $b1 can be used to detect the "-ch" at the end of the word, as an indicator that the "-es" ending may need to be added on.

$b2 -- buffer variable 2 counting backwards from the final OutBuffer character.

$b3 -- buffer variable 3 counting backwards from the final OutBuffer character.

$b01 -- obsolete buffer variable 01 in the OutBuffer() quasi-array

$b15 -- buffer variable 15 in the OutBuffer() quasi-array

$b16 -- obsolete buffer variable 16 in the OutBuffer() quasi-array for the right-justifying of an English or Russian word of up to sixteen characters in length. The buffer variable $b16 in older AI Minds would contain the last character in a word of any length. Since each word was right-justified in the OutBuffer(), a module like NounGen() or EnVerbGen() could perform tests on the contents of $b14 and $b15 and $b16 so as to detect one inflectional ending and to replace it with a different ending as necessary. In English, where a noun like "beach" or a verb like "teach" will sometimes need to have "-es" as an ending ("beaches", "teaches"), $b2 and $b1 can be used to detect the "-ch" at the end of the word, as an indicator that the "-es" ending may need to be added on.

$becon -- be-conditional -- carries a flag from the OldConcept() recognition module indicating that new input includes a verb of being which may permit the InFerence() module to make a general inference about the subject of the input so that the InFerence() module can look for prior knowledge as the basis of an inference.

$beep -- flag for Motorium() to create a beep.

$bias -- a variable holding the numeric value of the part of speech (POS) expected to be input next and parsed next on the basis of preceding parts of speech.

$binc -- OutBuffer() B-INCrement for a VerbGen() module. The $binc counter allows RuVerbGen() to examine and test all the right-justified characters of a Russian verb, especially the several characters of an inflectional ending which might need to be changed on the basis of grammatical number (singular or plural), person (first, second, third), or past-tense gender (male, female, neuter).

$birth -- holds the time of when the Ghost AI started running.

$c1 -- the first character of a left-justified word in the quasi-array of the AudBuffer(), where a word of input or of retrieval from memory is stored briefly in transit to the OutBuffer() where the same word is right-justified so that the inflectional endings of the word may easily be manipulated or changed by a module like NounGen() or EnVerbGen().

$c2 -- the second character of any word stored in the AudBuffer().

$c16 -- the sixteenth character if a word of that length is stored in the quasi-array of the auditory input AudBuffer().

$catiobj -- conCATenated (chained) IndirectOBJect flag for syncopating sentences.

$catdobj -- conCATenated (chained) DirectOBJect flag for syncopating sentences.

$catsubj -- conCATenated (chained) SUBJect flag for syncopating sentences.

$catverb -- conCATenated (chained) VERB flag for syncopating sentences.

$chaincon -- chain-of-thought condition-flag

$char -- for use with getc (get character) in FileInput()

$cns -- (adjustable) size of "central nervous system" memory in the Ghost Perl AI

$coda -- memory recycled in ReJuvenate()

$conj -- a dual-purpose variable serving both as an identifier of which particular conjunction is under consideration for use by the ConJoin module, and a control-flag for the decision of whether to call the Indicative module more than once.

$dba -- doing-business-as noun-case or verb-person.

$defact -- default activation for EnNounPhrase()

$dirobj -- flag indicates seeking for a direct object. Should perhaps be renamed "dirobjcon" so as to indicate a condition rather than a particular direct object.

dobmfn -- for InFerence to pass gender to AskUser

$dobseq -- direct-object-subSEQuent word – is used only within the InFerence() module to hold onto the psi identifier of a noun or pronoun that was the direct object of a verb retrieved from memory as part of an inference that will require the same direct object to be associated with the verb if the inference is to be confirmed or refuted as a valid inference.

$dunnocon -- is a flag used in SpreadAct() to permit the AI Mind to respond to a who+verb+object query with "I DO NOT KNOW" if no correct answer is found.

$engov -- possible replacement for too strict "hlc". In order for the AI Mind thinking in English to be able to mention a Russian word without switching to Russian, it may be necessary to use $engov and $rugov as competing language-determinants. Whichever governing determinant has the higher numeric value shall cause thinking to occur in its language, even if a word occurs in a different language. Only a preponderance of words in the other language shall cause a switch to the other language.

$eot -- end-of-transmission for end of input

$etc -- from the abbreviation "etc." from the Latin phrase et cetera, meaning "and other things" -- a counting variable that is incremented in the EnNounPhrase() module when more than one fetchable idea is active in conceptual memory. The $etc count is used in the EnThink() module as a trigger for calling the ConJoin() module to insert a conjunction like "and" between two ideas in the generation of a sentence of thought.

$eureka -- something "found" in the aud-recog modules

$finlen -- fin(al) len(gth) or end-of-word for AudRecog auditory recognition module.

$finpsi -- carry-over end-psi during 32-SPACE or 13-CR (carriage return).

$foom -- a proposed new variable for a proposed new Spawn() module. The foom variable shall be set arbitrarily by the AI Mind Maintainer as the number of new concepts to be learned by the spawning AGI Mind as a trigger-level for the spawning of a new copy of the AI going "FOOM" in the sense of self-modifying and gaining knowledge and intelligence so rapidly as to constitute a "FOOM!" explosion of hard-take-off AI.

$fyi -- (for your information) 1 = Normal; 2 = Transcript; 3 = Tutorial mode; 4 = Diagnostic mode

$gapcon -- status-con flag for AudInput() to count down gaps of no input, so that internal thinking may accelerate during periods of no input from external sources.

$gencon -- status-con flag set briefly to unitary one by EnVerbGen() as a condition-flag to prevent EnVerbPhrase() from making a call to Speech() after EnVerbGen() has already generated and spoken an inflected form of an English verb. The $gencon flag is reset to zero at the end of EnVerbPhrase() whether it has been used or not, so that EnVerbPhrase() will normally call the Speech() module to say a verb-form summoned from memory by the parameters of person and number.

$hlc -- human-language code; 1=en; 2=de; 3=ru. May eventually be subject to override by the $engov or $degov or $rugov flag, so that an AI thinking in one language may discuss a word in a different language without switching over to thinking in the different language.

holdnum -- transfers number from subject to verb.

$i -- index for cycling through a loop or an array.

$ictus -- testable activation-level to trigger use of a conjunction by the ConJoin() module in expressing a compound thought composed of two or more strongly activated ideas, each generated by a single calling of the Indicative() module.

$idea -- for re-entry of AI thought back into the AI.

$impetus -- an accumulating variable to hold the activation-level of a trigger to initiate action by the FreeWill Volition module.

infincon -- infinitive condition flag

$inft -- – inference-time – holds the current time at the start of the InFerence() module so that the AskUser() module may ask a question based upon pre-existing knowledge before the formation of a silent inference in the AI memory.

$inhibcon -- flag for neural inhibition

$iob -- indirect-object tag for @psy concept array

isolation -- counter of thought-cycles spent in "isolation" with no human input, as an arbitrary trigger for output by the AI Mind, or as a threshold for the AI to take certain actions if input suddenly appears.

ivcon -- Imputed-Verb CONdition flag that is set to a unitary one in the RuParser module if a Russian be-verb is imputed to be meant by the speaker and is not overriden by a normal Russian verb. Then the Russian-thinking RuThink module calls the InFerence module to create a silent inference based on both the knowledge about a class of subjects and the new knowledge that a particular entity is a member of the indicated class of subjects. For instance, if the AI Mind knows that students read books, and a user states that "Mark is a student," the AI may silently infer the possibility that Mark reads books.

$jrt -- ReJuvenate() "junior time" for memories moved

$jux -- jux(taposed) concept in @psy array, especially the word "NOT" to negate a verb.

$k -- k(knowledge) element from @psy concept array

kbcon -- knowledge-base-condition flag for KbRetro to wait for a yes-or-no answer.

kbdba -- knowledge-base doing-business-as -- in the Mens Latina Latin AI, reveals what case a Latin noun is in at the particular toc (time of old concept) when the Latin noun was recognized as representing a particular concept. Because the inflectional ending of a recognized Latin word may be ambiguous with respect to, say, nominative case or accusative case, the value of the dba tag at the time of recognition is not the final say or final interpretation of what case a noun is in as part of a new Latin input. The value obtained from the toc node may be treated as a temporary default value likely to be overridden by the grammatical tests conducted by the LaParser() module.

kbmfn -- knowledge-base masculine-feminine-neuter -- in the Mens Latina Latin AI, is the gender (masculine/feminine/neuter) of a Latin noun as revealed at the toc time of its recognition as a word and a concept. The gender-value obtained for the noun is not likely to be overruled. Although in English a "friend" can be a he or a she, in Latin the gender is more closely bound with each particular noun.

kbnum -- knowledge-base num(ber) -- in the Mens Latina Latin AI, is the grammatical number (singular or plural) of a particular conceptual engram of a Latin word at the toc time of its recognition as a word and a concept. The given value serves as a default for the first instantiation of the input noun, and is not likely to be overridden, because Latin inflectional noun-endings typically reveal the grammatical number of the noun.

kbzap -- ZAP the knowledge base -- is a variable that holds the oldpsi word "YES" or "NO" when the InFerence() module has caused the AskUser() module to seek confirmation or denial of a silent inference from a human user engaged in a conceptual conversation with the AI Mind. KbRetro() uses the YES-value kbzap to confirm an inference, or the NO-value of kbzap to negate an inference.

$krt -- knowledge representation time.

$lastpho -- device to avoid extra "S" on verbs.

$len -- length, for avoiding input non-words. The len variable increments with each additional character during AudInput, and is reset to zero at the end of InStantiate or OldConcept or NewConcept, so that the auditory engram recall vector rv may be set for the first character of a word when the word-length is a unitary one.

$mfn -- masculine-feminine-neuter gender flag.

$midway -- dynamically adjustable time-limit for searching backwards in the experiential memory of the AI. In early AI Minds, "midway" is set to zero so as to search the entire memory. As AI Minds mature and acquire tremendously long lifetime memories, "midway" may be dynamically adjusted to search bacwards actually to the mid-way mark. If an AI has the task of finding all possible recollections and knowledge about a topic where an initial search has yielded no results, "midway" may be adjusted for a secondary and more exhaustive search.

$mjact -- motjuste-activation for defaulting to 701=I

$monopsi -- for detecting single-character words in the AudRecog() module

$mood -- a selector to enable the English-thinking EnThink module to choose among grammatical moods of verbs and their enveloping sentences, such as Indicative; Subjunctive; Imperative, etc.

$moot -- – as in legally moot – is a flag to prevent the formation of associative tags during mental operations which are not truly a part of cognition, such as the processing of an input query, the formation of a silent inference, or the creation of an output query.

$morphpsi -- for AudRecog() recognition of morphemes

$motjuste -- (from French) best word for inclusion in a thought

$mri -- most-recent-instantiation. The mri is a time-point which may be used for the insertion of an associative tag into the conceptual flag-panel of a previous concept in order to establish a link between a newly input concept being instantiated and the previously instantiated concept. The designation of the time-point of the mri concept may be qualified or restricted by code that requires the mri concept to be a special ("pos") part-of-speech, such as either a noun or a verb. Thus a preposition being instantiated may be back-linked to a noun or a verb as a mental point of departure leading to the preposition within a chain of thought. For example, if we say "John writes books for money", the preposition "for" could be linked backwards to a tpr (time-of-preposition) tag in the flag-panel of the verb "writes" or in the flag-panel of the noun "books", depending on whether the basic idea is "writes for money" or "books for money" -- especially if "John writes books for money and poems for fun."

$msg -- for input as a "message" to the AI. The "msg" string is built up by the concatenation of each incoming "pho" character to the accumulating "msg" string which is then displayed as the human input within the user interface. The "msg" string is reset to emptiness in the Sensorium module so that a new message of human input may accumulate.

$mtx -- machine-translation xfer (transfer) variable for SpreadAct module to transfer an activation-swarm from a sentence in one language to the same concepts in a target language for machine translation (MT).

$negjux -- a flag to indicate that a verb stored in memory has been negated by the nearby storing of a 250=NOT adverb. When EnVerbPhrase() selects a verb from conceptual memory and the verb is negated with "NOT", the numeric value of 250=NOT goes into the negjux variable as a flag to require the thinking of a negated English verb to include a call to the EnAuxVerb module that will place a form of the auxiliary verb "DO" before the adverb "NOT" and the verb itself, as in "God does not play dice." During a verb-fetch in EnVerbPhrase, negjux is given one chance to modify either a be-verb or a non-be-verb, and in either case is reset to zero after the one-chance opportunity.

newpsi -- for singular-nounstem assignments

$node -- "split" @psy array row into nodes

nonce -- roughly the time when the AI starts running -- for the Diagnostic display to show the recent contents of the conceptual and sensory memory arrays.

$nounlock -- time-point in conceptual memory for a verb to lock onto a seq-noun.

$nphrnum -- grammatical number of the noun-phrase selected by the EnNounPhrase module. nphrnum is set to zero near the start of EnNounPhrase so that no value will erroneously carry over from a previous noun to another noun being selected. EnArticle needs nphrnum as a basis for inserting the indefinite article "a" before a singular noun, and to avoid inserting "a" before a plural noun.

$nphrpos -- "noun-phrase part-of-speech" for testing during the EnThink() process.

$nucon -- new-condition – segregates SpreadAct() code to respond to the input of an unknown word by asking the user a question about the new concept being learned by the AI Mind.

$num -- number-flag for grammatical number. Verbs must agree in number with their subjects. Two or more singular subjects require a plural verb.

$numreq -- NUMber REQuired for agreement of subject and predicate nominative, with possibility of being overruled by circumstances.

numsubj -- for number of subject.

$nxr -- "next row in array" for AudRecog() dealing with next memory row

$nxt -- number incremented for each new concept

$objprep -- object of a preposition to be found and spoken by the EnPrep module.

$oldpsi -- used in OldConcept to de-globalize "psi"

$onset -- onset-tag for use as an auditory recall-vector.

$output -- output string as in a JavaScript AI Mind or the ghost.pl AI.

$PAL -- Permissive Action Link for AudBuffer and OutBuffer. This flag will normally be true with a unitary value of one, but it may be set to false with a value of zero so that a module like EnVerbGen may manipulate the buffer-characters in the OutBuffer without causing a feedback loop that dynamically interferes with the process of generating inflectional endings on a word.

$pho -- a "phoneme" or character of auditory input.

$phodex -- pho-index counter for AudBuffer(); phodex is reset to unitary one ("1") in the AudInput module and in the Speech module.

$pos -- (part of speech) with a numeric value (1=adj 2=adv 3=conj 4=interj 5=noun 6=prep 7=pron 8=verb) to be inserted into the conceptual flag-panel of a concept during the process of instantiation of a concept-node at a particular time-point. The part-of-speech parameter allows an AI Mind to restrict operations to one particular part of speech, such as verb, or to similar parts of speech, such as nouns and pronouns to be used as subjects or objects of a verb.

$pov -- point-of-view: 1=self; 2=dual; 3=alien. When pov=1, the word "you" is somebody in the external world. When pov=2, the word "you" refers to the self-concept "I" in the AI. When pov=3, the word "you" is interpreted as part of a conversation by a third-party or as a word in a text, not referring to the self-concept of either the AI or of someone talking to the AI.

$prc -- Provisional ReCognition in the AudRecog() auditory recognition module.

$pre -- pre(vious) associated @psy concept.

$prednom -- predicate nominative for InFerence.

$prejux -- previous $jux to carry "NOT" to a verb. If the adverb "not" passes through the OldConcept module in the process of the negation of a verb, the "prejux" variable holds the value for negation and passes it as "jux" to the flag-panel of the verb being negated.

$prep -- a preposition used in the EnPrep mind-module for English prepositions.

$prepcon -- prepositional condition-flag for parsing.

$prepgen -- urgency to generate a prepositional phrase.

$prepsi -- is the identifier of a "pre" concept for the purpose of spreading activation sideways from a direct object to its governing verb and then further sideways to the subject of the verb. This process may be tested by an AI coder or by an AI Mind Maintainer by entering one noun into the running AI software to see if some other noun is output as part of a remembered idea involving the noun being tested for. Activation spreading sideways is an important feature for the evolution of AI Minds able to ruminate endlessly and able to think interminably about all possible subjects contained as an accumulation of concepts in the AI knowledge base.

$prevpsi -- a value-holder to be used by InStantiate to cancel an imputed Russian BeVerb which has been tentatively instantiated after the input of a noun or pronoun potentially serving as the subject of an imputed verb of being. For instance, if someone begins a sentence in Russian with a noun like "The teacher...", it is not yet known whether the intended verb will be a be-verb like "is" or some ordinary verb like "plays (chess)". The AI software must instantiate an imputed verb of being in case the next input word is something like a noun or an adjective, and not a normal verb. But if a normal verb is indeed the next word of input, the imputed be-verb must be discarded in favor of the real verb, and the "prevpsi" identifier will help the AI program to declare that the "seq" of the Russian subject is the real verb, and not the momentarily imputed verb of being.

$prevtag -- local pre(vious) tag for use in InStantiate(). After a noun or a verb has been instantiated, $prevtag holds its concept-number ready to be inserted as a $pre tag, if needed, during the instantiation of a succeeding concept. Thus a verb can have a $pre back to its subject, and a direct object can have a $pre back to a verb.

$prsn -- 1st, 2nd, 3rd person of verb-forms.

$psi -- variable for an element of the @psy conceptual array, with a numeric identifier serving as an mtx tag for the same concept in another language.

$psibase -- winning psibase with winning actbase

psidata -- ( Psi mindcore data ) is a cumulative variable that permits the JavaScript Diagnostic() module to display a column of the flag-panels of the engrams in the Psy mindcore array.

$putnum -- putative num(ber) for subject-verb agreement.

$px1 -- (obsolete) preposition-transfer carrier for NLP generation. This carrier-variable transfers the location of the object of a preposition from a generative mind-module such as EnNounPhrase to the English-preposition EnPrep module which fetches and outputs both the remembered preposition and the object of the preposition. Made obsolete by the "tnpr" flag and the "tvpr" flag.

$px2 -- (obsolete) second of ad libitum many preposition-transfer carriers for NLP generation.

$px3 -- (obsolete) third preposition-transfer carrier for NLP generation. If an idea stored in conceptual memory contains multiple prepositions as in the phrase "government of the people, by the people, for the people", the group of "px" variables may serve to retrieve each separate object of the multiple prepositions.

$qucon -- query-condition – segregates SpreadAct() code to respond to the input of who-queries.

$quiet -- a status variable set briefly to false upon user-input, so that the AI may output one thought responding to input and not following an internal chain of thought as conducted by the SpreadAct() module. As long as user input continues, the AI continues to think up responses. If user-input stops or even pauses, the AI returns to thinking its own thoughts.

$quobj -- query object – holds onto the psi identifier of a word chosen by the InFerence() module to be the direct object of a query created by the AskUser() module.

quobjaud -- auditory recall-tag for AskUser module

qusnum -- query-subject number – for the AskUser module to ask a question seeking yes-or-no confirmation of a logical inference made by the AI Mind.

$qusub -- query subject – is a transfer-vehicle of the subject-identifier from any module prompting a question into the specific module that will ask the question.

$quverb -- query verb – is set in the InFerence() module with the identifier of a verb concept serving as part of an InFerence being made about user input. Then the AskUser() module transforms the quverb identifier into the yes-or-no-verb identifier ynverb so that AskUser() can use the query-verb to ask a question expecting a yes-or-no answer.

$qv1psi -- concept for SpreadAct to seek as a subject, so that the AI will try to remember knowledge about the concept.

$qv2num -- num(ber) of a verb in a who+verb+dir.obj response.

$qv2psi -- concept for SpreadAct to seek as a verb, so that the AI may remember ideas containing the same verb.

$qv3psi -- concept for SpreadAct to seek as an indirect object, so that the AI may think of ideas connected in any conceivable way with the target concept.

$qv4psi -- concept for SpreadAct to seek as a direct object, so that the AI may use the target concept as a subject or as a direct object in the remembrance of ideas.

$qvdocon -- query-condition for who+verb+direct-object – segregates SpreadAct() code to respond to input queries similar in form to "Who makes robots?"

$qviocon -- query-condition for who+verb+indirect-object – segregates SpreadAct() code to respond to input queries in a form like "To whom does God give help?"

$recnum -- recognized number of a recognized word

$recon -- incentive for reconaissance by asking a question.

$rjc -- counter of rejuvenation cycles

rota (Latin for "wheel") -- rotation variable for treating input-(pro)nouns. In a simple sentence of Latin input to the Mens Latina Latin AI, a series of nouns may initially (Stage One) be instantiated in a bare-bones fashion without their case or subject-verb-object role being identified. They may then go through a Stage Two instantiation where the LaParser() Latin parser module performs a series of tests to determine or disambiguate the speaker-intended case of the noun, based not only on its perhaps ambiguous inflectional ending but also on the other Latin nouns being used in the sentence of input. The rota counter allows the InStantiate() module to cycle or rotate through the several nouns that are possibly included in the Latin input, so that the OutBuffer() detection of potentially ambiguous Latin noun-endings may load the initial instantiation time-point onto unknown "upshot" placeholder variables to be kept ready during a series of "Stage Two" LaParser() tests in advance of re-instantiating each of the several nouns as either the tsj subject, or the tio indirect object, or the tdo direct object of the Latin verb. If the upshot variable (ux1-ux5; uy1-uy5; uz1-uz5) wins selection, it already contains the Stage One time-point of instantiation which will now be transferred to the time-point of the subject or indirect object or direct object, so that each item may be re-instantiated with the properly tested and identified associative tags which play their essential role in the Natural Language Understanding of the input sentence.

$rsvp -- to hold an arbitrary value for a delay in thinking to wait for user input.

$rugov -- possible replacement for too strict "hlc".

$rv -- recall-vector for auditory memory. The time-point value of rv is first set in the AudInput module after an intervening space when the first character of an input-word is being passed into the AudMem module. The recall-vector rv is reset to zero in AudInput after a call to OldConcept or to NewConcept.

$seq -- subSEQuent @psy concept in a Subject-Verb-Object (SVO) idea. During user input, a subject-noun or a subject-pronoun is at first instantiated at the time-of-subject tsj without a "seq" tag in the conceptual flag-panel, because the "seq" can not be known in advance. Upon the input of a verb at the time-of-verb tvb point in the same clause or sentence, the English-parsing EnParser module uses the time-of-subject tsj flag to re-instantiate the subject-concept with the numeric psi concept-identifier of the verb inserted retroactively into the seq slot in the flag-panel of the subject-concept. Subsequently if a direct object occurs in the input stream of the clause or sentence, EnParser uses the time-of-verb tvb flag to insert the psi of the direct object into the verb-panel as the seq of the verb, For the direct object itself, the time-of-direct-object tdo flag may be used to ensure that a seq of zero is assigned to the direct object as the final element in the Subject-Verb-Object (SVO) clause or idea.

seqdob -- for direct object transfer within InFerence

$seqneed -- noun/pronoun or verb needed as a "seq"

$seqpsi -- concept to which activation should be spread in the SpreadAct module for spreading conceptual activation.

seqrvx -- for rvx transfer of auditory recall-vector within InFerence

$seqtkb -- subSEQuent Time-in-Knowledge-Base – is used only within the InFerence() module to latch onto the specific time-point in memory of a verb which was linked in the past to a concept now occurring within user input as a predicate nominative which identifies a class of entities from which an inference can be drawn and assigned to the subject of the user input. For instance, if the AI knows "Boys play games" and the user inputs "John is a boy," the old verb "play" can now be used to infer, "John plays games," because John is a boy.

$seqverb -- subSEQuent-concept VERB – is an interstitial carrier of a verb-identifier in the InFerence() module, permitting a verb which was used in old knowledge to be used as part of an inference of new knowledge and as part of a question seeking confirmation or refutation of an inference.

$snu -- subject-number as parameter for verb-selection.

$spacegap -- a gap to add one post-word space in the Speech module.

$spt -- blank space time before start of a word

$stemgap -- for avoiding false AudRecog() stems

$subject -- subject for parser module

$subjectflag -- initial default for NounPhrase()

$subjnom -- subject nominative – is a concept identified in the OldConcept() recognition module as the subject of an input causing the AI to make an inference.

$subjnum -- (subject number) for agreement in grammatical number between a subject and a predicate nominative noun.

$subjpre -- subject-$pre to be held for verb in parsing.

$subjpsi -- subject-concept parameter to govern person of verb-forms.

$sublen -- length of AudRecog() subpsi word-stem

$subpsi -- for AudRecog() of sub-component wordstems

$svo1 -- subject -- item #1 in subject-verb-object

$svo2 -- verb -- item #2 in subject-verb-object

$svo3 -- indirect object -- item #3 in subject-verb-object

$svo4 -- direct object -- item #4 in subject-verb-object

$t -- lifetime Ghost AI experiential time "$t"

$t2s -- auditory text-to-speech index

$tai -- time of artificial intelligence diagnostics

$tbev -- time of be-verb for use with negjux negation-flag. In the OldConcept module, tbev is set for any 800=BE verb and is used for inserting a jux of 250=NOT into the flag-panel of a negated be-verb. The tbev flag may be reset to zero in the Sensorium module after any input has been processed by the AudInput module.

tcj -- time-of-conjunction -- conceptual flag-panel tag for a conjunction.

tdj -- time-of-adjective -- conceptual flag-panel tag for an adjective.

tdv -- time-of-adverb -- conceptual flag-panel tag for an adverb.

$tdo -- time-of-direct-object for a parser module.

tdt -- time-of-dative -- conceptual flag-panel tag for instantiating a word in the dative case.

$text -- a general variable to hold text, especially for incrementing by concatenation.

tgn -- time-of-genitive -- for instantiating a word in the genitive case.

tia -- time-of-instrument -- for instantiating a word in the instrumental (Russian) or ablative (Latin) case.

tin -- MindForth time-of-input for interactive display. This variable lets the Forthmind show the contents of the Psy conceptual array and the Aud auditory array immediately prior to a display of current input and output. For purposes of troubleshooting, a temporary adjustment may be made to show recent memory engrams a little further back in time, such as when a silent inference has been made by the InFerence module and is being adjusted retroactively by the KbRetro module.

$tio -- time-of-indirect-object for parser module.

$tkb -- time-in-knowledge-base of an idea. The EnParser() module stores time-of-verb $tvb in the k[13] $tkb slot of the subject of a sentence being stored at the time-of-subject $tsj time-point in the Psy conceptual array. The EnParser module stores $tkb as equal to $tdo (time of the direct object) in the Psy array row of the verb at time-of-verb $tvb. Thus for subjects $tkb means "verb" and for verbs $tkb means "direct object". No $tkb is stored for a direct object, because the associative tags go from subject to verb to object (SVO) but no further.

tkbn -- time-in-knowledge-base-noun -- is set in the InFerence() module as the concept-array time-point of the subject-noun in a yes-or-no question to be asked of the human user by AskUser() to confirm or negate an InFerence being made by the logical InFerence module.

tkbo -- time-in-knowledge-base-object -- is set in the InFerence() module as the concept-array time-point of the direct-object-noun in a silent InFerence to be confirmed or negated retroactively by the KbRetro() module after a human user has responded to a question posed by the AskUser() module. May be similar to tdo in the EnParser() module, but tkbo is the last of three consecutive time-points in the silent InFerence created by the InFerence() module for automated reasoning.

$tkbprep -- time-in-knowledge-base of a preposition, necessary for the EnPrep English-preposition module because the basic tkb flag, although obtainable at the time of fetching a preposition, is subject to mutation when the output of a preposition results in the re-entry of the preposition and the assigning of a new tkb value.

tkbv -- time-in-knowledge-base-verb -- is the time-point of a verb-concept in an AskUser() question for a human user to confirm or deny the truth of a logical InFerence made by the AI.

tmg -- time-of-midgap for the Mens Latina artificial intelligence in Latin to insert a silent pronoun serving as the unspoken subject of a Latin verb lacking a stated subject.

$tnpr -- time-of-noun-preposition – is for use in the EnNounPhrase module to convert a value stored as a "tpr" flag into specifically a "tnpr" flag so that the EnPrep English-preposition module may fetch and output not only a preposition related to the noun but also the object of the preposition. The "tnpr" flag is meant to distinguish between prepositional phrases that refer only to a noun and prepositional phrases that refer to a verb-phrase possibly including a noun as an object of the verb. The "tnpr" flag allows the chaining together of multiple prepositional phrases such as "the man in the street with an umbrella in his hand".

toc -- time-of-old-concept -- in the Mens Latina Latin AI, is obtained from the index "i" in the AudRecog() auditory recognition module so that other modules in the AI Mind may zero in not on the auditory engram of the word but on the simultaneous conceptual node for the recognized word, in order to harvest or fetch the values in the flag-panel of quasi-neuronal associative tags connecting the one recalled node of the concept with other concepts.

$topic -- @psy topic for a question to be asked

$tpp -- time-of-preposition for parsing.

tpr -- conceptual flag-panel tag for time-of-preposition.

tpu -- time-pen-ultimate before current I/O.

$trigger -- a trigger for Volition() to call Motorium()

$tru -- tru(th) variable serving as a flag-panel tag to hold dynamically the believed reliability or truth-value of an idea. On Reddit see an alternative way of modeling truth values. For an inference validated by a human user, the truth-value may be set to a positive value arbitrarily set by the AI Mind Maintainer. Since a positive truth-value may be carried forward in recollections of the validated inference, $tru must be reset to zero at the end of a thought-generation module such as the Indicative module. AI coders may use the truth-value to establish modes of thought which disregard ideas in the knowledge base lacking a positive truth-value.

$tseln -- time of selection of noun.

$tselo -- time of selection of object of prepositionn -- can be used to insert associative tags linking the preposition itself and its object.

$tselp -- time of selection of preposition - is used in the EnPrep English-preposition module to make sure by calculation that a verb being sought as used with a preposition is in close temporal proximity to the preposition, so that a query may be answered correctly with a response that mentions the query-subject qv1psi and the query-verb qv2psi and the selected preposition at the beginning of a prepositional phrase.

$tsels -- Time of SELection of Subject -- is used locally in EnNounPhrase() and in RuNounPhrase() to keep track of the experiential time $t at which a stored concept is selected as a subject, so that the $tsels can be used to revisit the engram at $tsels to change its activation-level in the course of neural inhibition.

$tselv -- time of selection of verb (for neural inhibition)

$tsj -- time-of-subject for parsing. The InStantiate module sets the tsj flag when a noun or a pronoun in the (dba=1) nominative case is passing through InStantiate and tsj has a prior value of zero. If the AI is receiving or re-entering a compound sentence of two main clauses joined by a conjunction, care must be taken to reset tsj to zero at the end of each indicative clause, so that the tsj of the second clause may properly be set. The tsj flag is necessary for the retroactive insertion of the seq flag and the tkb flag, and for extracting subject-data such as the num flag and the mfn flag, as for instance when a pronoun is about to replace a noun and must have the proper number and gender.

$tsn -- time of input as $seqneed time for InStantiate()

$tult -- t penultimate, or time-minus-one.

$tvb -- time-of-verb for parsing. The EnParser module sets "t - 1" as the tvb if the word being parsed is a verb by part-of-speech. InStantiate uses the time-of-verb tvb to fill the tkb-slot in a verb-panel with the time-location of the direct object of the verb. EnParser uses the time-of-verb tvb to find the numeric verb-concept that will be the pre of a direct object being parsed. EnParser also uses the time-of-verb tvb to store the time-of-direct-object tdo as the tkb in the flag-panel of the verb. Therefore the tvb-flag needs to be reset to zero not during but after AudInput, when program-flow has returned to the Sensorium module.

$tvpr -- time-of-verb-preposition – is used in the EnVerbPhrase module to convert a value stored as a "tpr" flag into specifically a "tvpr" flag so that the EnPrep English-preposition module may fetch and output not only a preposition related to the verb-phrase but also the object of the preposition. The "tvpr" flag is meant to identify a prepositional phrase that refers to a complete verb-phrase and not merely to a noun within the verb-phrase. For example, in the prepositional phrase at the end of "God does not play dice with the universe.", the preposition "with" is logically and semantically connected more to the verb-phrase "play dice" than to the noun "dice" all by itself.

$unk -- all-purpose $unk (unknown) for troubleshooting

$us1 -- the upstream noun number one for EnArticle to keep track of;

$us2 -- the UpStream noun number two for EnArticle to keep track of;

$us3 -- the UpStream noun number three for EnArticle to keep track of;

$us4 -- the UpStream noun number four for EnArticle to keep track of;

$us5 -- the UpStream noun number five for EnArticle to keep track of;

$us6 -- the UpStream noun number six for EnArticle to keep track of;

$us7 -- the UpStream noun number seven for EnArticle to keep track of.

$usn -- an upstream number-variable to coordinate the rotation of the us1-us7 variables in their role as the holders of noun-concepts mentioned so recently that they warrant the insertion of the definite article "the" by the Enarticle() module.

$usx -- a transfer variable for InStantiate to transfer the concept number of an incoming noun to whichever us1-us7 upstream variable is up next in the rotation of up to seven recently mentioned noun-concepts, so that the EnArticle module may insert the definite article "the" before any noun currently under discussion in a conversation.

ux1 -- unknown Latin case potentially nominative.

ux2 -- unknown Latin case potentially genitive.

ux3 -- unknown Latin case potentially dative.

ux4 -- unknown Latin case potentially accusative.

ux5 -- unknown Latin case potentially ablative.

uy1 -- unknown Latin case potentially nominative.

uy2 -- unknown Latin case potentially genitive.

uy3 -- unknown Latin case potentially dative.

uy4 -- unknown Latin case potentially accusative.

uy5 -- unknown Latin case potentially ablative.

uz1 -- unknown Latin case potentially nominative.

uz2 -- unknown Latin case potentially genitive.

uz3 -- unknown Latin case potentially dative.

uz4 -- unknown Latin case potentially accusative.

uz5 -- unknown Latin case potentially ablative.

$vault -- size of MindBoot() sequence in time-points.

$verbcon -- verb-condition for seeking (in)direct objects. $verbcon is set to a positive unitary one in the EnParser module when a verb comes in during input and remains at one while any indirect or direct object comes in. After an input sentence, $verbcon is reset to zero.

$verblock -- for subject-noun to lock onto seq-verb. The $verblock is a time-point in conceptual @psy memory where the $seq of a subject-noun is located, so that any crucial consideration such as the negation of a verb will be found when the verb-phrase module fetches the verb-concept from memory. If the thinking process finds and activates a particular subject-noun in memory, the $tkb of the subject noun becomes the $verblock so that a particular verb at a particular time will be selected by the English or Russian verb-phrase module.

$verbprsn -- verb-person -- reverting to zero for infinitive forms.

$verbpsi -- $psi concept-number of verb in the @psy array

$vphraud -- holds aud-fetch of verb-form for Speech() module

wasvcon -- query-condition for what-AUXILIARY-SUBJECT-VERB queries, such as "What do robots need?", so that SpreadAct may activate an indicated subject like "robots" and an indicated verb like "need" to retrieve a memory that completes the thought "Robots need... " in response to a query.

$whatcon (what-condition) -- SpreadAct() flag for condition of answering a what-query.

$wherecon -- a flag for the condition of answering a where-query. The flag is set to a positive one ("1") if the word "where" passes through the OldConcept module, and serves as a trigger for the EnThink module to call the SpreadAct module

$whocon -- flag for condition of answering a who-query. The InStantiate module sets whocon to a positive one ("1") when the input of the interrogative pronoun "who" is detected.

whoq -- flag to permit the AskUser module to ask a who-question such as "WHO ARE YOU".

$yncon -- yes-or-no condition – is a status flag that the InFerence module sets to a positive unitary one ("1") as a signal for the thinking module to call the AskUser module to ask a yes-or-no question seeking the validation or disavowal of a silent inference.

$ynverb -- yes-or-no-verb – identifier of a verb to be used in AskUser() for the asking of a question expecting a yes-or-no answer.


Resources

http://doc.perl6.org/language/variables

http://doc.perl6.org/language/variables#Sigilless_variables

http://mind.sourceforge.net/variable.html

If you have any questions or comments about the First Working AGI,
please Join the Discussion.

See AI 101 AI 102 AI 103 year-long community college AI course curriculum.
See Classics Course in Latin AI for Community Colleges or Universities.
See College Course in Russian AI for Community Colleges or Universities.
Collect one signed Mentifex Autograph Postcard from 563 in circulation.
See Attn: Autograph Collectors about collecting Mentifex Autograph Postcards.


Return to top; or to
javascript subreddit for AI Mind discussion;
Forth subreddit and robotics subreddit for MindForth discussion;
Perl subreddit for ghost.pl AI Mind Maintainers.


Website Counter