Use LEFT and RIGHT arrow keys to navigate between flashcards;
Use UP and DOWN arrow keys to flip the card;
H to show hint;
A reads text to speech;
25 Cards in this Set
- Front
- Back
Word Frequency Effect
|
words that are more frequent are recognized faster, (you’re more likely to think someone said baby than Babylon) so Forster ordered words in the lists by frequency (but context and grammar make a difference)
|
|
Cohort model
|
Marslen-wilson, parallel (many words are active at the same time), “he packed the picnic in the ba_ _” /b/ cohort, /bas/ cohort, /bask/ cohort. Start building a cohort of potential sounds around the word. All words with /b/ are active until you hear ba then bike and box drop out. Bashful drops out because its not a noun (so words also drop out when they don’t match the semantic context) bottom-up; process continues until only one word is left; problems=look in notes!!
|
|
Recognition Point
|
in cohort model, when only one word is left-lexical access
|
|
trace model
|
parallel and connectionist, Mclelland & Elman, three different layers (features-phonemes-words) phonemes are sending positive excitatory connections to the words containing them, phonemes inhibit other phonemes, words inhibit other words. You hear the word, which starts sending signals to other words, over time only one word is left active.
|
|
connectionism
|
see trace model^
|
|
node
|
a point of intersection like in trace model. Phoneme nodes, linked together in connectionist and parallel processing
|
|
• “look at that car/go on that tanker”
|
segmentation problem; trace doesn’t account for this;
|
|
Shigarette
|
- we know its cigarette b/c no other word ends in igarette.
|
|
Lexicon
|
mental dictionary
|
|
Lexical Access
|
Stage of word recognition at which stored information about a word becomes available for further processing
|
|
Word monitoring
|
using same target words in different spoken sentence contexts, shows that context affects how fast we register the target word-people push button as soon as they hear the word they were shown in the beginning
|
|
Lexical Decision Task
|
people push a button whether it’s a real word or nonsense word-measures the time it takes to recognize words
|
|
Morphological Representation
|
idk...the word we use to represent a concept??
|
|
Novel Words
|
neologisms, new words like “meme”
|
|
fixed word meanings
|
does a word have one fixed meaning? (core semantics, necessary & sufficient) a way to figure out word meanings
|
|
Necessary and Sufficient conditions
|
fixed, ex table, flat-has 4 legs-says anything that is flat with four legs is a table, ex in order for john to be a bachelor it is necessary that he is unmarried, male, and adult. So it is sufficient to know that john is a bachelor to know that he is male
|
|
Core Semantics
|
- fixed, words have a basic core to them, a definition, but doesn’t say that anything with those qualities must be that thing, describes commonalities/core meanings, but it doesn’t always work (ex. A bird has feathers, wings, etc. but if a bird is plucked it’s still a bird.)-difficult to define core easily
|
|
Fuzzy edges
|
- words open to interpretation-ex of vase, cup, bowl depending on what’s in
|
|
Family Resemblances
|
Example of games, define games and we see much overlap in similarities between types of games. There is no clear cut boundary
|
|
Prototype Theory
|
fuzzy, we have a common, generic image of what a cat is. More of a concept, a common understanding, feature-based, but are not necessary & sufficient features, we are faster at identifying things that are closest to our prototype. Copes with damaged examples (plucked chicken) but how do you pin down the list of features? It goes by appearance only, not everything has a single prototype
|
|
Exemplar Theory
|
fuzzy, has multiple ideas about what it can look like, our experience dictates what exemplars are. Takes a while. Addresses prototype theory. Function matters, we compare an item to diff exemplars for similarities and dissimilarities (ex. Owl)
|
|
Cognitive Economy
|
how meaning is stored-a brain uses the least amount of effort possible, and is not big enough to store all meanings separately.
|
|
Hierarchical Network Model
|
- memory is organized hierarchically and on diff levels. Items on the same level will have the same response time (we will be quicker to say canary is yellow than canary eats b/c they are on the same level) Look in notes for advantages
|
|
Semantic Net Model
|
words are organized based on strength of past associations- poke, go, molasses=slow. The faster people identify words, the more they are interconnected. Interrelations-ex remote association test . feature based model
|
|
Lexicalized
|
when thoughts are put into words
|