Do you remember your first day of school? Your seventh birthday party? The embarrassing outfits your parents dressed you in for family photos?
While some memories are better left buried deep in the burrows of our brain, never to resurface, memory formation and recollection are crucial concepts that heavily guide human experience.
When paying attention to our surroundings, we sift through piles of stimuli, process incoming information, and output a specific code. For instance, when learning about a specific molecule in chemistry class, you form a characteristic code or construct that may include its melting point, solubility, and bonding properties.
Each memory code has a unique shelf life that depends on the complexity of its encoding process. A code that is given semantic value, which involves a deeper level of processing, is more likely to persist and be recollected in the future. So is a code that is built on multiple associations.
Understanding the context in which a chemical compound is used creates a stronger memory code than merely memorizing the compound’s structure. Likewise, something that was learnt in the presence of a particular smell is easier to remember in the future when that smell is present because your mind forms an association of the two.
Memory storage
Once information is captured by our senses, it can enter short-term memory—more recently dubbed working memory—which acts as an interim storage facility. Information is housed in working memory for only a few seconds, but we can extend this by engaging in repetitive rehearsal. Nervously repeating an important number in your head secures its place in this short-term lodging.
George A. Miller, a pioneer in cognitive psychology, has shown that working memory has an occupancy limit of seven items, plus or minus two.
Generally, we are able to store five to nine pieces of information in working memory. We’ve even adapted to this limitation by constructing numerical systems that satisfy this range. Rather than separating the individual digits of a phone number, we use dashes to group the digits into larger units.
In this way, only three items, as opposed to nine, occupy our working memory at once. This process is called chunking and can act as a great study tactic. You can group related ideas or break down information into smaller bits to facilitate learning and memory retrieval.
The longer information stays in working memory, the more likely it will enter long-term memory, an unlimited storage facility where information is methodically categorized in a way that helps us remember.
We tend to cluster items that are closely related (i.e. apples and oranges), as well as organize information into networks that link together more loosely-related concepts (i.e. apples and the color red).
Long-term memory is divided into declarative and procedural memory. Facts about the world are declarative and require consciousness, whereas memories of how to do things—like riding a bike—are procedural and function subconsciously once they’re learned.
When new memories are created, corresponding neural pathways are formed in the brain. These pathways are reactivated when we remember past events which then temporarily re-enter working memory, giving us a quasi-time travel experience.
Recall and recognition are the main ways to retrieve stored memories. When remembering an abstract idea, we rely on recall, whereas recognition is used to remember events that involve physical objects. Depending on the content of the course and the style of testing, we may rely more on one method of retrieval over another.
In this way, only three items, as opposed to nine, occupy our working memory at once. This process is called chunking and can act as a great study tactic. You can group related ideas or break down information into smaller bits to facilitate learning and memory retrieval.
Why do we forget?
According to decay theory, memories become less prominent as time passes, eventually losing their neural blueprints. While this seems to adhere to common sense, there is not enough research to support it.
Interference theory posits that forgetting is a consequence of competing memories—new memories can interfere with previously formed ones and vice versa. A 1989 study by Chandler showed that interference is likely to occur when students simultaneously enroll in related subjects.
This is important to consider, especially when taking a full course load. Adopting effective study techniques, including chunking, can help minimize some of this inevitable interference.
Other theories amount forgetting to a failure of encoding or the absence of adequate cues during retrieval. Motivational theory even suggests that we are able—sometimes intentionally—to rid our mind of unwanted memories.
A more extreme example of forgetting is seen when damage occurs to the hippocampus, the brain’s main memory hub.
The case of patient H.M. is a famous example of the tragedies that strike when this structure is damaged. To treat his epilepsy, doctors removed parts of H.M.’s brain which included the hippocampus, leaving him unable to transfer information from working memory into long-term storage. While he was relieved of the epileptic seizures, he suffered the inability to learn and remember subsequent events.
As the semester progresses and our plates get fuller, it can be a challenge to keep up and retain the mounds of information that each course presents. Luckily, gaining some insight into the operations of memory can help us take advantage of the biological backroads that exist and help us successfully navigate another term.
In the words of the legendary Dory: just keep swimming, just keep swimming.
Graphic by Sara Mizannojehdehi.