Every course charted its own unique journey to the final paper, project, or exam, but for the most part I could count on one aspect of each class unfolding in a predictable way - the distribution of the syllabus, an ancient tradition that repeated the same familiar pattern in classrooms and lecture halls throughout my four years at college. The ceremony would start with the professor dividing the neatly stacked papers into piles, then handing each one to a student in the front row (usually to those sitting at the far ends). They would then follow the implied command to take one and pass it to the next, the rustling action snaking up and down the rows like a slow wave tumbling toward me in the bleachers of Fenway Park. When I calculated which of the piles would reach me first, I would casually turn my head toward the general direction of my neighbor in preparation for the coming exchange, my averted gaze acknowledging the possibility that this classmate swiveling in my direction to discover me staring at his or her head might suspect I'd been doing so for up to the last ten minutes. Even if I did avoid this silent scandal, there was no time for celebration because the next challenge was immediate - I would try to take the stack, by then looking more like fanned playing cards than its original arrangement which was akin to a just-unwrapped box of copy paper, attempting to receive this messy pile while also chancing brief eye contact as I muttered my thanks, a feat of dexterity that often did little more than prove the importance of "eye" in the expression "hand-eye coordination"; this is how to get a paper cut on the first day of class. These catastrophes were no match for the final indignity of turning in the other direction to find a student already in possession of my suddenly excess wares - the bubble having burst, I would wave to the professor from my seat like the lone inhabitant of a desert island trying to get the attention of a passing ship; by my senior year I learned it was expedient to simply bring the hot potato to the front of the classroom.
The ritual satisfied, I could finally turn my attention to the syllabus itself, which I would glance at for a few seconds before placing the bloody document into the folder I'd designated specifically for the class, where it would remain untouched for the remainder of the semester.
It's taken me just over ten years but I think I've finally realized that these seemingly useless syllabuses have some value, which I've separated into two components. The first is that they created boundaries for the learning process by defining the relevant knowledge for reference throughout the course. The list of sources, lectures, and assignments in college syllabuses always felt like too much detail, but I wonder in hindsight if I tucked them out of sight upon receipt because I never colored outside the lines; I prepared for exams by studying from the assigned textbooks. The challenge I've discovered since college is how to draw up my own list of learning materials. If I've read one book on a topic, should I read another book? Should I listen to a podcast instead, or watch an informational program? Should I just go outside or even online and start talking to people about the topic? This is where I miss the second valuable feature of a syllabus - it established clear benchmarks for how to define a successful student. The answer to the above questions could be condensed into one of those useless aphorisms favored by those who confuse irrefutability with wisdom - you should keep learning until you've learned enough. But how much is enough? I recently read a book from an author who claimed that he went through over one hundred management books per year, so I know I'm not the only one who could benefit from a little clarity as it relates to how much learning is enough.
This brings me to the larger issue on my mind lately, which has to do with the challenge of continuing to learn once a student leaves the education system. There seems to be an inability in the average person to create a self-directed learning process, which I feel is reflected in our reliance on external markers of learning such as degrees, certificates, or endorsements - we require proof because there is no confidence in the notion that people are continuously educating themselves. I don't blame anyone who feels this way because it's a logical consequence of an education system that leaves little to no time for teaching students how to teach themselves - if we didn't learn how to learn in school, where else would we learn it? The events of the past year seem to shine a spotlight on this problem, evidenced by a few conclusions on topics that I've had no choice but to learn about on my own thanks to their novelty - the necessity of prioritizing action over intent in the context of racial justice, the farce of allowing masks to substitute for distancing in outdoor settings, the facts of mail-in voting; I was expecting something close to universal agreement on these conclusions, but there is plenty of dispute. I'm open to changing my mind but when I listen to someone with a different perspective I invariably discover nothing at the core of the opposition - no evidence of thinking, contemplation, or learning, which means their most significant failures are unintentional; it reminds me of how English-only speakers fake their way through Spanish by adding "-o" to the end of every noun, failing to grasp the subtle misogyny that's being revealed by their linguistic shortcomings.
The problem isn't reaching the wrong conclusion, the problem is never trying to learn. A student who stops learning upon graduation soon loses not just the immediate knowledge of a specific topic, but also the broader sense of the obstacles common to the journey of discovery; the failure to turn on the light is not just darkness, but also the compounding error of stubbing a toe on the coffee table. But trying to discover is hard when we lack the skills for discovery - a college graduate knows how to learn within the context defined by a syllabus and knows how to measure progress against its benchmarks, but the skills for extracting the right knowledge from new territory is an untaught ability. It's like preparing for an Arctic exploration to study glaciers only to find upon arrival that all the ice is gone - the only thing left to do is see what was hidden for all those years, even if you weren't trained for it.
It seems foolish to hope that life is going to start handing us syllabuses anytime soon, so I think those interested in lifelong study must resign themselves wandering beyond the safe borders that were once considered off-limits or out of reach - where unreliable sources sound convincing, opinions serve as facts, and key findings from popular studies cannot be replicated. Most significantly, the lifelong student must learn how to navigate the group inertia that refuses to acknowledge inconvenient truths, and there may be no more inconvenient truth than the fact that most of us have stopped learning. The consolation is that with such a long time horizon, far longer than any semester-length course, the questions of quality or quantity are less relevant than the question of energy - do you have a way to start learning so that you keep learning? Do you have the internal motivation to continue discovering even when those around you prefer to remain stuck in place? Do you have the confidence that you'll know when you've learned enough about a particular topic? There are no grades in this stage of the learning process, only progress - the report card measures whether you continue to learn, or not.