This is my processing of what I’ve experienced, having been a professional, full-time student until last year, when I finished my Ph.D. at 29. At this point, I’ve sat in classrooms much more than I’ve been in front of them, and I can perhaps illustrate the brokenness of the curriculum/semester/grading/course scheme best through my experience taking Plant Taxonomy. “Plant tax” was an elective for me; I took various plant electives out of personal interest. I found my passion in ornithology, and that gripped me through the semester and beyond. I had an opportunity to learn about plants, though, and I took it. However, my retention was embarrassingly low: I say that because at various points in my life, I’ve tried to re-learn what I learned that semester. That, in and of itself, bothers me; what about it didn’t take the first time around? Did I not feel like I needed to know it at the time? (Now, I still don’t “need to know” it, but I choose to try to identify plants, hence going back over the skills found in the class.) I do think of it as a sort of brokenness in the academic system, in that I paid good money for a course that I ultimately didn’t remember the necessary skills from when I wanted to use them. That course would have done me better when I had a reason and/or general desire to learn about plants, as evidenced by perhaps my attempt to do it myself and realizing what skills I was lacking.
I’m not sure how to fix it without thinking way out of the box, meaning that perhaps there just isn’t the formality of 4 years and a degree within a block of time right after high school in your life. Maybe you learn skills as you go, taking electives as part of job training. Maybe degrees last for shorter times (3 years) before you take an internship or your first job in your subject. It would be cheaper, and thus alleviate the burgeoning debt system currently in place. I know some programs that already do something like this, and they seem to be the most successful/acclaimed. Maybe undergraduate research is a requisite in year 3 in science fields.
There could be many benefits to not being a full-time student for 4 years starting at age 17 (for me). When I look back, I gawk at my immaturity, but also see how life to that point trained me to be a full-time student (after all, it’s what we do ’til we reach college: we train to get good grades all day, 5 days a week to get to the best school). I can honestly say I wasn’t so “ready to learn” specialized skills of my field until later in life. I think I saw this phenomenon in physics majors that were returning students (perhaps case in point, I was a physics major at one point in this journey). The returning students were focused and, quite frankly, brilliant. Was it because life experience taught them what they wanted, and they had a renewed direction in taking classes? Were they just more mature than us? Did their advanced brain maturity make them better students? Sometimes I feel like the last question is true: while my brain was certainly most ready for creativity, broad curiosity and variety of information when I was younger, my older brain seems better at putting information into a classification system and thus deep learning now. Is it just because I know more, that I have a better “file cabinet” in my head? In any case, while I’ve also lost the undergraduate discipline of working on homework for hours, I am tempted to revisit concepts I couldn’t place back when, to see how I would approach them now. It feels like I just have more plain common sense, including about academic topics.
Herein lies the conundrum: I was privileged to go to college, and have parents that were able to pay my full tuition. That meant I got to be a carefree undergrad far from home, and thus meet a whole new community of lifelong friends outside of everything I grew up around. It broadened my horizons and taught me some new things about life. As many say, the most important things I learned in undergrad weren’t in the classroom, so I hesitate advocating a removal of this structure, when I loved it and found it to be so beneficial. It taught me things about self-reliance to solve problems, function in the real world and decide who I wanted to be.
However, perhaps here the disconnect is obvious. While my undergrad experience was educational by definition, it provided me only the basics for what I do now in my career skill- and knowledge-wise. Maybe we have to stop pretending that a 4-year undergrad is some grand job training center, and admit it’s more of a life-training center, and mostly for people who had similar (and relatively sheltered) life experiences. Some of that learning happens in the classroom, and my exposure to many different ideas and fields helped shape the way I think as a scientist, and that’s invaluable as well as somewhat intangible. I can’t dismiss, though, what the freedom to be a computer scientist, and then a physicist, and ultimately a biologist did for me. Maybe the importance of undergrad is indeed to expose you to many ideas to teach you to be a better and more well-rounded thinker (hence required core areas outside of your major, though for the cost in today’s student debt world, I’m not sure I like that anymore). What it arguably didn’t do is train me how to do my job, and that’s OK for me, because “I” had the money to spend and the time to figure it all out along the way. I want to pause this tangent, though, before it devolves into a degree inflation rant (which will be saved for another post).
I know I’m not saying anything new here, and experts have and do re-hash these ideas all the time. I’m thinking forward to continuing my career in academia, what I believe the institution I work for should be, and ultimately what my career as a professor should be in order to best serve the students I teach. I feel I need to think in terms of ideals in order to steer things in the right direction.