Having justified the virtues of “The Course,” in contrast to other methods of learning, I’d like to now make a similar case stressing the universal value of courses taught at the undergraduate level (the teaching level of most if not all MOOC classes).
Speaking broadly – and North Americanly (is that a word?), I first like to segment formal education into K-12,undergraduate and graduate levels.
At the K-12 level, the student body is maturing but not quite mature, which means teachers working in these grades will always need to dedicate a certain amount of energy into civilizing their charges.
At the very earliest grades, this involves supervision (i.e., babysitting) coupled with gross behavior modification. But as students approach the end of their high-school years, teachers are focused on the more fine-tuned work of helping students develop the independent thinking skills required for college or work.
At the other end of my described timeline (graduate school), the ability to think and work independently is assumed to be fully developed, which is why graduate students are charged with creating new knowledge using all of the abilities and techniques they’ve developed previously.
It is between the stages of K-12 and (for those who choose it) grad school, during the period of undergraduate education, when the ability to learn is firmed up and the ability to learn independently flowers (one hopes). This is done via undergraduate survey courses (which expose students to fields of knowledge they’ve not encountered before) and major requirements (which demonstrate how knowledge builds within a discipline).
And since college is (or should be) voluntary, teachers can assume students sitting in their classrooms are interested in the subject matter and ready to learn it, meaning teaching (not behavior modification) can be the primary task inside the classroom.
Now some people working today in higher ed might find my description a bit rosy, especially during such a vocation-oriented educational age. But even someone in that large cohort of undergraduate business majors (the most popular undergraduate major today) have to stop and take some courses outside their major during their four years in school, courses which can have career- and life-transforming effects on them, whatever field they decide to ultimately pursue (including business).
To take one example, I happened to have majored in chemistry when I went to college in the 80s, but never set foot in a lab after graduating (having decided to pursue journalism as a career). And because of the scientific training I had received, I was able to land freelance jobs at newspapers starting new science and technology sections, newspapers that would have otherwise never given me (or anyone else without a shred of journalistic experience) the time of day.
Such transferable skills are really at the core of an undergraduate education, which is why I’m such a booster for a wide-ranging liberal arts vs. a vocational college experience – since you never know which knowledge and skills are going to be applicable during your post-undergraduate life.
Their wide applicability may be why undergraduate-level classes are the core of most educational outreach projects, including extension school programs, recorded lectures from sources such as Great Courses or iTunes U, or the newest member of this particular family: the MOOC.
And to transgress a bit before this piece is over: if the teacher/professor is the surrogate parent in K-12, and a peer in graduate school, in an undergraduate college environment they need to fill the role of a sage (there, I said it).
And why should this be considered such a terrible thing? For all of my undergrad professors (including the ones I’m taking classes from currently) know far more than I do about the subjects they teach. And while I expect to be treated like an intelligent adult when I’m being taught by them, I’m much more interested in being exposed to their wisdom and knowledge vs. being treated as their equal.
If I were enrolled in graduate studies, I wouldn’t tolerate the scale and distance of massive online classes for a minute. But for well-taught (often brilliantly taught) undergraduate-level courses (especially convenient and free ones), I’m ready to sit in the most gigantic of virtual classrooms, as long as the teachers in front of me continue to do a good job challenging me as they expose me to the wonders of the world.
Paul Morris says
This highlights one of the big differences between the English and American university systems. As an undergraduate at an English university you would be unlikely to be exposed to anything outside your specialist subject. It doesn’t even make sense to talk about ‘Major’ subject, a first degree is comprised solely of courses relevant to the chosen subject. Speaking of which, that is another difference, the subject (ie Major in US terms) is chosen even before applying and there is little chance to make a radical change in direction once admitted. This is, of course, a reason why English degrees are generally three years in length rather than four.
So long as US universities have a General Education requirement then it is not going to be the case that “teachers can assume students sitting in their classrooms are interested in the subject matter ” – as is readily acknowledged, many students will do the bare minimum required to get through the Gen Ed courses. This shouldn’t be interpreted as an attack on the broader base of US degrees, in fact I think it is far more in line with the ideal of a liberal education espoused (but not practised) by English universities.