In my early days as an undergraduate student, I remember that a few of my professors had their own personal theory related to their discipline, and they included their own academic contribution in their classes, along with the most famous authors in the domain. During the semester, we saw many theories from a variety of international and renowned authors, compared the trends, followed the evolution of our field, and witnessed the differences between, say, European theoreticians, the ones in the United States, and those in Quebec. Then, at some point, some of our professors introduced us to their own, personal theory. Sometimes, it was an already existing theory to which they brought another point as a complement, or a nuance. In other cases, they brought a neologism they had created to describe an overlooked dimension or an emerging research question.
In the classroom, some of these “professors with a theory” even referred to themselves in the third person, using only their last name, as they did for other famous authors. They read aloud some excerpts from their own book or forthcoming article about “their” new theory. Others used their own book (not an anthology of texts or a reader, but their “book about their theory”) as the basis for an undergraduate course, encouraging students to read it, or even better, to buy a copy. Photocopies were not allowed.
I sometimes suspect that my “professor with a theory” imagined that one day, he would certainly sit in Paradise with all the famous authors and theoreticians, discussing his own revolutionary theories while harmlessly drinking cicuta.
In the same way, many graduate students doing a PhD were not discouraged from creating neologisms – quite the opposite, they were even invited to include some neologisms in their thesis title, to announce their contribution.
One could think these twists occurred only among egocentric or mediocre scholars seeking posterity. But in his latest (and excellent) book on global warming (The Politics of Climate Change, Blackwell, 2010), celebrated sociologist Anthony Giddens uses a new concept to describe how some damages causing global warming seem harmless because they do not have visible and immediate consequences; but instead of labeling this problem as “the global warming paradox” or something similar, the celebrated sociologist coined it “The Giddens paradox.”
Another related problem is self-quotation. Citing oneself can have many uses and meanings for academics. One of them is to say (I paraphrase): “What I am bringing here should not be considered new, as it has already been said before (incidentally by me)”… It is another way of saying: “I don’t have space to explain every aspect here; if you have any objections, just look at the extensive analysis I wrote about this issue in a previous publication.” Another use is to plug one’s own book “coming out this month.” Self- quotation is honest when scholars express the same idea in many publications and want to acknowledge that they are not trying to make many papers out of a single piece of research. But the more questionable dimension is for self-promotion.
I remember professor Fernand Dumont, himself a celebrated theoretician, often being quoted by other scholars. He was reluctant to refer to his own works during our weekly doctoral seminar, even though he didn’t refuse to mention his own research when students asked questions about his views and books. Some professors from France, during a lecture or speech, apologized before referring to an example taken from one of their own publications.
I do not want to mock or discourage scholars from forging their personal theories in their field: that is what they are being paid for! But conferences like ACFAS and Congress remain, in my view, the best places for academics to expose their own work, not in classes where undergraduates are not in a position to discuss and won’t dare to disagree with the professor.
Yves Laberge is a sociologist and scholar living in Québec City; he has served on various boards and commissions since 25 years.
I completely agree with this author’s point of viww. It often irks me to see colleagues limit the exposure of the field in favour of dissemanating their own views. I reserve my last class, which is not mandatory, where I will share my personal theory of the field I both teach and do research in for students who are interesetd. I have found though that students, who have a relationship with you as their teacher, are very interested in my personal theories. It is these theories, for better or worse, that shape how you dissemenate the knowledge of others in your field. In fact, a lot of these personal theories are worked out, at least for me, in the class room as I discuss the work of others in the field.
I agree that professors who use classes as a platform to promote their own research rather than provide a broader survey of the field can be a problem (except in graduate courses where a professor’s research is, in many fields, supposed to be the focus of a course). But I find the use of the term “personal” here really objectionable: if it is published in a refereed, academic forum then it is an academic theory or scholarly position, not a “personal” one. Other scholars have deemed it worthy of dissemination for scholars in the field (undergraduate, graduate, and post-PhD), and so there’s no logical reason to exclude it from the classroom: it is a published (same etymological root as “public”) theory, not a “personal” one. The term “personal” here implies mere opinion, and that does not belong in any classroom–from students or from instructors. The term “personal” also misses the larger problem: professors who teach the material most familiar to them (whether because its their own or just their favourite scholarly work) instead of challenging themselves to include less familiar material that would be useful to students, especially more up-to-date scholarship.
I don’t accept this point of view. There is always a question of balance and I am more familiar with teaching in science and engineering. Here it is much more engaging and beneficial to the whole educational enterprise (students and professors) if the professor is able to relate his own research to the topics being discussed and this may indeed occupy a large fraction of the lectures given. I think this is the hallmark of what distinguishes undergraduate lecturing from high school teaching. I feel there is already too much pressure on university professors to teach like high school teachers. Good students can read the textbook for additional background.
It depends on what you think university or college studies are all about. If you think they are about indoctrination and training (perhaps the training for a specific occupation), then the broadest and most basic materials should be presented (probably not your own theories unless you are a leader in the field). However, if you think the job of higher education is to foster critical thinking, then your primary role, as a professor is to serve as a model of that process. If doing so involves presenting your own theories, that’s acceptable, even desirable. But you should not merely present your ideas. In front of your students, you should go through the critical thought process that led you from existing theories to your own formulation.
To the extent that you are serving as a model, if you merely regurgitate accepted theories, facts and methods in class, the students will do the same on examinations and that’s what they will take away from the course. (In that case, one wonders why a professor is needed at all; a computer program would probably do a more thorough job.)
But if you offer yourself – an actual walking and talking human being, standing right there in front of them – as a model of critical thinking, then some of your students will catch on, and it will help them in their later education, their working lives, and their functioning as the thoughtful citizens required in a democratic society.
It’s not to much that I disagree with the author; I don’t understand his argument. A professor with a theory? Perhaps the greatest single virtue of the North American university system is that it promotes both intellectual entrepreneurship and research-led teaching in tandem. This nexus is not found everywhere in the world. By research-led teaching, I mean that even the lowliest academic on Canada, on a limited-term contract say, normally has the freedom to design a course and its syllabus, its sequence of pedagogical experiences, the associated readings and types of classroom/online encounters, and the appropriate system of evaluation, as well as grading the participants. In an area such as mine (ancient history) this means that, of the hundreds of ways in which a course might be taught, it is up to this unique scholar-teacher to articulate the driving questions, define the scope, and create the categories. If the instructor is a senior scholar who has contributed to the redefining of her field at the level of question, category, and method, it would be not only futile but positively self-defeating for the university to expect that she somehow detach herself and pretend that she were someone else, looking down from above. Of course, if she has a controversial hypothesis about some particular issue or other, the promotion of that hypothesis should not play a role in undergraduate teaching — but who ever imagined that it would? The *theory* of the subject, by which I understand the basic framework, logic, method, and classification (e.g., historical method) should be largely shared across the field. Each scholar’s unique integration and understanding of that theory is in any case inalienable, and it will define the course. This is surely something to be celebrated. Without that realization, we would be left (at least in my fields) with some unsustainable model of professors as ‘delivery mechanisms’ of free-standing, neutral information, data, or knowledge falsely so-called. What would be the point of that?