Throughout most of recorded history, information has been a scarce commodity. And, those who controlled access to information were held in high esteem.
Information scarcity is the foundation on which today’s educational institutions are modeled. The scarcity model is based on the premise that information is expensive to produce, store and transmit. That may have been the case in the past, but just in case you missed it, it isn’t the case any longer.
When information scarcity was the case, it made sense to concentrate resources and bring those who have knowledge and control access to information together. Those who want to get information know where to find it – where it is gathered. Information transmission between people – teaching – is best done where the information is held.
Traditionally, this has been the role of schools and universities. With the advent of the factory in the early nineteenth century, the idea to bring students together to learn in an efficient, one-to-many setting was seen as an ideal way to produce a learned workforce. Even the physical structures standing at the center of teaching in universities are still designed this way – large rooms where many students can assemble together to hear what a teacher has to say – lecture theatres. The method is employed to deliver 89% of university teaching today. Although there are many who will say that the trend is headed in the right direction (fewer lectures) the data says otherwise, presumably because of the pressure to engage in the research process.
One of the civilizing drives over the past millennium has been to make information more abundant. A drive to move from information scarcity to information abundance. Below are listed a few of the key milestones in that drive from when information began to move from the ancient libraries where copies were laboriously hand-made to the social media world of information overload we inhabit today.
Over the years, information has become easier and less expensive to both acquire and transmit. From a few ancient libraries to the wireless access to the internet available to the 14.02 billion mobile devices owned and operated by over 45% of the world’s population, the transition from information scarcity to information abundance has been relentless. Although not completely free, information is close to free (or could be) and available to pretty well half the people on the planet.
The internet today has made information cheap and ubiquitous – information abundance.
When viewed against the backdrop of recorded history, the digital age of information abundance is still very new, but we know that the genie is not going to go back into the bottle in order to support an information scarcity society.
Why not embrace the transition from information scarcity to information abundance as the culmination of centuries of striving?
This is a question that I have been asking for years and years. Why is education, especially higher education, still based on an information scarcity model?
Information scarcity requires us to bring students to a central place of learning. This has been interrupted by COVID 19, but not really - ZOOM/YouTube have just become the lecture halls currently used to still gather students together virtually. Information scarcity requires knowledge guardians to be gathered together physically. Information scarcity requires the publication of (text)books to encapsulate a field of study - a block of information frozen in both space and time. Information scarcity means that students must memorize reams of information because, well, because it is scarce and hard to come by.
None of these things are needed any longer in order to learn. They are needed for education, but not for learning. So why is education still based on a scarcity model? Five of the primary stakeholders demand that the model be upheld.
Why would universities, as institutions, want to work under an information scarcity model of learning and resist moving to an information abundance model of learning?
I believe that there are several reasons such as research focus, efficiency of teaching, teaching rather than learning focus, and real – rather than stated (marketed) purposes.
The definition of a university is a higher education institution that has research as the primary and central activity of the institution (the term has traditionally been used to designate research institutions - Wikipedia). Some would argue that research interests come second to teaching, however, the reality is that research is the only game in town. The status and prestige of the institution are based on research and have nothing to do with teaching. Academic hiring and promotional opportunities pay lip service to teaching, but are focussed on research activities with the few teaching only posts seen (and treated as) second class academics - even in universities that are called “teaching focussed”.
The very idea that teaching release is given for excellence in research clearly states where an institution's values lie. There are even teaching only, two-year colleges that will give teaching release for what can only be called a pale imitation of real research. Clearly, research activity is what is valued in higher education.
In many institutions, students are actually referred to (in the backrooms and corridors) as cash cows to support the real activities of the institution. Teaching is seen as a necessary chore that is made as efficient as possible, meaning doing it with as small a resource commitment as possible.
The research publishing game is grounded firmly with information scarcity as the primary goal. Why? Money…
Try to follow this explanation if you can. In the end, you will shake your head at the absurdity of it all.
First, the majority of research in universities is done by faculty members paid by the university. Their full-time salaries are almost exclusively paid for by the institution. Most of these institutions rely heavily on public money, and so we, the taxpayers are paying for academic research to be done.
Second, as a part of the research process, researchers (the same ones paid by the institutions) write extensive reports about their findings to be published so that other researchers (and anyone interested) can refer to the research and evaluate research findings for themselves.
The third step is for these reports to be submitted to an editor for peer review. The editor of a journal or publication is a faculty member at a university somewhere who is paid by the institution they work in. The reason a person is willing to become the editor of a journal is because of the prestige the unpaid role brings. The editor makes the final decision abbot whether a report is published, based on the next step in the process.
The fourth step is for the reports to be sent out to other researchers, who work in the same field of study, to be evaluated for publication. There are usually between three and five reviewers for each report. Oh, and did I mention that all of the peers who are reviewing the reports are faculty members of various universities, paid for by the universities.
So far, so good. Universities (or research grants) pay for the researchers. Universities pay for the writing of the reports. Universities pay for the editors of the publications that the reports are submitted to. And, the peers who review the research reports are paid for by the universities.
So far, everything is paid for by the universities with much of the funding for these activities coming from taxpayers.
At this stage, if a report is accepted by the editor for publication, the report then becomes the property of the publisher, usually a private company. The authors of scholarly work are required to sign over copyright to private companies with no compensation from the private companies whatsoever. The private publishers are given all the rights to research findings that have been completely paid for by universities.
And, then it gets worse. This is where information scarcity comes in. Other researchers and individuals must pay for the right to read these findings that were paid for entirely by universities in the first place. The information becomes a commodity that must be paid for. A commodity that is, by design, made scarce so that it has value for the publishing company.
So what. So a private company makes a bit of money in the process. They have to cover the cost of printing the reports and making the reports available online.
So, after paying all of the costs for producing, editing, and reviewing the research reports, universities then have to pay to access these reports.
Not a big deal until you are told just how much the universities have to pay to gain access to the information that they have already paid for to be produced.
The average amount paid for access to research journals in 2011 by 34 research universities in North America was $2,584,400 ranging from $1,201,200 (Colorado State) to $4,504,500 (University of Toronto). This doesn’t count the books that libraries bought. These are just the serials (journals) that research reports are published in (Fei She et. Al. 2012). And these prices reflect discount prices paid for journal bundles.
Information scarcity by design.
Another reason that higher education institutions embrace the information scarcity model of information is teaching efficiency. Large lectures with as many students as possible packed into the allocated space are efficient. In one hour, learning units for 400, 800, or even a thousand(s) of students can be checked off an administrator's list which then brings in the allocated income from wherever the source is.
In a world obsessed with efficiency of delivery and accountability of resources in virtually every aspect of our lives, maximizing efficiency in teaching is viewed as a positive aspect of a successful institution. To advertise inefficiency would be seen as a negative aspect of the university administration. As a result, in most institutions, minimum class sizes are specified. If fewer than twenty students sign up for classes in the first two years of study, the class is canceled because it is inefficient to run such a small class.
Obsession with efficiency leads to larger and larger lecture theatres, in spite of the fact that “no alternative has ever been discovered that is less effective (for learning) than lecturing (Bligh, 2000)”.
Teaching rather than Learning Focus
If you closely examine teaching activities in education, from the earliest preschool through to higher education, education is focussed on teaching. In the training of a teacher, the time spent learning how people learn is almost non-existent. I was talking to a recent graduate, from a leading education program, and he told me that he could remember the class he took that taught him about how people learn, and I was very excited. I asked him what resources were used to teach this, and he said “I think you misunderstand me. I mean that I took a class, a single, three-hour class, about how people learn. Not a semester-long class.” Needless to say, I was astounded. I knew that the emphasis on learning is minimal, but didn’t realize that it was that minimal.
It doesn’t change as students get older. I have studied the science that underlies how people learn and how that understanding can be applied to formal learning situations for decades now, and am considered a curiosity in education – not to be taken seriously. You would think that a higher education institution would want to have a learning expert somewhere in the institution. Instead, they rely almost exclusively on teaching experts. There is not even lip service paid to learning. It is all about teaching.
Many institutions have, as a part of their mission statements (or whatever they call them) words like leadership, excellence, reaching potential, innovation, and on and on. Although these buzzwords are officially a part of the heart of the institution, actions speak louder than words. Ever larger lecture theatres, relegating teaching to a second class activity without even a mention of learning, rewarding, and focussing almost exclusively on research – this is why institutions work under an information scarcity model. Maybe someone somewhere can enlighten me as to how sitting through 12, three-hour lectures (standard single semester class) can help a learner reach his or her potential.
Institutional Information Scarcity
Institutional information scarcity means that information is hard to find and is housed in information repositories (universities). Learners must go to a place of learning in order to access information. Learners must gather together by the hundreds and thousands to hear the words of a scholar and engage in real learning events (lectures). Access to significant information (journals, books, etc.) must be carefully controlled, and this access is sold to learners. Institutionally, an information abundant model would threaten their very existence as the guardian of information. Scarcity adds value. Without institutional control, there would be no scarcity and the cash cow (undergraduate) would disappear and the real activities of a university would cease because of the lack of funding just as newspapers and investigative journalism disappeared with the disappearance of classified ads and the associated income stream.
Institutions embrace an information scarcity model because as long as information is scarce, it is worth something. That doesn’t explain why faculty members who teach, continue to support an information scarcity model. I believe it is because of several factors: the reinforcement system, inertia, ease, and lack of interest
Reinforcement is a big factor. As I outlined above, the university and college system have as their primary focusses things other than learning. In universities, the focus is research. As such, the reinforcement system in place is for research. Few, if any, faculty members receive monetary rewards (raises), real recognition, significant promotions, and peer approval for their prowess in teaching. If a person receives tangible rewards for research-related activities (publications, grant capture, Ph.D. students, etc., space (labs), and assistants), those are the activities that will receive their time. I have worked in a research-intensive institution and know faculty members who teach students a few hours a year (as low as 3.5 in one case) so that teaching activities do not interfere with their research. As part of my role there, I supported the training of Ph.D. students. I am guilty of telling students not to spend too much of their time on teaching-related activities because this will hamper career development.
When there is explicit teaching relief (you get to teach fewer classes) for excellence in research, the institutional and peer expectations direct your activities. Although excellence in teaching is paid lip service, the real driver for promotion is research activity. With promotions come monetary rewards. Although money isn’t a factor for academics (at least that is what they say) I know too many academics (pretty well all) who respond favorably to financial incentives to actually believe what they say about their altruistic motives. Money talks in the academy just like anywhere else.
Lecturing is what academics do when they are teaching. Everyone knows this and we know that conformity is a basic driving force for behavior. Academics are people, and if everyone else is doing it (especially peers), and if this is what everyone has always done, there is little impetus to alter long-standing traditions. And so we see that lecturing accounts for over 89% of all teaching events in universities today.
Academics are not trained to teach, and there are few people who go through the grueling training that it takes to obtain a Ph.D. who want to teach. They are trained as researchers, and that is what they want to do.
This is one of the main drivers for individual faculty members to keep the information scarcity model of education in place. Dealing with students takes time. This is especially true when using non-standard teaching methods that actually foster learning. It takes about three years to get a series of lectures to the point where a lecturer is satisfied, unless a lecturer uses the pre-prepared, caned lectures that company virtually all textbooks. For self-prepared lectures, the first year is real work. In the second year, most of the material is revised to get it better. In the third year, some tweaks are made to get it just right and from then on you just deliver the same-old, same-old. When you can do this with hundreds of students at a time and test their progress (memorization) with a multiple-choice test (or go all out with an essay exam), you can satisfy the teaching requirements of your job in a few hours a week. Why change a method that works for everyone except those who actually want to learn.
Lack of Interest
This is the real reason why the information scarcity model reigns supreme. Faculty members are not interested in how students learn. The few who are interested in their students focus on how they teach. Teaching is the only thing that matters. I have found very few teachers (or other faculty members of any kind) who want to know anything at all about how people learn. When I have been approached, as an expert in the field, practitioners don’t want to know anything about how people learn. All they want to know is some teaching tip that will make their students happier. When institutions talk at all about students today, it has nothing to do with learning. The institutional focus on students is called “the student experience” which can be interpreted as “keep the students happy”. How can anyone be interested in how people learn when they find themselves in complete isolation when they do.
Although faculty and the institutions strive to keep the information scarcity model of information alive and kicking for their own benefit, students are just as resistant to change as the other two. This seems paradoxical as students would be the primary beneficiaries of a move to an information abundance model.
We know that students gain little or no benefit, when it comes to learning, from the way education happens today (lecturing). A study done in 1980 showed that students, one year after the end of their semester-long class, scored only 20% higher than students who didn’t take the course. Seven years later that difference had dropped to 10%. Another study showed that the drop in performance in a test taken at the end of a one-hour class and a test taken on the same material a week later saw the raw scores drop an average of 42% during that time.
And yet, students demand lectures. They believe that a lecture is the proper way to learn in university. Every time I have supported a lecturer in trying something that will actually facilitate learning (and even experienced this myself), the majority of the students erupt with fury at the idea that someone is doing something different from a lecture. Thanks to social media, I have read for myself the kinds of things they say. In one case, what my colleague received from the students was “You should lose your job. Your job is to tell us what we need to memorize in order to pass the test and you aren’t doing it“. This comment was followed by almost 200 others echoing the same sentiment (there were 350 students enrolled in the class).
I don’t use lectures in my teaching and base the entire experience for the students on all that I know about how people learn. I know that, even though my reputation precedes me and students sign up for my class expecting something very different, there is a repeated enrolment pattern every year. I ask the students to pre-read the exact requirements for my classes and before the first day of classes, 25% will drop the class. After the first day of class, the only time I stand up front and talk to the students (so I can explain how the class works), a crowd of students flock to me with horror in their eyes asking exactly what I expect of them right now - after which 25% of the enrolled students drop. Because of my reputation as a teacher, there is always a large waiting list of students trying to get into my classes and eventually they end up being almost full - between 50 and 55 students in a class with a 60 student cap.
A few years ago, I was teaching at a local college and was hauled onto the carpet three times during one semester and chastened for using methods other than the “read them the PowerPoint slides” kind of teaching that is so prevalent today. This dressing-down was instigated by a group of students who demanded a more traditional approach. When I told the chair of the department that I was doing this so that the students would actually learn, I was told that their learning was not my concern and my primary responsibility was to keep them happy.
I believe that the primary driver for this resistance to change comes from the reasons that students enroll in higher education in the first place. The 2016 Gallup Purdue Index (GPI) found that 86% of students want a higher education degree so they can get a better job (down from the 73% average in 2009). If 86% of the students are there for a paper that says they were there, then learning is an obstacle rather than an opportunity.
Lectures (and the wait to regurgitate stuff for an exam) are the traditional way to do it and fully subscribe to the information scarcity model of learning. Besides, attending lectures and cramming for a test is relatively easy - and it works. When all you want is a degree so you can get a better job, why do any more than you have to?
There are very few students in today’s mass education system who are there to learn. As a senior colleague at my former place of work said, “If we don’t ask too much of them, they won’t ask too much of us, and we’ll all be happy.”
Businesses expect university or college qualifications. Do they really care about how well students do in their studies? According to the Chronicle of Higher Ed, on a scale of 1 - 100, businesses care about GPA about an 8. In addition, according to the WEF (see Figure 1) businesses don’t care much about the content students learn either.
Figure 1: Figure from the WEF Future of Jobs survey (2016) highlighting the “Share of jobs requiring skills family as part of their skill set, %” with the requirement for content as a core component being 10%
All that they really want is a qualification. And, even though there are constant murmurings about how worthless these qualifications are, there are few companies that are removing the requirement from their job requirements.
What difference does it make if the qualification is gained through an information scarcity model or not?
Along with students, parents are resistant to changing the model of higher education. If it was good enough for them, it is good enough for their children.
What difference does it make that the world has changed? Almost every person who graduated from higher education got a good education. With that sample size in mind (n=1 - themselves), they turned out ok, so how could the fundamental model be wrong?
When parents tour a college or university with their children, what are they shown? What is it that they ooh and ahh about? When it comes to learning facilities, the most impressive things that they see are modern, up-to-date lecture theatres and the vast library collections - both relics from the information scarcity model of learning. It isn’t that libraries are a bad thing. I think that libraries are brilliant, but the contents of libraries should be completely available online - not just the catalogs.
From a parent's perspective, the information scarcity model of education is just what education is. It is what education has always been. It is what education will always be.
Besides, don’t kids spend too much time online already?
The first thing that we can do is think about it. Think about the effects of an information scarcity model of education on learning. Think about how much better our society would be if a significant part of education, from beginning to end, was about information literacy. How to judge the value of information. How to look for reputable sources of information. How to use these skills for the betterment of society.
Instead, formal education clings to a scarcity model while social media has embraced an information abundance model of information. A model that none of us are formally trained to work with and so we have conspiracy theories, rigged elections, fake news, and real radicalization occurring all around us.
When the Province of Alberta (where I live) is still using some curriculum developed in the mid-1990s before the internet was even around, how can we expect a model of information abundance in education to be in place? And, the majority of educational jurisdictions aren’t that much better.
At the very least, we should be lobbying for a curriculum-based firmly in this century and a foundational model of information based in the information age, even though we are mid-way through the transition into the next industrial age.
We have a lot to do. Don’t ignore information abundance, embrace it. This age has been the dream of those who want to learn through virtually all of history. Lives have been lost making information more available throughout history.
We have it now. Let’s use it.
Jesse is a world leader in the integration of the science of learning into formal teaching settings. He is an Adjunct Associate Professor at the University of Lethbridge and Director at The Academy for the Scholarship of Learning. Huge advocate of the science of learning, he provides people with ideas about how they can use it in their classrooms. Jesse holds a PhD in Psychology from the University of Wales, Bangor.