In January, 1997, after serving for 19 years as President of the American Association for Higher Education (AAHE), I began a new assignment as Director of Education for the Pew Charitable Trusts—a Philadelphia-based national foundation that makes grants in six major program areas. My immediate predecessor had focused the Education Program’s $35 million annual budget largely on K-12 issues. My mandate was to bring forward a new grant program aimed at the improvement of higher education. My first major task –a rite of passage for new Pew program directors—was to write a "white paper" for Pew’s Board of Directors that set forth my view of what the focus of this grant program should be.
For the first six months of my new job, I lived a double life. During the day I would go to my new office, meet constituents, read proposals, and do the myriad chores that foundation program directors do. Evenings I would return to my temporary apartment at the Korman Suites, stare out at the Philadelphia skyline, and stew for hours about how to make the kaleidoscope of issues that were swirling in my head into a coherent statement about what’s really important and why. This paper is what finally came out.
The Pew Board of Directors approved the paper in September, and in doing so encouraged me to share it broadly with colleagues and stakeholders outside the foundation. In the fall of 1997, three colleagues—Lee Shulman, President of the Carnegie Foundation for the Advancement of Teaching, Stan Ikenberry, President of the American Council on Education, and Richard Chait, Professor of Higher Education at Harvard University—all graciously hosted seminars that brought leading educators in their regions together to ponder the argument and it’s implications. From these occasions the word spread that the white paper was available, and the Pew Charitable Trusts began routinely fulfilling requests for complimentary copies.
Many of those who requested the paper did so in order to get a bead on what sort of proposals the Education Program might support. But numerous colleagues also reported that, fund-raising aside, the synthesis and argument presented in the white paper helped give direction to their own thinking. Soon we learned that the paper was being used as a background paper for planning retreats, seminars and training programs in higher education, and other such occasions. It seemed to have value beyond its original purpose of laying out a case for Pew’s grant-making agenda.
In January 2000, I left the Pew Charitable Trusts to become Director of the Pew Forum on Undergraduate Learning – an initiative that serves as an umbrella for the higher education grantees that Pew supported in the 1997-2000 period and as an incubator of further ideas about responsibility and accountability for undergraduate learning. Given the purposes of the Forum, I agreed that the Forum would take over from the Pew Charitable Trusts the task of disseminating the white paper to those who were interested in it.
Needless to say, the white paper no longer serves as a guide to the grant making priorities of the Pew Charitable Trusts. Readers interested in Pew’s current priorities should consult the Pew Charitable Trust’s own web-site: www.pewtrusts.com.
At this point in its life, I believe the paper might serve two purposes. Many of the institutions that received Pew support in the 1997-2000 period are now members of the Pew Forum and continue to work toward the larger goals outlined in the white paper. So some readers might find the paper useful as a statement of the larger "common cause" that over separate initiatives are pursuing from a number of different angles.
Beyond this, readers may continue to find the paper a helpful statement of the key problems that confronts higher education and the directions that reform should take.
If I were writing it today, I would make more of the new forces that are reshaping the entire enterprise. Instead of focusing on conditions of "the sobering 1990’s", I would call attention to the new 21st century landscape—trends such as the rising influence of the marketplace, the growing interest in what students can actually do with the knowledge they have acquired, the growing tendency of students to assemble courses from multiple providers, and the implications of new technologies. But having done that, I would still conclude that the problems the white paper identifies—cost, quality, and connection to the public agenda—are the key problems; and I would still sound the trumpet for higher expectations about what colleges and universities can contribute to student learning.
Director, Pew Forum on Undergraduate Learning
Washington DC, March 6, 2001
Chapter I—A Three-Minute History of Higher Education Chapter II—The Fall from the Pedestal Chapter III—The Challenge of Costs Chapter IV—The Challenge of Quality Chapter V—The Challenge of Quality II: Inadequate Incentives Chapter VI—The Challenge of Connection Chapter VII—A Higher Education Agenda for PCT Appendix I: Summary of Goals and Objectives Appendix II: Notes on Sources
Chapter I—A Three-Minute History of Higher Education
Higher education in America is now a sprawling enterprise of nearly 3,600 institutions serving 14.3 million students. The word "college" often summons a picture of fresh-faced young students strolling around a park-like setting, often paying exorbitant tuitions for the privilege of doing so. But the reality is quite different. Some 11.1 million of the 14.3 million students, nearly 80 percent of the total, attend public institutions. About 5.3 million of these students, close to 40 percent of all students, attend two-year public colleges where the annual tuition averages $1,387. Private liberal arts colleges—still our billboard image of what college is—enroll fewer than 5 percent of all students. The character of students, too, has changed. More than half of all undergraduates are age 22 or older; almost a quarter are 30 or older. And 40 percent of the total student body is attending college part-time.
It is useful to keep this snapshot in mind as we talk about the abstraction "higher education." But it is also important to see more than a snapshot. Higher education is an unfolding story, part of the epic tale of America's own evolution from an agricultural society into an industrial society and now from an industrial society into something new. Here is a brief version of this tale.
In the beginning there was Harvard. After erecting shelter, a house of worship and a framework for government, one of the first things the Massachusetts colonists did (in l636) was found a college modeled on Cambridge and Oxford, where many of the colonists had studied. Their aims were twofold: to pass on their religious values and to recreate a bit of old England in the new land. British officials in the colonial service routinely dressed for dinner in the jungles of Africa. The early Puritans, in the same spirit, founded themselves a college.
Before the colonies had joined to create a national government, there were nine colonial colleges. And in the l800s, as the settlers moved west, founding colleges was undertaken in the same spirit as canal building, farming and gold mining. The Dartmouth College court case of l819 legalized the existence of a growing private sector in American higher education. The states, retaining degree-granting charters in their own hands, liberally handed out charters. Religious denominations, competing with one another, established "hilltop" colleges in almost every major new settlement. All in all, by the Civil War, America had 250 colleges, of which 182 still survive. England at the time, with a population of 23 million, was managing nicely with four universities. Ohio, with a population of 3 million, boasted 37 colleges!
The colonial college of the pre-Civil War days was more like what we would call a prep school than a modern-day college. An upright clergyman always served as president. Faculty members were typically young bachelors who themselves aspired to be clergymen. There was little specialization: the faculty taught everything. All students studied a prescribed curriculum. The fundamental disciplines were Latin (the language of the law, the church and medicine) and Greek (the language of Renaissance learning).
But in the end, the colonial college was able to serve the needs of a nation that, after the Civil War, was caught up in an industrial revolution. The scientific enterprise began evolving into many new fields and disciplines. As Americans moved from the farm to the factory, work became increasingly specialized. A new middle class grew up conscious of the need to acquire specialized knowledge and skills. Members of this emerging class began to see their futures in terms of the tasks they would perform in the industrial economy rather than their reputations in their local towns. Tracks of achievement developed, leading into various occupations and professions, and people began having a sense of careers and professions that never previously existed.
All this produced a good deal of dissatisfaction with the traditional, classics-oriented, liberal arts college. In 1850, the president of Brown noted that the nation had several hundred colleges, 47 law schools and 42 theological seminaries, and yet not a single institution "designed to furnish the agriculturalist, the manufacturer, the mechanic, or the merchant with the education that will prepare him for the profession to which his life is to be devoted." With the passage of the Morrill Act, signed by Lincoln in the middle of the Civil War, the federal government began providing these resources.
Against this background arose the phenomenon historians call "the university movement." Since the Confederation period, the federal government had granted public lands that could be sold for the purpose of endowing state universities. Many states chartered public universities, and in 1825, Thomas Jefferson's University of Virginia became the nation's first university. But it was only after the Civil War that the university movement really flowered.
"Flowered" is the right word, for just as there were many varieties of colonial colleges, so there arose many varieties of universities. Some were established from scratch. Others came into being through the metamorphosis of the colonial college into a university. In 1871, Harvard president Charles Eliot introduced the principle of elective courses into the Harvard curriculum. With electives needing to be taught, Eliot could hire faculty who were specialists to teach these courses and advertise these courses to prospective students. In one brilliant stroke, he thus engineered the transformation of Harvard College into Harvard University. Other universities quickly followed suit.
The new universities embodied various ideals. Cornell's benefactor, Ezra Cornell, stated "I would found an institution in which any person can find instruction in any study." Thus, Cornell came to embody the ideal of an all-purpose curriculum. Ithaca became a place, in the sour words of one historian, "where Greek, physical chemistry, bridge-building, the diseases of the cow, and military drill were all equal." In Baltimore, Daniel Gilman, the founding president of Johns Hopkins, looked abroad to the German universities, the essence of which was the disinterested pursuit of truth through original investigation. Gilman thus focused his efforts on recruiting an eminent, German-trained faculty who brought with them instructional techniques--the seminar, the specialist's lecture, the laboratory, the monographic study—that were associated with pushing back the frontiers of knowledge. The University of Wisconsin, embodying still a third ideal, aspired to become a place where the liberal arts tradition, applied science and creative research would all be put together in the service of the people of the whole state. The word at Madison was that "the boundaries of the university were coterminous with the boundaries of the state." And the university took this word seriously. By 1910, more than 5,000 Wisconsin citizens were taking the university's correspondence courses.
All these institutions began with distinctive missions and conceptions of what they were all about. They were not made from the same cookie cutter. And yet, in terms of their internal organization and practices, they all quickly conformed to a standardized, common pattern. Here is what one of the most prominent historians of the period, Laurence Veysey, has to say on this subject:
Looking back, it could be seen that the decade of the l890's witnessed the firm development of the American academic model in every crucial respect... Before l890, there had been room for ... academic programs that differed markedly from one another. Harvard, Johns Hopkins, Cornell, and in their own way Yale and Princeton, had stood for distinct educational alternatives. During the l890's, in a very real sense the American academic establishment lost its freedom. To succeed in building a major university, one now had to conform to the standard structural pattern in all basic respects... A competitive market for money, students, faculty, and prestige dictated the avoidance of pronounced eccentricities... Consider the inconceivability of an American university without a board of trustees ... the lure of a well-defined system of faculty rank ... department chairmen, an athletic stadium, transcripts of student grades, formal registration procedures, or a department of geology.
This remarkable period from about l880 to l900 set the agenda for the 20th century. From England, we had borrowed the idea of a broad liberal education in a residential setting aimed at developing not only the mind but character as well. This model still guides our view of undergraduate education. From Germany, we borrowed the idea of a faculty dedicated to scientific investigation and the training of future scholars. This model still guides our conception of graduate education. Between l880 and l900, we grafted the German university on top of the English colonial college and invented the modern American university.
In terms of types of institutions that constitute the family of institutions we call "higher education," several important chapters of our story come in this century. One chapter is about how those colonial colleges that did not develop into universities evolved into our modern liberal arts colleges. Another chapter is about how normal schools, originally established to train elementary school teachers, turned into teachers colleges; and then, in the era of post-World War II (WWII) expansion, how these teachers colleges turned into comprehensive state colleges and universities.
The final chapter is the fascinating story of the two-year, "junior" or "community" college. William Rainey Harper, president of the University of Chicago, strongly believed that high schools should develop a 13th and 14th year of schooling. With Harper's encouragement, a high school in Joliet, Illinois, was parent to the first such junior college. By 1920, there were 52 junior colleges. By WWII there were 450. In l947, a famous presidential commission on Higher Education for American Democracy declared, "The time has come to make education through the fourteenth grade available in the same way that high school education is available." This set the stage for the massive expansion of the junior and community college movement after WWII. In the l960s, community colleges were opening at the rate of one a week.
WWII spurred two great developments that had enormous impact on higher education. First, the Manhatten Project dramatically demonstrated how closely the security of the country depended on the nation's pool of scientific talent. In 1944, Vanevar Bush, science advisor to President Roosevelt, authored a famous report titled "Science: The Endless Frontier." This paved the way for a massive, continuing investment of federal dollars in research.
The second development was the Veterans Readjustment Act, commonly known as the "GI Bill." No one had the slightest idea how significant it would be. The higher education establishment opposed the bill, fearing it would lower standards. Happily for the country, the bill passed. By l947, one of every two students in higher education was financed by the GI Bill. And, contrary to the fears of the education establishment, they turned out to be motivated, excellent students. By the mid-1950s, there was enough experience with the GI Bill students to suggest that an investment in higher education for ordinary Americans benefited the nation in concrete and specific ways, from increased productivity to increased tax returns. And this commonsense view of the public was bolstered by complex arguments from economists, showing that the return on the investment in higher education was at least as great as the return on the investment in the oil business. At the same time, the country was accepting a large role for government in maintaining a degree of economic prosperity. The rest, as they say, is history.
So here we are, on the verge of a new century—and a new era. People have tried to pin a label on this new era, calling it the "postindustrial society," "information society" or "computer age." But, in truth, no single label can capture the complex changes now taking place in America. The industrial revolution was revolutionary, not simply because of the introduction of new machines like the steam engine, the cotton gin and the power loom. What was revolutionary was that these machines, in interaction with other forces, catalyzed profound changes in the ways people lived and worked. Similiar changes are happening in America today.
What will be the role of higher education in this next century? Commenting on the impact of new technologies on higher education, Peter Drucker said recently in an interview carried in Forbes, "Thirty years from now the big university campuses will be relics. Universities won't survive. It's as large a change as when we first got the printed book." Drucker has been very right and very wrong many times before, and I suspect this latest casual prediction of his will be wrong. But as we think about the future of the place-bound, industrial-era university, it is useful to recall what happened to the colonial college during the last great social transition in America. At the very least, we need to keep this epic tale in mind as we dive into the problems facing higher education today.
Chapter II—the Fall From The Pedestal
With the previous chapter as backdrop, let us now look at the contemporary scene. I believe that the problems and issues that confront higher education today can be best understood as an effort by American society to revise the social contract—the rather lopsided bargain—that society made with higher education in the l960s. Senior faculty and administrators in higher education today sometimes wistfully look back on the l960s as the golden age, the way things are supposed to be. But, in fact, those were the abnormal times. We built our higher education system on a tidal wave of expectations that has now passed on.
The Amazing 1960s America's expectations for higher education in the l960s were shaped by three extraordinary events. The first event was the launching of Sputnik in l957, which both symbolized and spurred on the spectacular scientific and technological race with the USSR. America's political, military, cultural and economic influence dominated the "free world." As a training ground for the best and the brightest, higher education soared to new levels of public esteem. Between l953 and l962, for example, in Gallup polls assessing the suitability or attractiveness of nine leading professions, the academic profession rose from seventh to third place.
The second event was the civil rights movement. The GI Bill had demonstrated that helping ordinary Americans go to college was a good investment. In the early 1960s, this lesson was joined to society's gathering commitment to social justice to create a powerful political rationale—equality of opportunity—for extending college opportunities to those who had been excluded from the mainstream of American life.
The third event was the baby boom. In l964, the surge of new babies that began arriving nine months after VJ day and continued coming for 14 years hit college age. They took over higher education like an invading army.
Any one of these events would have thrust higher education onto the center of the national stage. The convergence of all three events, at one historical moment, in the context of a booming economy that was lifting all boats, created an unprecedented, spectacular set of expectations for the role higher education should play in American life.
So what did America want from higher education in the l960s? Two things. First, a rapid expansion of higher education's scientific research and training capability—with few questions asked about what particular kinds of research and training related to what particular kinds of public needs. Second, a rapid expansion of buildings and faculty to meet the surge in demand for access to college—with few questions asked about the kind or quality of education that was going on inside those buildings.
In the midst of a gold rush, people do not stop to ask questions about cost, quality and accountability. And these were gold rush days. To cite a personal example, when I left graduate school and became a member of the political science faculty at the University of Wisconsin in l965, I was one of five new assistant professors hired that year. Each year for the next three years, my department again hired five new professors. In the course of four years, my department expanded from 20 to 40 professors. This same story was taking place all over America. And we all thought this was what normal, professional life was like!
In the context of this boom period, colleges and universities were free to follow their own stars, to pursue their own internal visions of what kind of higher education society should have. Understandably, most chose to set their compass course on the same star—the ideal of institutional excellence and professional life that was set by our most prestigious universities. And reinforcing these aspirations (what some describe as "research envy") was the fact that the research universities were training the faculty who were taking up faculty positions in other types of institutions, establishing research-oriented "colonies" in regional and state universities, liberal arts colleges and even community colleges.
By the end of the l960s, what authors David Riesman and Christopher Jencks called "The Academic Revolution" had come to completion. A professionalized faculty was firmly in power, setting the standards for not only graduate but undergraduate education as well. In the arranged marriage between the liberal arts college and the German graduate school, the graduate school had emerged as the dominant and dominating partner.
The Sobering 1990s Fast-forward a generation to l997. America is a very different place.
The cold war is over. In its place we have a national agenda of troubling problems that can perhaps be summarized into two major challenges. The first is how to earn our national living in an increasingly interdependent, global economy. The second is nation building: how to renew our social, political and cultural life in the face of unprecedented change and a growing accumulation of unsolved domestic problems, including family disintegration, loss of jobs, crime and drugs.
The baby boom, the civil rights movement and a growing economy that lifts all boats have all passed on. In their place, representing new demands for higher learning, we have a "baby echo" (a second population bulge from the children of the baby boomers) and growing needs for adult education. But we also have a host of new conditions—rising concerns about costs, quality and accountability, new competitors for public resources, flagging commitments to civil rights and public investments—that limit the capacity of higher education to respond to these demands. All in all, higher education in the l990s confronts at least six new realities.
New enrollment demands. Twenty-five years after the baby-boom generation set a national record for school enrollment, the record is about to be broken by the children of the baby boomers. But unlike the last tidal wave, this will be a longer, slower rise in enrollments. By the year 2002, the number of high school graduates will increase by 14 percent; by the year 2006, the figure will be 17 percent.
National data are misleading, however, for most of this growth will occur in the far West and Southwest, with California being the epicenter of new enrollment pressures. In the 10-year period from l996 to 2006, California will see an 18.3 percent increase in school enrollments. After looking at nine sets of enrollment forecasts for higher education, an expert panel convened by the California Higher Education Policy Center concluded that higher education should plan for an increase of 488,000 students over the next decade. On the other hand, states such as Louisiana, Wyoming, North Dakota and Maine will actually experience declines in high school graduates and, therefore, declines in potential college enrollees.
"Nontraditional" adult students represent another potential surge of demand. According to the U.S. Census Bureau, the number of adults (aged 25 and older) enrolled in college jumped 28 percent between l987 and l994 to a total of 6.1 million students. Without going into all the complex changes taking place in the workplace, it seems clear that adult workers' demands for various forms of higher learning will continue to grow. Whether colleges and universities will be the institutions that meet this demand is another question. Just as traditional banks have lost market share to a host of other kinds of financial institutions, so, I suspect, will colleges and universities lose market share to a growing array of providers of adult learning services.
Public shock over rising college costs. In 1990, Daniel Yankelovich took some polls that leaped out at him as registering something newly important. Fully 88 percent of the American public reported that a high school diploma was no longer enough to qualify for a well-paying job. At the same time, 87 percent reported that college costs were rising at a rate that would put college out of the reach of most people. As Yankelovich later reported to a national higher education meeting, this was the same kind of political material out of which the national debate over health care had emerged. A service that the public regarded as indispensable to a decent life was becoming ever less affordable.
In truth, the trends that lie behind these attitudes had been at work for years. Since l974, the median family income of most Americans had been steadily losing ground—while the costs of going to college rose by leaps and bounds. Between l980 and l995, the median family income rose by 5 percent; financial aid per student rose by 37 percent. In contrast, tuition at private four-year institutions rose by 89 percent and at public four year institutions by 98 percent. But somehow, during the Reagan years, no one noticed. Then, with the l990 recession, the public suddenly awoke with shock to what was happening. The "story" of rising college costs has been building ever since.
Fueling all this is evidence that the public is right in its perception of the economic value of a college degree. In l979, full-time male workers aged 25 and over with at least a bachelor's degree earned 49 percent more per year than did comparable workers with only a high school degree. By 1993, the difference had nearly doubled to 89 percent. According to the Census Bureau, people with bachelor's degrees will, on average, now earn $600,000 more over their lifetimes than high school graduates. Add a professional or graduate degree, and the gap widens further.
New competitors for public funds. Today colleges and universities are encountering increasing resistance as they attempt to pass on their costs to students and parents. They are also running into similar troubles with the state and federal governments. A recent report on state expenditures in the l990s from the Center for the Study of the States identified the major shifts in state expenditures that had occurred between l990 and l994. The big loser in the battle for state resources was higher education, falling from 14 to 12.5 percent of the total. Federal mandates in health care, new welfare policies and pressures on states to spend more on prisons, highways and K-12 education are all cited as factors in this shift.
At the federal level, the same thing is true. For many years before the l990s, the federal share of total higher education expenditures ran to roughly 20 percent. In the early 1990s, it dropped to 15 percent. The culprit: the declining portion of the total federal budget going to discretionary domestic spending.
As I write, however, the federal picture is rapidly changing. President Clinton has proposed to add $35 billion to higher education over the next five years in a package of tax credits and tax deductions. Congress is now deliberating this proposal, and the Republican leadership has basically agreed to give financing for college a major piece of whatever tax cuts are going to be made. Although there are serious issues of equity at stake in using the tax system to finance access to college, it is possible that a major chunk of new resources will be available.
Weakening of other props for access. In the l960s, as we have seen, a constellation of public beliefs all gave momentum to the waves of access to higher education. One was a belief that an investment in higher education repeated benefits not only for individuals but for the nation at large. Another was a belief that government had a major role to play in our national life. A third was a gathering commitment to civil rights.
Today, all three of these beliefs are on the defensive. The political discourse about investments in higher education has shifted to emphasize the benefits to individuals. Both political parties are trying to reduce the role of government. And affirmative action is now under assault, both in the courts and in the larger political arena.
In the courts, a series of decisions, most notably the Supreme Court's 5-4 ruling in the case of Adarand v. Pena, have narrowed the scope of permissible affirmative action. For higher education, the most important decision has been the ruling of the Fifth Circuit Court of Appeals in Hopwood v. Texas, banning the use of race—even as a "plus factor" among equally qualified applicants—at the University of Texas Law School. By refusing to hear an appeal of Hopwood, the Supreme Court has left standing a ruling that reverses Bakke, a ruling that had given colleges and universities legal sanction to practice race-conscious admissions.
Politically, the landscape has been transformed dramatically over the past two years. The pivotal event was the passage of Proposition 209 in California, eliminating race- and gender- conscious affirmative action programs in public education, employment and contracting. Proposition 209 marked the first time that the voters in a state were afforded the opportunity to express their opposition to what Proposition 209's sponsors consistently referred to as "racial preferences." Buoyed by Proposition 209's success, opponents of affirmative action are now pressing for its elimination in other states as well. At the federal level, President Clinton's position to "mend not end" affirmative action remains under attack.
Growing concerns for quality. In the mid-1980s, concerns about the quality of higher education began surfacing in the national press. In the wake of the 1983 report, A Nation at Risk, which helped ignite public interest in school reform, the press started treating curricular debates and popular critiques like Profscam as national news. But all this was more titillating than real until the shock of the l990 recession. When mixed with the rising concerns for cost, press stories that had real bite started appearing. In June l992, the Chicago Tribune carried a series of scathing stories called "Degrees of Neglect." In September l992, a House select committee held hearings on "College Education: Paying More and Getting Less." In March l993, CBS's Minutest after tenure. By l995, when the Educational Testing Service released a study (funded by PCT) of adult literacy, headlines such as USA Today's "College-level Literacy Less Than Impressive" appeared around the country. By the mid-1990s, the quality of higher education had become an "issue" for the media and remains one to this day.
From center stage to the sidelines. In l992, Harvard president Derek Bok addressed the annual meeting of the American Association for Higher Education on the topic "Reclaiming the Public Trust." After listing all the negative news about higher education, he pointed out that the defects in university education that the media is criticizing today were at least as bad--and probably worse—20 or 30 years ago. Yet the criticism today is far more intense. So why, he asked, was there not more criticism then, and why is there so much now? Why are the most intemperate polemics treated with seriousness by the New York Times?'
The crucial difference between then and now, Bok pointed out, is that higher education in the l960s was actively and visibly engaged in two great ventures that had the enthusiastic support of the people and the government and were perceived as central to the progress of the nation--the tasks of beating the Russians and providing equality of opportunity. These ventures brought higher education into alliance with governmental, business and foundation leaders in pursuit of goals that everyone perceived as important.
Now, Bok concluded, Americans are focused on a national agenda of post-cold war problems to which colleges and universities are no longer clearly connected. In retrospect, beating the Russians in a technology race turns out to be easier than beating crime, welfare dependency, the drug culture and other problems on our national agenda. The solutions are complex, and the contributions that colleges and universities might make to these solutions are harder to figure out and articulate. It may be that a powerful case can be made that higher education holds the keys to economic development and civic renewal. But this case has not yet been made—at least in a way that has captured the imagination of the larger public.
I think Bok is onto something important here. When governors begin talking about problems in their states, few think of colleges and universities as resources to help address these problems. Colleges and universities may, in fact, be doing a lot more than the governors realize, but they are not perceived as being actively and visibly engaged. On the contrary, they are more likely to be seen as fiddling while Rome is burning.