viernes, 20 de marzo de 2015

In the Age of Information, Specializing to Survive


Jonathan Haber majored in philosophy at Harvard University. And Yale. And Stanford. He explored Kant’s “The Critique of Pure Reason” with an Oxford don and Kierkegaard’s insights into “Subjectivity, Irony and the Crisis of Modernity” with a leading light from the University of Copenhagen. In his quest to meet all the standard requirements for a bachelor of arts degree in a single year, the 52-year-old from Lexington, Mass., also took courses in English common law, Shakespeare’s late plays and the science of cooking, which overlapped with the degree in chemistry he earned from Wesleyan in 1985.


Here’s the brilliant part: Mr. Haber didn’t spend a dime on tuition or fees. Instead, he gorged from the smorgasbord of free courses offered by top universities. He documented the project on his website, degreeoffreedom.org, and in a new book exploring the wider phenomenon of massive open online courses, or MOOCs. He didn’t earn a degree — the knowledge may be free but the sheepskin costs dearly — but he was satisfied.

“I wouldn’t call myself a philosopher,” he said, “but I learned as much as most undergraduates.”
Mr. Haber’s project embodies a modern miracle: the ease with which anyone can learn almost anything. Our ancient ancestors built the towering Library of Alexandria to gather all of the world’s knowledge, but today, smartphones turn every palm into a knowledge palace.
And yet, even as the highbrow holy grail — the acquisition of complete knowledge — seems tantalizingly close, almost nobody speaks about the rebirth of the Renaissance man or woman. The genius label may be applied with reckless abandon, even to chefs, basketball players and hair stylists, but the true polymaths such as Leonardo da Vinci and Benjamin Franklin seem like mythic figures of a bygone age.

They don’t make geniuses like they used to.
Perhaps we need another Franklin to explain why. Thanks to the power of technology and the brute force of demographics, the modern world should be teeming with people of wide accomplishment. In Franklin’s era, the world’s population was about 800 million; today it’s seven billion people, many of whom enjoy the brain-building blessings of good nutrition and access to education. Indeed, the researcher James R. Flynn has found that I.Q. scores have been rising around the world for decades. Known as the “Flynn effect,” it is especially pronounced in developed nations such as the United States, where average scores have increased about three points per decade since the early 1900s.




Nevertheless, it is much easier to feel like Sisyphus than Leonardo nowadays, because one thing that has grown even faster than I.Q. scores is the amount of information the brain must process. Google estimated in 2010 that there were 300 exabytes (that’s 300 followed by 18 zeros) of human-created information in the world, and that more information was created every two days than had existed in the entire world from the dawn of time to 2003.
No doubt those numbers have increased vastly since then. But does it really matter? Like the physicists’ observation that the known universe has a diameter of 92 billion light years, these numbers are so large that they defy human comprehension; they are meaningless truths to just about everybody not named Stephen Hawking. When it comes to aggregate information, we blew our minds long ago.
Of course, not all information is equal. Those exabytes do include a few great novels, stirring films and groundbreaking scientific discoveries. Most are flotsam wrapped in jetsam: insipid blog posts and text messages, YouTube videos of cuddly cats and pornographic acts, ignorance that poses as knowledge.
“We are overloaded with junk,” said Daniel Levitin, a professor of psychology and behavioral neuroscience at McGill University whose books include “The Organized Mind.” “It’s becoming harder and harder to separate the wheat from the digital chaff. The problem with the Internet is anyone can post, so it’s hard to know whether you are looking at a fact or pseudofact, science or pseudoscience.”
That problem seems quintessentially modern; Alvin Toffler didn’t popularize the term “information overload” until 1970. But in the relative realm of human experience, it is as constant and nettlesome as death and taxes. At least since the heyday of ancient Greece and Rome, each generation has confronted the overwhelming struggle to search, sift and sort growing piles of information to make what is known useful. “Papyrus, print or petabyte — the history of feeling overwhelmed by information always seems to go back further than the latest technology.” said Seth Rudy, a professor of English literature at Rhodes College who explores this phenomenon in his new book, “Literature and Encyclopedism in Enlightenment Britain: The Pursuit of Complete Knowledge.” “The sense that there is too much to know has been felt for hundreds, even thousands, of years.”
In response, figures of expert erudition and taste — such as the Roman Gaius Petronius Arbiter, whose impeccable taste made his name a byword of discernment, and the 19th-century critic Matthew Arnold, who defined culture as “the best that has been thought and known” — have helped distinguish the dross from the gold.




Primitive search engines developed in the Middle Ages are still with us, including indexes, concordances and tables of contents, while the dictionary and the florilegium (a compilation of quotations and excerpts from other writings) enabled busy people to sample the world’s wisdom. This remains a thriving business; a sales pitch of modern journalism is that reporters and critics do the work (read the book, see the play, try the recipe, interview experts) so you don’t have to.




Encyclopedias rose in the Enlightenment. Tellingly, Mr. Rudy said, most early works were created by one person and aimed to synthesize all knowledge into a single, coherent body. Soon, they became collections of discrete articles written by a team of experts. By the 20th century, the storehouse of useful knowledge had grown at such a thrillingly alarming rate that the possibility of mastering just one area of study, such as physics, literature or art — much less to become a Renaissance man who could make important contributions to various fields — became less aspiration than delusion.
Julianne Moore’s character captured this sense in the Oscar-winning movie “Still Alice” when she joked about “the great academic tradition of knowing more and more about less and less until we know everything about nothing.”
That barb suggests a profound response to the explosion of information that has transformed modern scholarship and innovation: the rise of intense specialization and teamwork. “Once upon a time you could be a biologist,” said Benjamin F. Jones, an economist at the Kellogg School of Management at Northwestern University. “Now the accumulation of knowledge is such that biologists, for example, must specialize in an array of microdisciplines like evolutionary biology, genetics and cell functions.”
“At the turn of the 20th century,” he added, “the Wright brothers invented the airplane; today the design of the jet engine calls upon 30 different disciplines requiring a vast array of specialized teams.”
If the information age makes knowledge seem like a straitjacket, David Galenson, a professor of economics at the University of Chicago, notes that progress often hinges on those rare individuals who have escaped its bonds. Artists from Picasso to Bob Dylan and entrepreneurs including Bill Gates and Steve Jobs changed the world by finding “radically new ways of looking at old problems,” Mr. Galenson said. “They cut through all the accumulated stuff — forget what’s been done — to see something special, something new.”

It is why, Mr. Galenson added, the historian and physicist Stanley Goldberg said of Einstein, “It was almost as if he were wearing special glasses to make all that was irrelevant invisible."
For many who don’t share that kind of vision, the response to information overload is simple: Just search and forget (repeat as necessary). Even more ambitious absorbers of knowledge like Jonathan Haber will most likely find that the key to lifelong learning is a human mediator, someone who has engaged in the ancient task of searching and sorting through knowledge.
Until, of course, a modern-day Leonardo invents a machine that can do that too.

Share/Bookmark

sábado, 7 de marzo de 2015

Here’s What Will Truly Change Higher Education: Online Degrees That Are Seen as Official

Photo
CreditEdmon de Haro
Three years ago, technology was going to transform higher education. What happened?
Over the course of a few months in early 2012, leading scientists from Harvard, Stanford and M.I.T. started three companies to provide Massive Open Online Courses, or MOOCs, to anyone in the world with an Internet connection. The courses were free. Millions of students signed up. Pundits called it a revolution.
But today, enrollment in traditional colleges remains robust, and undergraduates are paying higher tuition and taking out larger loans than ever before. Universities do not seem poised to join travel agents and video stores on the ash heap of history — at least, not yet.
The failure of MOOCs to disrupt higher education has nothing to do with the quality of the courses themselves, many of which are quite good and getting better. Colleges are holding technology at bay because the only thing MOOCs provide is access to world-class professors at an unbeatable price. What they don’t offer are official college degrees, the kind that can get you a job. And that, it turns out, is mostly what college students are paying for.
Now information technology is poised to transform college degrees. When that happens, the economic foundations beneath the academy will truly begin to tremble.
Traditional college degrees represent several different kinds of information. Elite universities run admissions tournaments as a way of identifying the best and the brightest. That, in itself, is valuable data. It’s why “Harvard dropout” and “Harvard graduate” tell the job market almost exactly the same thing: “This person was good enough to get into Harvard.”
Degrees give meaning and structure to collections of college courses. A bachelor’s degree signifies more than just 120 college credits. To graduate, students need a certain number of upper- and lower-division credits, a major and perhaps a sprinkling of courses in the sciences and humanities.
College degrees are also required to get graduate degrees. It didn’t used to be that way. Back in the 19th century, people interested in practicing law could enroll directly in law school. When Charles Eliot became president of Harvard in 1869, he set to work making bachelor’s degrees a prerequisite for admission to Harvard’s graduate and professional schools. Other colleges followed suit, and by the turn of the century a large and captive market for their educational services had been created.
Most important, traditional college degrees are deeply embedded in government regulation and standard human resources practice. It doesn’t matter how good a teacher you are — if you don’t have a bachelor’s degree, it’s illegal for a public school to hire you. Private-sector employers often use college degrees as a cheap and easy way to select for certain basic attributes, mostly the discipline and wherewithal necessary to earn 120 college credits.
Free online courses won’t revolutionize education until there is a parallel system of free or low-fee credentials, not controlled by traditional colleges, that leads to jobs. Now technological innovators are working on that, too.
The Mozilla Foundation, which brought the world the Firefox web browser, has spent the last few years creating what it calls the Open Badges project. Badges are electronic credentials that any organization, collegiate or otherwise, can issue. Badges indicate specific skills and knowledge, backed by links to electronic evidence of how and why, exactly, the badge was earned.
Traditional institutions, including Michigan State and the University of Illinois at Urbana-Champaign, are experimenting with issuing badges. But so are organizations like the National Oceanic and Atmospheric Administration, 4-H, the Smithsonian, the Dallas Museum of Art and the Y.M.C.A. of Greater New York.
The most important thing about badges is that they aren’t limited to what people learn in college. Nor are they controlled by colleges exclusively. People learn throughout their lives, at work, at home, in church, among their communities. The fact that colleges currently have a near-monopoly on degrees that lead to jobs goes a long way toward explaining how they can continue raising prices every year.
The MOOC providers themselves are also moving in this direction. They’ve always offered credentials. In 2013, I completed a semester-long M.I.T. course in genetics through a nonprofit organization run by Harvard and M.I.T., called edX. You can see the proof of my credentials here and here.
Coursera, a for-profit MOOC platform, offers sequences of courses akin to college majors, followed by a so-called capstone project in which students demonstrate their skills and receive a verified certificate, for a fee of $470. The Coursera Data Science sequence is taught by Johns Hopkins University and includes nine four-week courses like exploratory data analysis, regression models and machine learning. The capstone project requires students to build a data model and create visualizations to communicate their analysis. The certificate is officially endorsed by both Coursera and Johns Hopkins. EdX has similar programs.
Inevitably, there will be a lag between the creation of such new credentials and their widespread acceptance by employers and government regulators. H.R. departments know what a bachelor’s degree is. “Verified certificates” are something new. But employers have a powerful incentive to move in this direction: Traditional college degrees are deeply inadequate tools for communicating information.
The standard diploma has roughly the same amount of information that prisoners of war are required to divulge under the Geneva Conventions. College transcripts are a nightmare of departmental abbreviations, course numbers of indeterminate meaning, and grades whose value has been steadily eroded by their inflation.
This has the effect of reinforcing class biases that are already built into college admissions. A large and relatively open-access traditional public university might graduate the same overall number of great job candidates as a small, exclusive, private university — say, 200 each. But the public 200 may graduate alongside 3,000 other students, while the private 200 may have only 300 peers. Because diplomas and transcripts provide few means of reliably distinguishing the great from the rest, employers give a leg up to private college graduates who probably had some legs up to begin with.
The new digital credentials can solve this problem by providing exponentially more information. Think about all the work you did in college. Unless you’re a recent college graduate, how much of it was saved and archived in a way that you can access now? What about the skills you acquired in various jobs? Digital learning environments can save and organize almost everything. Here, in the “unlabeled” folder, are all of my notes, tests, homework, syllabus and grades from the edX genetics course. My “real” college courses, by contrast, are lost to history, with only an inscrutable abbreviation on a paper transcript suggesting that they ever happened at all.
Open credentialing systems allow people to control information about themselves — what they learned in college, and what they learned everywhere else — and present that data directly to employers. In a world where people increasingly interact over distances, electronically, the ability to control your online educational identity is crucial.
This does present a new challenge for employers, who will have to sift through all this additional information. College degrees, for all of their faults, are quick and easy to digest. Of course, processing large amounts of information is exactly what computers are good for. Scientists at Carnegie Mellon University are designing open badges that are “machine discoverable,” meaning that they are designed to be found by employers using search algorithms to locate people with specific skills.
Protecting private, personal information is a big part of navigating the digital era. But people want certain kinds of information to be as public as possible — for example, that they are very good at specific jobs and would like to find an employer looking for such people. Companies such as LinkedIn are steadily building new tools for people to describe their employable selves. College degrees, by contrast, say little and never change.
In the long run, MOOCs will most likely be seen as a crucial step forward in the reformation of higher education. But their true impact won’t be felt until students and learners of all kinds have access to digital credentials that are also built for the modern world. Then they’ll be able to acquire skills and get jobs for a fraction of what colleges cost today.

Share/Bookmark