0

Learning to Think Like a Computer

 

By LAURA PAPPANO

 

In “The Beauty and Joy of Computing,” the course he helped conceive for nonmajors at the University of California, Berkeley, Daniel Garcia explains an all-important concept in computer science — abstraction — in terms of milkshakes.

“There is a reason when you go to the ‘Joy of Cooking’ and you want to make a strawberry milkshake, you don’t look under ‘strawberry milkshake,’ ” he said. Rather, there is a recipe for milkshakes that instructs you to add ice cream, milk and fruit of your choice. While earlier cookbooks may have had separate recipes for strawberry milkshakes, raspberry milkshakes and boysenberry milkshakes, eventually, he imagines, someone said, “Why don’t we collapse that into one milkshake recipe?”

“The idea of abstraction,” he said, “is to hide the details.” It requires recognizing patterns and distilling complexity into a precise, clear summary. It’s like the countdown to a space launch that runs through a checklist — life support, fuel, payload — in which each check represents perhaps 100 checks that have been performed.

Concealing layers of information makes it possible to get at the intersections of things, improving aspects of a complicated system without understanding and grappling with each part. Abstraction allows advances without redesigning from scratch.

 

 

It is a cool and useful idea that, along with other cool and useful computer science ideas, has people itching to know more. It’s obvious that computers have become indispensable problem-solving partners, not to mention personal companions. But it’s suddenly not enough to be a fluent user of software interfaces. Understanding what lies behind the computer’s seeming magic now seems crucial. In particular, “computational thinking” is captivating educators, from kindergarten teachers to college professors, offering a new language and orientation to tackle problems in other areas of life.

This promise — as well as a job market hungry for coding — has fed enrollments in classes like the one at Berkeley, taken by 500 students a year. Since 2011, the number of computer science majors has more than doubled, according to the Computing Research Association. At Stanford, Princeton and Tufts, computer science is now the most popular major. More striking, though, is the appeal among nonmajors. Between 2005 and 2015, enrollment of nonmajors in introductory, mid- and upper-level computer science courses grew by 177 percent, 251 percent and 143 percent, respectively.

In the fall, the College Board introduced a new Advanced Placement course, Computer Science Principles, focused not on learning to code but on using code to solve problems. And WGBH, the PBS station in Boston, is using National Science Foundation money to help develop a program for 3- to 5-year-olds in which four cartoon monkeys get into scrapes and then “get out of the messes by applying computational thinking,” said Marisa Wolsky, executive producer of children’s media. “We see it as a groundbreaking curriculum that is not being done yet.”

Computational thinking is not new. Seymour Papert, a pioneer in artificial intelligence and an M.I.T. professor, used the term in 1980 to envision how children could use computers to learn. But Jeannette M. Wing, in charge of basic research at Microsoft and former professor at Carnegie Mellon, gets credit for making it fashionable. In 2006, on the heels of the dot-com bust and plunging computer science enrollments, Dr. Wing wrote a trade journal piece, “Computational Thinking.” It was intended as a salve for a struggling field.

“Things were so bad that some universities were thinking of closing down computer science departments,” she recalled. Some now consider her article a manifesto for embracing a computing mind-set.

Like any big idea, there is disagreement about computational thinking — its broad usefulness as well as what fits in the circle. Skills typically include recognizing patterns and sequences, creating algorithms, devising tests for finding and fixing errors, reducing the general to the precise and expanding the precise to the general.

 

April 19, 2017

0 responses on "Learning to Think Like a Computer"

Leave a Message

X