- “Computers are incredibly fast, accurate, and stupid. Human beings are incredibly slow, inaccurate, and brilliant. Together they are powerful beyond imagination.” (This quote is often mistakenly attributed to Albert Einstein; most likely the correct attribution is Leo Cherne at the Discover America Meeting, Brussels, June 27, 1968.)
Albert Einstein's statement quoted above captures the essence of computational thinking. Computational thinking involves using the capabilities of one's (human) brain and the capabilities of computer (brains) to represent and solve problems and accomplish tasks. Education for computational thinking involves learning to make effective use of these two types of brains.
Here is a more recent description of computational thinking:
- Computational thinking is a way of solving problems, designing systems, and understanding human behavior that draws on concepts fundamental to computer science. Computational thinking is thinking in terms of abstractions, invariably multiple layers of abstraction at once. Computational thinking is about the automation of these abstractions. The automaton could be an algorithm, a Turing machine, a tangible device, a software system—or the human brain. (Carnegie Mellon, n.d.) [Bold added for emphasis.]
Human brains get better through informal and formal education and through regular use. Computer brains get better through the combined research and development of many thousands of people. Computer brains are getting better at a rapid pace. Thus, all student and all teachers need to learn and teach about the capabilities and limitations of the combination of human and computer brains.
Each type of brain has unique capabilities and limitations. Together they are incredibly powerful. Read more about this idea in the document Two Brains Are Better Than One.
Computational and Procedural Thinking
Many adjectives describe modes of thinking: Abstract, analytic, conceptual, concrete, convergent, creative, critical, deductive, divergent, strategic, synthetic, tactical…and computational and procedural.
Computational and procedural thinking are fundamental ideas in the discipline of computer and information science. A computer is a machine that can automatically, rapidly, and accurately carries out the steps in certain types of procedures. Computer programmers think in terms of solving problems and accomplishing tasks through the use of procedures.The procedures may be algorithmic or heuristic, or a combination of these two approaches.
- An algorithm is a step-by-step set of directions guaranteed to achieve a task, which may be to solve a particular problem, in a finite number of steps. You probably have memorized algorithms for adding a column of positive integers and for multiplying a pair of integers.
- A heuristic is like an algorithm except that accomplishment of a specific task or solution of a specific problem is not guaranteed. Many heuristics are called “rules of thumb,” simple-sounding guides that often conceal complexities.
For example, a heuristic for solving a complex problem is to break the problem into smaller, more manageable problems. Solve each of the smaller problems, put the results together, and the larger problem is solved.
Brainstorming is a group process heuristic for addressing a complex problem. In brainstorming, people suggest ideas and these are collected without comment by the person facilitating the brainstorming. Later, the group analyzes the brainstormed ideas, deciding on which ones are worthy of further study.
Developing algorithms and heuristics can be very mentally challenging. Here is an algorithm for looking up a definition a for a word in a dictionary.
- Start at the first word defined in the dictionary and compare it with the word you are looking up. If they are not the same, go to the next word that is defined in the dictionary. Continue until you find the word, or until you have looked at every word defined in the dictionary. In the latter case, you know the word is not defined in the particular dictionary you are using.
This is, of course, a poor algorithm. Think about how you go about looking up a word in a dictionary. Try to write this process down so that someone else (such as third grader) can follow your set of directions. You can see that it is often quite difficult to figure out how to write down an algorithm, and it is sometimes quite difficult to learn to use an algorithm.
Now, here is another approach.
- In the Google search engine, enter the term define: followed by the word that you want to look up. Very quickly, the search engine will provide you with some definitions or tell you that the word you want to find a definition for is not in its dictionary.
In this search engine approach, you (using your brain) decide about what word you want to find a definition for. Presumably you have some purpose in mind. Your brain directs your fingers to key the appropriate search information to the Google search engine. You then read the results, thinking about which of the definitions best fits your need.
This provides a good example of computational thinking. You use your physical and mental capabilities, and your sense of purpose, to work with the capabilities of the Web and the Google search engine to solve a problem. Note that it is quite a bit easier for a young student to learn to use the Google search engine approach than it is for the student to learn to look up a word in a paper dictionary—and it is a lot faster.
Computational thinking is an idea important to all people, not just computer scientists. That is, one need not be a computer professional to take advantage of the complementary capabilities and qualities of human and computer.
A Tidbit of Computer History
When electronic digital computers were first being developed starting in the late 1930s, people mainly thought of them as aids to doing arithmetic computations. Indeed, during World War II, large numbers of people spent their workdays running calculators, doing the calculations needed to support the war efforts. These people were called Computers.
The first full scale electronic digital computers built during and shortly after WWII were quite good at arithmetic computation. One electronic digital computer could do the work of several hundred human "Computers." Now, relatively inexpensive desktop computers are more than a million times as fast and a billion times as cost effective as the first commercially available electronic computers built in the early 1950s.
Although computational thinking is a relatively new term, the next two subsections make it clear that this is not a new idea.
1960 Historical Quote
In 1960, J.C.R. Licklider published his seminal paper Man-Computer Symbiosis. Notice the computational thinking ideas in the following quote from this seminal paper. Licklider is summarizing a personal analysis he made about how he spends his working time:
- About 85 per cent of my "thinking" time was spent getting into a position to think, to make a decision, to learn something I needed to know. Much more time went into finding or obtaining information than into digesting it. Hours went into the plotting of graphs, and other hours into instructing an assistant how to plot. When the graphs were finished, the relations were obvious at once, but the plotting had to be done in order to make them so. At one point, it was necessary to compare six experimental determinations of a function relating speech-intelligibility to speech-to-noise ratio. No two experimenters had used the same definition or measure of speech-to-noise ratio. Several hours of calculating were required to get the data into comparable form. When they were in comparable form, it took only a few seconds to determine what I needed to know.
- Throughout the period I examined, in short, my "thinking" time was devoted mainly to activities that were essentially clerical or mechanical: searching, calculating, plotting, transforming, determining the logical or dynamic consequences of a set of assumptions or hypotheses, preparing the way for a decision or an insight. Moreover, my choices of what to attempt and what not to attempt were determined to an embarrassingly great extent by considerations of clerical feasibility, not intellectual capability.
1962 Historical Quote
The following quote is from John Pfeiffer's 1962 book "The Thinking Machine":
- But above all it is important to remember that doing arithmetic, solving mathematical equations by sheer bulldozing power, is not the most significant of the machines' accomplishments. Computers are thinking aids of enormous potentialities. Merely having them around is enough to change the way we think, to force investigators in all fields to think through their problems along new lines. We are at the beginning of a trend that is certain to bring machines which not only learn, but which will accelerate the rate at which we ourselves learn. The revolution to come is difficult to appreciate fully. We only know that science, government, and industry will change swiftly and radically in the years ahead.
- Computers are too important to overrate or underrate. There is no real point in sensationalizing or exaggerating activities which are striking enough without embellishment. There is no point in belittling either. It is hardly an insult to existing computers that they fall considerably short of the human brain and are not creative. The difference simply emphasizes with new force the complexity and capabilities of the nervous system, and challenges us to study it as well as our machines more deeply. The more we learn about computers, the better we shall understand and appreciate the nature of thought - and the better we shall use our brains. (Pfeiffer, pp. 86-87)
Three BIG Ideas
The history of electronic digital computers includes the development of three important ideas:
- Computers could process both numeric and non-numeric symbols, and thus could be used for many tasks other than just arithmetic computation.
- Computer Science (CS) or Computer and Information Science (CIS) is an academic discipline and a significant area of study, research, and development.
- Information and Communication Technology (ICT) provides powerful aids to solving problems in every academic discipline. It is a major change agent in human societies throughout the world.
Many of the early CIS Departments in colleges and universities were formed by faculty who split off from the Mathematics Department, the Business School, or the College of Engineering. Initially, many computer scientists were interdisciplinary scholars, studying both CIS and deep applications of this new discipline in other disciplines.
Eventually, CIS grew in both breadth and depth, and it because an important discipline in its own right. Sub disciplines were developed such as analysis of algorithms, artificial intelligence, computability, databases, networking, and so on.
Moreover, computers became more and more cost effective, and the whole field of Information and Communication Technology (ICT) blossomed. Now a wide range of ICT products and services are routine, everyday parts of our lives. The widespread use of cell phones with a built-in digital camera provides a good example. Nowadays, many of these cell phones are also used to listen to recorded music, to view television, and to play games.
You know, of course, that reading and writing are both academic disciplines in their own right, and are also important aspects (components) of the other academic disciplines. A strong parallel exists between reading/writing and the overall computer field. We have the discipline of computer and information science, and we have computers becoming an important component of every other academic discipline.
A discipline is far more than a collection of isolated pieces. Learning a discipline and learning to use a discipline at a high level are far more than learning isolated facts, tools, and ideas. As a discipline grows and matures, its leaders give considerable thought to identifying unifying themes. Computational thinking is a unifying theme in the computer field and in uses of computers in every discipline
Using a computer system involves telling the system what you want it to do. Of course, you need to think about what you want the computer to do. You need to think about the overall problem-solving task you are trying to accomplish and how use of a computer might or will contribute to accomplishing the task.
As mentioned, computational thinking refers to people and computers working together to solve problems and accomplish tasks. As Jeannette Wing, a highly respected computer scientist, stated:
- Computational thinking builds on the power and limits of computing processes, whether they are executed by a human or by a machine. Computational methods and models give us the courage to solve problems and design systems that no one of us would be capable of tackling alone. Computational thinking confronts the riddle of machine intelligence: What can humans do better than computers, and what can computers do better than humans? Most fundamentally it addresses the question: What is computable? Today, we know only parts of the answer to such questions.
(We should always remember that part of “better” lies within the essence of what it is to be a human and what a computer is. Humans have intrinsic purposes; a computer as such cannot.)
Jeannette Wing coined the term computational thinking while she was head of the Computer Science Department at Carnegie Mellon. Quoting from the home page of Carnegie Mellon's School of Computer Science:
- At Carnegie Mellon, computational thinking pervades our culture. In our research, computer science interacts with almost every other discipline on campus. Computational biology, computational chemistry, computational design, computational finance, computational linguistics, computational logic, computational mechanics, computational neuroscience, computational physics, and computational and statistical learning are just a few examples of such interdisciplinary fields of study. In our education, our undergraduate computer science curriculum and our outreach programs teach students how to think like a computer scientist. Our message is that computer science is not just about programming, but about thinking. Our long-term vision is to make computational thinking commonplace for everyone, not just computer scientists.
A computer program is a detailed step set of instructions that can be interpreted and carried out by a computer. A computer is a machine that can quickly and accurately follow (carry out, execute) the detailed step-by-step set of instructions in a computer program. Computer programmers design, write, and test computer programs—so they are deeply involved in doing computational thinking.
However, all computer users are involved in computational thinking at some level, as they interact with a computer and tell it what they want done. This is true whether you are playing a computer game, retrieving information from the Web, or using a word processor.
Modeling and Simulation
The underlying idea in computational thinking is developing models and simulations of problems that one is trying to study and solve. We are all familiar with the idea of developing mental models—we form mental representations of a problem and often we "play the mental images" in our heads, doing a mental simulation.
We are also all aware of the value of being able to develop a mathematical representation of a problem. For example, this is what is being taught through the use of word problems in a math course. The value of math modeling lies in the huge accumulation of knowledge about solving a wide range of different math problems. If a problem can be represented mathematically (that is, if a math model can be developed for a problem) then this might well prove to be a powerful aid to solving the problem.
Suppose, for example, I am looking at a crowd in a sports stadium. I see just nine empty seats in a section that has 36 seats per row and 17 rows. How many people are seated in this section? That's easy enough. A math model for this problem is 36 x 17 - 9. Notice that the model is a pure math model. It does not say anything about people, what sport is being played, how comfortable the seats are, and so on. Notice also that this may be an incorrect math model. For example, it does not take into consideration the possibility of adults holding children on their laps or of an extremely large person filling two seats.
A more complex example: A person standing on the earth throws a stone (a baseball player throws a baseball; an artillery piece shoots a projectile). What pathway will the stone (baseball, projectile) follow and how far horizontally will it go? This is a relatively complex question because an answer depends on characteristics of the projectile, initial velocity, height above the ground the stone is released, initial angle, air resistance, and gravity.
This problem has been formally studied extensively for hundreds of years by physicists and mathematicians. (Hominids have been studying the problem less formally for two million plus years.) If there is no air resistance and the gravity is a constant, then the pathway is a parabola (a quadratic function). The horizontal distance traveled can be calculated by finding roots of a quadratic equation and doing some simple arithmetic.
A math model of the "real world" problem is far more complex. A computer program to solve this problem can be thought of as a computer model (a physics model, a math model) for this problem.
A excellent, easy to use computer program to solve this problem is available at http://csip.cornell.edu/curriculum_resources/CSIP/Bernier&Fogleman/projectile.html. The graphical representation to solutions to this problem makes it easy to see the effects of air resistance and the effects of shooting a projectile at different angles.
Each discipline has its own special vocabulary and notational system for modeling the types of problems it addresses. Learning to think in the vocabulary and notation of a discipline is a key aspect of developing a high level of expertise in the discipline.
Computer models have some of the characteristics of mental modeling as well as some of the characteristics of math modeling and the types of modeling done in other disciplines. If a problem lends itself to computer modeling, then the computer may well be able to carry out the steps (procedures, symbol manipulations) needed to solve the problem.
Thus, computational thinking, integrating human thinking with the capabilities of computers, provides a powerful new way to solve problems. The computer aspects of computational thinking require one to know capabilities and limitations of computers and how one communicates with (interacts with) a computer system. From an educational point of view, a key aspect of studying any discipline now includes:
- Learning some of the capabilities and limitations of computers as an aid to representing and solving the problems of the discipline.
- Learning how to actually make use of these computer capabilities.
- Learning how to think about problems in the discipline both from a traditional point of view and from a point of view of possible uses of computers to help solve the problem.
Effects in Different Disciplines
Computers have had a much bigger impact in some disciplines than in others. For example, you are well aware of use of music synthesizers and the digital representation of music. The music industry has been significantly changed by computers. Nowadays, computers and computational thinking are key parts of the overall discipline and music. However, this does not mean that older components of the discipline of music have gone away. The discipline of music has done a relatively good job of merging the old with the new.
Similarly, you are familiar with digital still and video cameras, and the computer editing of pictures and video. Computer graphics has greatly changed the movie and graphic arts industries. As with music, the graphic arts have done a good job of merging the old with the new. Nowadays, you can easily see this in many of the videos you see on television and at movie theaters.
Probably you write using a word processor. Writing is a challenging mental task. For thousands of years, people have learned to write using relatively simple tools. The invention of the typewriter proved useful to many people, as it increased the legibility of their writing and the speed of getting their thoughts down on paper.
Now we have word processing and desktop publication. As you use a keyboard or voice input to your computer system, you are creating a computer model of the writing task you are attempting to accomplish. This might be done in conjunction with adding pictures taken with a digital camera and graphics developed using computer graphics software. As you write, your word processor may be doing a search for possible misspelled words and possible errors in grammar. You may be making use of a computerized dictionary or thesaurus. You may be using the Web to find materials. In addition, you probably take advantage of the editing features of a word processor, such as moving sentences and paragraphs. Finally, you may use the desktop publication capabilities of a computer system to produce a nicely laid out and formatted final document, which may or may not appear as ink on paper.
As a final example, consider math. The development of writing allowed for the development of mathematical notation and eventually the development of "paper and pencil" computational algorithms. For thousands of years, math education has included a strong emphasis on learning the words and symbols in the language of mathematics, learning to represent problems in this mathematics language, and learning to do the types of mental and paper and pencil computations needed in solving commonly occurring types of problems. Note that the terms "computation" and "algorithm" apply to arithmetic, but they also apply to algebra, calculus, statistics, and so on.
The search for aids to the symbol manipulation (the computations based on algorithmic heuristic procedures) has been going on for many thousands of years. The abacus was a very important early success—an aid many use today. Math tables, logarithms, the slide rule, and mechanical calculators were all quite important aids to arithmetic computation.
Now, we have electronic digital calculators and computers. We have quite inexpensive solar battery-powered scientific calculators that are superb improvements over earlier aids to numerical computations. We have computers that can do the types of symbolic manipulation needed in many different algebraic, calculus, and statistics computations.
These computer systems can graph functions and statistical data, but they can also do graphical displays of the types of models used in many other disciplines. For example, an architect can develop a computer model of a planned building. The computer can do computations to test the structural integrity, fire and storm resistance, heating and cooling requirements, and so on. A computer system can provide users with a three dimensional structured walk-through of a planned building before physical work begins. These are all humongous computational tasks, based on accumulated knowledge in math, physics, and other disciplines.
Prensky, Marc (February/March 2009). H. Sapiens Digital:
From Digital Immigrants and Digital Natives to Digital Wisdom. Innovate: Journal of Online Education. Retrieved 6/109: http://www.innovateonline.info/index.php?view=article&id=705&action=article. (Free registration.)
The five-point cognitive development scale given below is sometimes called the Clarke scale, after Arthur C. Clark (see the quote given above.)
Many people use the term information to stand for the data, information, knowledge part of this scale. Others use it for the data, information, knowledge, wisdom part of the scale. Thus, information as in Information Age has different meanings to different people.
The following quotes are from the Prensky article referenced above:
- In 2001, I published "Digital Natives, Digital Immigrants," a two-part article that explained these terms as a way of understanding the deep differences between the young people of today and many of their elders (Prensky 2001a, 2001b). Although many have found the terms useful, as we move further into the 21st century when all will have grown up in the era of digital technology, the distinction between digital natives and digital immigrants will become less relevant. Clearly, as we work to create and improve the future, we need to imagine a new set of distinctions. I suggest we think in terms of digital wisdom.
- Digital technology, I believe, can be used to make us not just smarter but truly wiser. Digital wisdom is a twofold concept, referring both to wisdom arising from the use of digital technology to access cognitive power beyond our innate capacity and to wisdom in the prudent use of technology to enhance our capabilities. Because of technology, wisdom seekers in the future will benefit from unprecedented, instant access to ongoing worldwide discussions, all of recorded history, everything ever written, massive libraries of case studies and collected data, and highly realistic simulated experiences equivalent to years or even centuries of actual experience. How and how much they make use of these resources, how they filter through them to find what they need, and how technology aids them will certainly play an important role in determining the wisdom of their decisions and judgments. Technology alone will not replace intuition, good judgment, problem-solving abilities, and a clear moral compass. But in an unimaginably complex future, the digitally unenhanced person, however wise, will not be able to access the tools of wisdom that will be available to even the least wise digitally enhanced human.
Humans and computer systems, working and "thinking" together, are now routine approaches to problem solving in many different academic disciplines. In the examples provide above, the power of computers has substantially changed large parts of various disciplines.
Computer hardware capabilities are continuing a rapid pace of improvement. [http://iae-pedia.org/What_the_Future_is_Bringing_Us#Moore.27s_Law (See Moore’s Law). A steadily increasing range of problems can be solved through computer modeling and making effective use of the capabilities of computers. A steadily increasing amount of the accumulated knowledge of humans is being put into a form in which a computer system can do art of the work of making use of the information. (note, however, that software capabilities have increased much more slowly than hardware capabilities—in part because software capabilities are often qualitative.)
Thus, there are strong arguments for helping students to learn to think both in terms of the capabilities of their own brains and also in terms of the capabilities of computers. Computational thinking is a key component of a modern education. The rapid pace of increase in capabilities and decrease in cost of ICT facilities has proven to be a major challenge to our formal K-12 and higher education systems. Nowadays, every teacher, at every level and in every academic discipline, is faced by the challenges of how computers are affecting the content, instructional processes, and assessment in the subjects they teach.
Apple iPhoto (n.d.). Retrieved 2/10/09: http://www.apple.com/welcomescreen/ilife09/iphoto/play/.
- Watch a video that is essentially an ad for Apple's iPhoto. As you watch this, think about computational thinking, the nature of the formal education needed to make effective use of iPhoto, the problems it can solve for you, and so on.
- Software like iPhoto is an integral component of now and the future. From the developer's point of view, the idea is to integrate a relatively complete "solution" for a certain category of problems and tasks that a person might have—and to do so in a manner that requires very little time and effort to learn how to use at a personally satisfying level.
Carnegie Mellon (n.d.). Center for Computational Thinking. Retrieved 3/16/08: http://www.cs.cmu.edu/~CompThink/.
Columbia University (7/35/08). Scientists open Columbia's new Computational Biology lab. Retrieved 8/8/08: http://www.columbia.edu/cu/news/research/compbio.html. Quoting from the article:
- In early May scientists at Columbia University gathered in room 607 of the Sherman Fairchild building on Morningside campus to celebrate the new Pe'er/Bussemaker Lab for Systems Biology—the first of its kind at the University. The goal of the lab is to develop and apply complex tools that can probe and derive meaning from mountains of data now being created in the rapidly expanding field of systems and computational biology.
- Systems and computational biology is the meeting point between modern molecular biology and new research techniques emerging from the engineering, computer science, chemistry, mathematics, statistics and physics fields. It has the potential to allow scientists to pose limitless questions about how our cells work and issues related to general human health: the study of gene networks, analysis of protein shapes, prediction of biological function and understanding how a cell processes signals.
Computer Science Unplugged(n.d.). Retrieved 9/13/07: http://csunplugged.com/. This material is primarily aimed at precollege students and their teachers. Quoting from the Website:
- Computer Science Unplugged is a collection of activities designed to teach the fundamentals of computer science without requiring a computer. Because they're independent of any particular hardware or software, Unplugged activities can be used anywhere, and the ideas they contain will never go out of date. Unplugged activities have been trialled and refined over 15 years in classrooms and out-of-school programmes around the world.
Concord Consortium Blog (11/4/2007). Posted by Paul Horwitz. Evolution: a Powerful Model but a Fragile One. Retrieved 1/25/2009: http://blog.concord.org/archives/19-Evolution-a-Powerful-Model-but-a-Fragile-One.html.
Quoting from the first part of the document:
- The word "model" means a lot of different things to different people. A model airplane looks like a real airplane, only smaller; a paper airplane flies like a real airplane, only not as far or as fast. Both are models, neither is the kind of model I have in mind.
- For the purposes of this discussion I'm defining a model as a description of a phenomenon in terms of things that can't be seen, felt, or heard, but that explain what's going on. Models may involve things that are too small to be seen, or too big; processes that take place too slowly or too fast. The plate tectonics model, for instance, informs us that the Himalayas are being formed, even as we speak, by the earth crumpling like a car fender, as India crashes (rather slowly, to be sure) into Asia.
- Science is all about models of this kind, and an important goal of science education -- and of the Concord Consortium -- is to give students some examples of models and show them how to use those models to make predictions, to guide experimentation, and generally to make sense out of their own and other people's observations and experiments. Scientific models are constantly subject to revision as new experiments are performed, new data collected, and new interpretations advanced to explain existing data. It is important, therefore, that we teach our students about this process as well, giving them the sense that science is perpetually a work in progress, rather than a set of unchanging "facts."
Moursund, David (2006). Computational thinking and math maturity: Improving math education in K-8 schools. Eugene, OR: Information Age Education. Access at http://i-a-e.org/downloads/doc_download/3-computational-thinking-and-math-maturity-improving-math-education-in-k-8-schools.html.
This book addresses the problem that our K-8 school math education system is not as successful as many people would like it to be, and it is not as successful as it could be. It is designed as supplementary material for use in a Math Methods course for preservice K-8 teachers. However, it can also be used by inservice K-8 teachers and for students enrolled in Math for Elementary and Middle School teachers’ courses.
NSF (9/9/08). Cyber-Enabled Discovery and Innovation (CDI). Retrieved 10/7/08: http://www.nsf.gov/pubs/2008/nsf08604/nsf08604.htm?govDel=USNSF_25. Quoting from the NSF program announcement Website:
- Synopsis of Program:
- Cyber-Enabled Discovery and Innovation (CDI) is NSF’s bold five-year initiative to create revolutionary science and engineering research outcomes made possible by innovations and advances in computational thinking. Computational thinking is defined comprehensively to encompass computational concepts, methods, models, algorithms, and tools. Applied in challenging science and engineering research and education contexts, computational thinking promises a profound impact on the Nation’s ability to generate and apply new knowledge. Collectively, CDI research outcomes are expected to produce paradigm shifts in our understanding of a wide range of science and engineering phenomena and socio-technical innovations that create new wealth and enhance the national quality of life.
- CDI seeks ambitious, transformative, multidisciplinary research proposals within or across the following three thematic areas:
- From Data to Knowledge: enhancing human cognition and generating new knowledge from a wealth of heterogeneous digital data;
- Understanding Complexity in Natural, Built, and Social Systems: deriving fundamental insights on systems comprising multiple interacting elements; and
- Building Virtual Organizations: enhancing discovery and innovation by bringing people and resources together across institutional, geographical and cultural boundaries.
This CID initiative is an effort to use Federal Funds (via the NS) to help invent the future of science and technology.
NSF 9/3/2008). NSF Funds New Center to Bring Together Biologists, Mathematicians. Power of mathematics and modeling to be applied to large-scale questions in biology. Retrieved 9/3/2008: http://www.nsf.gov/news/news_summ.jsp?cntn_id=112167&govDel=USNSF_51.
NSF 1/14/09. CISE Pathways to Revitalized Undergraduate Computing Education (CPATH) Retrieved 1/14/09: http://www.nsf.gov/pubs/2009/nsf09528/nsf09528.html?govDel=USNSF_25. Quoting from the RFP:
- Computing has permeated and transformed almost all aspects of modern life. As computing becomes more important in all sectors of society, so does the preparation of a globally competitive U.S. workforce able to apply core computing concepts, methods, technologies, and tools - referred to here as Computational Thinking (CT) - to a broad range of societal challenges and opportunities.
- CT capitalizes on concepts, methods, technologies, and tools fundamental to the fields of computing, i.e. computer and information science and engineering. For example, computing concepts and methods equip us to reason at multiple levels of abstraction simultaneously, to think algorithmically and apply foundational mathematical concepts to solve complex problems, and to understand the dimensions and consequences of scale. However, it is only when computing concepts and methods are combined with the power of automation afforded by contemporary computing technologies and tools that the full potential of CT is unleashed. Drawing deeply on computational concepts, methods, technologies and tools, CT serves as a powerful strategy to more effectively design, understand and solve problems associated with complex systems in many aspects of modern life.
Pfeiffer, John (1962). The Thinking Machine. Columbia Broadcasting System.
Pinker, Steven (2008). The computational theory of mind (11 minute video). Retrieved 4/13/08: http://www.youtube.com/watch?v=LVrb5ClvDho.
Wing, Jeannette M. (March, 2006). Computational thinking. Communications of the Association for Computing Machinery. Retrieved 9/11/07: http://www.cs.cmu.edu/afs/cs/usr/wing/www/publications/Wing06.pdf. See also (retrieved 9/11/07): http://www.post-gazette.com/pg/07086/772791-96.stm.
Links to Other IAE Resources
This is a collection of IAE publications related to the IAE document you are currently reading. It is not updated very often, so important recent IAE documents may be missing from the list.
This component of the IAE-pedia documents is a work in progress. If there are few entries in the next four subsections, that is because the links have not yet been added.
IAE-pedia (IAE's Wiki)
Popular IAE Wiki Pages. Readers who enjoy this document may enjoy:
I-A-E Books and Miscellaneous Other
This page was initially developed by David Moursund and Dick Ricketts.