- 1 Introduction
- 2 Computational and Procedural Thinking
- 3 Computational Thinking
- 4 Some Aspects of Computational Thinking
- 5 Effects in Different Disciplines
- 6 Wisdom and Digital Wisdom
- 7 Summary
- 8 References and Added Resources
- 9 Links to Other IAE Resources
- 10 Authors
- Computers are incredibly fast, accurate, and stupid. Human beings are incredibly slow, inaccurate, and brilliant. Together they are powerful beyond imagination. (This quote is often mistakenly attributed to Albert Einstein; most likely the correct attribution is Leo Cherne at the Discover America Meeting, Brussels, June 27, 1968.) Ben Shomate's 2008 article, Einstein never said that explores this question.
Special Announcement: As of July 1, 2015 the Google Computational Thinking for Educators online course is now available. Information about it appears in the Google for Education blog: http://googleforeducation.blogspot.com/2015/07/computational-thinking-for-educators.html. Quoting from the website:
- Our new online course, Computational Thinking for Educators, is free and is intended for educators working with students between the ages of 13 and 18 who are interested in enhancing their teaching with creative thinking and problem solving. We’ll demonstrate how incorporating computational thinking into your classroom simply enhances what you already do, enriching your lessons and student exploration, even without access to technology. Another benefit to computational thinking is that it may help boost students’ confidence and is especially useful when dealing with ambiguous, complex or open-ended problems.
The statement by Leo Cherne quoted above captures the essence of computational thinking. Computational thinking involves using the capabilities of one's (human) brain and the capabilities of computer (brains) to represent and solve problems and accomplish tasks. Education for computational thinking involves learning to make effective use of these two types of brains.
Here is a more detailed description of computational thinking:
- Computational thinking is a way of solving problems, designing systems, and understanding human behavior that draws on concepts fundamental to computer science. Computational thinking is thinking in terms of abstractions, invariably multiple layers of abstraction at once. Computational thinking is about the automation of these abstractions. The automaton could be an algorithm, a Turing machine, a tangible device, a software system—or the human brain. (Carnegie Mellon, n.d.) [Bold added for emphasis.]
Human brains get better through informal and formal education and through regular use. Computer brains get better through the combined research and development of many thousands of people. Computer brains are getting better at a rapid pace. Thus, all students and all teachers need to learn about the capabilities and limitations of the combination of human and computer brains.
Each type of brain has unique capabilities and limitations. Together they are incredibly powerful. Read more about this idea in the document Two Brains Are Better Than One.
Computational and Procedural Thinking
Many adjectives describe modes of thinking: abstract, analytic, conceptual, concrete, convergent, creative, critical, deductive, divergent, strategic, synthetic, tactical, and also computational and procedural.
Computational and procedural thinking are fundamental ideas in the discipline of computer and information science. A computer is a machine that automatically, rapidly, and accurately carries out the steps in certain types of procedures. Computer programmers think in terms of solving problems and accomplishing tasks through the use of procedures. The procedures may be algorithmic or heuristic, or a combination of these two approaches.
- An algorithm is a step-by-step set of directions guaranteed to achieve a task, which may be to solve a particular problem in a finite number of steps. You probably have memorized algorithms for adding a column of positive integers and for multiplying a pair of integers.
- A heuristic is like an algorithm except that the accomplishment of a specific task or solution of a specific problem is not guaranteed. Many heuristics are called “rules of thumb,” simple-sounding guides that often conceal complexities.
For example, one heuristic for solving a complex problem is to break the problem into smaller, more manageable problems. Solve each of the smaller problems, put the results together, and the larger problem is solved. But, there is no guarantee that one will be able to solve all of the smaller problems, and there is no guarantee that one can figure out how to break the large problem into appropriate pieces.
Brainstorming is a group process heuristic for addressing a complex problem. In brainstorming, people suggest ideas and these are collected without comment by the person facilitating the brainstorming. Later, the group analyzes the brainstormed ideas, deciding on which ones are worthy of further study. Brainstorming is often a useful process (heuristic), but there is no guarantee that it will lead to a good solution to the problem under consideration.
Developing algorithms and heuristics can be very mentally challenging. Here is an algorithm for looking up a definition for a word in a dictionary.
- Start at the first word defined in the dictionary and compare it with the word you are looking up. If they are not the same, go to the next word that is defined in the dictionary. Continue until you find the word, or until you have looked at every word defined in the dictionary. In the latter case, you know the word is not defined in the particular dictionary you are using.
This is, of course, a poor algorithm. Think about how you go about looking up a word in a dictionary. Try to write this process down so that someone else (such as third grader) can follow your set of directions. You can see that it is often quite difficult to figure out how to write an algorithm clearly, and it is sometimes quite difficult to learn to use an algorithm.
Now, here is another approach.
- In the Google search engine, enter the term define: followed by the word that you want to look up. Very quickly, the search engine will provide you with some definitions or tell you that the word you want to find a definition for is not in its dictionary.
In this search engine approach, you (using your brain) decide on the word you want to find a definition for. Presumably you have some purpose in mind. Your brain directs your fingers to key the appropriate search information into the Google search engine. You then read the results, thinking about which definition best fits your need.
This provides a good example of computational thinking. You use your physical and mental capabilities, and your sense of purpose, to work with the capabilities of the Web and the Google search engine to solve a problem. Note that it is quite a bit easier for a young student to learn to use the Google search engine approach than it is for the student to learn to look up a word in a paper dictionary—and it is a lot faster.
Computational thinking is an idea important to all people, not just computer scientists. That is, one need not be a computer professional to take advantage of the complementary capabilities and qualities of human and computer brains.
A Tidbit of Computer History
When electronic digital computers were first being developed starting in the late 1930s, people mainly thought of them as aids to doing arithmetic computations. Indeed, during World War II, large numbers of people spent their workdays running calculators, doing the calculations needed to support the war efforts. These people were called Computers.
The first full-scale electronic digital computers built during and shortly after WWII were quite good at arithmetic computation. One electronic digital computer could do the work of several hundred human "Computers." Today, relatively inexpensive desktop, laptop, and tablet computers are more than a million times as fast and a billion times as cost effective as the first commercially available electronic computers built in the early 1950s.
Although computational thinking is a relatively new term, the next two subsections make it clear that this is not a new idea.
1960 Historical Quote
In 1960, J.C.R. Licklider published his seminal paper Man-Computer Symbiosis. Notice the computational thinking ideas in the following quote from this seminal paper. Licklider is summarizing a personal analysis he made about how he spends his working time:
- About 85 per cent of my "thinking" time was spent getting into a position to think, to make a decision, to learn something I needed to know. Much more time went into finding or obtaining information than into digesting it. Hours went into the plotting of graphs, and other hours into instructing an assistant how to plot. When the graphs were finished, the relations were obvious at once, but the plotting had to be done in order to make them so. At one point, it was necessary to compare six experimental determinations of a function relating speech-intelligibility to speech-to-noise ratio. No two experimenters had used the same definition or measure of speech-to-noise ratio. Several hours of calculating were required to get the data into comparable form. When they were in comparable form, it took only a few seconds to determine what I needed to know.
- Throughout the period I examined, in short, my "thinking" time was devoted mainly to activities that were essentially clerical or mechanical: searching, calculating, plotting, transforming, determining the logical or dynamic consequences of a set of assumptions or hypotheses, preparing the way for a decision or an insight. Moreover, my choices of what to attempt and what not to attempt were determined to an embarrassingly great extent by considerations of clerical feasibility, not intellectual capability.
1962 Historical Quote
The following quote is from John Pfeiffer's 1962 book The Thinking Machine:
- But above all it is important to remember that doing arithmetic, solving mathematical equations by sheer bulldozing power, is not the most significant of the machines' accomplishments. Computers are thinking aids of enormous potentialities. Merely having them around is enough to change the way we think, to force investigators in all fields to think through their problems along new lines. We are at the beginning of a trend that is certain to bring machines which not only learn, but which will accelerate the rate at which we ourselves learn. The revolution to come is difficult to appreciate fully. We only know that science, government, and industry will change swiftly and radically in the years ahead.
- Computers are too important to overrate or underrate. There is no real point in sensationalizing or exaggerating activities which are striking enough without embellishment. There is no point in belittling either. It is hardly an insult to existing computers that they fall considerably short of the human brain and are not creative. The difference simply emphasizes with new force the complexity and capabilities of the nervous system, and challenges us to study it as well as our machines more deeply. The more we learn about computers, the better we shall understand and appreciate the nature of thought - and the better we shall use our brains. (Pfeiffer, pp. 86-87)
Notice that the first paragraph mentions the idea of machine learning. Machine learning has grown to be an important component of the field of Artificial Intelligence. See http://en.wikipedia.org/wiki/Machine_learning. This article includes the information: "In 1959, Arthur Samuel defined machine learning as a 'Field of study that gives computers the ability to learn without being explicitly programmed'."
Three Big Ideas
The history of electronic digital computers includes the development of three important ideas:
- Computers could process both numeric and non-numeric symbols, and thus could be used for many tasks other than just arithmetic computation.
- Computer Science (CS) or Computer and Information Science (CIS) is an academic discipline and a significant area of study, research, and development.
- Information and Communication Technology (ICT) provides powerful aids to solving problems in every academic discipline. It is a major change agent in human societies throughout the world.
Many of the early CIS Departments in colleges and universities were formed by faculty who split off from the Mathematics Department, the Business School, or the College of Engineering. Initially, many computer scientists were interdisciplinary scholars, studying both CIS and deep applications of this new discipline in other disciplines.
Eventually, CIS grew in both breadth and depth, and it became an important discipline in its own right. Sub-disciplines were developed such as analysis of algorithms, artificial intelligence, computability, databases, networking, and so on.
Moreover, computers became more and more cost effective, and the whole field of Information and Communication Technology (ICT) blossomed. Now a wide range of ICT products and services are routine, everyday parts of our lives. The widespread use of cell phones with a built-in digital camera provides a good example. Nowadays, many of these "smart" cell phones are also used to communicate via email or texting, to search the Web, to listen to recorded music, to view television, to play games, and so on.
You know, of course, that reading and writing are both academic disciplines in their own right, and are also important aspects (components) of the other academic disciplines. A strong parallel exists between reading/writing and the overall computer field. We have the discipline of computer and information science, and we have computers becoming an important component of every other academic discipline.
A discipline is far more than a collection of isolated pieces. Learning a discipline and learning to use or apply a discipline at a high level are far more than learning isolated facts, tools, and ideas. As a discipline grows and matures, its leaders give considerable thought to identifying unifying themes. Computational thinking is a unifying theme in the computer field and in the uses of computers in every discipline.
Using a computer system involves telling the system what you want it to do. So, to get started you first need to think about what you are trying to accomplish and what parts of the task the computer can help with. You need to understand the capabilities and limitations of the computer system that will be relevant to addressing the problem that you have in mind.
As mentioned, computational thinking refers to people and computers working together to solve problems and accomplish tasks. As Jeannette Wing, a highly respected computer scientist stated:
- Computational thinking builds on the power and limits of computing processes, whether they are executed by a human or by a machine. Computational methods and models give us the courage to solve problems and design systems that no one of us would be capable of tackling alone. Computational thinking confronts the riddle of machine intelligence: What can humans do better than computers, and what can computers do better than humans? Most fundamentally it addresses the question: What is computable? Today, we know only parts of the answer to such questions (Wing, 2006).
(We should always remember that part of “better” mentioned above lies within the essence of what it is to be a human and what a computer is. Humans have intrinsic purposes; a computer as such cannot.)
- Learn more about Jeannette Wing in a (23 minute) 2013 video available at http://www.youtube.com/watch?v=NqqXmFsPkZw.
Jeannette Wing coined the term computational thinking while she was head of the Computer Science Department at Carnegie Mellon. Quoting from the home page of Carnegie Mellon's School of Computer Science:
- At Carnegie Mellon, computational thinking pervades our culture. In our research, computer science interacts with almost every other discipline on campus. Computational biology, computational chemistry, computational design, computational finance, computational linguistics, computational logic, computational mechanics, computational neuroscience, computational physics, and computational and statistical learning are just a few examples of such interdisciplinary fields of study. In our education, our undergraduate computer science curriculum and our outreach programs teach students how to think like a computer scientist. Our message is that computer science is not just about programming, but about thinking. Our long-term vision is to make computational thinking commonplace for everyone, not just computer scientists.
Some Aspects of Computational Thinking
A computer program is a detailed step set of instructions that can be interpreted and carried out by a computer. A computer is a machine that can quickly and accurately follow (carry out, execute) the detailed step-by-step set of instructions in a computer program. Computer programmers design, write, and test computer programs—so they are deeply involved in doing computational thinking.
However, all computer users are involved in computational thinking at some level, as they interact with a computer and tell it what they want done. This is true whether you are playing a computer game, retrieving information from the Web, or using a word processor.
Modeling and Simulation
The underlying idea in computational thinking is developing models and simulations of problems that one is trying to study and solve. We are all familiar with the idea of developing mental models—we form mental representations of a problem and often we "play the mental images" in our heads, doing a mental simulation.
We are also all aware of the value of being able to develop a mathematical representation of a problem. For example, this is what is being taught through the use of word problems in a math course. The value of math modeling lies in the huge accumulation of knowledge about solving a wide range of different math problems. If a problem can be represented mathematically (that is, if a math model can be developed for a problem) then this might well prove to be a powerful aid to solving the problem.
Suppose, for example, I am looking at a crowd in a sports stadium. I see just nine empty seats in a section that has 36 seats per row and 17 rows. How many people are seated in this section? That's easy enough. A math model for this problem is 36 x 17 - 9. Notice that the model is a pure math model. It does not say anything about people, what sport is being played, how comfortable the seats are, and so on. Notice also that this may be an incorrect math model. For example, it does not take into consideration the possibility of adults holding children on their laps or of an extremely large person filling two seats.
A more complex example: A person standing on the earth throws a stone; a baseball player throws a baseball; an artillery piece shoots a projectile. What pathway will the stone, baseball, or projectile follow and how far horizontally will it go? This is a relatively complex question because an answer depends on characteristics of the projectile, initial velocity, height above the ground the stone/baseball/projectile is released, initial angle, air resistance, and gravity.
This problem has been formally studied extensively for hundreds of years by physicists and mathematicians. (Hominids have been studying the problem less formally for two million plus years.) If there is no air resistance and the gravity is a constant, then the pathway is a parabola (a quadratic function). The horizontal distance traveled can be calculated by finding roots of a quadratic equation and doing some simple arithmetic.
A math model of the "real world" problem is far more complex. A computer program to solve this problem can be thought of as a computer model (a physics model, a math model) for this problem.
An excellent, easy to use computer program to solve this projectile problem is available at http://csip.cornell.edu/curriculum_resources/CSIP/Bernier&Fogleman/projectile.html. The graphical representation of solutions to this problem makes it easy to see the effects of air resistance and the effects of shooting a projectile at different angles.
Each discipline has its own special vocabulary and notational system for modeling the types of problems it addresses. Learning to think in the vocabulary and notation of a discipline is a key aspect of developing a high level of expertise in the discipline.
Computer models have some of the characteristics of mental modeling as well as some of the characteristics of math modeling and the types of modeling done in other disciplines. If a problem lends itself to computer modeling, then the computer may well be able to carry out the steps (procedures, symbol manipulations) needed to solve the problem.
Thus, computational thinking, integrating human thinking with the capabilities of computers, provides a powerful new way to solve problems. The computer aspects of computational thinking require one to know the capabilities and limitations of computers and how one communicates with (interacts with) a computer system. From an educational point of view, a key aspect of studying any discipline now includes:
- Learning some of the capabilities and limitations of computers as an aid to representing and solving the problems of the discipline.
- Learning how to actually make use of these computer capabilities.
- Learning how to think about problems in the discipline both from a traditional point of view and from a point of view of the possible uses of computers to help solve the problems.
Effects in Different Disciplines
Computers have had a much larger impact in some disciplines than in others. For example, you are well aware of the use of music synthesizers and the digital representation of music. The music industry has been significantly changed by computers. Nowadays, computers and computational thinking are key parts of the overall discipline of music. However, this does not mean that older components of the discipline of music have gone away. The discipline of music has done a relatively good job of merging the old with the new.
Similarly, you are familiar with graphic arts, digital still and video cameras, and the computer editing of pictures and video. Computer graphics has greatly changed the film and graphic arts industries. As with music, the graphic arts have done a good job of merging the old with the new. Nowadays, you can easily see this in many of the videos on television and at movie theaters.
Probably you write using a word processor. Writing is a challenging mental task. For thousands of years, people learned to write using relatively simple tools. The invention of the typewriter proved useful to many people, as it increased both the legibility of their writing and the speed of getting their thoughts down on paper.
Now we have word processing and desktop publication. As you use a keyboard or voice input to your computer system, you are creating a computer model of the writing task you are attempting to accomplish. This might be done in conjunction with adding pictures taken with a digital camera and graphics developed using computer graphics software. As you write, your word processor may be doing a search for possible misspelled words and possible errors in grammar. You may be making use of a computerized dictionary or thesaurus. You may be using the Web to locate information and other materials. In addition, you probably take advantage of the editing features of a word processor, such as moving sentences and paragraphs. Finally, you may use the desktop publication capabilities of a computer system to produce a nicely laid out and formatted final document, which may or may not appear as ink on paper.
As a final example, consider math. The development of writing allowed for the development of mathematical notation and eventually the development of "paper and pencil" computational algorithms. For thousands of years, math education has included a strong emphasis on learning the words and symbols in the language of mathematics, learning to represent problems in this mathematics language, and learning to do the types of mental and paper and pencil computations needed in solving commonly occurring types of problems. Note that the terms "computation" and "algorithm" apply to arithmetic, but they also apply to algebra, calculus, statistics, and so on. Learn more about the language of math at http://iae-pedia.org/Communicating_in_the_Language_of_Mathematics.
The search for aids to the symbol manipulation (the computations based on algorithmic heuristic procedures) has been going on for many thousands of years. The abacus was a very important early success—an aid many use today. Math tables, logarithms, the slide rule, and mechanical calculators were all quite important aids to arithmetic computation.
Now, we have electronic digital calculators and computers. We have quite inexpensive solar battery-powered scientific calculators that are superb improvements over earlier aids to numerical computations. We have computers that can do the types of symbolic manipulation needed in many different algebraic, calculus, and statistics computations. Wolfram Alpha, available online at http://www.wolframalpha.com, provides an excellent example of such computer capabilities.
These computer systems can graph functions and statistical data, but they can also do graphical displays of the types of models used in many other disciplines. For example, an architect can develop a computer model of a planned building. The computer can do computations to test the structural integrity, fire and storm resistance, heating and cooling requirements, and so on. A computer system can provide users with a three-dimensional structured walk-through of a planned building before physical work begins. These are all humongous computational tasks, based on accumulated knowledge in math, physics, and other disciplines.
Wisdom and Digital Wisdom
The five-point cognitive development scale given below. I have named it the Arthur C. Clarke Cognitive Understanding Scale, after Arthur C. Clark. He is one of my favorite authors.
Many people use the term information to stand for the data, information, knowledge part of this scale. Others use it for the data, information, knowledge, wisdom part of the scale. Thus, information as in Information Age has different meanings to different people.
The following quote is from Marc Prensky (February/March 2009):
- Digital technology, I believe, can be used to make us not just smarter but truly wiser. Digital wisdom is a twofold concept, referring both to wisdom arising from the use of digital technology to access cognitive power beyond our innate capacity and to wisdom in the prudent use of technology to enhance our capabilities. Because of technology, wisdom seekers in the future will benefit from unprecedented, instant access to ongoing worldwide discussions, all of recorded history, everything ever written, massive libraries of case studies and collected data, and highly realistic simulated experiences equivalent to years or even centuries of actual experience. How and how much they make use of these resources, how they filter through them to find what they need, and how technology aids them will certainly play an important role in determining the wisdom of their decisions and judgments. Technology alone will not replace intuition, good judgment, problem-solving abilities, and a clear moral compass. But in an unimaginably complex future, the digitally unenhanced person, however wise, will not be able to access the tools of wisdom that will be available to even the least wise digitally enhanced human.
Humans and computer systems, working and "thinking" together, are now routine approaches to problem solving in many different academic disciplines. In the examples provided above, the power of computers has substantially changed large parts of various disciplines.
Computer hardware capabilities are continuing a rapid pace of improvement. See Moore’s Law/. A steadily increasing range of problems can be solved through computer modeling and making effective use of the capabilities of computers. A steadily increasing amount of the accumulated knowledge of humankind is being put into a digital form so a computer system can do part of the work of retrieving and making use of the information.
Thus, there are strong arguments for helping students to learn to think both in terms of the capabilities of their own brains and also in terms of the capabilities of computers. Computational thinking is a key component of a modern education.
The rapid pace of increase in capabilities and decrease in cost of ICT facilities has proven to be a major challenge to our formal K-12 and higher educational systems. Nowadays, every teacher, at every level and in every academic discipline, is faced by the challenges of how computers are affecting the content, instructional processes, and assessment in the subjects they teach.
References and Added Resources
Apple iPhoto (2009). Retrieved 8/30/2013 from http://www.apple.com/welcomescreen/ilife09/iphoto/play/.
- Watch a video that is essentially an ad for Apple's iPhoto. As you watch this, think about computational thinking, the nature of the formal education needed to make effective use of iPhoto, the problems it can solve for you, and so on.
- Software like iPhoto is an integral component of today and of the future. From the developer's point of view, the idea is to integrate a relatively complete "solution" for a certain category of problems and tasks that a person might have—and to do so in a manner that requires very little time and effort to learn how to use at a personally satisfying level.
Carnegie Mellon (n.d.). Center for Computational Thinking. Retrieved 8/30/2013 from http://www.cs.cmu.edu/~CompThink/.
Columbia University (7/35/08). Scientists open Columbia's new Computational Biology lab. Retrieved 8/30/2013 from http://www.columbia.edu/cu/news/research/compbio.html. Quoting from the article:
- In early May scientists at Columbia University gathered in room 607 of the Sherman Fairchild building on Morningside campus to celebrate the new Pe'er/Bussemaker Lab for Systems Biology—the first of its kind at the University. The goal of the lab is to develop and apply complex tools that can probe and derive meaning from mountains of data now being created in the rapidly expanding field of systems and computational biology.
- Systems and computational biology is the meeting point between modern molecular biology and new research techniques emerging from the engineering, computer science, chemistry, mathematics, statistics and physics fields. It has the potential to allow scientists to pose limitless questions about how our cells work and issues related to general human health: the study of gene networks, analysis of protein shapes, prediction of biological function and understanding how a cell processes signals.
Computer Science Unplugged (n.d.). Retrieved 8/30/2013 from http://csunplugged.com/. This material is primarily aimed at precollege students and their teachers. Quoting from the website:
- Computer Science Unplugged is a collection of activities designed to teach the fundamentals of computer science without requiring a computer. Because they're independent of any particular hardware or software, Unplugged activities can be used anywhere, and the ideas they contain will never go out of date. Unplugged activities have been trialled and refined over 15 years in classrooms and out-of-school programmes around the world.
Moursund, David (2006). Computational thinking and math maturity: Improving math education in K-8 schools. Eugene, OR: Information Age Education. Access at http://i-a-e.org/downloads/doc_download/3-computational-thinking-and-math-maturity-improving-math-education-in-k-8-schools.html.
This book addresses the problem that our K-8 school math education system is not as successful as many people would like it to be, and it is not as successful as it could be. It is designed as supplementary material for use in a Math Methods course for preservice K-8 teachers. However, it can also be used by inservice K-8 teachers and for students enrolled in Math for Elementary and Middle School Teachers courses.
NSF (9/9/08). Cyber-Enabled Discovery and Innovation (CDI). Retrieved 8/30/2013 from http://www.nsf.gov/pubs/2008/nsf08604/nsf08604.htm?govDel=USNSF_25. Quoting from the NSF program announcement website:
- Synopsis of Program:
- Cyber-Enabled Discovery and Innovation (CDI) is NSF’s bold five-year initiative to create revolutionary science and engineering research outcomes made possible by innovations and advances in computational thinking. Computational thinking is defined comprehensively to encompass computational concepts, methods, models, algorithms, and tools. Applied in challenging science and engineering research and education contexts, computational thinking promises a profound impact on the Nation’s ability to generate and apply new knowledge. Collectively, CDI research outcomes are expected to produce paradigm shifts in our understanding of a wide range of science and engineering phenomena and socio-technical innovations that create new wealth and enhance the national quality of life.
- CDI seeks ambitious, transformative, multidisciplinary research proposals within or across the following three thematic areas:
- From Data to Knowledge: enhancing human cognition and generating new knowledge from a wealth of heterogeneous digital data;
- Understanding Complexity in Natural, Built, and Social Systems: deriving fundamental insights on systems comprising multiple interacting elements; and
- Building Virtual Organizations: enhancing discovery and innovation by bringing people and resources together across institutional, geographical and cultural boundaries.
This CDI initiative is an effort to use Federal funds (via the NSF) to help invent the future of science and technology.
NSF (9/3/2008). NSF Funds New Center to Bring Together Biologists, Mathematicians. Power of mathematics and modeling to be applied to large-scale questions in biology. Retrieved 8/30/2013 from http://www.nsf.gov/news/news_summ.jsp?cntn_id=112167&govDel=USNSF_51.
NSF (1/14/09). CISE Pathways to Revitalized Undergraduate Computing Education (CPATH) Retrieved 8/30/2013 from http://www.nsf.gov/pubs/2009/nsf09528/nsf09528.html?govDel=USNSF_25. Quoting from the RFP:
- Computing has permeated and transformed almost all aspects of modern life. As computing becomes more important in all sectors of society, so does the preparation of a globally competitive U.S. workforce able to apply core computing concepts, methods, technologies, and tools - referred to here as Computational Thinking (CT) - to a broad range of societal challenges and opportunities.
- CT capitalizes on concepts, methods, technologies, and tools fundamental to the fields of computing, i.e. computer and information science and engineering. For example, computing concepts and methods equip us to reason at multiple levels of abstraction simultaneously, to think algorithmically and apply foundational mathematical concepts to solve complex problems, and to understand the dimensions and consequences of scale. However, it is only when computing concepts and methods are combined with the power of automation afforded by contemporary computing technologies and tools that the full potential of CT is unleashed. Drawing deeply on computational concepts, methods, technologies and tools, CT serves as a powerful strategy to more effectively design, understand and solve problems associated with complex systems in many aspects of modern life.
Paul, Annie Murphy (6/17/2013). "Rules for Thinking in a Digital World." The Brilliant Blog. Retrieved 6/17/2013 from http://anniemurphypaul.com/2013/06/rules-for-thinking-in-a-digital-world/. Quoting from the blog entry:
- My own take: technology can make us smarter or stupider, and we need to develop a set of principles to guide our everyday behavior, making sure that tech is improving and not impeding our mental processes. Today I want to propose one such principle, in response to the important question: What kind of information do we need to have stored in our heads, and what kind can we leave “in the cloud,” to be accessed as necessary?
- The answer will determine what we teach our students, what we expect our employees to know, and how we manage our own mental resources.
Pinker, Steven (2008). The computational theory of mind (11-minute video). Retrieved 8/30/2013 from http://www.youtube.com/watch?v=LVrb5ClvDho.
Pfeiffer, John (1962). The Thinking Machine. Columbia Broadcasting System. Retrieved 8/30/2013 from http://archive.org/details/thinkingmachine00pfei.
Prensky, Marc (February/March 2009). H. Sapiens Digital: From Digital Immigrants and Digital Natives to Digital Wisdom. Innovate: Journal of Online Education. Retrieved 8/30/2013 from http://www.innovateonline.info/index.php?view=article&id=705&action=article.
Wing, Jeannette M. (March, 2006). Computational thinking. Communications of the Association for Computing Machinery. Retrieved 8/30/2013 from http://www.cs.cmu.edu/afs/cs/usr/wing/www/publications/Wing06.pdf.
Links to Other IAE Resources
This is a collection of IAE publications related to the IAE document you are currently reading. It is not updated very often, so important recent IAE documents may be missing from the list.
This component of the IAE-pedia documents is a work in progress. If there are few entries in the next four subsections, that is because the links have not yet been added.
IAE-pedia (IAE's Wiki)
Popular IAE Wiki Pages. Readers who enjoy this document may enjoy:
I-A-E Books and Miscellaneous Other
This page was initially developed by David Moursund and Dick Ricketts.