Two Brains Are Better Than One

From IAE-Pedia
Revision as of 11:05, 18 March 2012 by Moursund (Talk | contribs)

Jump to: navigation, search
IAE-pedia Header.png



“Computers are incredibly fast, accurate, and stupid. Human beings are incredibly slow, inaccurate, and brilliant. Together they are powerful beyond imagination.” (This quote is often mistakenly attributed to Albert Einstein; most likely the correct attribution is Leo Cherne at the Discover America Meeting, Brussels, June 27, 1968.)


Introduction

In the early days of electronic digital computers, such machines were often referred to as "brains" or "electronic brains." A much more accurate description for such early computers is "automated calculating machines." These early computers were designed to rapidly and accurately carry out a specified sequence of arithmetic calculations. One such computer could do the work of more than a hundred people equipped with the best calculators of that time.

Since mass production of computers first began in the very early 1950s, they have become about 10 billion times as cost effective as they were initially. Large numbers of computer programs have been written that solve a wide range of math and non-math problems. Artificial Intelligence (Machine Intelligence) has become a productive component of the field of Computer and Information Science.

Moreover, electronic digital computers have become commonplace. They are now everyday objects of use by hundreds of millions of people. For example, a cell telephone contains a more powerful computer than the typical desktop or laptop microcomputer produced ten years ago. Worldwide production of such cell telephones is now close to a billion per year. Many of these cell telephones now include a Web browser, email, audio and video players, a digital camera—as well as a telephone.

This proliferation and steadily increasing capability of computers makes it important for people to understand the capabilities of human brains (human intelligence)in comparison with computer "brains" (computer intelligence; artificial intelligence; machine intelligence).

Three Brains are Better than Two

On August 4, 2009 I (David Moursund) made a presentation at the Oregon Math Leaders Conference on the Two Brains are Better than One topic. While putting that talk together, it occurred to me that the talk should be on "Three Brains are Better than One or Two Brains."

The talk I gave focused on the role of three types of brains in representing and solving math problems:

  1. Human brain (a "meat" brain).
  2. Paper & pencil (reading and writing) brain. The external storage media is static.
  3. Information and Communication Technology (ICT) brain. The external storage can be static or dynamic. It can do things on its own, and it can interact with the human brain.

It was a huge leap forward (a major paradigm shift) when people developed reading and writing as an aid to their meat brain. We are now involved in another huge step forward (a major paradigm shift) as we develop and learn to make effective of an ICT brain.

Our educational system is certainly making some progress in helping students to make effective use of the ICT brain and to integrate use of all three types of brains. However, the progress of thoroughly integrating use of the ICT brain in with use of one's meat brain and with paper & pencil brain has been modest. Meanwhile, the ICT brain's capabilities have continued to grow very rapidly.

Background Reading

You use your brain and aids to you brain to help the solve problems and accomplish the tasks that interest you. This article on Two Brains Are Better Than One assumes you have some insights into problem solving, expertise, and computational thinking.

Problem Solving

As background reading, you might want to learn more about problem solving and roles of computers in problem solving. Problem solving includes:

  • Question situations: recognizing, posing, clarifying, and answering questions.
  • Problem situations: recognizing, posing, clarifying, and then solving problems.
  • Task situations: recognizing, posing, clarifying, and accomplishing tasks.
  • Decision situations: recognizing, posing, clarifying, and making good decisions.
  • Using higher-order critical, creative, wise, and foresightful thinking to do all of the above. Often the results are shared, demonstrated, or used as a product, performance, or presentation.

A key to getting better at problem solving is reflection (reflective thinking) during the process of solving problems and after one has solved a problem. View every problem-solving situation as an opportunity to learn to become better at problem solving.

Expertise

Expertise is another important background topic. Through informal and formal education, you want to increase your level of expertise in a variety of different areas. Expertise is demonstrated by solving problems (referring to the general definition given above). In problem solving, humans use aids to their physical capabilities and aids to their mental capabilities. Computers now play important roles in both types of aids.

Computational Thinking

Computational Thinking is the type of thinking a person does when making use of human intelligence and computer intelligence to solve problems and accomplish tasks.

Computational thinking is a way of solving problems, designing systems, and understanding human behavior that draws on concepts fundamental to computer science. Computational thinking is thinking in terms of abstractions, invariably multiple layers of abstraction at once. Computational thinking is about the automation of these abstractions. The automaton could be an algorithm, a Turing machine, a tangible device, a software system—or the human brain. (Carnegie Mellon, n.d.) [Bold added to emphasize use of one's brain.]

Computer and Human Intelligence

An electronic digital computer is a machine designed for the input, storage, processing, and retrieval of data, information, and anything else that can be digitized. A computer can rapidly and accurately follow a step-by-step set of instructions that has been coded in an appropriate format and stored in its memory. Such a set of instructions is called a computer program or a computer procedure.

Computer programmers design and develop computer procedures to solve problems and accomplish tasks. Computer capabilities are improved through the development of faster and more reliable computers with increased storage capability, and through the development of computer programs that are better at solving old problems and/or designed to solve new problems.

A human brain can accept input from a person's senses. It can store, process, and output information. A person is born with a number of built-in procedures that the brain knows how to carry out. For example, such procedure keep your heart beating and your lungs breathing at appropriate rates. You brain can learn new procedures, such as procedures for directing your body in walking, talking, and carrying reading and writing activities.

In summary, both a computer brain and a human brain:

  • Can input, store, process, and output information.
  • Have certain "wired in" procedures.
  • Can learn new procedures.

However, there are major differences between a human brain (wet ware) and a computer brain (hardware). Two of these differences are understanding and speed.

For example, a computer system can memorize (rote memory, with no understanding) a book in a few seconds. During this time a typical human can read a sentence—and understand what the sentence means.

You are aware that voice input systems to computers are getting better. A person talks to a computer. The computer receives this input and translates it into words that can be displayed on a a computer output screen. Modern versions of such voice-to-text systems get better through their human users correcting the errors the computer system makes.

For example, suppose I am using such a voice-to-text system to enter text into a word processor or an email system. I say a sentence or several sentences, and the words are displayed on my screen. I see an error that the computer system has made, and I enter a correct word in place of an incorrect word. The computer system "learns" from this error correction process. This is somewhat akin to the way a child learns to speak correctly, through receiving relatively immediate feedback on incorrect pronunciation.

Here is another important aspect of learning by humans versus learning by computer. If you own and routinely use a computer, you will notice that from time to time the computer will tell you that it needs to install an update to a piece of its software. You indicate that you agree to this, and the update processes occurs automatically. Compare this with "updating" some data or procedures in your brain!

Also, compare this with the situation of students doing much of their school work using printed (hard copy) books that are on a six or seven year replacement cycle. Some of the information in the books is terribly out of date.

Human and Computer Memory and Processing

A typical human brain contains about 100 billion neurons and a still larger number of other cells. In very rough terms, think of a brain as a three pound collection of about a trillion cells that has the consistency of soft butter. A human brain is constantly changing as it retrieves, stores, processes, and outputs information.

A computers' "brain" consists of a combination of processing units and storage (memory) units. A processing unit can carry out various instructions, such as to add a pair of numbers, compare two numbers to see which is larger, retrieve data from a memory space, and store data in a memory space. (Note that an inexpensive solar powered handheld calculator can do all of these things. However, its brain is rather feeble as compared to that of a modern microcomputer.)

It is now possible to manufacture a chip that contains both one or more processing units and a substantial amount of storage. If still more processing power and storage are needed, many such units can be interconnected, and more memory units can be added. Super computers are now being built that contain many tens of thousands of processing units and trillions of bytes (characters) of memory.

Computer Memory and Processors

Computer memory is much simpler and easy to understand than human brain memory. Computer memory stores binary digits (0s and 1s). Often a computer memory is divided into chunks that are eight bits in length. Each chunk can store a alphabetic letter, numeric digit, or a punctuation mark. Such 8-bit chunk is called a byte. The word basketball is ten characters in length. It takes ten bytes of computer storage (80 bits) to store this word in a computer memory unit.

A medium-length novel is about a million bytes in length. The hard drive on a relatively modern microcomputer can store several hundred thousand books.

Binary coded data can be stored and retrieved with very little chance of error. For example, consider a long book, containing about 200 thousand words of text. This amount of text might be about 10 million bits. It is common to talk about storage capacity of a computer storage unit in terms of bytes, where a byte is eight bits.

For another example, think of a very high quality color picture taken with a a high quality digital camera. This picture consists of millions of pixels (picture elements; think of a pixel as a colored dot). A very high quality digitized representation of one picture might take as much storage space in a computer memory as a couple of dozen long books of text. In terms of computer storage, the statement "A picture is worth a thousand words." is incorrect. It is more accurate to say that a picture is worth a million words.

The memory of a computer's brain is easily expanded. One can add various versions of "fast" internal memory, disk storage, flash memory, holograph memory, laser disk memory, tape memory, and so on. All such memory storage units are subject to failure. Thus, it is common to maintain a backup copy of the contents of a computer's memory.

A processing unit (often called a central processing unit, or a CPU) in a computer brain is designed to be able to follow a step by step set of instructions that manipulate bits. A typical CPU may be designed to carry out a hundred or more different instructions. That is not very impressive, until one thinks about the fact that even an inexpensive microcomputer can carry out a billion or more of these instructions in one second!

You may find it helpful to think in terms of a hand held calculator that can add, subtract, multiply, and divide. The CPU in this calculator can carry out the instructions add, subtract, multiply, and divide. However, it can do other things. For example, it can accept input from a keyboard, and it can display an answer in the display unit.

An inexpensive calculator often contains a square root key. This means that a set of instruction on how to calculate square root has been built into the calculator—it is stored in the calculator's permanent memory. The CPU of the calculator automatically follows this set of instructions when the square root key is pressed.

In summary, in an electronic digital computer, processing is carried out by processing units, and storage is carried out by storage units. The capabilities of a computer's brain are improved by building better (faster, more reliable) processing units and by building better (larger, more reliable, faster) memory units. Today's computers are about 10 billion times as cost effective as computers manufactures in the early 1950s. We will likely well see improvements by another factor of well over a hundred during the next decade.

And, of course, today's software is much better than what was available in the early days of computers. Not only are there many thousands of computer programmers adding to the general collection of computer programs, there are also many people working in the specific area of artificial intelligence (machine intelligence). Computers are steadily becoming more capable. During this time, the innate capabilities of human brains will not change.

Human Memory and Processing

As indicated earlier, a human brain consists of about a trillion cells, with about 100 billion of these being neurons. Neurons are both storage and processing units. Thus, in some sense a human brain is somewhat like a computer that contains 100 billion CPUs.

Neurons communicate with each other through their axons and dendrites. A typical neuron may have about 5,000 to 10,000 dendrites, so it can directly send a signal to 5,000 to 10,000 different neurons.

Each neuron aids in both storage of information and processing of information. The storage and processing depend on the "strength" of the connections among the various neurons involved in storage of some particular data and processing of this data.

A collection of neurons stores a piece of data or information. Thus, it is possible that a neuron might be damaged of die without any loss of data that is stored in the brain. This is a type of built in redundancy.

The learning process consists of strengthening and weakening of electrochemical connections. Quoting from http://faculty.washington.edu/chudler/ap.html:

Neurons send messages electrochemically. This means that chemicals cause an electrical signal. Chemicals in the body are "electrically-charged" — when they have an electrical charge, they are called ions. The important ions in the nervous system are sodium and potassium (both have 1 positive charge, +), calcium (has 2 positive charges, ++) and chloride (has a negative charge, -). There are also some negatively charged protein molecules. It is also important to remember that nerve cells are surrounded by a membrane that allows some ions to pass through and blocks the passage of other ions. This type of membrane is called semi-permeable.

Rather than continue this technical discussion, perhaps it suffices to say that a human brain is not digital either in terms of storage (memory) or in terms of processing. A human brain is able to do tasks relatively rapidly by simultaneously making use of millions of different neurons. Each neuron is very slow relative to a computer's CPU, but this parallel processing allows a lot of processing to be done in a short period of time.

Nowadays, people build faster computers both by building faster individual CPUs, and also by building computers that contain lots of CPUs. Today's supercomputers contain tens of thousands of CPUs, all operating at the same time doing parallel processing.

Human Intelligence

Research on the general human intelligence factor (g) has led to a nature-and-nurture theory that divides human intelligence into fluid intelligence (the nature component) and crystallized intelligence (the nurture component). McArdle et al. (2002) describes these concepts this way:

The theory of fluid and crystallized intelligence … proposes that primary abilities are structured into two principal dimensions, namely, fluid (Gf) and crystallized (Gc) intelligence. The first common factor, Gf, represents a measurable outcome of the influence of biological factors on intellectual development (i.e., heredity, injury to the central nervous system), whereas the second common factor, Gc, is considered the main manifestation of influence from education, experience, and acculturation. Gf-Gc theory disputes the notion of a unitary structure, or general intelligence.

The human brain grows considerably during a person’s childhood, with full maturity being reached in the mid to late 20s for most people. Both Gf and Gc increase during this time. While a person’s level of fluid intelligence tends to peak in the mid to late 20s, growth in crystallized intelligence may continue well into the 50s.

Since the rate of decline in fluid intelligence over the years tends to be relatively slow, a person’s total cognitive capabilities can remain high over a long lifetime. Current research strongly supports the idea of “use it or lose it” for the brain or mind, as well as the rest of one’s body (McArdle et al.; 2002). Elkhonon Goldberg has written extensively in this area.

A brain has great plasticity. This means that it has great ability to change. Learning is a process that changes a person's brain. Quoting from the Wikipedia:]

Neuroplasticity (variously referred to as brain plasticity or cortical plasticity or cortical re-mapping) refers to the changes that occur in the organization of the brain as a result of experience. A surprising consequence of neuroplasticity is that the brain activity associated with a given function can move to a different location as a consequence of normal experience or brain damage/recovery.
Decades of research have now shown that substantial changes occur in the lowest neocortical processing areas, and that these changes can profoundly alter the pattern of neuronal activation in response to experience. According to the theory of neuroplasticity, thinking, learning, and acting actually change the brain's functional anatomy from top to bottom, if not also its physical anatomy. A proper reconciliation of critical period studies, which demonstrate some functional and anatomical aspects of the neocortex are largely immutable after development, with the new findings on neuroplasticity, which demonstrate some functional aspects are highly mutable, are an active area of current research.

A Brain Comparison Example

Suppose that you are walking down the street and you decide that you need to know what time it is. You first seek a response from your internal clock and related senses. It might provide you with an estimate of the time of day to the nearest hour or half hour. It might tell you that it is about time to eat. It might tell you that it is time to get off your feet and rest your tired legs. It might tell you that the sun feels particularly hot and it is time to put on some more sunscreen or seek some shade.

After some additional thought, your brain might tell you that it is Friday and that it is only a short walk to one of your favorite eateries that serves really excellent clam chowder on Fridays.

As an alternative, you might glance at your electronic digital watch. Its output display shows numbers and letters that represent the time of the day, the day of the week, the day of the month, and so on. The time of day data may appear to be accurate to the nearest second, but probably your watch is off by quite a few seconds.

You can learn a lot about human brain versus computer brain by reflecting on this simple example. Probably you notice that your internal timekeeping system is quite human oriented. It is not designed to give very precise numerical results. If your internal clock says "it is probably time to eat," that is a lot different than your digital watch displaying 12:15 pm.

Moreover, even though computers are supposed to be very fast and very accurate, a digital watch is more than just a computer. It is a time-counting device that starts with some given initial data and then counts off (measures) time increments. If the initial data is wrong, then the computed answers will likely be wrong. Also, the accuracy in measuring time increments tends to vary with the quality of the watch. And, of course, the battery might die or the circuitry might "break."

It is possible that you are into high tech gadgets, and that your watch is as "atomic" watch. Such a watch contains a radio receiver that receives signals from an atomic clock that is very accurate. You might have a Global Positioning System (GPS) watch or handheld device. It can tell the time and where you are located on the surface of the earth.

One of the purposes of this example is to get you to think about your human brain versus easily portable computer brains. An easily portable piece of Computer and Information Technology can solve some problems and accomplish some tasks quite quickly and accurately.

Real Time & Non-Real Time Problems

You face many problems throughout the day that require "real time" (that is, relatively rapid) problem solving. As an example, consider carrying on a conversation with a friend. Your friend says something, and waits for you to respond. You need to make a quick decision of what to say. You think of an ides and your brain forms a sentence and directs your speech system to say the sentence. Your friend says a sentence and your brain quickly processes the sound waves coming into your ears. Likely it also incorporates visual information, such as gestures and facial expressions.

Here is another example. You are standing at the side of a moderately busy street, and you want to get to the other side. You watch the traffic, looking for an opening. At each instant, you are making a "go" or "no go" decision.

One goal in informal and formal education is to help a student gain the knowledge and skills to make rapid, good quality decisions in situations that require real time problem solving and decision making.

However, throughout the day, you encounter many problem-solving situations that do not require immediate action. In these settings, you have the time to consider a number of alternatives and the possible consequences of these alternatives. You have the time to seek more information. For example, you might telephone a friend or do a Web search.

Think about the real time and non-real time situations. Some Information and Communication Technology tools (such as a digital watch or a GPS) are easily portable and very fast. They can solve some challenging problems in real time. However, to use these aids, you need to make use of you education that helps you to understanding the meaning of the output from a watch or from a GPS.

The lens implants I had for cataracts provide a different type of example. I can see better than before I got the implants, and they function in real time. I already knew how to see and process visual information before the cataract surgery, and there was complete and automatic transfer of this learning to seeing with the implants.

These few examples provide a lot of food for thought. The Web is the world's largest library and it is growing quite rapidly. You cannot use the Web without being able to read and understand what you are reading. This is merely one more argument to support s strong emphasis on children learning to read with understanding. It suggests that there needs to be more emphasis on reading to retrieve information to help solve non-real time problems.

I can keyboard faster than I can write by hand. The results are much more legible than my handwriting, and the results have other advantages in editing, saving and sharing the results, and so on. I rely heavily on a large list of words I have memorized how to spell. However, I also rely heavily on my spelling checker along with my word processor's electronic dictionary and electronic thesaurus. (Sometimes I use a voice input system. As such systems get better and better, they add a new dimension to the handwriting versus keyboarding argument.)

Some Problems Faced by Our Brains

The totality of collected data, information, knowledge, and wisdom is growing quite rapidly. This problem situation is often referred to as information overload.

A somewhat different way of thinking about this situation is that most of us face the problem situation of problem overload.

"We are drowning in information but starved for knowledge." (John Naisbitt; American author, speaker; born in 1929.)

People organize and save information as knowledge that will help themselves and others deal with the myriad of problem situations that they encounter on a daily basis. They learn a variety of ways to cope with the huge number of problem situations they routinely encounter.

Humans have a very long history of developing aids (tools) to help they solve the problems they routinely encounter. Such tools can be thought of as embodiments of some of the data, information, knowledge, and wisdom of the tool inventor and designer. Thus, when you learn to make use of such tools, you are building on the previous work of other people. You are learning to make use of tools that aid your physical body and your mind.

As humans continually develop more and better tools, our educational system is faced by a continuing challenge of what to teach. Our current educational system is struggling with the question: What should students be learning about problems that computers can solve or make major contributions to solving?

This is a very hard question. Moreover, possible answers change as computer systems become more and more capable.

Word Processing as a Simple Example

Think about the overall problem of students learning to write and to communicate in writing. This problem has existed since the first formal schools to teach reading and writing were developed more than 5,000 years ago.

There have been many changes in how this problem is addressed in schools. For example, students no longer learn how to use a stylus to write on a clay tablet that they have molded out of clay and will later bake in an oven or dry in the sun. Students no longer learn how to select a quill and cut it to make a quill pen. Students now have pencils (with erasers), ballpoint pens, and paper.

Also, many students now have word processors that solve the "legibility" problem, and that also contain spelling and (possibly) grammar checkers. These new aids to writing facilitate writing in a non-linear manner (jumping around to different parts of the document one is writing), writing material designed to be read in an interactive (non-linear) fashion, writing that includes pictures, graphics, charts, tables, and so on. Now, students are also faced by the writing problem of desktop publication.

Our schools have long stressed the importance of students learning cursive writing. We are now seeing a major change in this area. Why spend school time teaching an art form. The art form of calligraphy is still alive and well, but it is not mainstream in the curriculum. Cursive handwriting appears to be headed down the same path.

A word processor is a powerful aid to the "revise, revise, revise" aspect of high quality writing. So far, computers have not been readily enough available so that all students learn to compose at a computer and integrate this revision approach into their standard writing activities. However, this seems to be the direction we are headed.

A great many people now routinely use a spelling checker as they write. At the current time, it seems clear that it is still quite important to be able to spell the words that one routinely used when writing. Currently, however, students are being expected to memorize the spelling of many words that they do not routinely use in writing. For many students, such rote memorization contributes little to their overall education.

Moreover, voice input to computers is steadily improving. What do we want to teaching about spelling when all students have routine access to relatively good quality voice input systems as they write? For example, might we need to place still greater emphasis on words that sound the same but are spelled differently and have different meanings? Deciding among such words requires understanding.

Math as an Example

Hand held, solar battery powered calculators have long been relatively inexpensive and rugged. Thus, our math education system has had many years to consider the issue of what math-related things a student should learn to do mentally, should learn to do with the aid of paper and pencil, and should learn to do using a calculator. In recent years, the decision process has become more difficult because of graphing, equation solving, and programmable calculator and the rapid proliferation of microcomputers.

Adults have accepted the use of calculators for their own calculation needs. However, many adults and many teachers have resisted this approach for students. Thus, we have some testing situations in which students are allowed to use calculators, and other in which they are not. We have some elementary school teachers who integrate use of calculators into the curriculum, and others who strongly resist this approach.

The calculator issue is merely the tip of the iceberg. Computers can now solve a wide range of problems that students are learning to solve mentally and/or with the aid of pencil and paper. Each year, such computer capabilities improve significantly.

The essence of this problem situation is the question of how to best help students gain arithmetic literacy (numeracy) of a sort that will serve them well in their futures. There tends to be considerable agreement that it is very helpful for a student to educate his or her human brain so that it can quickly and accurately solve one-digit computational problems and do other calculations that can be done mentally and quickly. There is considerable agreement that students need to have understanding of numbers, the number line, making use of math to represent problems, and making use of math to solve problems.

Much of this math education can be accomplished by the end of the fourth or fifth grade. Starting in about the fourth grade, the math curriculum begins to have an increasing amount of content that students find little use for in their everyday lives. They are told "you will need it someday" or "you will need it in the next math unit."

It is approximately at these same grade levels (grade 4 or 5) that many students begin to lose their interest in math and perhaps begin to develop a dislike of math. For many students, their success in math education gradually decreases. Moreover, what they cover in math courses is soon forgotten. A majority of adults in our country function at about the sixth grade level in their use of and understanding of math.

What inexpensive calculators can do in grade school, more expensive calculators and computers can do in secondary school and in the first two years of college math. That is, computer brains can carry out the procedures to solve the "pure math" problems that students are learning to solve in school. They can solve equations, draw graphs of data, do statistical computations, and so on.

The underlying issue here is that of understanding capabilities, limitations, and uses of math versus memorizing math procedures and developing speed and accuracy at carrying them out mentally or using pencil and paper.

Here is an example to illustrate this situation. What do you know about square root? In the "good old days," in algebra courses students were taught a paper and pencil algorithm for computing square root. Now, this paper and pencil calculation topic has disappeared from the math curriculum. If one needs to calculate a square root, he or she uses a calculator that has a square root key. This calculator capability provides a clear separation between understanding meaning and use of square root versus knowing how to calculate square root.

Two Brains in Each Academic Discipline

Nowadays, every academic discipline is being impacted by the capabilities of computers. The impact is certainly stronger in some disciplines than in others. Thus, in each discipline, students need to be learning how to make use of their own brain, and computer brains to represent and solve the problems of the discipline.

This learning does not automatically occur merely because younger students are good at playing computer games, using cell telephones, doing instant messaging, using music storage and payback devices, and so on. There is a large difference between being able to "Google" something on the web and being able to retrieve, understand, and use information in a manner that contributes to one's level of expertise in a discipline.

In some disciplines, the value of a computer brain has become so great that computational thinking has become an integral component of the discipline. If you doubt this, just do a Web search on terms such as "computational XYZ", where the XYZ is replaced by words such as biology, chemistry, economics, math, or math. Many disciplines use other vocabulary. Thus, "computational music" returns few hits, while "digital music" returns many millions of hits.

The growing importance of computational thinking and computer intelligence presents an interesting challenge to our educational testing system. One component of assessment needs to focus on how well students can solve problems and accomplish tasks in a computer hands-on environment. This component also needs to include a focus on understanding what problems within the discipline being assessed can be solved by computers, and what problems might best be solved by human and computer brains working together.

More Than Two Brains

Many people work and play in team environments. A team of people may get together to brainstorm on a problem. Here, the expectation is that more than one human brain is better than just one human brain.

Nowadays, it is common for such a brainstorming group of people to be facilitated by computers and by telecommunication equipment. Moreover, each person in the group may well be making use of versions of computer brains that fits their specific needs.

More generally, a team of people working on a problem nowadays often consists of a number of people and a number of computer brains. The results of their thinking may well be implemented by a number of other people and a number of other computer brains. Some of these other computer brains are parts of automated data gathering and production equipment.

Thus, we need an education system that helps prepare students to work in team environments that involve a number of human brains and a number of computer brains. This is a key idea in project-based learning.

Summary and Conclusions

In everyday life, a person encounters many problem situations that require relatively rapid (real time) decision-making and action-taking.

In some of these situations, a computerized tool (computer brain) can provide more accurate and more rapid solution-finding and appropriate action-taking than a human brain. We see example of this in the anti lock brakes, airbag deployment, and some other safety features being built into modern cars.

There are a substantial number of routine real time problem situations where there is great need for a person to have a healthy and appropriately educated brain. It is quite helpful to be able to remember some arithmetic facts and how to spell the words one routinely uses in writing. It is very helpful to be able to recognize the face and the voice of a friend.

In addition, there are many situations in which either a healthy human brain or a computer brain can quickly solve the problem or accomplish the task. The number of such situations is growing as computer brains become more capable. Such situations present the possibility of spending less time in school on rote memory with little or no understand activities, and more time on understanding and higher-order thinking activities.

In everyday life, a person encounters many other problems that do not require real time decision making and action taking. In these situations, there is time to reflect and time to draw upon sources of information such as other people, the Web, books, and so on.

In many of these non-real time problem solving situations, computer systems are able to solve or greatly help in solving the problem that is under consideration.

So, in summary, we need an education system that prepares students to work well in a two-brain environment. The nature of the education needed varies with the capabilities and interests of the student and his or her human brain. It also varies with the current cost and effectiveness of computer brains, and how these are changing over time.

References

Carnegie Mellon (n.d.). Center for Computational Thinking. Retrieved 3/16/08: http://www.cs.cmu.edu/~CompThink/.

Devitt, James (4/14/08). New Book Concludes That The Human Mind Is The Product Of A Chaotic Evolutionary Path. Medical News Today. Retrieved 4/15/08: http://www.medicalnewstoday.com/articles/103826.php. Quoting from the article:

The human mind, far from being a highly efficient computer, is in fact the product of a bumpy evolutionary path, serving as a marvelous storage facility but operating as a shaky retrieval system, concludes New York University's Gary Marcus in his new book Kluge: The Haphazard Construction of the Human Mind (Houghton Mifflin).
"Kluge," a term popularized by computer pioneer Jackson Granholm, is "an ill-assorted collection of poorly matching parts, forming a distressing whole."
The fundamental difference between computers and the human mind is in the basic organization of memory, Marcus observes. While computers organize everything they store according to physical or logical locations, the human brain stores millions of memories, but has no idea where they are located - information is retrieved not by knowing where it is, but by using cues or clues that hint at what we are looking for.

McArdle, John et al. (2002). Comparative longitudinal structural analyses of the growth and decline of multiple intellectual abilities over the life span. Developmental Psychology Vol. 38, No. 1, 115–142. Retrieved 4/21/08: http://psychology.ucdavis.edu/labs/ferrer/pubs/dp_2002.pdf.

Moursund, David (2007). A College Student's Guide to Computers in Education. Access at http://iae-pedia.org/College_Student’s_Guide_to_Computers_in_Education.

Prensky, Marc (February/Marcy, 2009). H. Sapiens Digital: From Digital Immigrants and Digital Natives to Digital Wisdom. Innovate: Journal of Online Education. Retrieved 6/1/09: http://www.innovateonline.info/index.php?view=article&id=705&action=article. (Free, but requires registration.)

Quoting from the article:

In 2001, I published "Digital Natives, Digital Immigrants," a two-part article that explained these terms as a way of understanding the deep differences between the young people of today and many of their elders (Prensky 2001a, 2001b). Although many have found the terms useful, as we move further into the 21st century when all will have grown up in the era of digital technology, the distinction between digital natives and digital immigrants will become less relevant. Clearly, as we work to create and improve the future, we need to imagine a new set of distinctions. I suggest we think in terms of digital wisdom.
Digital technology, I believe, can be used to make us not just smarter but truly wiser. Digital wisdom is a twofold concept, referring both to wisdom arising from the use of digital technology to access cognitive power beyond our innate capacity and to wisdom in the prudent use of technology to enhance our capabilities. Because of technology, wisdom seekers in the future will benefit from unprecedented, instant access to ongoing worldwide discussions, all of recorded history, everything ever written, massive libraries of case studies and collected data, and highly realistic simulated experiences equivalent to years or even centuries of actual experience. How and how much they make use of these resources, how they filter through them to find what they need, and how technology aids them will certainly play an important role in determining the wisdom of their decisions and judgments. Technology alone will not replace intuition, good judgment, problem-solving abilities, and a clear moral compass. But in an unimaginably complex future, the digitally unenhanced person, however wise, will not be able to access the tools of wisdom that will be available to even the least wise digitally enhanced human.
Moreover, given that the brain is now generally understood to be highly plastic, continually adapting to the input it receives, it is possible that the brains of those who interact with technology frequently will be restructured by that interaction. The brains of wisdom seekers of the future will be fundamentally different, in organization and in structure, than our brains are today. Future wisdom seekers will be able to achieve today's level of wisdom without the cognitive enhancements offered by increasingly sophisticated digital technology, but that wisdom will not be sufficient, either in quality or in nature, to navigate a complex, technologically advanced world.

Reference.com (n.d.). Human Brain. AnAsk.com service. Retrieved 9/21/09: http://www.reference.com/browse/wiki/Human_brain. Quoting from the document:

Comparison of the brain and a computer
Much interest has been focused on comparing the brain with computers. A variety of obvious analogies exist: for example, individual neurons can be compared with a transistor (although a neuron's computing power is probably closer to a simple calculator than a transistor), and the specialized parts of the brain can be compared with graphics cards and other system components. However, such comparisons are fraught with difficulties. Perhaps the most fundamental difference between brains and computers is that today's computers operate by performing often sequential instructions from an input program, while no clear analogy of a program appears in human brains. The closest to the equivalent would be the idea of a logical process, but the nature and existence of such entities are subjects of philosophical debate. Given Turing's model of computation, the Turing machine, this may be a functional, not fundamental, distinction. However, Maass and Markram have recently argued that "in contrast to Turing machines, generic computations by neural circuits are not digital, and are not carried out on static inputs, but rather on functions of time" (the Turing machine computes computable functions). Ultimately, computers were not designed to be models of the brain, though constructs like neural networks attempt to abstract the behavior of the brain in a way that can be simulated computationally.
In addition to the technical differences, other key differences exist. The brain is massively parallel and interwoven, whereas programming of this kind is extremely difficult for computer software writers (most parallel systems run semi-independently, for example each working on a small separate 'chunk' of a problem). The human brain is also mediated by chemicals and analog processes, many of which are only understood at a basic level and others of which may not yet have been discovered, so that a full description is not yet available in science. Finally, and perhaps most significantly, the human brain appears hard-wired with certain abilities, such as the ability to learn language (cf. Broca's area), to interact with experience and unchosen emotions, and usually develops within a culture. This is different from a computer in that a computer needs software to perform many of its functions beyond its basic computational capabilities. The human brain is able to interpret and solve complex problems that are not formalized using its powers of pattern recognition and interpretation (strong AI), whereas the computer with current software and current hardware is only able to solve formalized problems (weak AI) due to more limited pattern recognition capability. A human can understand context in an arbitrary text, something even the most powerful and best software is not able to discern (as of 2008). A simple example is to attempt write a program that can determine if an arbitrary text contains humor or not.
There have been numerous attempts to quantify differences in capability between the human brain and computers. According to Hans Moravec, by extrapolating from known capabilities of the retina to process image inputs, a brain has a processing capacity of 0.1 quadrillion instructions per second (100 million MIPS). In comparison, the fastest supercomputer in the world, called Roadrunner and devised and built by engineers and scientists at I.B.M. and Los Alamos National Laboratory, is capable of handling 1.026 quadrillion calculations per second, and an average 4-function calculator is capable of handling 10 instructions per second. It is possible the brain may be surpassed by normal personal computers (in terms of Instructions Per Second, at least) by 2030.
The computational power of the human brain is difficult to ascertain, as the human brain is not easily paralleled to the binary number processing of today's computers. For instance, multiplying two large numbers can be accomplished in a fraction of a second with a typical calculator or desktop computer, while the average human may require a pen-and-paper approach to keep track of each stage of the calculation over a period of five or more seconds. Yet, while the human brain is calculating a math problem in an attentive state, it is subconsciously processing data from millions of nerve cells that handle the visual input of the paper and surrounding area, the aural input from both ears, and the sensory input of millions of cells throughout the body. The brain is regulating the heartbeat, monitoring oxygen levels, hunger and thirst requirements, breathing patterns and hundreds of other essential factors throughout the body. It is simultaneously comparing data from the eyes and the sensory cells in the arms and hands to keep track of the position of the pen and paper as the calculation is being performed. It quickly traverses a vast, interconnected network of cells for relevant information on how to solve the problem it is presented, what symbols to write and what their functions are, as it graphs their shape and communicates to the hand how to make accurate and controlled strokes to draw recognizable shapes and numbers onto a page.

Yong, Ed (4/2/09). Enter Adam, the robot scientist. Retrieved 12/28/09 from http://scienceblogs.com/notrocketscience/2009/04/enter_adam_the_robot_scientist.php. Quoting from the Website:

In a laboratory at Aberystwyth University, Wales, a scientist called Adam is doing some experiments. He is trying to find the genes responsible for producing some important enzymes in yeast, and he is going about it in a very familiar way. Based on existing knowledge, Adam is coming up with new hypotheses and designing experiments to test them. He carries them out, records and evaluates the results, and comes up with new questions. All of this is part and parcel of a typical scientist's life but there is one important difference that sets Adam apart - he's a robot.
In a space the size of a small van, Adam contains a library of yeast strains in a freezer, two incubators, three pipettes for transferring liquid (one of which can manage 96 channels at once), three robot arms, a washer, a centrifuge, several cameras and sensors, and no less than four computers controlling the whole lot. All of this kit allows Adam to carry out his own research and to do it tirelessly - carrying out over 1000 experiments and making over 200,000 observations every day. All a technician needs to do is to keep Adam stocked up with fresh ingredients, take away waste and run the occasional clean.

Links to Other IAE Resources

This is a collection of IAE publications related to the IAE document you are currently reading. It is not updated very often, so important recent IAE documents may be missing from the list.

This component of the IAE-pedia documents is a work in progress. If there are few entries in the next four subsections, that is because the links have not yet been added.

IAE Blog

All IAE Blog Entries.

Garbage in, garbage out—for computer and human brains.

Five brains are better than one.

IAE Newsletter

All IAE Newsletters.

IAE-pedia (IAE's Wiki)

Home Page of the IAE Wiki.

Popular IAE Wiki Pages.

I-A-E Books and Miscellaneous Other

David Moursund's Free Books.

David Moursund' Learning an Leading with Technology Editorials


Author or Authors

The original version of this page was written by David Moursund.