DT2 Computational Thinking
"Computational thinking is a fundamental skill for everyone, not just for computer scientists. To reading, writing, and arithmetic, we should add computational thinking to every child’s analytical ability." - Jannette Wing
Computational Thinking develops in students the capacity to view problems and opportunities as challenges that can be solved through the application of their understanding of digital technologies. Computational Thinking involves the development of a range of Digital Technology processes and skills that allow students to understand problems and create solutions that would not be possible without the perspective provided by Computational Thinking. While Computational Thinking as a concept has been available for many decades, the widespread development of Computation Thinking in all students at all years levels is very new, and how best to develop such thinking and its associated skills will continue to evolve as the Technologies learning area is implemented in Australian classrooms.
Procedural Thinking
In the 60's and 70's Seymour Papert introduced computers into the classroom for the purpose of developing student thinking in different ways, developing this into the concept of Procedural Thinking - incorporating this into the learning theory of contructivism. In the decades since, computer education veered off into learning about the technologies - software, hardware and networks, but lost focus on developing student understanding of the world and technologies place within it. Papert focused on giving students agency over technology, being in control of technologies to serve their interests and curiosity, rather than passive users of what others have created. Digital Technologies aims to re-engage students and teachers with understanding technologies so that they can manipulate and control technologies for their own purposes in creative and innovative ways.
Computational Thinking
Computational Thinking (CT) is a way of viewing problems and opportunities using computer science techniques that provide a perspective to understand the underlying algorithms and computations involved. The term computational thinking was first used by Seymour Papert in 1996 and made popular by Jeannette Wing in 2006. Computational Thinking is a problem-solving process that includes the following characteristics: - Analysing and logically organizing data;
- Data modeling, data abstractions, and simulations;
- Formulating problems such that computers may assist;
- Identifying, testing, and implementing possible solutions;
- Automating solutions via algorithmic thinking; and
- Generalising and applying this process to other problems.
Computational thinking (CT) also involves a set of problem-solving skills and techniques that software engineers use to write programs that underlie the computer applications. - Decomposition: When we taste an unfamiliar dish and identify several ingredients based on the flavor, we are decomposing that dish into its individual ingredients;
- Pattern Recognition: People look for patterns in stock prices to decide when to buy and sell;
- Pattern Generalisation and Abstraction: A daily planner uses abstraction to represent a week in terms of days and hours, helping us to organise our time; and
- Algorithm Design: When a chef writes a recipe for a dish, they are creating an algorithm that others can follow to replicate the dish.
Jannette Wing (2006) reconceptualised Computational Thinking as the new fundamental thinking skill for all students and teachers. Where reading and writing became universally fundamental with the technological development of the printing press, Computational Thinking arrises from the transformational technology of the computer and information networks. Computational thinking involves solving problems, designing systems, and understanding human behavior using mental processes, tools and perspectives developed from across the fields of computing.
Computational Thinking provides a new way of thinking about solving problems and developing opportunities. Computational Thinking involves reformulating seemingly difficult problems into ones we know how to solve, by reduction, embedding, transformation, or simulation. For those uneducated in Computational Thinking, the concepts and terminology can appear as complex and difficult as learning Mathematics or the Scientific Method for the fist time. The skills and processes involved in Computational Thinking have to be applied and contextualised to reach the point in which it is possible to think computationally. This is primarily done through problem solving activities that gradually build to enable increasing complex Computational Thinking concepts to be applied from a growing set of techniques and process. What is Computational ThinkingComputational Thinking is using abstraction and decomposition when attacking a large complex tasks or designing large complex systems. It is choosing an appropriate representation for a problem or modeling it to make a problem easier to understand and solve. It is having the confidence to safely use, modify, and influence large complex systems without understanding their every detail. Computational Thinking is thinking in terms of prevention, protection, and recovery from worst-case scenarios through redundancy, damage containment, and error correction. Computational thinking is using heuristic reasoning (experience-based techniques such as common sense) to discover a solution. It is planning, learning, and scheduling in the presence of uncertainty. Computational Thinking is using massive amounts of data to speed up computation, making trade-offs between time and space, between processing power and storage capacity.
When a student goes to school in the morning, they put in their backpack the things they need for the day; that uses a computational process of prefetching and caching. When a student loses their hat and you suggest they retrace their steps; this is an example of the computational process of backtracking. At what point do you stop renting a DVD and buy yourself a copy?; an example of an online algorithms. Which line do you stand in at the supermarket?; an example of performance modeling.. Why does your landline telephone still work during a power outage?; an example of independence of failure and redundancy in design. How do Completely Automated Public Turing Test(s) to Tell Computers and Humans Apart, or CAPTCHAs, authenticate humans?; that exploits the difficulty of solving hard Artificial Intelligence problems to foil computing agents. Computational thinking will have become ingrained in everyone’s lives when words such as algorithm and precondition are part of everyone’s vocabulary; when nondeterminism and garbage collection take on the meanings used by computer scientists; and when trees are drawn upside down. Ubiquitous computing, where everyone uses a computing device (desktops, laptops, tablets, mobiles) for every conceivable task, is now realised. Ubiquitous computing was yesterday’s dream that has become today’s reality; computational thinking is tomorrow’s reality. CharacteristicsComputational Thinking has the following characteristics: - Conceptualising, not programming.
- Digital Technologies is not just about computer programming. Thinking computationally means more than being able to program a computer. It requires thinking at multiple levels of abstraction;
- Fundamental, not rote skill.
- A fundamental skill is something every human being must know to function in modern society. Rote means a mechanical routine;
- A way that humans, not computers, think.
Computational thinking is a way humans solve problems; it is not trying to get humans to think like computers. Computers can be dull and boring; humans are clever and imaginative. We humans make computers exciting. Equipped with computing devices, we use our cleverness to tackle problems we would not dare take on before the age of computing and build systems with functionality limited only by our imaginations; Complements and combines mathematical and Design Thinking.Digital Technologies inherently draws on Mathematical Thinking, given that, like all sciences, its formal foundations rest on mathematics. Digital Technologies inherently draws on Design Thinking, given that students build systems that interact with the real world. The constraints of the underlying computing device force students to think computationally, not just mathematically, however being free to build virtual worlds enables them to engineer systems beyond the physical world; Ideas, not artifacts. It’s not just the software and hardware artifacts students produce that will be physically present everywhere and touch our lives all the time, it will be the computational concepts we use to approach and solve problems, manage our daily lives, and communicate and interact with other people; and For everyone, everywhere. Computational thinking will be a reality when it is so integral to human endeavours it disappears as an explicit philosophy. Some people equate Digital Technologies just with computer programming. Some parents see only a narrow range of job opportunities for their children who major in Digital Technologies. Many people think the fundamental research in Digital Technologies is complete and that only the engineering remains. Computational Thinking is a grand vision to guide Digital Technologies educators, researchers, and practitioners as society’s image of the field changes, especially through the Technologies learning area, sending two main messages: - Intellectually challenging and engaging problems remain to be understood and solved. The problems and solutions are limited only by our own curiosity and creativity; and
- You can study Digital Technologies and do anything.
Students can study the English or Mathematics learning areas and go on to a multitude of different careers. The same for Digital Technologies. Students can study Digital Technologies and go on to careers in medicine, law, business, politics, any type of science or engineering, and the arts. The study of Digital Technologies should be seen as an intellectual adventure, exploring the joy, awe, and power of digital technologies; improving the world through innovations, problem solving, and creative works; achieved through the power of Computational Thinking. Computational Thinking Practices- Connecting Computing
- Identification of impacts of computing.
- Description of connections between people and computing.
- Explanation of connections between computing concepts.
- Developing computational artifacts
- Creation of an artifact with a practical, personal, or societal intent.
- Selection of appropriate techniques to develop a computational artifact.
- Use of appropriate algorithmic and information-‐management principles.
- Abstracting
- Explanation of how data, information, or knowledge are represented for computational use.
- Explanation of how abstractions are used in computation or modeling.
- Identification of abstractions.
- Description of modeling in a computational context.
- Analysing problems and artifacts
- Evaluation of a proposed solution to a problem.
- Location and correction of errors.
- Explanation of how an artifact functions.
- Justification of appropriateness and correctness.
- Communicating
- Explanation of the meaning of a result in context.
- Description using accurate and precise language, notation, or visualizations.
- Summary of purpose.
- Collaborating
- Collaboration of participants in solving a computational problem.
- Collaboration of participants in producing an artifact.
- Collaboration at a large scale.
Big Ideas, Key Concepts, and Supporting Concepts Creativity: Computing is a creative activity. - Computing fosters the creation of artifacts.
- Computing enables people to create digitally—including creating knowledge, tools, expressions of ideas, and solutions to problems.
- Computing enables people to translate intention into digital artifacts.
- Computing fosters creative expression.
- Computing extends traditional forms of human expression and experience.
- Computing fosters the creation of new forms of expression.
- Computing enables creative exploration that informs and inspires.
- Programming is a creative process.
- Some programs are developed to satisfy personal curiosity or for creative expression.
- Some programs are developed to solve problems, develop new knowledge, or help people, organizations, or society.
Abstraction: Abstraction reduces information and detail to facilitate focus on relevant concepts- A combination of abstractions built upon binary sequences can be used to represent all digital data.
- The interpretation of a binary sequence depends on how it is used (e.g., instruction, number, text, sound, or image).
- A finite representation is used to model the infinite mathematical concept of a number.
- Number bases, including binary and decimal, are abstractions used for reasoning about digital data.
- Multiple levels of abstraction are used in computation.
- Binary data is processed by physical layers of computing hardware, including gates, chips, and components.
- Programming languages, from low to high level, are used in developing software.
- Applications and systems are designed, developed, and analysed using levels of hardware, software, and conceptual abstractions.
- Models and simulations use abstraction to raise and answer questions.
- People use models and simulations to generate new understanding and knowledge.
- Models use different levels of abstraction to represent phenomena.
- Hypotheses can be formulated, refined, and tested using models and simulations.
- Simulations can facilitate extensive and rapid testing of models.
Data: Data and information facilitates the creation of knowledge- People use computer programs to process information to gain insight and knowledge.
- Computers can be used to find patterns in, and test hypotheses about, digitally represented information.
- Insight and knowledge can result from translating and transforming digitally represented information.
- Computing facilitates exploration and the discovery of connections in information.
- Big Data (use of large datasets) provides new opportunities and new challenges for extracting information and knowledge.
- Scalability, of systems and analytical approaches, is an important consideration when datasets are large.
- Metadata can increase the effective use of data or a dataset by providing additional information about various aspects of that data.
- Computational manipulation of information requires consideration of representation, storage, security, and transmission.
- There are trade-‐offs involved in the many possible ways to represent digital and non-‐digital information as digital data.
- Data is stored in many formats depending on its characteristics—such as size and intended use—so that it can be manipulated computationally. computational problems.
Algorithms: Algorithms are used to develop and express solutions to computational problems- An algorithm is a precise sequence of instructions for a process that can be executed by a computer.
- Sequencing, selection, iteration, and recursion are building blocks of algorithms.
- Algorithms can be combined to make new algorithms.
- Different algorithms can be developed to solve the same problem.
- Algorithms are expressed using languages.
- Languages for algorithms include natural language, pseudo-code, and visual and textual programming languages.
- The language used to express an algorithm can be different from the programming language used to implement the algorithm.
- Different languages are better suited for expressing different algorithms.
- The language used to express an algorithm can affect characteristics such as clarity or readability, but not whether an algorithmic solution exists.
- Algorithms can solve many, but not all, problems.
- Many problems can be solved in a reasonable time.
- Some problems can be solved, but heuristic approaches are necessary to solve them in a reasonable time.
- Some problems cannot be solved using any algorithm.
- Algorithms are evaluated analytically and empirically.
- Algorithms can be evaluated using many criteria (e.g., efficiency, correctness, and clarity).
- Different correct algorithms for the same problem can have different efficiencies.
Programming: Programming enables problem solving, human expression, and creation of knowledge creation of knowledge. - Programs are written to execute algorithms.
- Programming requires an understanding of how instructions are processed.
- Programs are executed to automate processes.
- A single program can be run multiple times and on many machines.
- Executable programs increase the scale of problems that can be addressed.
- Programming is facilitated by appropriate abstractions.
- Functions are re-‐usable programming abstractions.
- Parameterisation can be used to generalize a specific solution.
- Data abstraction provides a means of separating behaviour from implementation.
- Application Program Interfaces (APIs) and libraries simplify complex programming tasks.
- Programs are developed and used by people.
- Programs are developed to solve problems.
- Developing programs is an iterative process.
- Finding and eliminating errors is an essential part of developing programs.
- Documentation is a necessary part of developing maintainable programs.
- Programs are evaluated for their correctness and style.
- Programming uses mathematical and logical concepts.
- Programming uses numerical concepts including real numbers and integers.
- Programming uses applications of logical concepts including Boolean algebra.
- Sets and collections are tools for solving computational problems.
Internet: The Internet pervades modern computing- The Internet is a network of autonomous systems.
- The Internet connects devices and networks all over the world.
- The Internet and the systems built on it facilitate collaboration.
- The Internet is built on evolving standards including those for addresses and names.
- Characteristics of the Internet and the systems built on it influence their use.
- Hierarchy and redundancy help systems scale.
- Interfaces and protocols enable widespread use.
- The size and speed of systems affect their use.
- Cybersecurity is an important concern for the Internet and systems built on it.
- The trust model of the Internet involves tradeoffs.
- Cryptography is essential to many models of cybersecurity. Implementing cybersecurity has software, hardware, and human components.
Impact: Computing has global impacts - Computing affects communication, interaction, and cognition.
- Computing enhances communication, fostering new ways to communicate and collaborate.
- Widespread access to information facilitates identification of problems, development of solutions, and dissemination of results.
- Computing enhances human capabilities (e.g., through the use of cyber-physical systems and assistive technologies).
- The Internet and the web have a profound impact on society.
- Computing enables innovation in nearly every field.
- Computational approaches and data analysis enable innovation.
- Computing enables innovation by providing access to, and sharing of, information.
- Computing has both beneficial and harmful effects.
- Innovations enabled by computing raise legal and ethical concerns.
- Privacy and security concerns arise in the development and use of computational systems and artifacts.
- Technology enables collection, use, and exploitation of information about, by, and for individuals, groups, and institutions.
- Widespread access to digitised information raises questions about intellectual property.
- Computing is situated within economic, social, and cultural contexts.
- Computing innovations both influence and are influenced by the contexts in which they are designed and the contexts in which they are used.
- The global distribution of computing resources raises issues of equity, access, and power.
Computational Thinking Skills
DecompositionDecomposition is the ability to break down a task into minute details so that we can clearly explain a process to another person or to a computer, or even to just write notes for ourselves. Decomposing a problem frequently leads to pattern recognition and generalization, and thus the ability to design an algorithm. Examples: 1. When we taste an unfamiliar dish and identify several ingredients based on the flavor, we are decomposing that dish into its individual ingredients. 2. When we give someone directions to our house, we are decomposing the process of getting from one place to another. 3. In mathematics, we can decompose a number such as 256.37 as follows: 2*102+5*101+6*100+3*10-1+7*10-2 4. In mathematics, we can decompose any numbers such as 20, 24, and 28 into their prime factors to help us find their least common multiple: ◦ 20: 2*2*5 ◦ 24: 2*2*2*3 ◦ 28: 2*2*7 ◦ LCM: 2*2*2*5*3*7 = 840
Pattern Recognition Pattern Recognition is the ability to notice similarities or common differences that will help us make predictions or lead us to shortcuts. Pattern recognition is frequently the basis for solving problems and designing algorithms. Examples: - Children identify patterns in the reaction of their parents and teachers to their behaviour in order to determine what is right and what is wrong. They base their future behaviour on these patterns.
- People look for patterns in stock prices to decide when to buy and sell.
- In mathematics, we can follow a pattern to explain the logic behind why the product of two negative numbers is a positive number:
◦ (-3)(3) = -9 ◦ (-3)(2) = -6 ◦ (-3)(1) = -3 ◦ (-3)(0) = 0 ◦ (-3)(-1) = 3 ◦ (-3)(-2) = 6 ◦ (-3)(-2) = 9 In mathematics, when calculating the largest area possible for a rectangle of a given perimeter, we can guess and see patterns in the length, width, and area such as:
- As the length and width approach each other in value, the area increases; and
- As the difference between the length and width increases, the area decreases.
These patterns lead us to the conclusion that the rectangle with the greatest area is a square.
Pattern Generalisation and Abstraction Pattern Generalisation and Abstraction is the ability to filter out information that is not necessary to solve a certain type of problem and generalise the information that is necessary. Pattern generalisation and abstraction allows us to represent an idea or a process in general terms (e.g., variables) so that we can use it to solve other problems that are similar in nature. Examples: - A daily planner uses abstraction to represent a week in terms of days and hours, helping us to organise our time.
- A world map is an abstraction of the earth in terms of longitude and latitude, helping us describe the location and geography of a place.
- In mathematics, we write generalised formulas in terms of variables (such as the two shown below) instead of numbers so that we can use them to solve problems involving different values:
−b ± √b2 - 4ac2a a+b)(a-b) = a2 - b2
Algorithm Design Algorithm Design is the ability to develop a step-by-step strategy for solving a problem. Algorithm design is often based on the decomposition of a problem and the identification of patterns that help to solve the problem. In computer science as well as in mathematics, algorithms are often written abstractly, utilising variables in place of specific numbers. Examples: - When a chef writes a recipe for a dish, she is creating an algorithm that others can follow to replicate the dish.
- When a coach creates a play in football, he is designing a set of algorithms for his players to follow during the game.
- In mathematics, when we add and subtract fractions with different denominators, we follow an algorithm along the lines of:
- Find the least common multiple of all the denominators.
- Multiply the numerator and denominator of each fraction by whatever number yields the least common multiple identified in the previous step.
- Add (or subtract) the numerators and use the least common multiple found in the first step as the denominator.
- In mathematics, when we calculate the percent change between two numbers, we follow an algorithm along the lines of:
- If the original number is greater than the new number, use the following equation to calculate the percent change: percent decrease = 100*(original - new)/original.
- If the new number is greater than the original number, use the following equation to calculate the percent change: percent increase = 100*(new - original)/original.
- If neither is true, then the original and new numbers must equal each other and there is no percent change.
We can take this a step further and implement this algorithm in the Python programming language to have the computer calculate this for us: original = float(input('Enter the original number: ')) new = float(input('Enter the new number: ')) percent_decrease = 100 * (original - new) / original print 'Percent decrease:', percent_decrease, '%' percent_increase = 100 * (new - original) / original print 'Percent increase:', percent_increase, '%' print 'There is no percent change.'
COMPUTATIONAL THINKING ILLUSTRATED
Analysing effects of computationComputation is everywhere. From search engines that help us find information, to cash registers in stores, to software used for designing bridges, we live in a world built on the effects of computation. Computation is not just another word for technology. For example, a cellular phone contains many different technologies: a radio transmitter and receiver, a processor, memory, and electro-mechnical parts like buttons and touch screens. When studying the effects of computation, we aren’t trying to learn how physics governs these technologies. Analyzing the effects of computation means specifically looking at what happens when we collect, store, and process data. The computation done by a cell phone involves recording your voice as data, compressing and transmitting that data, and interacting with a larger system that routes your call’s data to its destination. This same computational process is done in reverse so your conversation partner can talk back to you. That sounds like a lot of computational work for your cell phone to do, but that’s only part of what happens when you make a call. All the sending and receiving of data happens via radio waves. When the technology for radio communication was first invented by Nikola Tesla, it could only be used for mass communication in the form of broadcasts. It takes computation to transform that raw technological capacity into the more refined form we use today in our cell phones. One effect of computation is that radio can now be used for person-to-person communication, with many simultaneous conversations happening in the same physical area. When we analyze the effects of computation, we take note of data is transformed. We look at how information is processed and what is accomplished by that processing. We can think about what we might do if such computational power wasn’t available. That can also help us start to imagine new things we can strive to accomplish using computation. A major part of the work in analysing the effects of computation is careful observation, as Blaze, Ada, Charles, Alan, and Grace are doing in this illustration. In their world, as in ours, computation is everywhere. By looking closely, we can start to see what computation — not just raw technology — does for us.
Creating computational artifacts
Creating computational artifacts is all about making things. Programming is one of the most visible ways we make computational artifacts. In that case, the artifacts are both the programs we made and their outputs. But the term computational artifact is not limited to just computer programs. It can refer to a whole range of things from microprocessors to bar codes to an airplane’s navigation system. In this illustration, the characters are building, testing, and exploring computational artifacts. The process of creating is not limited to only thinking of ideas, or just assembling parts. The machines you see in these cartoons are symbolic, designed to be open to interpretation and imagination. Here are some ways of looking at them to help you get started: Grace is building something new. At the moment, she’s using a wrench because it’s the right tool for the work she’s doing. She’s not just using a machine built by someone else; she’s actually making something new herself. Sometimes, creating things is a time-consuming and difficult process, but it gets easier with experience. Blaze is wearing a glove that controls a much larger and stronger hand. This hand can do many things, including lift up Blaze himself. Blaze’s glove is a metaphor for computational artifacts that allow us to harness the power of machines to carry out massive calculations. When we turn that power back upon itself as we do when we use recursion, higher-order functions, or write a compiler for a language in that same language, things can get very exciting. Alan is walking on the ceiling. He’s holding a Möbius strip, a topological surface with only one side. When twisted and attached back to itself, a regular flat rectangle can be transformed into a Möbius strip. Using computational thinking, we can change our perspective to solve a complex problem — like Alan, who is upside down! Many computational concepts, like the idea of the Möbius strip, can challenge our assumptions about what’s possible and reveal deeper truths about the properties of the systems we are using or creating. At first this can seem as difficult as walking on the ceiling, but after a while you’ll probably find it fun. Charles is holding an orb covered in what look like small radio dishes. Computational artifacts need not be designed to work in isolation. They can work together and communicate to accomplish a task, like we see in multi-core processors or parallel computing. Perhaps the radio dishes are helping Charles to hear things that other characters can not. Similarly, algorithms for pattern recognition, signal processing, error correction, and noise reduction enhance our ability to extract information from data. With the help of computational artifacts, we gain new powers.
Using abstractions and models“All models are wrong, but some are useful.” – George E. P. Box One meaning of the word model is: A smaller or simpler version of the original item. The model could be a physical object like the small robot in this illustration. Notice that Blaze isn’t trying to move the arms of the huge robot, nor trying to move the heavy blocks himself. Instead, he is working with a model robot small enough for him to literally put his own hands around. This is simplifies the physical work he needs to do, just like a simplified model of an idea makes thinking easier. For example, classical mechanics is a model: It’s Newton’s easily-computed approximation of the more complex reality of motion. In computer science, we make a model every time we write a program. We must choose the information and level of detail represented in our program. Some details must be left out. If we tried to include everything in a model or program, we would end up simulating the whole world! In a complex system, we might use many different models and make them work together. We might not even care if one part of the system was switched out for something else that can accomplish the same goal. We could say we’ve abstracted that part of the system. Carefully selecting the qualities we care about and ignoring the rest of the details is the key to abstraction. When we deliberately separate our system into parts that can be individually understood, tested, reused, and substituted, then we are creating new abstractions. “Analysis is the process of breaking a complex topic or substance into smaller parts to gain a better understanding of it.” In this illustration, Ada is using a tool with many attachments, representing the idea that we often need to try multiple approaches and many different tools before we can “crack” a problem. Different problems and different approaches to these problems have different strengths. Often, we can’t solve a problem until we try a number of different ways to break it down. That’s why it’s so valuable to have a variety of conceptual tools available when working on a problem. Over to the right, Alan is controlling a zoomed-in view of the cubes on the table. This allows him to see and understand not only how a cube looks and acts from the outside, but to how its internal workings contribute to its overall behavior. Programmers engage in this kind of analysis when they use a debugger; so do electrical engineers when they use an oscilloscope to visualize signals. Communicating processes and results Very rarely is a computational artifact self-explanatory. A CPU made of microscopic transistors on silicon or a compiled binary program of 1’s and 0’s are both quite difficult to understand. Their forms are optimised for computational performance, not human comprehension. The design plan for the CPU or the source code of the program are more easily understood. But even these precursors don’t necessarily explain how they were made or why they work. Computational thinking requires us to discuss processes for both people and machines to follow, and how these processes are intended to lead to specific results. For example, when a programmer is learning how to write programs, they need to be taught to debug by printing the value of a variable. When you discover a new mathematical technique for manipulating 3d shapes efficiently, you have to write it up so other people can understand and use it. Communication is the way we bring the things we know into the world. When we use computation to solve a problem, the answer we get isn’t automatically meaningful to others. We have to communicate this result in a way that reveals both its importance and its origin. In the illustration, Charles is capturing the sounds of a parrot in the wild and transmitting them to Grace at another location. We can interpret this literally as communication of audio data, like someone’s voice on a phone call. However, to another parrot perhaps the parrot’s song represents a process (like a technique for finding fruits and seeds, or a plan for timing seasonal migration) or some important news (such as the winner of the annual parrot speechmaking contest). Communicating about processes and results allows us to benefit from insights gained by other computational thinkers.
Working effectively in teams The ability to work in a team can mean the difference between success and failure. Building any complex system, software or hardware, requires more work be done in less time than any single person can accomplish. But adding more people doesn’t necessarily mean that the job will get done sooner. To make teamwork effective, individuals need interpersonal and communication skills as well as knowledge of different team methodologies and processes. As teams grow in size, the role of culture and management becomes increasingly important. Teamwork, like any other skill, takes practice. Various strategies for dividing up work have different strengths and weaknesses. Figuring out the best way to work together isn’t always easy, but it’s important for computational thinking. As multi-core processors and distributed computing become more common, we see computers themselves working in teams. Most web sites that you visit are served from data centers, where hundreds or thousands of individual computers work together to accomplish amazing tasks. We humans can do the same!
DecompositionIn this illustration, Ada, Alan, and Grace are each taking apart some of the machines we have seen in other scenes. But decomposition isn’t only about disassembling objects. It’s also about pulling apart the steps of a process. Some things that we think of as a single action are actually a composite of many smaller actions. For example we may say that we are going to make dinner. But when we apply decomposition, we find that making dinner actually means to opening the refrigerator, getting out the broccoli, cutting up an onion, turning on the stove, and many other small steps.
A difficult computational problem can sometimes be solved by thinking of the overall task as being made up of many smaller, simpler tasks. Decomposition involves identifying those smaller tasks and how they fit together. The more times you do this, the easier it gets. Just ask Ada, who is taking apart an orb. Even though each of the orbs is a little different, she has a pretty good idea of what pieces she’s going to find when she takes one apart.
There’s something strange about the pattern of blocks, and Grace is pointing it out to Ada. Although they aren’t looking at the whole complicated machine that produces this pattern of blocks, they can still identify what is unusual. This doesn’t mean anything is wrong, but it tells them that there might be more going on than they first thought. Forming an idea of what you expect is one way to find patterns. The more you look, the more patterns you will find in nature, in computational artifacts, and in processes. When we recognize a pattern, we can use our other computational thinking skills to help us understand its importance.
Pattern generalisation and abstractionAfter you’ve seen the same pattern a few times, you might start thinking of different ways to describe it. Alan is watching some blocks fall into place to form a picture. If the machines drop the same pattern of blocks again, they’ll make the same picture. There’s a lot for Alan to think about here, watching the blocks fall. There lots of possible patterns — see if you can calculate the number. There are also a lot of ways to describe these patterns. If we wanted the machines to make a picture of a house with the door on the right side instead of the left side, the instructions would be almost the same. What if instead of giving the machines new instructions every time, we simply told them what to change about some other instructions? We would need instructions that describe how to change other instructions. Thinking this way is some of the work we do when we try to generalize patterns. We look for what’s the same about a group of patterns and try to describe it them a way that’s both clear and efficient. If we can describe the group of patterns all at once, a pattern of patterns, then we have an abstraction.
Algorithm DesignIt’s a computational thinking dance party! The special dance floor in this illustration might be recording their steps, or it might be lighting up with dance instructions. But whileGrace, Alan, and Ada are dancing away, Charles is actually designing a new dance. Like an algorithm, a dance is a set of steps that can be followed by others to get the same result. Sometimes we think of algorithms as being written down like a computer program, but an algorithm is more like an idea. The same algorithm can be written in many different computer languages. It’s the steps in the process that make an algorithm what it is. In order to design an algorithm, or a dance, you need to understand your goal. You also need to understand the constraints of the system. Humans only have two feet, so a dance designed for humans has to work with that limitation. Computational systems have different kinds of limitations, such as the speed of the processors or the size of the memory or the amount of electricity they consume. Designing an algorithm that accomplishes specific goal within the constraints of the system is like creating an elegant dance that everyone else wants to learn.
Computational Fairy Tales by experience level
Computational Fairy Tales were developed by Jeremy Kubica and formatted into two books: Computational Fairy Tales (Kubica, 2012) Best Practices of Spell Design (Kubica, 2013) The stories are written for a variety of audiences, from absolutely no programming experience to those with computer science backgrounds. But a general structure exists: BeginnerBeginner: Focuses on general concepts (e.g., very high level algorithms) and simple programming concepts. Good for people without significant (or any) programming experience, including both: people just learning to program and people with no interest in programming but who want to know the concepts.
Intermediate: Algorithms, data structures, and practical programming techniques. Concepts generally covered the second half of an introductory CS course and in an intro to algorithms course. Good for people who have some experience programming.
- Data Structures:
- Algorithms:
- Graphs:
- Strings:
- Object Oriented Programming:
- Practical programming:
Advanced: Advanced algorithms, pointers, and computational theory. These are topics more commonly found in algorithms courses. Good for people who have been programming for a while or have taken some computer science courses.
Really? Completely random topics. The target audience here is people who have been studying computer science for a while.
Computational Thinking Stories by topic
Algorithms:Data Structures:Graphs (Data Structures and Algorithms):Basic Programming: Memory / Pointers:Practical Programming:Computational Complexity and Big-O Notation:Object-Oriented Programming:Other Computer Science Concepts: |