Seeing Chance

Bridging Informal and Formal Visualizations of Probability Experiments

Seed funding:
National Academy of Education/Spencer Foundation postdoctoral fellowship to Abrahamson, 2005-6 ($65k)

The Marbles Scooper (design: Dor Abrahamson; engineering & production: Paulo Blikstein). Random sampling of four marbles.

The Combinations Tower, the sample space of this probability experiment.

An outcome distribution from a computer-based simulation of the marbles-scooping experiment.

Overview

Consider this paradox. On the one hand, there is growing evidence of babies’ capacity to draw mathematically sound inferences from situations involving random events. On the other hand, school students’ manifest chronic challenges in learning probability concepts. What’s going on? Well, perhaps school is not capitalizing on students’ innate or early-developed capacity. In this project, we created learning materials — using both traditional media (marbles, cards, crayons) and computer-based modules (NetLogo simulations) — that, on the one hand, enable students to draw on the same intuitions that we know babies have, but on the other hand lend themselves to elaboration into mathematical models (that is, they are ‘bridging tools’). We thus honed the tension between inference from intuitive perceptual judgments of naturalistic situations and inferences from analytical models of these same situations.

We worked with students in Grades 4-6 as well as with 7th graders and undergraduate and graduate students. Across the gamut, all students used the same form of perceptual judgment when looking at the situation in non-analytical ways but were challenged by the rationale of the formal analysis for the experiment. The challenge, it turned out, centered on how they were seeing specific outcomes in the sample space, what sense they made of these objects. Once students succeeded in viewing a specially arranged assembly of the sample space as holistically expressing their intuitive inference regarding the source situation, they were willing retroactively to accept the rationale of the combinatorial-analysis procedure. We named this sudden appropriation of the mathematical model a “semiotic leap,” because at those moments new signs were born in which disciplinary forms first bore the intuitive meanings.

Much of the publications coming from this project focus on how the students were guided to recognize and negotiate competing visualizations of one and same object, and how that cognitive and social process was resolved as learning. (See the Histo-Blocks model; see the ProLab suite of NetLogo models.)

Education as Negotiation: The Case of Basic Probability (on LUMEN)


Publications

Abrahamson, D. (2014). Rethinking probability education: Perceptual judgment as epistemic resource. In E. J. Chernoff & B. Sriraman (Eds.), Probabilistic thinking: presenting plural perspectives (pp. 339-260). New York: Springer.

ABSTRACT: The mathematics subject matter of probability is notoriously challenging, and in particular the content of random compound events. When students analyze experiments, they often omit to discern variations as distinct outcomes, e.g., HT and TH in the case of flipping a pair of coins, and thus infer erroneous predictions. Educators have addressed this conceptual difficulty by engaging students in actual experiments whose outcomes contradict the erroneous predictions. Yet whereas empirical activities per se are crucial for any probability design, because they introduce the pivotal contents of randomness, variance, sample size, and relations among them, empirical activities may not be the unique or best means for students to accept the logic of combinatorial analysis. Instead, learners may avail of their own pre-analytic perceptual judgments of the random generator itself so as to arrive at predictions that agree rather than conflict with mathematical analysis. I support this view first by detailing its philosophical, theoretical, and pedagogical foundations and then presenting empirical findings from a design-based research project. Twenty- eight students aged 9–11 participated in tutorial, task-based clinical interviews that utilized an innovative random generator. Their predictions were mathematically correct even though initially they did not discern variations. Students were then led to recognize the formal event space as a semiotic means of objectifying these presymbolic notions. I elaborate on the thesis via micro-ethnographic analysis of key episodes from a paradigmatic case study. Along the way, I explain the design-based research methodology, highlighting how it enables researchers to spin thwarted predictions into new theory of learning.


Abrahamson, D. (2012). Rethinking intensive quantities via guided mediated abductionJournal of the Learning Sciences, 21(4), 626-649.

ABSTRACT: Some intensive quantities, such as slope, velocity, or likelihood, are perceptually privileged in the sense that they are experienced as holistic, irreducible sensations. However, the formal expression of these quantities uses a/b analytic metrics; for example, the slope of a line is the quotient of its rise and run. Thus, whereas students’ sensation of an intensive quantity could serve as a powerful resource for grounding its formal expression, accepting the mathematical form requires students to align the sensation with a new way of reasoning about the phenomenon. I offer a case analysis of a middle school student who successfully came to understand the intensive quantity of likelihood. The analysis highlights a form of reasoning called abduction and suggests that sociocognitive processes can guide and mediate students’ abductive reasoning. Interpreting the child’s and tutor’s multimodal action through the lens of abductive inference, I demonstrate the emergence of a proportional concept as guided mediated objectification of tacit perception. This “gestalt first” process is contrasted with traditional “elements first” approaches to building proportional concepts, and I speculate on epistemic and cognitive implications of this contrast for the design and instruction of these important concepts. In particular, my approach highlights an important source of epistemic difficulty for students as they learn intensive quantities: the difficulty in shifting from intuitive perceptual conviction to mediated disciplinary analysis. My proposed conceptualization of learning can serve as an effective synthesis of traditional and reform-based mathematics instruction.


Abrahamson, D. (2012). Seeing chance: Perceptual reasoning as an epistemic resource for grounding compound event spaces. ZDM Mathematics Education, 44(7), 869–881.

ABSTRACT: The mathematics subject matter of probability is notoriously challenging, and in particular the content of random compound events. When students analyze experiments, they often omit to discern variations as distinct events, e.g., HT and TH in the case of flipping a pair of coins, and thus infer erroneous predictions. Educators have addressed this conceptual difficulty by engaging students in actual experiments whose outcomes contradict the erroneous predictions. Yet whereas empirical activities per se are crucial for any probability design, because they introduce the pivotal contents of randomness, variance, sample size, and relations among them, empirical activities may not be the unique or best means for students to accept the logic of combinatorial analysis. Instead, learners may avail of their own pre-analytic perceptual judgments of the random generator itself so as to arrive at predictions that agree rather than conflict with mathematical analysis. I support this view first by detailing its philosophical, theoretical, and didactical foundations and then by presenting empirical findings from a design-based research project. Twentyeight students aged 9–11 participated in tutorial, task-based clinical interviews that utilized an innovative random generator. Their predictions were mathematically correct even though initially they did not discern variations. Students were then led to recognize the formal event space as a semiotic means of objectifying these presymbolic notions. I elaborate on the thesis via micro-ethnographic analysis of key episodes from a paradigmatic case study.


Abrahamson, D. (2012). Discovery reconceived: Product before processFor the Learning of Mathematics, 32(1), 8-15.

ABSTRACT: This article is motivated by a commitment to the ideas underlying discovery learning, namely the epistemological notion of grounded, meaningful, generative knowledge. It is also motivated by concern that these ideas have been implicitly misinterpreted in curriculum and instruction, ultimately to the detriment of students. Accordingly, I discuss an alternative, empirically based, theoretical articulation of discovery pedagogy that addresses the criticisms it has faced. The research question framing this alternative approach is, “What exactly about a mathematical concept should students discover via discovery learning?” I will pursue this question by reflecting on two case studies of children who participated in activities of my own design. Empirical data from these and other studies have served me over the past decade as contexts for inquiry into the cognition and instruction of mathematical concepts, an inquiry that, in turn, keeps feeding back into further design and articulation of design principles. In this essay, I will use these data to offer an empirically grounded “centrist” answer to the question of what students should discover, at least with respect to a particular class of mathematical concepts (intensive quantities) as embodied in a particular type of design (perception-based learning).


Students and tutor inventing idiosyncratic metaphors in making sense of mathematical ideas.

Jake and the snake

Li: splash vs. ripple

Sima and the robot

Razi and the magic pasta sieve

Abrahamson, D., Gutiérrez, J. F., & Baddorf, A. K. (2012). Try to see it my way: the discursive function of idiosyncratic mathematical metaphorMathematical Thinking and Learning, 14(1), 55-80.

ABSTRACT: What are the nature, forms, and roles of metaphors in mathematics instruction? We present and closely analyze three examples of idiosyncratic metaphors produced during one-to-one tutorial clinical interviews with 11-year-old participants as they attempted to use unfamiliar artifacts and procedures to reason about realistic probability problems. Our interpretations of these episodes suggest that metaphor is both spurred by and transformative of joint engagement in situated activities: metaphor serves individuals as semiotic means of objectifying and communicating their own evolving understanding of disciplinary representations and procedures, and its multimodal instantiation immediately modifies interlocutors’ attention to and interaction with the artifacts. Instructors steer this process toward normative mathematical views by initiating, modifying, or elaborating metaphorical constructions. We speculate on situation parameters affecting students’ utilization of idiosyncratic resources as well as how socio-mathematical license for metaphor may contribute to effective instructional discourse.


Abrahamson, D. (2011). Towards instructional design for grounded mathematics learning: The case of the binomial. In N. L. Stein & S. Raudenbush (Eds.), Developmental cognitive science goes to school (pp. 267-281). New York: Taylor & Francis – Routledge.

ABSTRACT: Consider a penny. It is flipped four times. Now consider two possible outcomes of this experiment:

(a) Heads, Heads, Heads, Heads

(b) Heads, Heads, Tails, Tails

Is one of these two outcomes more likely than the other, or are they equally likely?

This item targets basic knowledge of probability. Namely, it aims to evoke the phenomenon of a random generator (e.g., coins, dice, spinners, etc.) as a context for eliciting and gauging an understanding of randomness, independence, and distribution. Solving this item does not demand any numerical reasoning or arithmetical calculation—one need only grasp the logic of the situation so as to determine the appropriate response. We might thus hope that graduates of the U.S. school system, who have studied at least basic probability concepts, fare well on this simple item. But do they?


Abrahamson, D. (2010). A tempest in a teapot is but a drop in the ocean: Action-objects in analogical mathematical reasoning. In K. Gomez, L. Lyons, & J. Radinsky (Eds.), Learning in the Disciplines: Proceedings of the 9th International Conference of the Learning Sciences (ICLS 2010) (Vol. 1 (Full Papers), pp. 492-499). International Society of the Learning Sciences: Chicago IL.

ABSTRACT: We discuss a brief transcribed excerpt from a task-based interview with Li, an 11.5-year-old participant in a design-based research study of probabilistic cognition pertaining to the binomial. We investigate whether and how Li made sense of the behavior of an unfamiliar computer-based artifact—the diminishing proportional impact of successive random samples on the overall shape of a dynamically accumulating outcome distribution. Li constructed two informal analogical situations as multimodal discursive means of concretizing, elaborating, and communicating his emerging understanding of the artifact’s behavior. These non-routine utterances shifted the discourse to an explicitly embodied, imagistic space bearing unique affordances for negotiated epistemic syntheses of phenomenological and technological constructions of quantitative relations. Microgenetic analysis suggests that Li’s presymbolic notion was not a static magnitude but an intensive-quantity “action–object”; he subsequently unpacked this dynamical “a/b” qualia into its constitutive “a” and “b” elements. We reflect on implications of this counter-curricular sequence for educational design.


RG is reasoning about the relation between the color ratio in the marbles bin and the likelihood of particular samples. Note how she gestures as she speaks about the marbles being half green and half blue (or 50% green and 50% blue) — she “locates” the green on the left and the blue on the right, even though evidently the marbles are mixed. This gesture gives us clues on how students bridge from situations to mathematical forms.

Negotiating media constraints on image expressivity, Mary: (a) manipulates the on-screen histogram “hands on”; (b) manipulates the on-screen histogram “hands off”; (c) considers pen and paper, but declines; (d) remote-manipulates the on-screen histogram “hands on”; (e) shifts a card up to show the expected shape; but (f) returns the card because the shift violated constraints of the representational form

Mark, a senior economics major, explaining why he expects a 2-green-and-2-blue sample as the central tendency of a hypothetical experiment with the marbles scooper: (a) using hands as event categories; (b) using columns of the combinations tower; and (c) elaborating on expected variance by squeezing columns toward each other.


If the chance of getting a green marble were .6 rather than .5 (that is, if 60% of the marbles in the bin were green), then the expected outcome distribution in an actual experiment would no longer be the same shape as the combinations tower. LB, a senior stats major, outlines how the shape would change.
Working with the NetLogo ProbLab model Histo-Blocks, LB reasons why under the .6 condition, the likelihood of getting a 2-green-2-blue sample is equal to the likelihood of getting a 3-green-1-blue sample. Thus Histo-Blocks enabled LB both to confirm her earlier inference and to analyze it quantitatively.

Abrahamson, D. (2009). Embodied design: Constructing means for constructing meaningEducational Studies in Mathematics, 70(1), 27-47.

(run NetLogo web model Histo-Blocks)

ABSTRACT: Design-based research studies are conducted as iterative implementation-analysis-modification cycles, in which emerging theoretical models and pedagogically plausible activities are reciprocally tuned toward each other as a means of investigating conjectures pertaining to mechanisms underlying content teaching and learning. Yet this approach, even when resulting in empirically effective educational products, remains under-conceptualized as long as researchers cannot be explicit about their craft and specifically how data analyses inform design decisions. Consequentially, design decisions may appear arbitrary, design methodology is insufficiently documented for broad dissemination, and design practice is inadequately conversant with learning-sciences perspectives. One reason for this apparent under-theorizing, I propose, is that designers do not have appropriate constructs to formulate and reflect on their own intuitive responses to students’ observed interactions with the media under development. Recent socio-cultural explication of epistemic artifacts as semiotic means for mathematical learners to objectify presymbolic notions (e.g., Radford, Mathematical Thinking and Learning 5(1): 37–70, 2003) may offer design-based researchers intellectual perspectives and analytic tools for theorizing design improvements as responses to participants’ compromised attempts to build and communicate meaning with available media. By explaining these media as potential semiotic means for students to objectify their emerging understandings of mathematical ideas, designers, reciprocally, create semiotic means to objectify their own intuitive design decisions, as they build and improve these media. Examining three case studies of undergraduate students reasoning about a simple probability situation (binomial), I demonstrate how the semiotic approach illuminates the process and content of student reasoning and, so doing, explicates and possibly enhances design-based research methodology.


Li: back and forth

Tamar: back and forth

Students alternate their visualization of individual compound outcomes in the sample space: as combinations… as variations… and back again. A combinations view, which ignores the order of singleton outcomes, is the natural ratio-based perception that serves humans in making powerful heuristic inferences of likelihood (what Kahneman & Tversky called ‘representativeness’), but the variations view, which attends to the order, is necessary for adopting the mathematical method of combinatorial analysis so as to make accurate, calculation-based predictions. Students will adopt the variations view only once they accept that the relative number of variations in an event set (e.g., 6 cards in the 2-green-and-2-blue column as compared to 4 cards in the 3-green-and-1-blue column) determines the relative likelihood of getting one of these permutations. Only retroactively do they accept the rationale of the classicist procedure by which they themselves had built the sample space.


Abrahamson, D. (2009). Orchestrating semiotic leaps from tacit to cultural quantitative reasoning—the case of anticipating experimental outcomes of a quasi-binomial random generatorCognition and Instruction, 27(3), 175-224.

ABSTRACT: This article reports on a case study from a design-based research project that investigated how students make sense of the disciplinary tools they are taught to use, and specifically, what personal, interpersonal, and material resources support this process. The probability topic of binomial distribution was selected due to robust documentation of widespread student error in comparing likelihoods of possible events generated in random compound-event experiments, such as flipping a coin four times, for example, students erroneously evaluate HHHT as more likely than HHHH, whereas in fact these are 2 of 16 equiprobable elemental events in the sample space of this experiment. The study’s conjecture was that students’ intuitive reasoning underlying these canonical errors is nevertheless in accordance with mathematical theory: student intuition is couched in terms of an unexpanded sample space—that is, five heteroprobable aggregate events (no-H, 1H, 2H, 3H, 4H), and therefore students’ judgments should be understood accordingly as correct, for example, the combination “3H, 1T” is indeed more likely than “4H,” because “3H, 1T” can occur in four different orders (HHHT, HHTH, HTHH, THHH) but “4H” has only a single permutation (HHHH). The design problem was how to help students reconcile their mathematically correct 5 aggregate-event intuition with the expanded 16 elemental-event sample space. A sequence of activities was designed involving estimation of the outcome distribution in an urn-type quasi-binomial sampling experiment, followed by the construction and interpretation of its expanded sample space. Li, whose experiences were typical of a total of twenty-eight Grade 4–6 participants in individual semi-structured clinical interviews, successfully built on his population-to-sample expectation of likelihood in developing the notion of the expanded sample space. Drawing on cognitive-science, sociocultural, and cultural-semiotics theories of mathematical learning, I develop the construct semiotic leap to account for how Li appropriated as a warrant for his intuitive inference an artifact that had initially made no sense to him. More broadly, I conclude that students can ground mathematical procedures they are taught to operate even when they initially do not understand the rationale or objective of these cultural artifacts (i.e., students who are taught a procedure can still be guided to re-invent the procedure-as-instrument).


Abrahamson, D. (2009). A student’s synthesis of tacit and mathematical knowledge as a researcher’s lens on bridging learning theory. In M. Borovcnik & R. Kapadia (Eds.), Research and developments in probability education <Special Issue>. International Electronic Journal of Mathematics Education, 4(3), 195-226.

ABSTRACT: What instructional materials and practices will help students make sense of probability notions? Li (11 years) participated in an interview-based implementation of a design for the binomial. The design was centered around an innovative urn-like random generator, creating opportunities to reconcile two mental constructions of anticipated outcome distributions: (a) holistic perceptual judgments based in tacit knowledge of population-to-sample relations and implicitly couched in terms of the aggregate events with no attention to permutations on these combinations; and (b) classicist-probability analytic treatment of ratios between the subset of favorable to all elemental events with attention to the permutations. We argue that constructivist and sociocultural perspectives on mathematics learning can be reconciled by revealing interactions of intuitive and formal resources in individual development of deep conceptual understanding. Learning is the guided process of blending two constructions of problematized situations: the phenomenologically immediate and the semiotically mediated.


Abrahamson, D., Bryant, M. J., Gutiérrez, J. F., Mookerjee, A. V., Souchkova, D., & Thacker, I. E. (2009). Figuring it out: Mathematical learning as guided semiotic disambiguation of useful yet initially entangled intuitions. In S. L. Swars, D. W. Stinson, & S. Lemons-Smith (Eds.), Proceedings of the Thirty-First Annual Meeting of the North-American Chapter of the International Group for the Psychology of Mathematics Education (Vol. 5, pp. 662-670). Atlanta, GA: Georgia State University.

ABSTRACT: When participants in inquiry refer to an object, they may, unbeknown to them, construct the object differently. They thus tacitly attribute different idiosyncratic senses for their respective constructions and consequently draw different inferences regarding the phenomenon under investigation. A single person, too, may shift between alternative constructions of a mathematical object, assigning them different senses, thus arriving at apparently competing conclusions. Only upon acknowledging the different constructions can the person begin to explore whether and how the differing conclusions are in fact complementary. Building on empirical data of students engaged in interview-based tutorial activities targeting fundamental probability notions, we explicate breakdowns such false-contradiction introduces into learning processes yet suggest opportunities such ambiguity fosters.


Abrahamson, D. (2009). Coordinating phenomenologically immediate and semiotically mediated constructions of statistical distribution. In K. Makar (Ed.), The role of context and evidence in informal inferential reasoning. Proceedings of the Sixth International Research Forum on Statistical Reasoning, Thinking, and Literacy (SRTL-6). Brisbane, Australia: The University of Queensland.

ABSTRACT: I locate the drama of mathematical learning in students’ creative attempts to coordinate two different mental constructions of a situation designed to embody a targeted mathematical notion: A phenomenologically immediate construction of the source situation, and a construction that is semiotically mediated by a mathematical model of the same situation. To successfully synthesize these constructions, students must assimilate mathematical units of analysis into their tacit schematic structures. One strategy for accomplishing this assimilation is to invent realistic, hypothetical, or fantastic images that lend plausible coherence to the blending of these tacit and formal orientations. We focus on Razi, a 11.5 year-old student learning the binomial distribution. She conjures an imaginary mechanical system to concretize her intuitive sense of distribution.


Abrahamson, D. (2009, April). Appropriate tools: On grounding mathematical procedures in perceptual intuitions. Paper presented at the annual meeting of the American Educational Research Association, San Diego, CA, April 13-20, San Diego.

ABSTRACT: : I report on a design-based research case study in the area of middle-school probability that served as a context for investigating whether students can build meaning for the disciplinary tools they are taught to use, and if so, what personal, technological, and interpersonal resources may support this process. The topic of binomial distribution was selected due to robust literature documenting students’ apparent ‘misconceptions’ of expected likelihoods. Li successfully built upon his event-based intuitive sense of likelihood in developing the outcome-based notion of sample space. Utilizing cognitive-science, sociocultural, and cultural–semiotic theoretical models of mathematical learning, the construct ‘semiotic leap’ is developed herein to explain Li’s insight as appropriating an available artifact as a means of warranting his intuitive inference.



Abrahamson, D. (2008). The abduction of Peirce: The missing link between perceptual judgment and mathematical reasoning? Paper presented at the Townsend Working Group in Neuroscience and Philosophy (A. Rokem, J. Stazicker, & A. Noë, Organizers). UC Berkeley.


Abrahamson, D. (2008). Bridging theory: Activities designed to support the grounding of outcome-based combinatorial analysis in event-based intuitive judgment—A case study. In M. Borovcnik & D. Pratt (Eds.), Proceedings of the Probability Topic Study Group at the International Congress on Mathematical Education (ICME 11). Monterrey, Mexico: ICME.

ABSTRACT: Li, an 11 year old boy, participated in the implementation of a mixed-media design for the binomial that combines activities pertaining to theoretical probability (combinatorial analysis) and empirical probability (simulated experiments). This design was engineered to accommodate, corroborate, yet elaborate on students’ heuristic inferences, and student reasoning was elicited through semi-structured clinical interviews. Applying a cultural–semiotic approach to the analysis of Li’s case study, I discuss a universal pedagogical tradeoff articulated as tension between constructivist and sociocultural perspectives on mathematics education. Li fluctuates between two interpretations of a sample space: event-based attention grounded in intuitive perceptual judgment of a random generator yet oblivious to permutations; and outcome-based attention supporting normative mathematization yet initially unsynthesized with intuition. These apparently vying perspectives are reconciled, if problematically, when Li notices that the entire sample space indexes an expected distribution qualitatively aligned with his perceptual intuition. At a theoretical level, I argue, constructivist and sociocultural perspectives, too, can be reconciled, if problematically, by accepting that mathematical phenomena are phenomenologically akin to scientific phenomena and thus mathematical learning is an inductive process of synthesizing (Schön, 1981) heuristic-based perceptual judgments and artifact-based mediated analytic procedures.


Abrahamson, D., & White, T. (2008). Artifacts and aberrations: On the volatility of design research and the serendipity of insight. In P. A. Kirschner, F. Prins, V. Jonker, & G. Kanselaar (Eds.), Proceedings of the Eighth International Conference of the Learning Sciences — International Perspectives in the Learning Sciences: Cre8ing a Learning World (ICLS2008) (Vol. 1, pp. 27-34). Utrecht, The Netherlands: ISLS.

ABSTRACT:  We discuss a brief transcribed excerpt from a task-based interview with Li, an 11.5-year-old participant in a design-based research study of probabilistic cognition pertaining to the binomial. We investigate whether and how Li made sense of the behavior of an unfamiliar computer-based artifact—the diminishing proportional impact of successive random samples on the overall shape of a dynamically accumulating outcome distribution. Li constructed two informal analogical situations as multimodal discursive means of concretizing, elaborating, and communicating his emerging understanding of the artifact’s behavior. These non-routine utterances shifted the discourse to an explicitly embodied, imagistic space bearing unique affordances for negotiated epistemic syntheses of phenomenological and technological constructions of quantitative relations. Microgenetic analysis suggests that Li’s presymbolic notion was not a static magnitude but an intensive-quantity “action–object”; he subsequently unpacked this dynamical “a/b” qualia into its constitutive “a” and “b” elements. We reflect on implications of this counter-curricular sequence for educational design.


Abrahamson, D. (2008, March). Fostering the emergence of an embodied cognitive artifact: The case of the number line in a design for probability. In D. Abrahamson (Chair), D. Earnest (Org.), & H. Bass (Discussant) The many values of the number line—An interdisciplinary forum. Symposium presented at the annual conference of the American Education Research Association, New York, March 24-28.

ABSTRACT: I trace the emergence of a mathematical instrument, the number line, in the context of student engagement with a situated-probability problemsolving interview task involving manipulatable objects. I argue that consequent semiotic–ontological ambiguity engendered struggle with generative conceptual confusion. Namely, as students conducted combinatorial analysis to create the sample space of a random generator, the objects they built to express the stochastic events served both as tickmarks on an emergent number line and as outcomes, members of those marked events. Negotiating the tickmark-vs.-member semiotic ambiguity challenged, then facilitated the dyad’s discourse over the event-vs.- outcome learning axis, which I have implicated as key to deep understanding of the binomial. Extrapolating from the data, I examine the phylogenesis-recapitulates-microgenesis conjecture (a reversal on Haeckel) by which the historical evolution of the number line may have proceeded from objects to inscription; I draw an explorative implication that mathematics instruction could follow suit, i.e., that students could reinvent the number line as an inscribed ordinal sequence of sorted objects.


Abrahamson, D. (2008). Writes of passage: From phenomenology to semiosis in mathematical learning. In T. Rikakis & A. Keliiher (Eds.), Proceedings of the CreativeIT 2008 Workshop—Success Factors in Fostering Creativity in IT Research and Education Tempe, AZ: Arizona State University.

ABSTRACT: How can designers of instructional materials for mathematics learning best support students’ progress from intuition to inscription? This paper explains theembodied-design methodology for creating cognitively ergonomic learning tools. Two case studies are contrasted in which individual participants in a design-based research study of mathematical cognition engaged in problem-solving situations pertaining to the content of probability. I analyze the microgenesis of the reflexive, mediated interplay between students’ multimodal, intuitive, presymbolic notions and the available multimedia tools (cf. Radford, 2003). The first case demonstrates how the interviewer-student dyad coped with the ontological imperialism (Bamberger & diSessa, 2003) inherent to some forms of notation. Following a brief overview of how embodied mathematical reasoning, and particularly gesture studies, can contribute to a deeper and more nuanced understanding of students’ reasoning processes as they problem solve, I recount the second case, in which close listening to a student’s truncated expressivity directly informed the design of a computer-based module that enables electronic gesture through which students can elaborate their reasoning and interlink spatial and symbolical referents. I advocate a measured balance between streamlined and frictive learning tools, because it is in breakdown that creative reasoning and creative habits of mind are fostered.


Abrahamson, D. (2007). Both rhyme and reason: Toward design that goes beyond what meets the eye. In T. Lamberg & L. Wiest (Eds.), Proceedings of the Twenty Ninth Annual Meeting of the North American Chapter of the International Group for the Psychology of Mathematics Education (pp. 287 – 295). Stateline (Lake Tahoe), NV: University of Nevada, Reno.

ABSTRACT: Drawing on design-based studies where students worked with learning tools for proportionality, probability, and statistics, I appraise whether students had opportunities to construct conceptual understanding of the targeted mathematical content. I conclude that visualizations of perceptually privileged mathematical constructs support effective pedagogical activity only to the extent that they enable students to coordinate perceptual conviction with mathematical operations—intuiting that, and not how, two representations are related constitutes perceptually powerful yet conceptually weak situatedness. In constructivist learning, as in empirical research, regularity apprehended in observed phenomena is measured, expressed, and schematized. Students should articulate or corroborate visual thinking with step-by-step procedures, e.g., synoptic views of multiplicative constructs should include tools for distributed-addition handles


Abrahamson, D. (2007). Handling problems: Embodied reasoning in situated mathematics. In T. Lamberg & L. Wiest (Eds.), Proceedings of the Twenty Ninth Annual Meeting of the North American Chapter of the International Group for the Psychology of Mathematics Education(pp. 219 – 226). Stateline (Lake Tahoe), NV: University of Nevada, Reno.

ABSTRACT: Drawing on design-based studies where students worked with learning tools for proportionality, probability, and statistics, I appraise whether students had opportunities to construct conceptual understanding of the targeted mathematical content. I conclude that visualizations of perceptually privileged mathematical constructs support effective pedagogical activity only to the extent that they enable students to coordinate perceptual conviction with mathematical operations—intuiting that, and not how, two representations are related constitutes perceptually powerful yet conceptually weak situatedness. In constructivist learning, as in empirical research, regularity apprehended in observed phenomena is measured, expressed, and schematized. Students should articulate or corroborate visual thinking with step-by-step procedures, e.g., synoptic views of multiplicative constructs should include tools for distributed-addition handles.


Abrahamson, D. (2007, April). The real world as a trick question: Undergraduate statistics majors’ construction-based modeling of probability. Paper presented at the annual meeting of the American Education Research Association, Chicago, IL.

ABSTRACT: 24 undergraduate/graduate students enrolled in mathematical programs participated in one-to-one interviews as part of a design-based research study of the cognition of probability. The students were asked to estimate outcome distributions of a very simple randomness generator consisting of an exposed bin full of marbles, half green and half blue, and a scooper—a 2-by-2 array of concavities—for drawing out exactly four marbles from the mix. This array formation (4-block) featured also in combinatorial-analysis materials and computer-based simulations of the probability experiment. Central to the design is the combinations tower, an assembly of the 16 unique outcomes in the form of a 1:4:6:4:1 “picto-barchart,” i.e., with the outcomes themselves, not just stark columns as in regular histograms. All students said that the relatively most common experimental outcome should have 2 green and 2 blue marbles, but only 10 students initiated combinatorial analysis as a means of warranting their guess, of whom only 4 conducted it successfully. For all students, the combinations tower constituted a context for coordinating between the sample space of the stochastic device and distributions of actual outcomes in experiments with this device. I argue for the utility of guided, situated problem solving for the learning and consolidation of probability concepts.


Abrahamson, D., & Wilensky, U. (2007). Learning axes and bridging tools in a technology-based design for statisticsInternational Journal of Computers for Mathematical Learning, 12(1), 23-55.

ABSTRACT: We introduce a design-based research framework, learning axes and bridging tools, and demonstrate its application in the preparation and study of an implementation of a middle-school experimental computer-based unit on probability and statistics, ProbLab. ProbLab is a mixed-media unit, which utilizes traditional tools as well as the NetLogo agent-based modeling-and-simulation environment (Wilensky 1999) and HubNet, its technological extension for facilitating participatory simulation activities in networked classrooms (Wilensky and Stroup 1999a). We will focus on the statistics module of the unit, Statistics As Multi-Participant Learning-Environment Resource (S.A.M.P.L.E.R.). The framework shapes the design rationale toward creating and developing learning tools, activities, and facilitation guidelines. The framework then constitutes a data-analysis lens on implementation cases of student insight into the mathematical content. Working with this methodology, a designer begins by focusing on mathematical representations associated with a target concept—the designer problematizes and deconstructs each representation into a pair of historical/cognitive antecedents (idea elements), each lying at the poles of a learning axis. Next, the designer creates bridging tools, ambiguous artifacts bearing interaction properties of each of the idea elements, and develops activities with these learning tools that evoke cognitive conflict along the axis. Students reconcile the conflict by means of articulating strategies that embrace both idea elements, thus integrating them into the target concept.


Abrahamson, D. (2006). Learning chance: Lessons from a learning-axes and bridging-tools perspective. In A. Rossman & B. Chance (Eds.), Proceedings of the Seventh International Conference on Teaching Statistics. Salvador, Bahia, Brazil.

ABSTRACT: The paper builds on design-research studies in the domain of probability and statistics conducted in middle-school classrooms. The design, ProbLab (Probability Laboratory), which is part of Wilensky’s ‘Connected Probability’ project, integrates: constructionist projects in traditional media; individual work in the NetLogo modeling-and-simulation environment; and networked participatory simulations in HubNet. An emergent theoretical model, ‘learning axes and bridging tools,’ frames both the design and the data analysis. A learning axis is the space of potential learning residing between two subconstructs of a domain that the designer identifies as necessary, interdependent, and complementary, e.g., between ‘theoretical probability’ and ‘empirical probability.’ The subconstructs are embedded as competing “affordances” of a bridging tool. A bridging tool is an artifact, a “mathematical object,” designed to foster and sustain students’ dwelling in a learning axis and honing the tension between these coupled subconstructs toward coordinating them as a new mental construction. The model is explicated by discussing a sample episode, in which a student reinvents sampling by connecting ‘local’ and ‘global’ perceptions of a population.


Abrahamson, D., & Cendak, R. M. (2006). The odds of understanding the Law of Large Numbers: A design for grounding intuitive probability in combinatorial analysis. In J. Novotná, H. Moraová, M. Krátká, & N. Stehlíková (Eds.), Proceedings of the Thirtieth Conference of the International Group for the Psychology of Mathematics Education (Vol. 2, pp. 1-8). Charles University, Prague: PME.

ABSTRACT: Twenty-eight Grade 4 – 6 students participated in 1 hr. clinical interviews in a design-based study that investigated: (a) probability-related intuitions; (b) the affordances of a set of innovative mixed-media learning tools for articulating these intuitions; and (c) the utility of the learning-axes-and-bridging-tools framework supporting diagnosis, design, and data-analysis. Students intuited qualitative predictions of mean and variance, yet only through grounding computer-based simulations of probability experiments in discrete–scalar, non-uniform, multiplicative transformations on a special combinatorial space, the combinations tower, could students articulate their intuitions. We focus on a key learning axis, students’ confusing likelihoods of unique events with those of classes of events.


Abrahamson, D. (2006). The shape of things to come: The computational pictograph as a bridge from combinatorial space to outcome distributionInternational Journal of Computers for Mathematical Learning, 11(1), 137-146.

ABSTRACT: This snapshot introduces the computational pictograph, an interactive computer-based artifact designed to help students learn mathematics. Focusing on the domain of probability, we will examine a computational pictograph that supports students exploration of the Law of Large Numbers. This law is explored as a coordination between the combinatorial space of a stochastic device – what it is possible to ‘‘get’’ with this device – and the outcome distribution of the device – what you actually ‘‘get’’ when you operate the device and how often you get it. To help students in this exploration, the design of the computational pictograph backgrounds the similarities between the constructs and, thus, foregrounds the critical differences between them. We will look at two implementations of the computational pictograph – the Dice Stalagmite and 9-Block Stalagmite models.


The “combinations tower” is the sample space of an x-block. Here we see the combinations tower of the 9-block, which has 512 permutations. The 6th grade kids engineered and built this sample space, and we had them paste all the permutations on a long poster, stacked in 10 columns. This video is a bit lame, because I was hoping to show how we combined in the ProbLab unit theoretical probability, experimental probability, and statistics, but just then the computer simulations were not running — we had to debug them between classes… :\  (The bearded guy is Matthew Berland, who is heroically debugging HubNet.)

Two students remain after class to discuss combinatorial analysis of the 9-block. They use the terms “anchor” and “mover” that will become clear in the second video.

The next day, a third student from the team is explaining their strategy for the case of 9-choose-2. Note now they have figured out they must divide by two so as to avoid double counting.

Abrahamson, D., Janusz, R., & Wilensky, U. (2006). There once was a 9-Block… — A middle-school design for probability and statisticsJournal of Statistics Education, 14(1).

ABSTRACT: ProbLab is a probability-and-statistics unit developed at the Center for Connected Learning and Computer-Based Modeling, Northwestern University. Students analyze the combinatorial space of the 9-block, a 3-by-3 grid of squares, in which each square can be either green or blue. All 512 possible 9-blocks are constructed and assembled in a “bar chart” poster according to the number of green squares in each, resulting in a narrow and very tall display. This combinations tower is the same shape as the normal distribution received when 9-blocks are generated randomly in computer-based simulated probability experiments. The resemblance between the display and the distribution is key to student insight into relations between theoretical and empirical probability and between determinism and randomness. The 9-block also functions as a sampling format in a computer-based statistics activity, where students sample from a “population” of squares and then input and pool their guesses as to the greenness of the population. We report on an implementation of the design in two Grade 6 classrooms, focusing on student inventions and learning as well as emergent classroom socio-mathematical behaviors in the combinations-tower activity. We propose an application of the 9-block framework that affords insight into the Central Limit Theorem in science.


Abrahamson, D., & Wilensky, U. (2005). ProbLab goes to school: Design, teaching, and learning of probability with multi-agent interactive computer models. In M. Bosch (Ed.), Proceedings of the Fourth Congress of the European Society for Research in Mathematics Education(pp. 570-579). Universitat Ramon Llull: FUNDEMI IQS.

ABSTRACT: ProbLab, an experimental middle-school unit in probability and statistics, includes a suite of computer-based interactive models authored in NetLogo (Wilensky, 1999). We explain the rationale of two of the models, Stochastic Patchwork and Sample Stalagmite, and their potential as learning supports, e.g., the temporal–spatial metaphor: sequences of stochastic events (occurring over time) are grouped as arrays (laid out in space) that afford proportional judgment. We present classroom episodes that demonstrate how the Law of Large Numbers (many samples) can be mapped onto the classroom social space (many students) as a means of facilitating discussion and data sharing and contextualizing the content. We conclude that it is effective to embed the Law of Large Social Numbers into designs for collaborative learning of probability and statistics.


Abrahamson, D., & Wilensky, U. (2005). Understanding chance: From student voice to learning supports in a design experiment in the domain of probability. In G. M. Lloyd, M. Wilson, J. L. M. Wilkins, & S. L. Behm (Eds.), Proceedings of the Twenty Seventh Annual Meeting of the North American Chapter of the International Group for the Psychology of Mathematics Education. Roanoke, VA—Virginia Tech: PME-NA.

ABSTRACT: Six middle-school students participated in pre-intervention interviews that informed the design of learning tools for a computer-enhanced experimental unit on probability and statistics. In accord with the PME-NA XXVII conference theme, we elaborate our methodological frameworks and design principles. Our design was in response to students’ failure to solve compound-event problems. We characterize student difficulty as ‘ontological fuzziness’ regarding the stochastic device, its combinatorial space, and individual outcomes. We conclude that students need opportunities to concretize the combinatorial space. Also, we conjecture that, given suitable learning tools, students could build on their comfort with single-outcome problems to solve compound-event problems. To those ends, we designed the ‘9-block,’ a mixed-media stochastic device that can be interpreted either as a compound sample of 9 independent outcomes, a single independent event, or as a sample out of a population. We explain activities around the designed tools and outline future work on the unit.


Abrahamson, D., & Wilensky, U. (2005). Collaboration and equity in classroom activities using Statistics As Multi-Participant Learning-Environment Resource (S.A.M.P.L.E.R.). In W. Stroup & U. Wilensky (Chairs), C. D. Lee (Discussant), Patterns in group learning with next-generation network technology. Paper presented at the annual meeting of the American Educational Research Association, Montreal, Canada.

ABSTRACT: We examine an implementation of a probability-and-statistics participatory simulation activity in a networked classroom to investigate the equity it affords in terms of student learning opportunities. The simulation activity, S.A.M.P.L.E.R, was designed for middle schoolers to learn basic statistical sampling theory. Students each drew their own samples from a shared population, then each inputted a value reflecting their quantification of their samples. Classroom input was plotted and compared to the true population value. Students devised strategies to coordinate classroom sampling and mathematize perceptual judgment to achieve accurate prediction. We compare the design to a non-networked collaborative construction project enacted in the same classroom and demonstrate and analyze advantages of the networked design, which was more demanding of student participation, more supportive, more student-centered, more inclusive, more suited to capitalize on classroom social dynamics, and more equitable in terms of emergent distribution of student roles and skill development.

[Note: this paper is best consumed after a taste of this other one.]


Abrahamson, D., & Wilensky, U. (2004). S.A.M.P.L.E.R.: Collaborative interactive computer-based statistics learning environment. In M. Niss (Ed.), Proceedings of the 10th International Congress on Mathematical Education. Copenhagen, Denmark.

INTRODUCTION: S.A.M.P.L.E.R.—Statistics As Multi-Participant Learning-Environment Resource—is a participatory simulation (Wilensky & Stroup, 1999a). In participatory simulations, a classroom of students collectively simulates a complex phenomenon that they are studying, with each student playing the role of a single agent or a set of agents in this phenomenon. For example, students each may “be” an atom in a molecule, a bird in a flock, or a distributed population sample-mean. Technology-based participatory simulations are built in the NetLogo cross-platform agent-based modeling-and-simulation environment (Wilensky, 1999) and operate through the HubNet architecture (Wilensky & Stroup, 1999b). Each student operates a NetLogo “client,”…


First experimental implementation of S.A.M.P.L.E.R. in a Chicago Public School Grade 6. We let the children lead most of the discussion. You will see them first attempt to grapple with problems of variability. In this pilot phase, the sampling is run off the server. Later we used the HubNet technology to run the activity as a participatory simulation, with each child taking samples on their own computer and uploading their data to the server.

S.A.M.P.L.E.R. Grade 6 study in District 65 (somewhere north of Chicago…). The NetLogo HubNet technology is set up for the networked classroom to engage in a participatory simulation of a statistical investigation. Students are each taking samples from the same population and uploading their data to the main server.

Student Josh is leading the classroom in discussing the distribution of sample means that they had just uploaded to the server. One student, Tiffany, is an outlier on the far right. The classroom concludes that the outlier helped the classroom grand mean get closer to the true vaue (compare vertical line in histogram to green/blue contour line above it). Thus S.A.M.P.L.E.R., a HubNet participatory simulation activity, draws all students into mathematical discourse by distributing statistical constructs onto the classroom social fabric, collaboration practices, and discourse norms. “Thank you, Tiffany!.”

Abrahamson, D., & Wilensky, U. (2004). S.A.M.P.L.E.R.: Statistics As Multi-Participant Learning-Environment Resource. Paper presented at the annual meeting of the American Educational Research Association, San Diego, CA.

ABSTRACT: S.A.M.P.L.E.R. (Abrahamson & Wilensky, 2004a, 2004b) is a computer-based classroom learning-environment built in the NetLogo (Wilensky, 1999a) and HubNet (Wilensky & Stroup, 1999) modeling environments. S.A.M.P.L.E.R. is the statistics component of ‘ProbLab’ (Abrahamson & Wilensky, 2002) a middle-school curricular unit built at The Center for Connected Learning and Computer-Based Modeling as part of the ‘Connected Probability’ project (Wilensky, 1993, 1995, 1997a). We report results from an implementation of S.A.M.P.L.E.R. (two Grade 6 classrooms, total n = 38), and frame our analysis of student discussion in terms of two novel design-research constructs: ‘learning axes’ and ‘bridging tools.’ A bridging tool (Abrahamson, 2004; Fuson & Abrahamson, 2005) is a classroom artifact designed to tap students’ previous mathematical knowledge and situational intuitions and help students build understanding of new mathematical concepts that are linked to symbols, procedures, and vocabulary. A learning axis is a space of potential learning extending between two competing perceptual interpretations of a bridging tool—students construct new concepts through reconciling the tension created by this dual interpretation. We discuss some tradeoffs inherent in the design of learning environments that use this approach.

<This 2004 conference paper eventually became our 2007 paper in IJCML>


ProbLab at District 65. A 6th grader is comparing an actual outcome distribution from a simulated 9-block simulator (experimental probability) with the experiment’s sample space achieved through combinatorial analysis (theoretical probability). The 9-block is a 3-by-3 matrix where each cell can be either green or blue. There are 512 permutations. The classroom has created a giant poster with all the permutations, arranged in ten columns as a “combinations tower.” For this student, the sample space is very much real — it is “in this world.”

Abrahamson, D., & Wilensky, U. (2004). ProbLab: A computer-supported unit in probability and statistics. In M. J. Høines & A. B. Fuglestad (Eds.), Proceedings of the 28th Annual Meeting of the International Group for the Psychology of Mathematics Education (Vol. 1, pp. 369). Bergen, Norway: PME.

ABSTRACT: ProbLab is a computer-based middle-school curricular unit in probability and statistics designed to enrich student thinking in the domain. The ProbLab unit is part of the Connected Probability project (Wilensky, 1997) and includes a suite of interactive models1 authored in the NetLogo (Wilensky, 1999) modeling-and-simulation environment and using the HubNet Participatory Simulation technological infrastructure (Wilensky & Stroup, 1999). ProbLab’s design rationale and interactive materials reflect our view of the domain as constituted on three interrelated pillars: theoretical probability, empirical probability, and statistics. Students explore connections between these pillars by constructing and experimenting with domain bridging tools (Abrahamson, 2004), such as the 9-block, a 3-by-3 array of squares, each of which is either green or blue. A 9-block is at once one of all 512 permutations in its combinatorial sample space (theoretical prob.), a randomly generated compound event (empirical prob.), and a sample out of a population of squares (statistics)


ProbLab materials and activities for the Grade 6 unit. From left: One of 512 9-blocks; 6th-grade students create and assemble the combinatorial space comprising 512 unique elements; the resultant Combinations Tower, consisting of nine columns (1, 9, 36, 84, 126, 126, 84, 36, 9, 1) — the CT stretches from the floor to the ceiling; an NetLogo empirical experiment that dynamically builds frequency distributions of randomly generated 9-blocks; on his laptop, a student participating in the S.A.M.P.L.E.R. participatory simulation activity takes 9-block and 1-block samples from a hidden population.


Abrahamson, D., Berland, M. W., Shapiro, R. B., Unterman, J. W., & Wilensky, U. (2004). Leveraging epistemological diversity through computer-based argumentation in the domain of probability. In Y. B. Kafai, W. A. Sandoval, N. Enyedy, A. S. Nixon, & F. Herrera (Eds.), Proceedings of The Sixth International Conference of the Learning Sciences (pp. 28-35). Mahwah NJ: Lawrence Erlbaum Associates.

ABSTRACT:  The paper is a case study of technology-facilitated argumentation. Several graduate students, the first four authors, present and negotiate complementary interpretations of a diagram generated in a computer-simulated stochastic experiment. Individuals use informal visual metaphors, programming, and formal mathematical analysis to ground the diagram, i.e., to achieve a sense of proof, connection, and understanding. The NetLogo modeling-and-simulation environment (Wilensky, 1999) serves to structure the authors’ grounding, appropriating, and presenting of a complex mathematical construct. We demonstrate individuals’ implicitly diverse explanatory mechanisms for a shared experience. We show that this epistemological diversity, sometimes thought to undermine learning experiences, can, given appropriate learning environments and technological fluency, foster deeper understanding of mathematics and science.

[see also 2006 FLM version]


A ProbLab simulation of a geometrical probability experiment

The solution lies in geometrical analysis.

The ProbLab model “Equidistant Probability” runs a Monte Carlo solution to a geometrical problem.


Abrahamson, D., & Wilensky, U. (2003). The quest of the bell curve: A constructionist approach to learning statistics through designing computer-based probability experiments. In M. A. Mariotti (Ed.), Proceedings of the Third Conference of the European Society for Research in Mathematics Education. Pisa, Italy: University of Pisa.

ABSTRACT: This paper introduces the rationale, explains the functioning, and describes the process of developing ‘Equidistant Probability’, a NetLogo microworld that models stochastic behavior. In particular, we detail the phases in attempting to choose suitable parameters and create such graph displays as will permit an observer to witness the incremental growth of a bell-shaped curve. We argue that the process of building the model, and in particular the accountability, motivation, and frustration experienced, were conducive to ‘connected learning’ (Wilensky, 1993), through which the design of this microworld is grounded. The microworld is part of “ProbLab,” a suite of Probability-and-Statistics models, which in turn is part of “Understanding Complexity,” a middle-school curriculum, currently in development.


The marble scooper was designed by Dor Abrahamson and engineered and 3d-printed by Paulo Blikstein, when Dor and Paulo were at Northwestern University, around 2003. Here are simulations of 4-block and 9-block scoopers created during the process. The concavities are deep enough to capture marbles while scooping through a bin full of marbles, but not so deep that the marbles get stuck. The depth of the concavities also permits a teacher to hold the scooper up, at a near-vertical inclination, for a classroom to see the outcome. The spacing of the concavities allows for drainage of excess marbles, once the scoop is completed. The front of the scooper — the “cowcatcher” — facilitates penetrating among the dense pack of marbles in the bin. The handle is crafted for strength to resist the torque force of scooping through the marbles.

The marbles-scooper activity has been adopted by Marja van den Heuvel-Panhuizen of the Hans Freudenthal Institute at Utrecht University