Friday, March 4, 2011

Of numbers and words

Kari posted an interesting article on her reader feed a few days weeks ago.

New Scientist article: Without language, numbers make no sense

PNAS referenced paper: Full paper available here

I urge you to read the New Scientist article (it's really short), and if you're further interested, read the paper, although it's not really necessary for my argument below. The author's basic assertion (if you didn't read the paper is as follows (from the abstract):

Here we examine the numerical abilities of individuals who lack conventional language for number (deaf individuals who do not have access to a usable model for language, spoken or signed) but who live in a numerate culture (Nicaragua) and thus have access to other aspects of culture that might foster the  development of number. These deaf individuals develop their own gestures, called homesigns, to communicate. We show that homesigners use gestures to communicate about number. However, they do not consistently extend the correct number of fingers when communicating about sets greater than three, nor do they always correctly match the number of items in one set to a target set when that target set is greater than three. Thus, even when integrated into a numerate society, individuals who lack input from a conventional  language do not spontaneously develop representations of large exact numerosities.

--

This assertion really threw me for a bit of a loop, cuz, I like to think that numbers are an entity independent of language, and as fundamental to cognition, communication and understanding as language is. I argue that just because the "homesigners" do not have our number system, does not exclude the possibility that they have developed their own number system. This would be consistent with and explain the fact that they never came up with a matching language to associate with our number system. Simply because they do not adhere to the numerate culture of Nicaragua (and ours), does not follow that they do not have an understanding of numbers. I argue that their language deficits sufficiently removed them from their surrounding society and forced themselves to come up with their own system of numbers and counting. I further argue that their number system may be far more fundamental to our understanding of cognition and thinking than our base 10 counting system. Perhaps this is the argument of the authors, but if it is, it is stated in a very base 10 culture based view, and does not allow for, account for, or suggest additional possible innate number systems that are different than ours.

One excellent piece of evidence are the author's words. "They're not wildly off," says Spaepen. "They can approximate quantities, but they don't have a way of getting to the exact number."


Yes, OF COURSE you need to have a word for the number to IDENTIFY the exact number! That is clear, and unarguable. The fact that they do not have a concept of an "exact number" within their language does not mean that our number system is based on language. What I assert is that it is one thing to state that words are required to identify a number, and totally another thing (and a false extrapolation) to assert that without being able to actually identify the number, that you don't have an understanding of numbers.

Just as one would try to identify a "house" or a "tree" without the words for it, identifying exact numbers for the homesigners would be difficult without the vocabulary.* However, this does not automatically equate to a lack of understanding to what a "house" or a "tree" are. Everything has a name, and not knowing the name doesn't necessarily mean you don't understand the concept of it. Instead of precisely identifying the number, they dance around the idea a bit, in an attempt to articulate the idea through the translation into their own language. Presumably, they never found the need to develop this vocabulary for exact numbers. This assertion is also supported with Spaepan's findings:


*There is an absolutely astounding Radiolab audio episode discussing this very issue. I won't go into details here as this blog post is absurdly long already, but there was this 27-year old dude that was born deaf, and never developed ANY language. And in a totally weird way. He didn't realize that when people were trying to talk to him (sign language included), they were trying to COMMUNICATE with him. He had NO CONCEPT of language. He had no idea there was even SOUND! Until one day this woman taught him... THAT EVERYTHING HAS A NAME!!! "Table", "Chair", "House!!!"... He had no concept of "things" connected with "identification". I know. I want to continue this story too...

To be fair, Spaepan offers some very realistic alternative explanations to the conclusion that the New Scientist article draws:

(1) Homesigners may lack stable summary symbols for each integer—symbols that stand for the cardinal value of the entire set, not just individuals within a set (fingers are easily, perhaps too easily, mapped onto individuals).

(2) They may also lack the principle of the successor function—that each natural number n has a successor that is exactly n + 1. That is, homesigners’ number gestures are not embedded in a count routine.

(3) In addition, homesigners may fail to appreciate that one-to-one correspondence guarantees numerical equivalence. That is, homesigners’ number gestures are not used as a tally system. With respect to why, several possibilities remain open. Exposure to a linguistic system that contains a count routine may be essential to develop representations of exact number (4, 6).

--

I totally agree with those possibilities. I also think it's bullshit and irresponsible of New Scientist not to address these possibilities in its article summary.

--

So, what does Spaepan conclude?

A cultural context in which exact number representations are valued, and a social context in which one’s communicative partners share a counting routine and an associated system of exact number concepts, are not enough to scaffold the creation of a count routine or representations of exact number that are flexible and  generalize across domains.

--

I am in total agreement with that conclusion. Basically, given two languages and two potential number systems, the number systems have the potential to stay independent from one another. So, basically what this boils down to is that actually fucking New Scientist is the culprit here (or just a very badly written abstract), and specifically New Scientist article title:

Without language, numbers make no sense


is wrong. The better title would be that With OR without language, numbers make absolutely perfect sense to those who use that number system, and that number systems aren't directly related to language in any fundamental sense, as referenced to the transfer, or lack there of, to counting systems between languages. Basically, their title is shit, and mine isn't much better; however, mine is actually relevant to the paper we're both trying to describe. Further, it seems to draw a totally different and opposite conclusion from the paper.

I suppose to fix all this, the title should be, Without language, our numbers don't make sense. Which, makes sense!

--

So, now that we got that New Scientist bullshit all cleared up, let's further delve into this counting and numbers system, shall we? Maybe go back to the ideas originally introduced with the number systems, and explore some idea of what the homesigners have in mind, so to speak.

As with most of my knowledge, the discussion below comes from an excellent Radiolab episode about numbers and their innate calibration, where I basically borrow what they talk about.

So, when discussing innate traits, we need to look at babies and young children. And we need to set up experiments to be able to observe, without influencing, their ability to count, and more specifically, to observe and not influence their number system.

Currently, Stanislas Dehaene is one of the leading researchers in the field of number sense, and he specifically focuses on babies, and the number sense that they understand.


For a long time people thought that we came into the world with no innate sense of numbers. We came into the world empty with no concept of numbers and numbering, and that we developed an understanding numbers only after we were taught them. Thanks to Dehaene, we now know that this is completely wrong.

One of Dehaene's experiments is as follows. He hooks up 2-3 month olds to a brain activity monitor, and sits them in front of a computer screen. He then passes in front of them a series of 8 ducks. The children not not only react visibly to the ducks, but this is also confirmed with stimulated brain activity. So Dehaene continues to show 8 more ducks on the screen, then passes 8 more, then 8 more, and so one. Eventually, habituation occurs, and the children's brain activity and visible interest decreases.



Then he passes 8 trucks on the screen. Once again brain activity increases with the new stimuli. Then again the pattern is repeated... 8 trucks, 8 trucks, 8 trucks... until habituation once again occurs and brain stimulation decreases.

--

Now, once again he passes the 8 ducks on the screen. 8 ducks, 8 ducks, 8 ducks... Once again, initial brain stimulation, and then subsequent habituation occurs. This is where the experiment changes.

He puts 16 ducks on the screen.

What do you think happens in the brains of these 2-3 month old children?

Yup, brain stimulation. BUT, brain stimulation in a DIFFERENT AREA of the brain to where the stimulation originally occurred to when the 8 trucks were on the screen.

So, through subsequent experiments, Dehaene finds a pattern. Babies are REALLY GOOD at noticing the difference between 8 and 16 ducks, or 10 and 20 ducks. BUT, if shown 10 and the 11 ducks, there is no subsequent brain stimulation. He found that babies are really good at generally recognizing the difference between large differences in numbers. It's basically a significantly different pattern that they are recognizing, and really, this shouldn't be that surprising. Babies aren't really "detail oriented", and this should not be come as a shock.

However, this is where it gets interesting.

Dehanene did find a pattern in baby number recognition. Babies seem to experience these patterns, these numbers, these quantities in a logarithmic pattern. They will notice number changes on a logarithmic scale. Babies seem, plainly, to be able to innately count within a logarithmic base.

So, a little background on logarithmic numbers. Think about the distance between the numbers 1 and 2. It's 1. Now, think about the distance between 9 and 10. It's also 1. However, on a logarithmic scale, the difference between 1 and 2 is DOUBLE! 2 is TWICE as big as 1! This obviously does not hold for the difference between 9 and 10.

We think about the distance between these numbers in "discrete ordered chunks". They are integers. However, logarithmically speaking (and thinking like a baby), the distance between 1 and 2 is huge, and is equivalent to the distance between 9 and 18, not 9 and 10. So, logarithms think in ratios between number differences.

--

So, why do we as adults think in terms of discrete packages when as children we seem to have evolutionarily developed this innate logarithmic counting system?

Do we just naturally develop this learned discrete numbers counting system? Well, not exactly, as it turns out.

--

Dehaene argues that if left to your own devices, we'd never switch.

Dehaene's observations of Amazonian tribes confirmed that some do not have discrete number counting systems, and some continue to count in a logarithmic scale! This is similar to the findings of the original New Scientist paper discussed originally.

Get this experiment.

On one side of a line you place 1 object. On the other side you place 9 objects. Dehaene asked tribe members to tell him the EXACT number in between 1 and 9.

The answer to us is pretty obvious, right?

5.

But guess what they answered?

3.

So, ummm, why?

"This is a bit tricky, but the gist is if you're thinking in ratios (on a logarithmic scale), and you're starting at 1, and you multiply by 3, you get to 3, and then again, hey hey, you multiply by 3 again, you get to 9!

So, to them, those are equal jumps on either side.

3 is to 1 as 9 is to 3.

This jump feels intuitively simply like the middle to these people! AND, these are the numbers that we all naturally feel and have innately as children!!! ... or, this is Dehaene's theoretical claim. Which is kinda actually AWESOME!

--

So what are the possible implications of this innate natural logarithmic number system we have as babies? Why do we think about numbers the way we do rather than the way we innately do?? Are they just human constructions??

There does seem to be this human condition where we are more exact with our numbers, and this seems to make sense in our world. Evolutionarily, it would seem, we never needed to be exact with numbers, and therefore never developed the discrete number system. Subsequently, current day pressures in our society and culture seem to place an added burden on us. Just as language, we are taught this "unnatural" number system. Pretty cool! So, to reiterate, given our current environment, we place a greater importance on discrete number systems, BUT, that doesn't mean that other, namely the logarithmic number system, isn't innate in all of us. Or, to remove the double negative and to follow the evidence, it appears that our innate number system may be logarithmic.

Does this logarithmic way of counting ring a bell with anyone else??:



http://learn.genetics.utah.edu/content/begin/cells/scale/


http://htwins.net/scale/

We innately count in powers of 10! We innately understand the size of everything based on the powers of 10! We just unlearn it!

--

Over the first few years of child development, we gradually learn to "count". Susan Carey, Department of Psychology at Harvard University, has run experiments trying to figure out the way we learn how to count.

So, you take your two-year old at home, and you throw a bunch of pennies in front of them. You then say to the child, "Can you give me 1 penny?" And the child will pick up one penny and give it to you.

Congratulations kid.

Now you ask, "Can I have two pennies?" Well, these Einsteins just pick up a handful of pennies and hand them to you.

And so they've handed you a bunch of pennies. Well, in fact, it is argued, that they don't know the concept of "more than one" for about 9 months. BUT, that's the only concept that have at 9 months. 1, and then more than 1. Then, over time they start to understand 3. Then 4. Then 5, and so on. This gradual process usually takes about 3.5-4 years.

So, after YEARS of people asking them, "Hey, do you even know how to even COUNT?", they finally drop their innate "more than one" concept of numbers. This also tends to happens AFTER they actually learn the number vocabulary itself. They (most) will be able to count out the words from 1 to 10 WELL BEFORE they have the concept of what the hell they are actually talking about!

Stupid Kid.

And gradually, they develop this concept of integers... 5 is one greater than 6, and 9 is one greater than 10.

And this, Susan Carey argues, is a change in our innate number system. She further argues that no other animals have this. These numbers that we have, are not actually innate. So just as language, and independent of language, we learn this integer system.*

*This is also fairly similar to the way we learn to connect concepts of vocabulary to language. Further discussed here. Which, thinking about it,-may- be an argument for the discrete number system being connected with language. However, as shown above and below, these things are developed independently. BUT, it is an interesting concept, and maybe I'll jump back into it later on in the year. As in, how is "thinking"/"consciousness" connected with the integer number system and ultimately "language". To seed this a bit, do babies, before they can count or before they develop language, have consciousness? Hmmm.. spacial and mirror studies suggest that they don't. Ok, back to this number logarithmic thing... for now. 

--

What additional possible advantage does evolution have to endow us with a logarithmic counting system as opposed to a discrete numeric counting system? What may be interesting is to look at how animals count. Let the Clever Hans discussion begin!

But seriously, there does seem to be evidence that animals count in a logarithmic AND/OR discrete number ways, depending on the animal. Some examples:

http://www.scientificamerican.com/article.cfm?id=how-animals-have-the-ability-to-count

Although Brannon feels that animals do not have a linguistic sense of numbers—they aren’t counting “one, two, three” in their heads—they can do a rough sort of math by summing sets of objects without actually using numbers, and she believes that ability is innate. Brannon thinks that it might have evolved from the need for territorial animals “to access the different sizes of competing groups and for foraging animals to determine whether it is good to stay in one area given the amount of food retrieved versus the amount of time invested.”


--

Animals in general: http://www.apa.org/monitor/oct07/goforth.aspx

[M]any animals practice arithmetic, though in different, often less precise ways than humans, according to new research and summary data presented at APA's 2007 Annual Convention . . .


"As the difference between the sets gets bigger, performance increases," Beran said. "It's easier to tell the difference between six versus two than it is three versus two."

"These are mathematical abilities that appear to be part of a primitive mathematical toolkit for reasoning about numerical values," Cantlon said.

--

Of monkeys: http://www.livescience.com/2160-monkeys-math-humans.html

The researchers now want to learn more about what this primitive math system in monkeys is capable of "and whether it is the evolutionary basis of human mathematical thinking," Cantlon said. "We are also interested in whether this primitive mathematical system forms the basis of mathematical development in human children."


--

Dolphins: http://tursiops.org/modules.php?name=News&file=article&sid=1793

For a three-year study, two male Bottlenose dolphins, Talon and Rainbow, were presented with two blackboards containing differing and varying size dots.

The dolphins were then asked to choose the blackboard containing the "less" amount of dots and chose the correct blackboard eighty percent of the time.


 Too many men, Dolphins.

--

Potential conflicting study: http://www.livescience.com/7025-monkeys-babies-math.html

"We conclude that the babies are showing an internal representation of 'two-ness' or 'three-ness' that is separate from sensory modalities and, thus, reflects an abstract internal process," said Elizabeth Brannon of Duke University.

Previous work with monkeys yielded similar findings.

"These results support the idea that there is a shared system between preverbal infants and nonverbal animals for representing numbers," Brannon said.

--

Elephants count discrete numbers??: http://www.timesonline.co.uk/tol/news/world/asia/article4660924.ece

“I couldn’t believe it at first,” said Irie, “They could instantly compare numbers like six and five."

--

Why do elephants need to learn how to count???: http://www.guardian.co.uk/science/2008/aug/21/elephants.arithmetic

It is not obvious why elephants should need this mathematical faculty in the wild. "It really is tough to figure out why [elephants] would need to count," said Mya Thompson, an ecologist at Cornell University who studies elephants.

One possibility is that they use it to keep track of other members of their herd so that no individual is left behind. Asian elephants live in close-knit groups of six to eight. "You really don't want to lose your group members," she said.


--

Basically, this is all support that scientists should talk and read more to scientists outside their field, but who knows. Also, there are undoubtedly standardization issues.

Anyway, it does seem that discrete number systems can arise from social and cultural groupings (elephants), but don't necessarily have to (humans).

I'm basically as confused as ever, but the implications of all this research are quite fun, and all points to numbers and counting as developing entirely independently of words and language, which is basically what I set out to do. So, awesome.

However, I'm still not sure... especially when consciousness is thrown into the mix. As in, is conscious necessary for a discrete counting system? Forget it for now... I'm going to bed.

No comments:

Post a Comment