# New Scientist article: Without language, numbers make no sense

PNAS referenced paper: Full paper available here.I urge you to read the New Scientist article (it's really short), and if you're further interested, read the paper, although it's not really necessary for my argument below. The author's basic assertion (if you didn't read the paper is as follows (from the abstract):

Here we examine the numerical abilities of individuals who lack conventional language for number (deaf individuals who do not have access to a usable model for language, spoken or signed) but who live in a numerate culture (Nicaragua) and thus have access to other aspects of culture that might foster the development of number. These deaf individuals develop their own gestures, called homesigns, to communicate. We show that homesigners use gestures to communicate about number. However, they do not consistently extend the correct number of fingers when communicating about sets greater than three, nor do they always correctly match the number of items in one set to a target set when that target set is greater than three. Thus, even when integrated into a numerate society, individuals who lack input from a conventional language do not spontaneously develop representations of large exact numerosities.

--

This assertion really threw me for a bit of a loop, cuz, I like to think that numbers are an entity independent of language, and as fundamental to cognition, communication and understanding as language is. I argue that just because the "homesigners" do not have our number system, does not exclude the possibility that they have developed their own number system. This would be consistent with and explain the fact that they never came up with a matching language to associate with our number system. Simply because they do not adhere to the numerate culture of Nicaragua (and ours), does not follow that they do not have an understanding of numbers. I argue that their language deficits sufficiently removed them from their surrounding society and forced themselves to come up with their own system of numbers and counting. I further argue that their number system may be far more fundamental to our understanding of cognition and thinking than our base 10 counting system. Perhaps this is the argument of the authors, but if it is, it is stated in a very base 10 culture based view, and does not allow for, account for, or suggest additional possible innate number systems that are different than ours.

One excellent piece of evidence are the author's words. "They're not wildly off," says Spaepen. "They can approximate quantities, but they don't have a way of getting to the exact number."

**The fact that they do not have a concept of an "exact number" within their language does not mean that our number system is based on language.**What I assert is that it is one thing to state that words are required to identify a number, and totally another thing (and a false extrapolation) to assert that without being able to actually identify the number, that you don't have an understanding of numbers.

Just as one would try to identify a "house" or a "tree" without the words for it, identifying exact numbers for the homesigners would be difficult without the vocabulary.* However, this does not automatically equate to a lack of understanding to what a "house" or a "tree" are. Everything has a name, and not knowing the name doesn't necessarily mean you don't understand the concept of it. Instead of precisely identifying the number, they dance around the idea a bit, in an attempt to articulate the idea through the translation into their own language. Presumably, they never found the need to develop this vocabulary for exact numbers. This assertion is also supported with Spaepan's findings:

*

*There is an absolutely astounding Radiolab audio episode discussing this very issue. I won't go into details here as this blog post is absurdly long already, but there was this 27-year old dude that was born deaf, and never developed ANY language. And in a totally weird way. He didn't realize that when people were trying to talk to him (sign language included), they were trying to COMMUNICATE with him. He had NO CONCEPT of language. He had no idea there was even SOUND! Until one day this woman taught him... THAT EVERYTHING HAS A NAME!!! "Table", "Chair", "House!!!"... He had no concept of "things" connected with "identification". I know. I want to continue this story too...*

To be fair, Spaepan offers some very realistic alternative explanations to the conclusion that the New Scientist article draws:

(1) Homesigners may lack stable summary symbols for each integer—symbols that stand for the cardinal value of the entire set, not just individuals within a set (fingers are easily, perhaps too easily, mapped onto individuals).

(2) They may also lack the principle of the successor function—that each natural number n has a successor that is exactly n + 1. That is, homesigners’ number gestures are not embedded in a count routine.

(3) In addition, homesigners may fail to appreciate that one-to-one correspondence guarantees numerical equivalence. That is, homesigners’ number gestures are not used as a tally system. With respect to why, several possibilities remain open. Exposure to a linguistic system that contains a count routine may be essential to develop representations of exact number (4, 6).

--

I totally agree with those possibilities. I also think it's bullshit and irresponsible of New Scientist not to address these possibilities in its article summary.

--

So, what does Spaepan conclude?

A cultural context in which exact number representations are valued, and a social context in which one’s communicative partners share a counting routine and an associated system of exact number concepts, are not enough to scaffold the creation of a count routine or representations of exact number that are flexible and generalize across domains.

--

I am in total agreement with that conclusion. Basically, given two languages and two potential number systems, the number systems have the potential to stay independent from one another. So, basically what this boils down to is that actually fucking New Scientist is the culprit here (or just a very badly written abstract), and specifically New Scientist article title:

# Without language, numbers make no sense

is wrong. The better title would be that

**With OR without language, numbers make absolutely perfect sense to those who use that number system, and that number systems aren't directly related to language in any fundamental sense**, as referenced to the transfer, or lack there of, to counting systems between languages. Basically, their title is shit, and mine isn't much better; however, mine is actually relevant to the paper we're both trying to describe. Further, it seems to draw a totally different and opposite conclusion from the paper.

I suppose to fix all this, the title should be, Without language,

*our*numbers don't make sense. Which, makes sense!

--

So, now that we got that New Scientist bullshit all cleared up, let's further delve into this counting and numbers system, shall we? Maybe go back to the ideas originally introduced with the number systems, and explore some idea of what the homesigners have in mind, so to speak.

As with most of my knowledge, the discussion below comes from an excellent Radiolab episode about numbers and their innate calibration, where I basically borrow what they talk about.

So, when discussing innate traits, we need to look at babies and young children. And we need to set up experiments to be able to

**observe, without influencing**, their ability to count, and more specifically,

**to observe and not influence**their number system.

Currently, Stanislas Dehaene is one of the leading researchers in the field of number sense, and he specifically focuses on babies, and the number sense that they understand.

For a long time people thought that we came into the world with no innate sense of numbers. We came into the world empty with no concept of numbers and numbering, and that we developed an understanding numbers

*only after*we were taught them. Thanks to Dehaene, we now know that this is completely wrong.

One of Dehaene's experiments is as follows. He hooks up 2-3 month olds to a brain activity monitor, and sits them in front of a computer screen. He then passes in front of them a series of 8 ducks. The children not not only react visibly to the ducks, but this is also confirmed with stimulated brain activity. So Dehaene continues to show 8 more ducks on the screen, then passes 8 more, then 8 more, and so one. Eventually, habituation occurs, and the children's brain activity and visible interest decreases.

Then he passes 8 trucks on the screen. Once again brain activity increases with the new stimuli. Then again the pattern is repeated... 8 trucks, 8 trucks, 8 trucks... until habituation once again occurs and brain stimulation decreases.

--

Now, once again he passes the 8 ducks on the screen. 8 ducks, 8 ducks, 8 ducks... Once again, initial brain stimulation, and then subsequent habituation occurs. This is where the experiment changes.

He puts 16 ducks on the screen.

What do you think happens in the brains of these 2-3 month old children?

Yup, brain stimulation. BUT, brain stimulation in a DIFFERENT AREA of the brain to where the stimulation originally occurred to when the 8 trucks were on the screen.

So, through subsequent experiments, Dehaene finds a pattern. Babies are REALLY GOOD at noticing the difference between 8 and 16 ducks, or 10 and 20 ducks. BUT, if shown 10 and the 11 ducks, there is no subsequent brain stimulation. He found that babies are really good at generally recognizing the difference between large differences in numbers. It's basically a significantly different pattern that they are recognizing, and really, this shouldn't be that surprising. Babies aren't really "detail oriented", and this should not be come as a shock.

However, this is where it gets interesting.

Dehanene did find a pattern in baby number recognition. Babies seem to experience these patterns, these numbers, these quantities in a logarithmic pattern. They will notice number changes on a logarithmic scale.

**Babies seem, plainly, to be able to innately count within a logarithmic base.**

So, a little background on logarithmic numbers. Think about the distance between the numbers 1 and 2. It's 1. Now, think about the distance between 9 and 10. It's also 1. However, on a logarithmic scale, the difference between 1 and 2 is DOUBLE! 2 is TWICE as big as 1! This obviously does not hold for the difference between 9 and 10.

We think about the distance between these numbers in "discrete ordered chunks". They are integers. However, logarithmically speaking (and thinking like a baby), the distance between 1 and 2 is huge, and is equivalent to the distance between 9 and 18, not 9 and 10. So, logarithms think in ratios between number differences.

--

So, why do we as adults think in terms of discrete packages when as children we seem to have evolutionarily developed this innate logarithmic counting system?

Do we just naturally develop this learned discrete numbers counting system? Well, not exactly, as it turns out.

--

Dehaene argues that if left to your own devices, we'd never switch.

Dehaene's observations of Amazonian tribes confirmed that some do not have discrete number counting systems, and some continue to count in a logarithmic scale! This is similar to the findings of the original New Scientist paper discussed originally.

Get this experiment.

On one side of a line you place 1 object. On the other side you place 9 objects. Dehaene asked tribe members to tell him the EXACT number in between 1 and 9.

The answer to us is pretty obvious, right?

5.

But guess what they answered?

3.

So, ummm, why?

"This is a bit tricky, but the gist is if you're thinking in ratios (on a logarithmic scale), and you're starting at 1, and you multiply by 3, you get to 3, and then again, hey hey, you multiply by 3 again, you get to 9!

So, to them, those are equal jumps on either side.

3 is to 1 as 9 is to 3.

This jump feels intuitively simply like the middle to these people! AND, these are the numbers that we all naturally feel and have innately as children!!! ... or, this is Dehaene's theoretical claim. Which is kinda actually AWESOME!

--

So what are the possible implications of this innate natural logarithmic number system we have as babies? Why do we think about numbers the way we do rather than the way we innately do?? Are they just human constructions??

There does seem to be this human condition where we are more exact with our numbers, and this seems to make sense in our world. Evolutionarily, it would seem, we never needed to be exact with numbers, and therefore never developed the discrete number system. Subsequently, current day pressures in our society and culture seem to place an added burden on us. Just as language, we are taught this "unnatural" number system. Pretty cool! So, to reiterate, given our current environment, we place a greater importance on discrete number systems, BUT, that doesn't mean that other, namely the logarithmic number system, isn't innate in all of us. Or, to remove the double negative and to follow the evidence, it appears that our innate number system may be logarithmic.

Does this logarithmic way of counting ring a bell with anyone else??:

http://learn.genetics.utah.edu/content/begin/cells/scale/