The scientific evidence for Dual N Back improving fluid intelligence is actually fairly weak as far as I can tell, see this critique and the other studies discussed:
My personal theory is that solving difficult math and programming problems is a better use of time than doing working memory training exercises. I'd guess that solving math and programming problems improve working memory just as fast as playing 'brain games', and they have the additional benefit of improving one's math and programming skill.
Of course, this is just a guess of mine that is unsupported by any experimental evidence.
In this book http://www.amazon.com/gp/product/0141035501/ the author says that solving simple math equations (addition,subtraction ,division) fast is much more stimulating than doing complex math.
Seems very true in my opinion. I've always excelled at math, essentially have a bachelors in mathematics, etc etc. My weakness has always been getting the details of the "basic" math correct.
Higher math is about logic, abstractions and applying them to solve problems. Basic math is about holding bits of information in your head and manipulating them accurately. I absolutely believe that basic math taxes your working memory system more than advanced math does.
I just skimmed Outliers book by Malcom Gladwell. One of the things author discusses are differences in number systems in different cultures. He argues (in fact quoting Stanislas Dehaene [1]), that in eastern languages number words are shorter and faster to pronounce.
Thus, you can hold more of them at a time in memory (short term memory is very time sensitive).
You could say that you dont really need to 'pronounce' stuff to make mental operations, still they have sensory form (be it visual, auditory,...), so argument holds.
He also brings up the issue of regularity of eastern number systems making it much easier to do calculations in these languages - to an extent that it gives eastern children real advantage in math. Developmentally speaking.
Sounds very true in my experience. This is one of the reasons why naming (in programming and everything else) is so crucial: good naming directly influences how easily one can mentally manipulate the objects in question. This is another reason why mathematicians work so hard to create concise abstractions.
I'll have to take a look at that book, sounds fascinating. Thanks.
I do not have that book with me now but he provides brain scans in the experiments he conducted, most of the book is just exercises (math equations , stroop color and word recall)
I think he helped with brain age, which had a good reputation, and then a mediocre one for actually working. Despite that, after conversing with my wife we are of the opinion that fast simple math may be promising.
That's a brutal critique which comes close to a charge of scientific fraud. I'd be curious to see a response. If there isn't one, I know what to think.
Please let me know your suggestions on how to improve this
. I need to make a few more changes for this to be played on ipad but for now works with the keyboard
Keep going with this! I was just playing multiplication asteroids and there is a little lag between when I "die" and when the game over comes up so I can get an extra 2 or 3 answers in. Also, would it be possible to show lives remaining in asteroids?
I'd make the instructions pop up at first or at least make the link a little more prominent. I struggled with how to play the color game at first and it took me a minute to find the instructions.
The games are fun and I could see them being used for brain training or even just as teaching aids for middle school aged children.
Yes that will be a small fix to show the lives and as far as the instructions popping up first that is something I thought about but now that you mention it too I think that will be added next
Superficially, the answer to this question is, obviously, yes. We reason spatially and with mental simulation, but we also reason by applying the linguistic conventions learned from our culture (e.g., "if this statement were false, then ...").
What most people seem to fail to realize is just how different people are from each other cognitively, and how many different ways there are of getting to a particular conclusion. "Intelligence" is like a country's GDP -- a complicated, non-stationary mess that, taken together and measured in a way our culture deems important, ends up representing our "Gross Cognitive Product".
So, dual n-back probably improves working memory in most people, which will probably improve their problem solving ability, most of the time. But there are, without a doubt, many other subtle, complicated, and idiosyncratic aspects of cognition that are likely to have far more dramatic effects if appropriately tweaked.
Their sidebar explaining IQ scores seems a little silly. I'd like to see a citations for average scores of Nobel prize winners. Also, who went back in time to administer an intelligence test to Mozart?
IQ is just defined in terms of standard deviations. Having an IQ of 130 means you are 3 standard deviations smarter than average, i.e. in the 99.7th percentile. Presumably, they just put Motzart in the top 99.xxxxxx% percentile by guess and calculated the resulting IQ.
Of course you have to remember that different tests use different standard deviations. For example, Mensa here in Finland uses a test with standard deviation of 15, so 130 is "only" two standard deviations smarter than average. The point is, the scores are meaningless without knowing the standard deviation used.
IQ tests are normed for a standard deviation of 15 points, not 10. (And I'll bet you knew that but got distracted by the '3' in 130. The "D'oh! Effect" -- a side effect of our associative memory architectures.)
Also, who went back in time to administer an intelligence test to Mozart?
A very astute question. IQ scores are specific to a particular test taken at a particular time, even when comparing IQ scores according to today's standard score definition of IQ for test-takers taking several tests very close together in time.
(See the sortable table, with reference to the original publication, at this page anchor location on Wikipedia. The same table appears at the page anchor link below. ALL of the Wikipedia articles on human intelligence and IQ testing need a lot of updates, because they have been subject to frequent edit-warring, but that table is quite useful.)
Nobody has an IQ score from more than about a century ago. The current standard score definition of IQ, performance on a cognitive test with the population median set at 100 and performance two standard deviations above the median being called IQ 130, began with the Wechsler adult tests in the 1950s and spread to child testing by the 1970s, and is now pretty nearly universal. But even with that definitional issue kept straight, an individual person's IQ score can bounce up and down over time, and by any kind of testing theory we can never be completely sure of a person's "true" score, as any score on any occasion of testing is an estimate of the test-taker's behavior on other occasions or with other item content. Honest IQ test-givers report scores with an error band around the score, as has always been done, for example, by the psychologist who has tested my children for appropriate educational placement.
The very excellent book Terman's Kids: The Groundbreaking Study of How the Gifted Grow Up by Joel N. Shurkin
gives the full back story to the silly estimates you see of historical figures who lived before the era of IQ testing, which were mostly made up by Terman's collaborator Catherine Cox. Her procedure was justifiably laughed at by Shurkin as he described it: she counted the lines in biographical reference works on different historical figures, and supposed that the people in history who got the most ink probably had the highest IQs. There were always plenty of anomalies in her results from the very beginning, and no one takes them seriously anymore.
Lately I've been thinking about the limitations that being human puts on a programmer. We work very hard (and should) to reduce cognitive load on ourselves through good development tools that act as crutches for memory---REPL's and good debuggers allow us to try something and see what happens, as opposed to simulating a multivariate operation in our heads. Intellisense and easily-available docs allow us to cheat a bit on learning(and what I really mean is memorizing) API's.
But what if we could do these things without relying on JIT computer aids? What if I could simulate more levels of abstraction in my head? The private dream of many a Lisper(this one, anyway)---writing programs that write programs that write programs---would be a bit more attainable.
I've been using Anki with great success to learn API's and keyboard shortcuts (Anki+emacs is a match made in heaven), but I despaired at my inability to hold the whole stack, from top to bottom, in my head at once.
So I posted a badly phrased question on Stack Exchange, ("How can I increase the number of levels of abstraction I can reason about at once?), and kept Googling. Eventually I came across Jaeggi's research. It looks promising, but hasn't passed the wide-replication test yet. I'm glad this came up on HN, because I'm eager to see more research in the area and get come confirmation or refutation of the findings.
In the meantime, the premise of Jaeggi's conclusion raises two questions---if working memory can be trained, can it decay with disuse? In that case, are with our fancy debugging tools mere shadows of Real Programmers that used to walk the earth? The other question is this---if the brain is likened to a computer, working memory corresponds to RAM. If we are successful at training working memory and making people "smarter," will we in the future face a bottleneck of processing speed rather than space?
One thing I've noticed about Anki is that the pain of memorizing and retaining stuff has been significantly reduced. As such, I tend to be much more willing to "just memorize the whole thing" in a lot of cases. I guess a good analogy is the impact faster CPU's/more memory has had on programming---when they stop being the limiting factor, we start using languages adapted to us, rather than them. Similarly, as memorization has become "cheaper" to me, I find myself making choices that involve more memorization. A few days ago I made a deck specifically for all the gmail keyboard shortcuts. It would not normally be worth my time to commit those to memory, but because the cost has fallen so much, it didn't seem like a bad idea.
Personally, I think that attaining the state of flow is what makes programming enjoyable, and I haven't experienced it in many years. There are just too many APIs, too much poor documentation, too many bugs, and too many languages I have to switch between for me to ever get into the 'zone'.
Absolutely! I totally miss this from my college days. Being able to crank out a project from start to finish the day its due: being in the zone for many hours straight. You could hold the entire set of necessary APIs in your head at once so there was nothing separating your ideas from the editor. It's an experience of pure creation that you simply will never achieve these days, except for very short bursts.
I cope by taking every chance I get in my day job to experience these short bursts of pure creation. Do I need an interesting algorithm implemented? I'll hand code it myself rather than spend an hour or so trying to find and shoehorn the "standard" implementation into my application. I love the rare occasions where I notice a, dare I say, "clever" solution* to a problem. Coming up with and implementing these cases are a joy. I admit this isn't always the best way to solve a problem from a "software engineering" standpoint. But it keeps me in the game.
*Not clever as in obscure, WTF worthy; but clever as in elegant and expressive, albeit perhaps inaccessible to less skilled coders.
I did some moderately serious dual n-backing a year or so back in an effort to improve my chess game. Of course it wasn't a remotely scientific experiment so I can't tell if it really helped, but I did feel that my ability to concentrate increased, if not my ability to reason. I felt more able to take a breath after working out a variation and take a few seconds to really concentrate on the final (imagined) position and scrutinize it for tricks.
(My USCF rating did go up fairly significantly around then but it's no proof of any causation, plus I was doing plenty of other things at the same time to increase my chess results anyway.)
I feel the same way while learning a new technique on an instrument. It's having that focus and attention to detail that you can draw upon in your work when you need it.
It's interesting to think about what intelligence really means. In some ways, this has been frequently discussed (ie multiple types of intelligence) and in other ways it seems that this has never been discussed (ie what are we actually talking about when we call someone intelligent?) Is intelligence the ability to learn something quickly? Is it the ability to understand something quickly? Is it the ability to solve a particular type of problem quickly?
Depending on what we mean, "Can you make yourself smarter?" has fairly obvious answers.
Start by looking at children. In one regard, we rarely learn faster than when we are kids. Everything is foreign to us, and we are constantly learning, our brains little sponges in a wet world. But clearly, our 25-year old selves could solve far more complex problems than our 6-year old selves. Did we get smarter between 6 and 25?
In the same vein, think about how severely retarded people are described: "He has the mind of a 4 year old." Whether that description is medically accurate or not isn't the point. We certainly think of children as intellectually inferior, even though all of our brains started out that way.
So what changed? Why is a 25 year old "more intelligent" than a 6 year old? Is it the creation of new neural pathways? Is it simply the way they've learned to look at the world or quickly apply answers and processes they already know to fit new problems?
Maybe you can provide more answers? Because it seems to me that the fact that we got to where we are today indicates that you can absolutely make yourself more intelligent, depending on how you define that. But I'm open to objections.
Yes, you certainly did get smarter between 6 and 25. Children's brains are still developing until late puberty. All sorts of cognitive tasks show development, including short term memory, working memory, etc... It's absolutely not just factual learning taht is happening.
Agreed (and studies show that brains are developing until at least 18, possibly into early 20's).
But why? Most people seem to think that this rapid development is, for the most part, genetically driven. Doesn't it seem strange to reach this conclusion when we don't even have a solid idea about what "intelligence" really means (other than "performs above X level on some cognitive test")?
For the same reason that people grow taller until they're in their late teens and get stronger until they're in their early 20's.
>Most people seem to think that this rapid development is, for the most part, genetically driven.
It is. Depending on which studies you cherry-pick, the heritability of intelligence is somewhere between 0.5 and 0.8. The best way to be smart is to choose your parents.
>Doesn't it seem strange to reach this conclusion when we don't even have a solid idea about what "intelligence" really means (other than "performs above X level on some cognitive test")?
People can argue about the definition of strength just as easily. Who is the strongest person in the world? Is it whoever can bench-press the most weight? What about leg press? Clean-and-jerk? Maybe some average of these measures? Maybe we want to factor the person's weight in as well. While "strength" doesn't always have a precise definition, it's usually pretty easy for us to tell weak people from strong ones. It's the same when people talk about intelligence. The precise definition varies, but there are lots of correlating ways to measure it.
This reminds me of an old New York Times article that categorized people into two categories:
1. Those who believe that intelligence is fixed from birth, and
2. Those who believe that intelligence can be improved
The finding was that folks in category (1) tended to be fearful of being wrong, and had trouble succeeding in life whereas the folks in category (2) felt it was OK or good to make mistakes, and tended to be more successful.
When learning to program, my first major project was a feature-loaded N-back in C#/XNA. I crammed in every option I could think of. The game and code is available at http://workingmemoryworkout.codeplex.com, and it's worth noting that the inspiration came from Brain Workshop. (http://brainworkshop.sourceforge.net/)
Another generic "intelligence" booster, for certain definitions of intelligence, is learning about cognitive biases and certain reasoning skills - "rationality". When I read the Sequences (http://wiki.lesswrong.com/wiki/Sequences) on Less Wrong (http://lesswrong.com/), I think my reasoning and analysis skills improved, and I was able to avoid some thinking mistakes, such as by training myself to be "fair" to all sides of an argument. I also found reading the Sequences fun; their subjects include interesting mental puzzles. Go check them out.
I'm probably just being overly critical here but his example of "deciding if a number is odd or even in a matter of seconds" seems odd, I think 2nd graders and up can tell you in half a second or less whether or not a number's even or odd.
He says:
"In addition to working memory, researchers are seeking to improve fluid intelligence by training other basic mental skills — perceptual speed (deciding, in a matter of seconds, whether a number is odd or even), visual tracking (on a shoot-’em-up computer game, for instance) or quickly switching between a variety of tasks."
> can tell you in half a second or less whether or not a number's even or odd.
Sure, but just because it takes half a second or less doesn't mean it's not reflecting mental performance. For example, testing reflexes takes even less time, down to tenths of a second, but yet, reflex time still correlates with IQ.
Working memory is convenient for mental dexterity. But it doesn't help you see deeper associations between the apparently unrelated, which is the true crux of genius and of all revolutionary insight and progress, IMHO. That comes from long-term associative memory, seeded by persevering immersion and experiment.
Don't know about the specifics in the article, but modern research is showing (has already shown, I think...) that the brain shows far more plasticity than they thought not long ago.
Things (not specificially intelligence) that were thought to be invariable turn out to not exist.... the brain can learn.
While I can't read the article. My answer to this question was always along the lines of "Oh I'm sure you can, but the problem is that thanks to the Dunning-Kruger effect, dumb people won't feel that they need to put in the effort to get smarter."
So the smart get smarter and the dumb get dumber; so to speak.
Actually, there was a weird recent re-analysis of Jaeggi 2011 using the Big Five psychology data they recorded for the subjects at the time, where the Conscientiousness (rough synonym for hardworking and self-disciplined) benefitted least in terms of far transfer/IQ while their scores improved the most: http://www.sciencedirect.com/science/article/pii/S0191886912...
Thanks for that article. In the abstract, they seem to state that they believe the conscientious didn't improve as much because they developed specific strategies for the particular game. So, they basically gamed the system instead of improving organically. It looks like you could vary the game rules and objectives to keep the conscientious from developing those strategies.
http://www.gwern.net/DNB%20FAQ#criticism
My personal theory is that solving difficult math and programming problems is a better use of time than doing working memory training exercises. I'd guess that solving math and programming problems improve working memory just as fast as playing 'brain games', and they have the additional benefit of improving one's math and programming skill.
Of course, this is just a guess of mine that is unsupported by any experimental evidence.