ideas

Does it seem plausible that education serves (in whole or part) as a signal of ability rather than simply a means to enhance productivity? (…)
Many MIT students will be hired by consulting firms that have no use for any of these skills. Why do these consulting firms recruit at MIT, not at Hampshire College, which produces many students with no engineering or computer science skills (let alone, knowledge of signaling models)?
Why did you choose MIT over your state university that probably costs one-third as much?
{ David Autor/MIT | PDF }
photo { Robert Frank }
economics, ideas, kids | March 30th, 2012 1:54 pm

IQ, whatever its flaws, appears to be a general factor, that is, if you do well on one kind of IQ test you will tend to do well on another, quite different, kind of IQ test. IQ also correlates well with many and varied real world outcomes. But what about creativity? Is creativity general like IQ? Or is creativity more like expertise; a person can be an expert in one field, for example, but not in another. (…)
The fact that creativity can be stimulated by drugs and travel also suggests to me a general aspect. No one ever says, if you want to master calculus take a trip but this does work if you are blocked on some types of creative projects.
{ Marginal Revolution | Continue reading }
illustration { Shag }
ideas, psychology | March 28th, 2012 12:19 pm

We are still living under the reign of logic: this, of course, is what I have been driving at. But in this day and age logical methods are applicable only to solving problems of secondary interest. The absolute rationalism that is still in vogue allows us to consider only facts relating directly to our experience. Logical ends, on the contrary, escape us. It is pointless to add that experience itself has found itself increasingly circumscribed. It paces back and forth in a cage from which it is more and more difficult to make it emerge. It too leans for support on what is most immediately expedient, and it is protected by the sentinels of common sense. Under the pretense of civilization and progress, we have managed to banish from the mind everything that may rightly or wrongly be termed superstition, or fancy; forbidden is any kind of search for truth which is not in conformance with accepted practices. It was, apparently, by pure chance that a part of our mental world which we pretended not to be concerned with any longer — and, in my opinion by far the most important part — has been brought back to light. For this we must give thanks to the discoveries of Sigmund Freud. On the basis of these discoveries a current of opinion is finally forming by means of which the human explorer will be able to carry his investigation much further, authorized as he will henceforth be not to confine himself solely to the most summary realities.
{ André Breton, Manifesto of Surrealism, 1924 | Continue reading }
Surrealism had the longest tenure of any avant-garde movement, and its members were arguably the most “political.” It emerged on the heels of World War I, when André Breton founded his first journal, Literature, and brought together a number of figures who had mostly come to know each other during the war years. They included Louis Aragon, Marc Chagall, Marcel Duchamp, Paul Eluard, Max Ernst, René Magritte, Francis Picabia, Pablo Picasso, Phillippe Soupault, Yves Tanguey, and Tristan Tzara. Some were “absolute” surrealists and others were merely associated with the movement, which lasted into the 1950s. (…)
André Breton was its leading light, and he offered what might be termed the master narrative of the movement.
No other modernist trend had a theorist as intellectually sophisticated or an organizer quite as talented as Breton. No other was [as] international in its reach and as total in its confrontation with reality. No other [fused] psychoanalysis and proletarian revolution. No other was so blatant in its embrace of free association and “automatic writing.” No other would so use the audience to complete the work of art. There was no looking back to the past, as with the expressionists, and little of the macho rhetoric of the futurists. Surrealists prized individualism and rebellion—and no other movement would prove so commercially successful in promoting its luminaries. The surrealists wanted to change the world, and they did. At the same time, however, the world changed them. The question is whether their aesthetic outlook and cultural production were decisive in shaping their political worldview—or whether, beyond the inflated philosophical claims and ongoing esoteric qualifications, the connection between them is more indirect and elusive.
Surrealism was fueled by a romantic impulse. It emphasized the new against the dictates of tradition, the intensity of lived experience against passive contemplation, subjectivity against the consensually real, and the imagination against the instrumentally rational. Solidarity was understood as an inner bond with the oppressed.
{ Logos | Continue reading }
flashback, ideas, poetry | March 28th, 2012 6:12 am

People often compare education to exercise. If exercise builds physical muscles, then education builds “mental muscles.” If you take the analogy seriously, however, then you’d expect education to share both the virtues and the limitations of exercise. Most obviously: The benefits of exercise are fleeting. If you stop exercising, the payoff quickly evaporates. (…) Exercise physiologists call this detraining. As usual, there’s a big academic literature on it.
{ EconLib | Continue reading }
health, ideas, sport | March 27th, 2012 3:58 pm

“Eskimo has one hundred words for snow.” The Great Eskimo Vocabulary Hoax [PDF | Wikipedia] was demolished many years ago. (…)
People who proffer the factoid seem to think it shows that the lexical resources of a language reflect the environment in which its native speakers live. As an observation about language in general, it’s a fair point to make. Languages tend to have the words their users need and not to have words for things never used or encountered. But the Eskimo story actually says more than that. It tells us that a language and a culture are so closely bound together as to be one and the same thing. ”Eskimo language” and the “snowbound world of the Eskimos” are mutually dependent things. That’s a very different proposition, and it lies at the heart of arguments about the translatability of different tongues.
Explorer-linguists observed quite correctly that the languages of peoples living in what were for them exotic locales had lots of words for exotic things, and supplied subtle distinctions among many different kinds of animals, plants, tools, and ritual objects. Accounts of so-called primitive languages generally consisted of word lists elicited from interpreters or from sessions of pointing and asking for names. But the languages of these remote cultures seemed deficient in words for “time,” “past,” “future,” “language,” “law,” “state,” “government,” “navy,” or “God.”
More particularly, the difficulty of expressing “abstract thought” of the Western kind in many Native American and African languages suggested that the capacity for abstraction was the key to the progress of the human mind… The “concrete languages” of the non-Western world were not just the reflection of the lower degree of civilization of the peoples who spoke them but the root cause of their backward state. By the dawn of the twentieth century, “too many concrete nouns” and “not enough abstractions” became the conventional qualities of “primitive” tongues.
That’s what people actually mean when they repeat the story about Eskimo words for snow. (…)
If you go into a Starbucks and ask for “coffee,” the barista most likely will give you a blank stare. To him the word means absolutely nothing. There are at least thirty-seven words for coffee in my local dialect of Coffeeshop Talk.
{ David Bellos/Big Think | Continue reading }
Linguistics | March 27th, 2012 12:56 pm

We need to understand why we often have trouble agreeing on what is true (what some have labeled science denialism). Social science has taught us that human cognition is innately, and inescapably, a process of interpreting the hard data about our world – its sights and sound and smells and facts and ideas - through subjective affective filters that help us turn those facts into the judgments and choices and behaviors that help us survive. The brain’s imperative, after all, is not to reason. Its job is survival, and subjective cognitive biases and instincts have developed to help us make sense of information in the pursuit of safety, not so that we might come to know “THE universal absolute truth.” This subjective cognition is built-in, subconscious, beyond free will, and unavoidably leads to different interpretations of the same facts. (…)
Our subjective system of cognition can be dangerous. It can produce perceptions that conflict with the evidence, what I call The Perception Gap.
{ Big Think | Continue reading }
ideas, psychology | March 27th, 2012 7:54 am

Since quite a long time neuroscientists know where the process of creative thinking takes place in the brain: mainly in the frontal lobe. The part of the brain behind our forehead is responsible for the production of ideas that are not only original, rare and uncommon but also appropriate, thus useful and adaptive. Against that background, findings about impairments in creative cognition as a consequence of frontal lobe damages are not surprising.
The neuroscientists Shamay-Tsoory et al. however, looked at this relation a little bit closer. Lesions in the frontal lobe do not always entail negative consequences for creativity. Rather on the contrary.
{ United Academics | Continue reading }
artwork { Femke Hiemstra }
brain, ideas | March 26th, 2012 1:12 pm

Today’s word: the business term for membership services (gyms, streaming music) being paid for but not used is “breakage.”
{ Sasha Frere-Jones }
photo { Nathaniel Ward }
Linguistics, photogs | March 26th, 2012 7:44 am

Researchers have established a direct link between the number of friends you have on Facebook and the degree to which you are a “socially disruptive” narcissist, confirming the conclusions of many social media skeptics.
People who score highly on the Narcissistic Personality Inventory questionnaire had more friends on Facebook, tagged themselves more often and updated their newsfeeds more regularly.
The research comes amid increasing evidence that young people are becoming increasingly narcissistic, and obsessed with self-image and shallow friendships.
A number of previous studies have linked narcissism with Facebook use, but this is some of the first evidence of a direct relationship between Facebook friends and the most “toxic” elements of narcissistic personality disorder.
{ Guardian | Continue reading }
photo { Leo Berne }
quote { Hamilton Nolan/Gawker }
gross, ideas, psychology, social networks, technology | March 25th, 2012 11:18 am

Defamiliarization or ostranenie is the artistic technique of forcing the audience to see common things in an unfamiliar or strange way, in order to enhance perception of the familiar.
{ Wikipedia | Continue reading }
photo { Philip-Lorca diCorcia }
Linguistics, ideas, photogs | March 23rd, 2012 12:13 pm

The problem with the web is that it largely began as a world separate from meatspace. Today, most people use their real names, but this wasn’t always the case. When I started going online in the mid-90s, no one even knew my gender. I preferred that, not because I was hiding, but because I feel very strongly that I should be judged by my thoughts, not who people assume I am by seeing I am a woman, by attaching a handful of preconceived notions to what I am saying because they see my photo and think I’m too young or too old or attractive or unattractive. (…)
There is nothing wrong with wishing that it were possible to compartmentalize your digital conversations in the same way you do your meatspace exchanges. Unfortunately for us, this is not the direction the web is going, which is why pseudonymous accounts and the networks who accept them are so very, very important.
{ AV Flox | Continue reading }
image { Mark Gmehling }
experience, ideas, technology | March 23rd, 2012 11:48 am

A Christian missionary sets out to convert a remote Amazonian tribe. He lives with them for years in primitive conditions, learns their extremely difficult language, risks his life battling malaria, giant anacondas, and sometimes the tribe itself. In a plot twist, instead of converting them he loses his faith, morphing from an evangelist trying to translate the Bible into an academic determined to understand the people he’s come to respect and love.
Along the way, the former missionary discovers that the language these people speak doesn’t follow one of the fundamental tenets of linguistics, a finding that would seem to turn the field on its head, undermine basic assumptions about how children learn to communicate, and dethrone the discipline’s long-reigning king, who also happens to be among the most well-known and influential intellectuals of the 20th century.
{ The Chronicle of Higher Education | Continue reading }
image { The Connected Poster }
Linguistics | March 23rd, 2012 11:35 am

Philosophically, a realist is someone who holds that our theories are descriptions of how the world really is. Yet realist explanations of the behaviour of elementary particles face a fundamental challenge. Although electrons, for example, would seem to be simple entities – they have no internal structure – a satisfactory description of their behaviour in classical Newtonian terms, as if they were little balls, proved impossible. Theorists therefore turned to building mathematical models which could predict electron behaviour rather than explain electrons in realist terms.
Instrumentalism is this view that theories are useful for explaining and predicting phenomena, rather than that they necessarly describe the world. Yet the mathematical apparatus underpinning such instrumentalism, such as Dirac’s use of infinite-dimensional Hilbert [abstract mathematical] spaces, and Feynman’s sums-over-all-possible-histories, are so powerful, and so beautiful, that the lack of a convincing realist alternative has so far not proved to be a significant handicap in the development of quantum physics and its technological applications. In fact, quantum theory in its instrumental form has provided the most successful explanation of all time, by being the most powerful of all scientific theories. But Deutsch wants a realist quantum theory.
{ Philosophy Now | Continue reading }
artwork { Olaf Brzeski, Dream, Spontaneous Combustion, 2008 }
ideas, theory | March 23rd, 2012 11:15 am






Codex Seraphinianus, originally published in 1981, is an illustrated encyclopedia of an imaginary world, created by the Italian artist, architect and industrial designer Luigi Serafini during thirty months, from 1976 to 1978. The book is approximately 360 pages long (depending on edition), and written in a strange, generally unintelligible alphabet. (…)
The book is an encyclopedia in manuscript with copious hand-drawn colored-pencil illustrations of bizarre and fantastical flora, fauna, anatomies, fashions, and foods. It has been compared to the Voynich manuscript,[3] “Tlön, Uqbar, Orbis Tertius”, and the works of M.C. Escher[6] and Hieronymus Bosch. (…)
In a talk at the Oxford University Society of Bibliophiles held on 12 May 2009, Serafini stated that there is no meaning hidden behind the script of the Codex, which is asemic; that his own experience in writing it was closely similar to automatic writing; and that what he wanted his alphabet to convey to the reader is the sensation that children feel in front of books they cannot yet understand, although they see that their writing does make sense for grown-ups.
{ Wikipedia | Continue reading | Thanks to Adam John Williams }
books, visual design | March 23rd, 2012 10:05 am

Love is an alien invasion coordinated with sleeper cell revolts. Someone penetrates you and leaves behind a colony that allows the monster inside them to ventriloquize your thoughts. It’s the opportunity that that part of us which cries out to be dominated and longs to be victimized has been waiting for ever since we were born. Everyone knows the best scene in The Manchurian Candidate is the one where Frank Sinatra and Janet Leigh meet on the train. It captures the indistinguishabilty of love from brainwashing, even and especially at its inception. Love is a cancer. You can’t just cut it out. You have to poison your whole body to beat it, kill yourself just enough to keep on living. Love is White Power. Love is Vichy France.
–-Rick Santorum, National Association of Women Against Women, inaugural address
{ If you can read this you’re lying | Continue reading }
painting { Jules Lefebvre, Odalisque, 1874 }
ideas, relationships | March 21st, 2012 2:58 pm

A team of physicists published a paper drawing on Google’s massive collection of scanned books. They claim to have identified universal laws governing the birth, life course and death of words.
Published in Science, that paper gave the best-yet estimate of the true number of words in English—a million, far more than any dictionary has recorded (the 2002 Webster’s Third New International Dictionary has 348,000). More than half of the language, the authors wrote, is “dark matter” that has evaded standard dictionaries.
The paper also tracked word usage through time (each year, for instance, 1% of the world’s English-speaking population switches from “sneaked” to “snuck”). It also showed that we seem to be putting history behind us more quickly, judging by the speed with which terms fall out of use. References to the year “1880″ dropped by half in the 32 years after that date, while the half-life of “1973″ was a mere decade. (…)
English continues to grow—the 2011 Culturonomics paper suggested a rate of 8,500 new words a year. The new paper, however, says that the growth rate is slowing. Partly because the language is already so rich, the “marginal utility” of new words is declining: Existing things are already well described. This led them to a related finding: The words that manage to be born now become more popular than new words used to get, possibly because they describe something genuinely new (think “iPod,” “Internet,” “Twitter”).
{ WSJ | Continue reading }
image { Larry Welz }
Linguistics | March 20th, 2012 12:32 pm

Throughout the centuries hair has been a symbol of status for wealth, class, rank, slavery, royalty, strength, manliness or femininity. And, it still holds a place of value in our postmodern era. Two novels that include hair as a major theme are Bret Easton Ellis’s American Psycho (1991) and Don Delillo’s Cosmopolis (2003). These two authors create characters heavily involved in the Wall Street world but also swallowed up in their depthless narcissism.
In American Psycho we are introduced to twenty-eight year old Patrick Bateman, a psychopath serial killer who is also a wealthy, narcissistic Wall Street broker. Bateman’s first person narrative is filled with the gruesome crimes he commits, but they are narrated in such a way that it is never clear if he is actually performing the murders. He is obsessed with the material world and confines his life to eating at fancy restaurants, womanizing, purchasing name brand clothes and the most up-to-date electronics, and ensuring the preciseness of his hair.
Twenty-eight year old Eric Packer of Cosmopolis follows many of the traits of his predecessor, but, instead of aimless wandering from fashionable restaurants to name brand designer shops, Packer sets out across New York on a mini-epic journey for a haircut. In his maximum security limousine, Packer views the world from multiple digital screens and purposefully loses his entire financial portfolio in a gamble against the Japanese yen. Within these two novels, hair functions as the external symbol of inward turmoil. Hair becomes a primary symbol through which these two authors create a harsh critique of the modern world of technology, consumerism, and emptiness.
{ Wayne E. Arnold | Continue reading }
books, hair | March 20th, 2012 12:20 pm

Most of us would agree that King Tut and the other mummified ancient Egyptians are dead, and that you and I are alive. Somewhere in between these two states lies the moment of death. But where is that? The old standby — and not such a bad standard — is the stopping of the heart. But the stopping of a heart is anything but irreversible. We’ve seen hearts start up again on their own inside the body, outside the body, even in someone else’s body. Christian Barnard was the first to show us that a heart could stop in one body and be fired up in another. Due to the mountain of evidence to the contrary, it is comical to consider that “brain death” marks the moment of legal death in all fifty states. (…)
Sorensen says that the idea of “irreversibility” makes the determination of death problematic. What was irreversible, say, twenty years ago, may be routinely reversible today. He cites the example of strokes. Brain damage from stroke that was irreversible and led irrevocably to death in the 1940s was reversible in the 1980s. In 1996 the FDA approved tissue plasminogen activator (tPA), a clot-dissolving agent, for use against stroke. This drug has increased the reversibility of a stroke from an hour after symptoms begin to three hours.
In other words, prior to 1996, MRIs of the brains of stroke victims an hour after the onset of symptoms were putative photographs of the moment of death, or at least brain death. Today those images are meaningless.
{ Salon | Continue reading }
health, ideas | March 20th, 2012 12:09 pm

You’re famous for denying that propositions have to be either true or false (and not both or neither) but before we get to that, can you start by saying how you became a philosopher?
Well, I was trained as a mathematician. I wrote my doctorate on (classical) mathematical logic. So my introduction into philosophy was via logic and the philosophy of mathematics. But I suppose that I’ve always had an interest in philosophical matters. I was brought up as a Christian (not that I am one now). And even before I went to university I was interested in the philosophy of religion – though I had no idea that that was what it was called. Anyway, by the time I had finished my doctorate, I knew that philosophy was more fun than mathematics, and I was very fortunate to get a job in a philosophy department (at the University of St Andrews), teaching – of all things – the philosophy of science. In those days, I knew virtually nothing about philosophy and its history. So I have spent most of my academic life educating myself – usually by teaching things I knew nothing about; it’s a good way to learn! Knowing very little about the subject has, I think, been an advantage, though. I have been able to explore without many preconceptions. And I have felt free to engage with anything in philosophy that struck me as interesting.
{ Graham Priest interviewed by Richard Marshal | Continue reading }
photo { Robert Whitman }
ideas | March 19th, 2012 3:04 pm

Several theories claim that dreaming is a random by-product of REM sleep physiology and that it does not serve any natural function.
Phenomenal dream content, however, is not as disorganized as such views imply. The form and content of dreams is not random but organized and selective: during dreaming, the brain constructs a complex model of the world in which certain types of elements, when compared to waking life, are underrepresented whereas others are over represented. Furthermore, dream content is consistently and powerfully modulated by certain types of waking experiences.
On the basis of this evidence, I put forward the hypothesis that the biological function of dreaming is to simulate threatening events, and to rehearse threat perception and threat avoidance.
To evaluate this hypothesis, we need to consider the original evolutionary context of dreaming and the possible traces it has left in the dream content of the present human population. In the ancestral environment, human life was short and full of threats. Any behavioral advantage in dealing with highly dangerous events would have increased the probability of reproductive success. A dream-production mechanism that tends to select threatening waking events and simulate them over and over again in various combinations would have been valuable for the development and maintenance of threat-avoidance skills.
Empirical evidence from normative dream content, children’s dreams, recurrent dreams, nightmares, post traumatic dreams, and the dreams of hunter-gatherers indicates that our dream-production mechanisms are in fact specialized in the simulation of threatening events, and thus provides support to the threat simulation hypothesis of the
{ Antti Revonsuo/Behavioral and Bain Sciences | PDF }
archives, psychology, science, theory | March 19th, 2012 3:02 pm