science

Orpheus with his lute made trees

31.jpg

News of the successful use of ether anesthesia on October 16, 1846, spread rapidly through the world. […] Incredibly, this option was not accepted by all, and opposition to the use of anesthesia persisted among some sections of society decades after its introduction.

We examine the social and medical factors underlying this resistance. […] Complications of anesthesia, including death, were reported in the press, and many avoided anesthesia to minimize the considerable risk associated with surgery. Modesty prevented female patients from seeking unconsciousness during surgery, where many men would be present. Biblical passages stating that women would bear children in pain were used to discourage them from seeking analgesia during labor. […] In certain geographical areas, notably Philadelphia, physicians resisted this Boston-based medical advance, citing unprofessional behavior and profit seeking.

{ Journal of Anesthesia History | Continue reading }

photo { Peter Martin, Greenwich Village Nudes, Figure #1, 1951 }

And all the clouds that lour’d upon our house in the deep bosom of the ocean buried

3.jpg

Seasonal affective disorder (SAD) is based on the theory that some depressions occur seasonally in response to reduced sunlight. SAD has attracted cultural and research attention for more than 30 years and influenced the DSM through inclusion of the seasonal variation modifier for the major depression diagnosis. This study was designed to determine if a seasonally related pattern of occurrence of major depression could be demonstrated in a population-based study. A cross-sectional U.S. survey of adults completed the Patient Health Questionnaire–8 Depression Scale. […] Depression was unrelated to latitude, season, or sunlight. Results do not support the validity of a seasonal modifier in major depression. The idea of seasonal depression may be strongly rooted in folk psychology, but it is not supported by objective data.

{ Clinical Psychological Science | Continue reading }

photo { Daido Moriyama }

We choose to go to the moon in this decade and do the other things

410.jpg

{ Margaret Hamilton standing next to listings of the Apollo Guidance Computer source code, January 1969 | Margaret Hamilton (born August 17, 1936) is a computer scientist, systems engineer, and business owner. She was Director of the Software Engineering Division of the MIT Instrumentation Laboratory, which developed on-board flight software for the Apollo space program. | Wikipedia | Continue reading }

Eat the meat and spit out the bones

331.jpg

In an update on an old story, an investment banker asks the client to pay by placing one penny on the first square of a chessboard, two pennies on the second square, four on the third, doubling the number on each square that follows. If the banker had asked for this on only the white squares, the initial penny would double thirty-one times to $21,474,836 on the last square. Using both the black and the white squares, the sum on the last square is $92,233,720,368,547,758.

People are reasonably good at estimating how things add up, but for compounding, which involved repeated multiplication, we fail to appreciate how quickly things grow. As a result, we often lose sight of how important even small changes in the average rate of growth can be. For an investment banker, the choice between a payment that doubles with every square on the chessboard and one that doubles with every other square is more important than any other part of the contract. […]

Growth rates for nations drive home the point that modest changes in growth rates are possible and that over time, these have big effects. […]

If economic growth could be achieved only by doing more and more of the same kind of cooking, we would eventually run out of raw materials and suffer from unacceptable levels of pollution and nuisance. Human history teaches us, however, that economic growth springs from better recipes, not just from more cooking.

{ Paul Romer | Continue reading }

Why did the cat go to Minnesota? To get a mini soda!

71.jpg

After 2.5 millennia of philosophical deliberation and psychological experimentation, most scholars have concluded that humor arises from incongruity. We highlight 2 limitations of incongruity theories of humor.

First, incongruity is not consistently defined. The literature describes incongruity in at least 4 ways: surprise, juxtaposition, atypicality, and a violation.

Second, regardless of definition, incongruity alone does not adequately differentiate humorous from nonhumorous experiences.

We suggest revising incongruity theory by proposing that humor arises from a benign violation: something that threatens a person’s well-being, identity, or normative belief structure but that simultaneously seems okay.

Six studies, which use entertainment, consumer products, and social interaction as stimuli, reveal that the benign violation hypothesis better differentiates humorous from nonhumorous experiences than common conceptualizations of incongruity. A benign violation conceptualization of humor improves accuracy by reducing the likelihood that joyous, amazing, and tragic situations are inaccurately predicted to be humorous.

{ Journal of Personality and Social Psychology }

photo { William Klein }

Did they ever tell Constau

48.jpg

Many of our errors, the researchers found, stem from a basic mismatch between how we analyze ourselves and how we analyze others. When it comes to ourselves, we employ a fine-grained, highly contextualized level of detail. When we think about others, however, we operate at a much higher, more generalized and abstract level. For instance, when answering the same question about ourselves or others — how attractive are you? — we use very different cues. For our own appearance, we think about how our hair is looking that morning, whether we got enough sleep, how well that shirt matches our complexion. For that of others, we form a surface judgment based on overall gist. So, there are two mismatches: we aren’t quite sure how others are seeing us, and we are incorrectly judging how they see themselves.

If, however, we can adjust our level of analysis, we suddenly appear much more intuitive and accurate. In one study, people became more accurate at discerning how others see them when they thought their photograph was going to be evaluated a few months later, as opposed to the same day, while in another, the same accuracy shift happened if they thought a recording they’d made describing themselves would be heard a few months later. Suddenly, they were using the same abstract lens that others are likely to use naturally.

{ Maria Konnikova, The Confidence Game | Continue reading }

The monkish monsignor, with a head full of plaster

7.jpg

Previous studies have found that facial appearance can predict both the selection and performance of leaders. Little is known about the specific facial features responsible for this relationship, however.

One possible feature is mouth width, which correlates with the propensity for physical combat in primates and could therefore be linked to one’s perceived dominance and achievement of greater social rank. […]

We observed that mouth width correlated with judgments of CEOs’ leadership ability and with a measure of their actual leadership success. Individuals with wider mouths were also more likely to have won U.S. senate, but not gubernatorial, races. Mouth width may therefore be a valid cue to leadership selection and success.

{ Journal of Experimental Social Psychology | Continue reading }

photo { Gregory Crewdson }

‘There’s only one corner of the universe you can be certain of improving, and that’s your own self.’ –Aldous Huxley

45.jpg

After medicine in the 20th century focused on healing the sick, now it is more and more focused on upgrading the healthy, which is a completely different project. And it’s a fundamentally different project in social and political terms, because whereas healing the sick is an egalitarian project […] upgrading is by definition an elitist project. […] This opens the possibility of creating huge gaps between the rich and the poor […]Many people say no, it will not happen, because we have the experience of the 20th century, that we had many medical advances, beginning with the rich or with the most advanced countries, and gradually they trickled down to everybody, and now everybody enjoys antibiotics or vaccinations or whatever. […]

There were peculiar reasons why medicine in the 20th century was egalitarian, why the discoveries trickled down to everybody. These unique conditions may not repeat themselves in the 21st century. […] When you look at the 20th century, it’s the era of the masses, mass politics, mass economics. Every human being has value, has political, economic, and military value. […] This goes back to the structures of the military and of the economy, where every human being is valuable as a soldier in the trenches and as a worker in the factory.

But in the 21st century, there is a good chance that most humans will lose, they are losing, their military and economic value. This is true for the military, it’s done, it’s over. The age of the masses is over. We are no longer in the First World War, where you take millions of soldiers, give each one a rifle and have them run forward. And the same thing perhaps is happening in the economy. Maybe the biggest question of 21st century economics is what will be the need in the economy for most people in the year 2050.

And once most people are no longer really necessary, for the military and for the economy, the idea that you will continue to have mass medicine is not so certain. Could be. It’s not a prophecy, but you should take very seriously the option that people will lose their military and economic value, and medicine will follow.

{ Edge | Continue reading }

A gradual decline into disorder

33.jpg

Physicist Enrico Fermi famously asked the question “Where are they?” to express his surprise over the absence of any signs for the existence of other intelligent civilizations in the Milky Way Galaxy. […]

Observations have shown that the Milky Way contains no fewer than a billion Earth-size planets orbiting Sun-like (or smaller) stars in the “Goldilocks” region that allows for liquid water to exist on the planet’s surface (the so-called habitable zone). Furthermore, the search for extraterrestrial intelligent life has recently received a significant boost in the form of “Breakthrough Listen”—a $100-million decade-long project aimed at searching for non-natural transmissions in the electromagnetic bandwidth from 100 megahertz to 50 gigahertz.

Simple life appeared on Earth almost as soon as the planet cooled sufficiently to support water-based organisms. To be detectable from a distance, however, life has to evolve to the point where it dominates the planetary surface chemistry and has significantly changed the atmosphere, creating chemical “biosignatures” that can in principle be detected remotely. For instance, Earth itself would probably not have been detected as a life-bearing planet during the first two billion years of its existence. […]

[A]n excellent first step in the quest for signatures of simple extrasolar life in the relatively near future would be to: search for oxygen, but try to back it up with other biosignatures. […]

One would ideally like to go beyond biosignatures and seek the clearest sign of an alien technological civilization. This could be the unambiguous detection of an intelligent, non-natural signal, most notably via radio transmission, the aim of the SETI (Search for Extraterrestrial Intelligence) program. Yet there is a distinct possibility that radio communication might be considered archaic to an advanced life form. Its use might have been short-lived in most civilizations, and hence rare over large volumes of the universe. What might then be a generic signature? Energy consumption is a hallmark of an advanced civilization that appears to be virtually impossible to conceal. […]

More pessimistically, biologically-based intelligence may constitute only a very brief phase in the evolution of complexity, followed by what futurists have dubbed the “singularity”—the dominance of artificial, inorganic intelligence. If this is indeed the case, most advanced species are likely not to be found on a planet’s surface (where gravity is helpful for the emergence of biological life, but is otherwise a liability). But they probably must still be near a fuel supply, namely a star, because of energy considerations. Even if such intelligent machines were to transmit a signal, it would probably be unrecognizable and non-decodable to our relatively primitive organic brains.

{ Scientific American | Continue reading }

Upon the knife of my youth

42.jpg

Researchers have created a digital audio platform that can modify the emotional tone of people’s voices while they are talking, to make them sound happier, sadder or more fearful. New results show that while listening to their altered voices, participants’ emotional state change in accordance with the new emotion. […]

The study found that the participants were unaware that their voices were being manipulated, while their emotional state changed in accordance with the manipulated emotion portrayed. This indicates that people do not always control their own voice to meet a specific goal and that people listen to their own voice to learn how they are feeling.

{ eurekAlert | Continue reading }

drawing { Julia Randall }

Tragedy on the stage is no longer enough for me

32.jpg

A technique called optogenetics has transformed neuroscience during the past 10 years by allowing researchers to turn specific neurons on and off in experimental animals. By flipping these neural switches, it has provided clues about which brain pathways are involved in diseases like depression and obsessive-compulsive disorder. “Optogenetics is not just a flash in the pan,” says neuroscientist Robert Gereau of Washington University in Saint Louis. “It allows us to do experiments that were not doable before. This is a true game changer like few other techniques in science.” […]

The new technology relies on opsins, a type of ion channel consisting of proteins that conduct neurons’ electrical signaling. Neurons contain hundreds of different types of ion channels but opsins open in response to light. Some opsins are found in the human retina but those used in optogenetics are derived from algae and other organisms. The first opsins used in optogenetics, called channel rhodopsins, open to allow positively charged ions to enter the cell when activated by a flash of blue light, which causes the neuron to fire an electrical impulse. Other opsin proteins pass inhibitory, negatively charged ions in response to light, making it possible to silence neurons as well. […]

The main challenge before optogenetic therapies become a reality is getting opsin genes into the adult human neurons to be targeted in a treatment. In rodents researchers have employed two main strategies: transgenics, in which mice are bred to make opsins in specific neurons—an option unsuitable for use in humans. The other method uses a virus to implant a gene into a neuron. Viruses are currently being used for other types of gene therapy in humans, but challenges remain. Viruses must penetrate mature neurons and deliver their gene cargo without spurring an immune reaction. Then the neuron has to express the opsin in the right place, and it has to go on making the protein continuously—ideally forever.

{ Scientific American | Continue reading }

Question: Tell me what you think about me, I buy my own diamonds and I buy my own rings.

31.jpg

A basic feature of psychological processes is their irreversibility. Every experience changes a person in a way that cannot be completely undone… one must assume that persons are continuously and irreversibly changing. […]

The logic of inductive inference entails that what is observed under given conditions at one time will occur again under the same conditions at a later time. But this logic can only be applied when it is possible to replicate the same initial conditions, and this is strictly impossible in the case of irreversible processes.

As a result, no psychological theory can attain the status of a “law”, and no result will be perfectly replicable.

{ Neuroskeptic | Continue reading }