nswd

ideas

Hot mockturtle vapour and steam of newbaked jampuffs rolypoly poured out from Harrison’s

77.jpg

Observe your own mood, and that of others, in the context of how recently they have eaten. If there’s a hothead in your circle, notice that his anger is greatest before meals, when hunger is highest, and rarely does he explode during meals or just after. When you feel agitated, try eating some carbs. They’re like a miracle drug. I suspect that anger is evolution’s way of telling you to go kill something so you can eat.

{ Scott Adams | Continue reading }

‘Reality doesn’t impress me.’ –Anais Nin

55.jpg

In 1889, when Friedrich Nietzsche suffered the mental collapse that ended his career, he was virtually unknown. Yet by the time of his death in 1900 at the age of 55, he had become the philosophical celebrity of his age. From Russia to America, admirers echoed his estimation of himself as a titanic figure who could alter the course of history. (…)

Suffering from violent migraines, Nietzsche resigned his academic post when he was 34 and began the life of a little-heeded nomad-­intellectual in European resorts. With escalating intensity, he issued innovative works of philosophy that challenged every element of European civilization. He celebrated the artistic heroism of Beethoven and Goethe; denigrated the “slave morality” of Christianity, which transfigured weakness into virtue and vital strength into sin; and called on the strong in spirit to bring about a “transvaluation of all values.” The “higher man” — or as Nietzsche sometimes called him, the “overman” or “Übermensch” — did not succumb to envy or long for the afterlife; rather he willed that his life on earth repeat itself over and over exactly as it was. (…)

If God was dead, so too were equally fictitious entities like the self. There was no objective truth, only the truth-effects engendered by the workings of power and the instabilities of language. (…) More brilliantly than anyone, Nietzsche understood the peril of modern nihilism and the need to cultivate robust souls who would strive to achieve excellence without authoritative religious belief. (…)

Several decades before Nietzsche wrote, “What does not kill me makes me stronger,” Emerson wrote, “In general, every evil to which we do not succumb, is a benefactor.”

{ NY Times | Continue reading }

Now you steppin wit a G, from Los Angeles, where the helicopters got cameras

211.jpg

The third (and last) time I went to New Orleans was in September of 1978. I was living in Marin County, and I took the red-eye out of San Francisco, flying on a first-class ticket paid for by Universal Pictures, the studio that was financing the movie I was contracted to write. The story was to be loosely based on an article written by Hunter Thompson that had been recently published in Rolling Stone magazine. Titled “The Banshee Screams for Buffalo Meat,” the 30,000-word piece detailed many of the (supposedly) true-life adventures Hunter had experienced with Oscar Zeta Acosta, the radical Chicano lawyer who he’d earlier canonized in Fear and Loathing in Las Vegas.

Hunter and I were in New Orleans to attend the hugely anticipated rematch between Muhammad Ali and Leon Spinks, the former Olympic champion who, after only seven fights, had defeated Ali in February. The plan was to meet up at the Fairmont, a once-elegant hotel that was located in the center of the business district and within walking distance of the historic French Quarter. Although Hunter was not in his room when I arrived, he’d instructed the hotel management to watch for me and make sure I was treated with great respect.

“I was told by Mister Thompson to mark you down as a VIP, that you were on a mission of considerable importance,” said Inga, the head of guest services, as we rode the elevator up to my floor. “Since he was dressed quite eccentrically, in shorts and a Hawaiian shirt, I assumed he was pulling my leg. The bellman who fetched his bags said he was a famous writer. Are you a writer also?” I told her I wrote movies. “Are you famous?”

“No.”

“Do you have any cocaine?”

I stared at her. Her smile was odd, both reassuring and intensely hopeful. In the cartoon balloon I saw over her head were the words: I’m yours if you do. “Yes, I do.”

“That is good.”

Inga called the hotel manager from my room and told him, in a voice edged with professional disappointment, that she was leaving early because of a “personal matter.” After she hung up, she dialed room service and handed me the phone. She directed me to order two dozen oysters, a fifth of tequila, and two Caesar salads. Then, with a total absence of modesty, she quickly stripped off her clothes, walked into the bathroom, and a moment later I heard the water running in the shower.

{ LA Review of Books | Continue reading }

photo { Richard Kern | More: Shot by Kern | videos }

Think no more about that. After one. Timeball on the ballast office is down. Dunsink time.

212.jpg

One day in 1945, a man named Percy Spencer was touring one of the laboratories he managed at Raytheon in Waltham, Massachusetts, a supplier of radar technology to the Allied forces. He was standing by a magnetron, a vacuum tube which generates microwaves, to boost the sensitivity of radar, when he felt a strange sensation. Checking his pocket, he found his candy bar had melted. Surprised and intrigued, he sent for a bag of popcorn, and held it up to the magnetron. The popcorn popped. Within a year, Raytheon made a patent application for a microwave oven.

The history of scientific discovery is peppered with breakthroughs that came about by accident. The most momentous was Alexander Fleming’s discovery of penicillin in 1928, prompted when he noticed how a mould that floated into his Petri dish killed off the surrounding bacteria. Spencer and Fleming didn’t just get lucky. Spencer had the nous and the knowledge to turn his observation into innovation; only an expert on bacteria would have been ready to see the significance of Fleming’s stray spore. As Louis Pasteur wrote, “In the field of observation, chance favours only the prepared mind.”

The word that best describes this subtle blend of chance and agency is “serendipity.” (…)

Today’s internet plies us with “relevant” information and screens out the rest. Two different people will receive subtly different results from Google, adjusted for what Google knows about their interests. Newspaper websites are starting to make stories more prominent to you if your friends have liked them on Facebook. We spend our online lives inside what the writer Eli Pariser calls “the filter bubble.”

{ More Intelligent Life | Continue reading }

photo { Luigi Ghirri }

There it is Red Murray said

82.jpg

Thousands of characters — letters and obscure symbols — filled the more than 100 pages of a centuries-old text that had been located in East Berlin after the end of the Cold War. No one knew what the text meant, or even what language it was in. It was a mystery that USC computer scientist Kevin Knight and two Swedish researchers sought to solve. (…)

After months of painstaking work and a few trips down the wrong path, the moment finally came when the team knew it was on to something. Out of what had been gibberish emerged one word: ceremonie — a variation of the German word for ceremony. Knight said they figured out the rest from there.

Breaking the code on the document known as the Copiale Cipher revealed the rituals and political observations of an 18th century secret German society, as well as the group’s unusual fascination with eye surgery and ophthalmology.

But the larger significance of the team’s work wasn’t necessarily the discovery, it was how they arrived at it. (…)

“You start to see patterns, then you reach the magic point where a word appears,” he said. It was then, he said, “you no longer even care what the document’s about.”

The team ran statistical analyses of 80 languages, initially believing that the code lay in the Roman letters between the symbols that dotted the pages. Using a combination of brain power and computer wizardry, they broke the code by figuring out the symbols.

{ LA Times | Continue reading }

photo { Darren Almond }

No taxi cause she hated it

6.jpg

In a new study from the Centre for Addiction and Mental Health (CAMH), people with schizophrenia showed greater brain activity during tests that induce a brief, mild form of delusional thinking. This effect wasn’t seen in a comparison group without schizophrenia.

“We studied a type of delusion called a delusion of reference, which occurs when people feel that external stimuli such as newspaper articles or strangers’ overheard conversations are about them,” says CAMH Scientist Dr. Mahesh Menon, adding that this type of delusion occurs in up to two-thirds of people with schizophrenia. “Then they come up with an explanation for this feeling to make sense of it or give it meaning.”

The study was an initial exploration of the theory that the overactive firing of dopamine neurons in specific brain regions is involved in converting neutral, external information into personally relevant information among people with schizophrenia. This may lead to symptoms of delusions. “We wanted to see if we could find a way to ’see’ these delusions during Magnetic Resonance Imaging scanning,” says Dr. Menon.

{ EurekAlert | Continue reading }

Schizophrenia is a mental disorder characterized by a breakdown of thought processes and by poor emotional responsiveness. It most commonly manifests itself as auditory hallucinations, paranoid or bizarre delusions, or disorganized speech and thinking, and it is accompanied by significant social or occupational dysfunction. The onset of symptoms typically occurs in young adulthood, with a global lifetime prevalence of about 0.3–0.7%.

Genetics, early environment, neurobiology, and psychological and social processes appear to be important contributory factors; some recreational and prescription drugs appear to cause or worsen symptoms. Current research is focused on the role of neurobiology, although no single isolated organic cause has been found. The many possible combinations of symptoms have triggered debate about whether the diagnosis represents a single disorder or a number of discrete syndromes.

Despite the etymology of the term from the Greek roots skhizein (”to split”) and phren- (”mind”), schizophrenia does not imply a “split mind” and it is not the same as dissociative identity disorder—also known as “multiple personality disorder” or “split personality”—a condition with which it is often confused in public perception.

The mainstay of treatment is antipsychotic medication, which primarily suppresses dopamine (and sometimes serotonin) receptor activity.

{ Wikipedia | Continue reading }

photo { Brian James }

Rhythm-A-Ning

32.jpg

Carissa Kowalski Dougherty explores how album covers moved from the purely functional to graphic works of art that conveyed the tone, mood, and feel of the lyric-less jazz music contained within. Dougherty also investigates how race is designated on the covers, an item, she says, that is inextricably linked to the music itself.

During the postwar period, African-American artists and musicians were confronting the same issues in their respective fields: how to retain their identity as black Americans while being recognized as skilled artists regardless of race; how to convey their own personal experiences; how to overcome discrimination; how to succeed in their field, and how to express pride in their African heritage—all without the aid of words.

{ MIT/Design Issue | Link to PDF }

From the beginnin’ to end, losers lose, winners win

28.jpg

Professor Trifonov analyzes the vocabulary of 123 existing definitions of life in order to provide a possible path for finding a possible minimal agreement among scientists. To this purpose, he compares from a linguistic point of view the definitions and ranks the terms used therein according to their frequency. (…)

The outcome of this analysis is a definition of life as “self-reproduction with variations.” (…)

Is “self-reproduction with changes” a good definition? Can this definition actually provide a minimal basis of consensus?

{ Fabrizio Macagno/SSRN | Continue reading }

Confusion occurs, comin up in the cold world

11.jpg

Names of countries in foreign languages (exonyms) often bear no relationship to the names of the same countries in their own official language or languages (endonyms). Such differences are generally accepted without complaint; the fact that English speakers refer to Deutschland as Germany and Nihon as Japan is not a problem for the governments or the people of those countries.

Occasionally, however, diplomats from a given country request that other governments change its name. (…)

Over the past several years, Georgia has been trying to convince a number of countries to call it “Georgia,” even though the Georgian name for the country is Sakart’velo.

{ GeoCurrents | Continue reading }

‘The writing’s easy, it’s the living that is sometimes difficult.’ –Charles Bukowski

45.jpg

This is what I mean when I say I would like to swim against the stream of time. I would like to erase the consequences of certain events and restore an initial condition. But every moment of my life brings with it an accumulation of new facts, and each of these new facts brings with it its consequences; so the more I seek to return to the zero moment from which I set out, the further I move away from it: though all my actions are bent on erasing the consequences of previous actions and though I manage to achieve appreciable results in this erasure, enough to open my heart to hopes of immediate relief, I must, however, bear in mind that my every move to erase previous events provokes a rain of new events, which complicate the situation worse than before and which I will then, in their turn, have to erase. Therefore I must calculate carefully every move so as to achieve the maximum of erasure with the minimum of recomplication.

{ Italo Calvino, If on a winter’s night a traveler, 1979 | Continue reading }

If on a winter’s night a traveler begins with a chapter on the art and nature of reading, and is subsequently divided into twenty-two passages. The odd-numbered passages and the final passage are narrated in the second person. That is, they concern events purportedly happening to the novel’s reader. (Some contain further discussions about whether the man narrated as “you” is the same as the “you” who is actually reading.) These chapters concern the reader’s adventures in reading Italo Calvino’s novel, If on a winter’s night a traveler. Eventually the reader meets a woman, who is also addressed in her own chapter, separately, and also in the second person.

{ Wikipedia | Continue reading }

Him: My magic watch says that you don’t have any underwear on. Her: Yes I do. Him: Damn! It must be 15 minutes fast.

5.jpg

{ Todd McLellan }

Do you think I care whether you agree with me? No. I’m telling you why I disagree with you. That, I do care about.

81.jpg

Measuring power and influence on the web is a matter of huge interest. Indeed, algorithms that distill rankings from the pattern of links between webpages have made huge fortunes for companies such as Google. One the most famous of these is the Hyper Induced Topic Search or HITS algorithm which hypothesises that important pages fall into two categories–hubs and authorities–and are deemed important if they point to other important pages and if other important pages point to them. This kind of thinking led directly to Google’s search algorithm PageRank. The father of this idea is John Kleinberg, a computer scientist now at Cornell University in Ithaca, who has achieved a kind of cult status through this and other work. It’s fair to say that Kleinberg’s work has shaped the foundations of the online world.

Today, Kleinberg and a few pals put forward an entirely different way of measuring power and influence; one that may one day have equally far-reaching consequences.

These guys have worked out how to measure power differences between individuals using the patterns of words they speak or write. In other words, they say the style of language during a conversation reveals the pecking order of the people talking.

“We show that in group discussions, power differentials between participants are subtly revealed by how much one individual immediately echoes the linguistic style of the person they are responding to,” say Kleinberg and co.

The key to this is an idea called linguistic co-ordination, in which speakers naturally copy the style of their interlocutors. Human behaviour experts have long studied the way individuals can copy the body language or tone of voice of their peers, some have even studied how this effect reveals the power differences between members of the group.

Now Kleinberg and so say the same thing happens with language style. They focus on the way that interlocutors copy each other’s use of certain types of words in sentences. In particular, they look at functional words that provide a grammatical framework for sentences but lack much meaning in themselves (the bold words in this sentence, for example). Functional words fall into categories such as articles, auxiliary verbs, conjunctions, high-frequency adverbs and so on.

The question that Kleinberg and co ask is this: given that one person uses a certain type of functional word in a sentence, what is the chance that the responder also uses it?

To find the answer they’ve analysed two types of text in which the speakers or writers have specific goals in mind: transcripts of oral arguments in the US Supreme Court and editorial discussions between Wikipedia editors (a key bound in this work is that the conversations cannot be idle chatter; something must be at stake in the discussion).

Wikipedia editors are divided between those who are administrators, and so have greater access to online articles, and non-administrators who do not have such access. Clearly, the admins have more power than the non-admins.

By looking at the changes in linguistic style that occur when people make the transition from non-admin to admin roles, Kleinberg and co cleverly show that the pattern of linguistic co-ordination changes too. Admins become less likely to co-ordinate with others. At the same time, lower ranking individuals become more likely to co-ordinate with admins.

A similar effect also occurs in the Supreme Court (where power differences are more obvious in any case).

Curiously, people seem entirely unware that they are doing this. “If you are communicating with someone who uses a lot of articles — or prepositions, orpersonal pronouns — then you will tend to increase your usage of these types of words as well, even if you don’t consciously realize it,” say Kleinberg and co.

{ The Physics arXiv Blog | Continue reading }

photo { Robert Whitman, F***ed Up In Minneapolis | Black & White Gallery, Brooklyn, NY, until Jan 14 }

A pickle for the knowing ones, or plain truths in a homespun dress

38.jpg

“Lord” Timothy Dexter (1748 – 1806) was an eccentric American businessman noted for a series of lucky transactions and his writing. (…)

He made his fortune by investing in Continental Dollars during the Revolutionary War, when they could be purchased for a tiny percentage of their face value. After the war was over, and the U.S. government made good on the dollars, he became wealthy. (…)

His 1802 memoir A Pickle for the Knowing Ones, or Plain Truths in a Homespun Dress is entirely misspelled and contains no punctuation. At first he handed his book out for free, but it became popular and was re-printed in eight editions. In the second edition Dexter added an extra page which consisted of 13 lines of punctuation marks. Dexter instructed readers to “peper and solt it as they plese.”

Dexter announced his death and urged people to prepare for his burial. About 3,000 people attended his mock wake. The crowd was disappointed when they heard a still-living Dexter screaming at his wife that she was not grieving enough.

{ Wikipedia | Continue reading | Literary historian Paul Collins discusses Lord Timothy’s lasting appeal | NPR | Life of Lord Timothy Dexter }

I’ve got my heart but my heart’s no good

46.jpg

Your book starts with the idea, which was very prominent and commonly believed by a large group of people, that fat–eating fat and fat in your diet, particularly animal fat–isn’t good for you and it leads to heart disease. How did that come to be accepted wisdom in the medical profession?

First, let me say I think it’s still commonly believed by most people, and the latest dietary guidelines are trying to get us to limit our fat intake, and limit our saturated fat intake. This is an hypothesis that grew out of the observations of one very zealous University of Minnesota nutritionist in the 1950s, a fellow named Ancel Keys, who came up with this idea that dietary fat raised cholesterol, and it was raised cholesterol that caused heart disease. At the time there was effectively no meaningful experimental data to support–I’ll rephrase that: There was no experimental data to support that observation. It seemed plausible, though. It seemed plausible, compelling. Keys was a persuasive fellow. And by 1960 or so, the American Heart Association (AHA) got behind it in part because Keys and a fellow-proponent of this hypothesis, a cardiologist from Chicago named Jeremiah Stamler, got onto the AHA, got involved with an ad hoc committee, and were able to publish a report basically saying we should all cut our fat intake. This was 1961. Like I said, no data to support it; no experimental data at all. And once the AHA got behind it, it got a kind of believability. The attitude was: It’s probably right, and all we have to do is test it. Or, we’re going to believe it’s true, but we don’t have the data yet because we haven’t done the tests yet.

And researchers start doing the tests, experimental trials, taking a population. For instance, a famous study at the VA hospital in Los Angeles, where you randomize half of them to a cholesterol-lowering diet which is not actually low in fat, by the way–it’s low in saturated fat and high in polyunsaturated fat. And then the other half of your subjects eat a control diet and you look for heart disease over a number of years and see what happens. And trial after trial was sort of unable to prove the hypothesis true. But the more we studied it, the more people simply believed it must be true. And meanwhile, the AHA is pushing it; other observations are being compiled to support it even though in order to support it you have to ignore the observations that don’t support it. So, you pay attention to the positive evidence, ignore the negative evidence. One Scottish researcher who I interviewed memorably called this “Bing Crosby epidemiology” where you “accentuate the positive, eliminate the negative.” Basic human nature. But this is what happened. And as the AHA gets behind it, the journalists see the AHA as honest brokers of information on this, so they have no reason to doubt the AHA. And the AHA was honest brokers–they just were bad scientists. Or they were not scientists. So, then the press gets behind it, and as the press gets behind it, politicians begin to think maybe we should do something about it, and a Congressional subcommittee gets involved, run by George McGovern, that had originally been founded in the late 1960s to address hunger in America; and they did a lot of good things with school lunch programs and food stamps. And by the mid-1970s they were running out of things to do, so they decided: Since we’ve been dealing with under-nutrition, which is not enough food, they would get involved with over-nutrition, which is a problem of too much food and obesity and diabetes and heart disease. And they had one day of hearings, McGovern’s subcommittee, and they assign a former labor reporter from the Providence, RI, Journal to write the first dietary goals for the United States–the first document ever from a government body of any kind suggesting that a low fat diet is a healthy diet. And once McGovern comes out with this document, written by a former labor reporter who knew nothing about nutrition and health; now the USDA feels they have to get involved; and you get this kind of cascade or domino effect. To the point that by 1984 the National Institute of Health (NIH) holds a consensus conference saying that we have a consensus of opinion that we should all eat low fat diets, when they still don’t have a single meaningful experiment showing that a low fat diet or cholesterol lowering diet will reduce the risk of heart disease, or at least make you live longer. Because a few of the studies suggested that you could reduce the risk of heart disease but you would increase cancer. And not one study–the biggest study ever done, which was in Minnesota, actually suggested that if you put people on cholesterol-lowering diets you increase mortality; they had more deaths in the intervention group than the control group. (…)

Japanese women in Japan have very low rates of breast cancer. So when Japanese women come to the United States, by the second generation they have rates of breast cancer as high as any other ethnic group, and one possibility is it’s because they come over here and they eat more fat. But the problem with those observational studies, those comparisons, is you don’t know what you are looking at. So, you focus on fat because that’s what your hypothesis is about–and this is an endemic problem in public health–and you just don’t pay attention to anything else. So, sugar consumption is very low in Japan and very high here. So, maybe it’s sugar that’s the cause of heart disease, or the absence of sugar is the reason the Japanese are so relatively healthy; and if you don’t look at sugar, you don’t know.

{ Gary Taubes/EconTalk | Continue reading }

photo { Robert Mapplethorpe }

‘The desire to die was my one and only concern; to it I have sacrificed everything, even death.’ –Cioran

29.jpg

The most extreme proponent of anti-natalism is probably David Benatar, author of Better Never to Have Been, which maintains that:

(1) Coming into existence is always a serious harm. (2) It is always wrong to have children. (3) It is wrong not to abort fetuses at the earlier stages of gestation. (4) It would be better if, as a result of there being no new people, humanity became extinct.

{ EconLib | Continue reading }

‘Doubt is the origin of wisdom.’ –Descartes

25.jpg

One of the better-known psychology factoids is that 80% of people tend to think they are above average (if you don’t know this, you’re clearly in the “below-average” 20%).

A new study explains this tendency by finding evidence for what the researchers call the “better-than-my-average-effect.”

Essentially, we evaluate how we really are by looking at our best performances, but when we evaluate others we tend to focus on their average performance.

{ peer-reviewed by my neurons | Continue reading }

You know, it’s kinda like… Success is subjective, you know. It could be an opinion.

44.jpg

If someone asked you to describe the psychological aspects of personhood, what would you say? Chances are, you’d describe things like thought, memory, problem-solving, reasoning, maybe emotion. In other words, you probably list the major headings of a cognitive psychology text-book. In cognitive psychology, we seem to take it for granted that these are, objectively, the primary components of “the mind.” (…)

In fact, this conception of the mind is heavily influenced by a particular (Western) cultural background. Other cultures assign different characteristics and abilities to the psychological aspects of personhood. (…)

Cross-linguistic research shows that, generally speaking, every culture has a folk model of a person consisting of visible and invisible (psychological) aspects. While there is agreement that the visible part of the person refers to the body, there is considerable variation in how different cultures think about the invisible (psychological) part. In the West, and, specifically, in the English-speaking West, the psychological aspect of personhood is closely related to the concept of “the mind” and the modern view of cognition. (…)

In Korean, the concept “maum” replaces the concept “mind.” “Maum” has no English counterpart, but is sometimes translated as “heart”. Apparently, “maum” is the “seat of emotions, motivation, and “goodness” in a human being.” (…)

The Japanese have yet another concept for the invisible part of the person — “kokoro.” “Kokoro” is a “seat of emotion, and also, a source of culturally valued attention to, and empathy with, other people.”

{ Notes from Two Scientific Psychologists | Continue reading }

painting { Eugène Delacroix, Orphan Girl at the Cemetery, 1824 }

Don’t give up your miracle is on the way

42.jpg

1. The medieval theologian Thomas Aquinas reasoned that the universe must have a First Cause, to which he assigned the name God.

2. Modern physicists in their way are likewise in search of a First Cause. (…)

A useful proxy for the First Cause is energy. (…) Yet no one thinks energy bears any resemblance to God in the traditional religious sense. It has neither knowledge nor will. It’s not a person. It doesn’t summon us to paradise or command us to embrace the good and shun evil. It provides our lives with no meaning. It’s just there.

{ The Straight Dope | Continue reading }

photo { Christiane Wöhler Friedebach }

‘All paid jobs absorb and degrade the mind.’ –Aristotle

231.jpg

It is not new to talk about the need to acquire “irreplaceable” skills. But what is not properly appreciated is the scope of the challenge this poses to people in all kinds of jobs, and the exact defining characteristic of what will make a skill “irreplaceable.”

The basic rule of economics after the Industrial Revolution is: if a task can be automated, it will be. Or to put it differently, if a worker can be replaced by a machine, he will be. Call it the principle of expendability. The only thing that has changed since the first power loom is the number and nature of the tasks that can be automated. The first thing the Industrial Revolution did was to automate physical tasks. But now we are beginning to automate mental tasks, and what we are just beginning to see is the scope of the mental work that can be automatized. It is much wider than you probably think.

An awful lot of work that is usually considered to require human intelligence really doesn’t. Instead, these tasks require complex memorization and pattern recognition, perceptual-level skills that can be reduced to mechanical, digitized processes and done by a machine. These include many tasks that currently fill the days of highly educated, well paid professionals.

Take doctors. A recent article by Farhad Manjoo, the technology columnist for Slate, describes how computers have begun to automate the screening of cervical cancer tests. A task that used to be done by two physicians, who could only process 90 images per day, can now be done with better results by one doctor and a machine, processing 170 images per day. (…)

One more example. I recently came across a story about a composer and music theorist who created a computer program that writes cantatas in the style of Johann Sebastian Bach. (A cantata is a short piece with a well-defined structure, which makes the task a little easier.) The climax of the story is a concert in which an orchestra played a mixture of the computer’s compositions and actual Bach cantatas. An audience of music experts could not reliably determine which was which.

{ Robert Tracinski/Real Clear Markets | Continue reading }

illustration { Julian Murphy }

Three friends face mid-life crises. Paul is a writer who’s blocked. François has lost his ideals and practices medicine for the money; his wife grows distant, even hostile. The charming Vincent, everyone’s favorite, faces bankruptcy, his mistress leaves him, and his wife, from whom he’s separated, wants a divorce.

229.jpg

Researchers are great at finding correlations between lifestyle and health. Here are four study results you’ve probably seen.

1. People who have a drink or two each day live longer.

2. People who own pets live longer.

3. People who exercise 20 minutes a day live longer.

4. Religious people live longer.

What do all four of those lifestyle choices have in common in terms of a possible root cause explanation? Read the list again and see if you can find it. (…) The pattern I noticed is that each of the lifestyle choices directly lowers stress by improving a person’s attitude.

My hypothesis is that stress is the root cause of most health problems.

{ Scott Adams | Continue reading }



kerrrocket.svg