Is the answer to this question “no”?


On 25 October 1946, Karl Popper (at the London School of Economics), was invited to present a paper entitled “Are There Philosophical Problems?” at a meeting of the Cambridge University Moral Sciences Club, which was chaired by Ludwig Wittgenstein.

The two started arguing vehemently over whether there existed substantial problems in philosophy, or merely linguistic puzzles—the position taken by Wittgenstein.

Wittgenstein used a fireplace poker to emphasize his points, gesturing with it as the argument grew more heated. Eventually, Wittgenstein claimed that philosophical problems were nonexistent.

In response, Popper claimed there were many issues in philosophy, such as setting a basis for moral guidelines. Wittgenstein then thrust the poker at Popper, challenging him to give any example of a moral rule, Popper (later) claimed to have said:

“Not to threaten visiting lecturers with pokers”

{ Wikipedia | Continue reading }

Parnet: Let’s move on to “W”.

Deleuze: There’s nothing in “W”.

Parnet: Yes, there’s Wittgenstein. I know he’s nothing for you, but it’s only a word.

Deleuze: I don’t like to talk about that… For me, it’s a philosophical catastrophe. It’s the very example of a “school”, it’s a regression of all philosophy, a massive regression. […] They imposed a system of terror in which, under the pretext of doing something new, it’s poverty instituted in all grandeur… […] the Wittgensteinians are mean and destructive. […] They are assassins of philosophy.

{ The Deleuze Seminars | Continue reading }

‘She had deceived herself in supposing that she could be whatever she wanted to be.’ —Tolstoy


Sartre, it will be recalled, had asserted a kind of absolute freedom for the conscious human being. It was this claim that Merleau-Ponty disputed. […] If freedom were everywhere, as seemed to be the case in Sartre’s Being and Nothingness , then freedom in effect would be nowhere […] “Free action, in order to be discernible, has to stand out from a background of life from which it is entirely, or almost entirely, absent.” (Merleau-Ponty, Phenomenology of Perception, 1945) […]

While Sartre properly emphasized the subject’s freedom, he distorted the scope of this freedom by rendering it absolute. The subject, argued Merleau-Ponty, always faced a previously established situation, an environment and world not of its own making. Its life, as intersubjectively open, acquired a social atmosphere which it did not itself constitute. Social roles pressed upon the individual as plausible courses for his life to take. Certain modes of behavior became habitual. Probably , this world, these habits, a familiar comportment: probably these would not change overnight. It was unlikely that an individual would suddenly choose to be something radically other than what he had already become. The Sartre of Being and Nothingness underestimated the weight of this realm of relative constraint and habitual inertia.

{ Merleau-Ponty: The Ambiguity of History | Continue reading

Cognitive science is lacking conceptual tools to describe how an agent’s motivations, as such, can play a role in the generation of its behavior. […] a new kind of non-reductive theory is proposed: Irruption Theory. […] irruptions are associated with increased unpredictability of (neuro)physiological activity, and they should hence be quantifiable in terms of information-theoretic entropy. Accordingly, evidence that action, cognition, and consciousness are linked to higher levels of neural entropy can be interpreted as indicating higher levels of motivated agential involvement.

{ PsyArXiv | Continue reading }

Plato has Socrates describe a group of people who have lived chained to the wall of a cave all of their lives, facing a blank wall. The people watch shadows projected on the wall from objects passing in front of a fire behind them, and give names to these shadows.


In mid-1947, a United States Army Air Forces balloon crashed at a ranch near Roswell, New Mexico. Following wide initial interest in the crashed “flying disc”, the US military stated that it was merely a conventional weather balloon. Interest subsequently waned until the late 1970s, when ufologists began promoting a variety of increasingly elaborate conspiracy theories, claiming that one or more alien spacecraft had crash-landed and that the extraterrestrial occupants had been recovered by the military, which then engaged in a cover-up.

In the 1990s, the US military published two reports disclosing the true nature of the crashed object: a nuclear test surveillance balloon from Project Mogul.

{ Wikipedia | Continue reading }

photo { W. Eugene Smith, Untitled [man holding bottle, S-shaped foam form emerging from it], Springfield, Massachusetts, 1952 }

‘Pittacus said that half was more than the whole.’ —Diogenes Laërtius


The idea that unconscious thought is sometimes more powerful than conscious thought is attractive, and echoes ideas popularized by books such as writer Malcolm Gladwell’s best-selling Blink.

But within the scientific community, ‘unconscious-thought advantage’ (UTA) has been controversial. Now Dutch psychologists have carried out the most rigorous study yet of UTA—and find no evidence for it. […] The report adds to broader concerns about the quality of psychology studies and to an ongoing controversy about the extent to which unconscious thought in general can influence behaviour.

{ Scientific American | Continue reading }

art { Bronzino, Portrait of Lucrezia Panciatichi, 1545 }

Spinoza on why there can only be one substance


It is just possible to discern some points beneath the heated rhetoric in which Patricia Churchland indulges. But none of these points is right. If you hold that “mental processes are actually processes in the brain,” to quote Churchland, then you are committed to the thesis that it is sufficient to understand the mind that one understands the brain, and not merely necessary. This is just the well-known “identity theory” of mind and brain: mental processes are identical to brain processes; and the identity of a with b entails the sufficiency of a for b. To hold the weaker thesis that knowledge of the brain is merely necessary for knowledge of the mind is consistent even with being a heavy-duty Cartesian dualist, since even such a dualist accepts that mind depends causally on brain.

{ Patricia Churchland vs. Colin McGinn/NY Review of Books | Continue reading }

‘a book that is just every time pinochio is eaten by the whale in every iteration of the story, printed on sheets of lead’ —@BAKKOOONN


Looks like they did a pretty simple edit job. I’ve done more retouching on basic portrait work.

Then you’re a crap photographer.

lololol colour balance, brightness, levels, these are all totally normal things to alter.

You can’t get a good shot in the first place, you’re the problem. Photographers did not always have retouching to fall back on, and they got some pretty damned good shots without it. You are advertising that you are unable to do that.

After spending 15 years as a photographer and countless hours in the darkroom, I am authorized to say you don’t have a fucking clue what you’re talking about. Dodge and burn, fool. Dodge and burn.

Argument from authority, which tends to be problematic in the first place (look it up, fool), and also pre-invalidated by the very subject.
Dude, this is Annie Freakin’ Leibovitz. You’re not a better authority than she is, and she screwed up massively here.

Have I personally offended you? I’ve seen you on here before and know you’re not a troll. Are you just a massive fucking asshole, or what is your deal? My point was that these aren’t particularly edited shots and that they were fairly true to the original photos, so the bounty on the pictures did not serve any purpose because there was almost nothing to reveal. And when I said I’ve done more editing on basic portrait work, I clearly, to anyone who isn’t you, was saying that these were edited with such a light touch that even regular old portrait work requires more editing (i.e. not much).

You’re just wrong, that’s all.

{ Jezebel | Continue reading }

I love origami, but HATE paper cuts. Dilemma.


There is no scientific evidence that psychiatric diagnoses such as schizophrenia and bipolar disorder are valid or useful, according to the leading body representing Britain’s clinical psychologists.

In a groundbreaking move that has already prompted a fierce backlash from psychiatrists, the British Psychological Society’s division of clinical psychology (DCP) will on Monday issue a statement declaring that, given the lack of evidence, it is time for a “paradigm shift” in how the issues of mental health are understood. The statement effectively casts doubt on psychiatry’s predominantly biomedical model of mental distress – the idea that people are suffering from illnesses that are treatable by doctors using drugs.

Dr Lucy Johnstone, a consultant clinical psychologist who helped draw up the DCP’s statement, said it was unhelpful to see mental health issues as illnesses with biological causes.

“On the contrary, there is now overwhelming evidence that people break down as a result of a complex mix of social and psychological circumstances – bereavement and loss, poverty and discrimination, trauma and abuse,” Johnstone said. The provocative statement by the DCP has been timed to come out shortly before the release of DSM-5, the fifth edition of the American Psychiatry Association’s Diagnostic and Statistical Manual of Mental Disorders.

The manual has been attacked for expanding the range of mental health issues that are classified as disorders.

{ The Observer | Continue reading }

images { Kazuki Takamatsu | Maja Daniels }

Alone on deck, in dark alpaca, yellow kitefaced


One in four of us will struggle with a mental illness this year, the most common being depression and anxiety. The upcoming publication of the fifth edition of the Diagnostic and Statistical Manual for Mental Disorders (DSM) will expand the list of psychiatric classifications, further increasing the number of people who meet criteria for disorder. But will this increase in diagnoses really mean more people are getting the help they need? And to what extent are we pathologising normal human behaviours, reactions and mood swings?

The revamping of the DSM – an essential tool for mental health practitioners and researchers alike, often referred to as the ‘psychiatry bible’ – is long overdue; the previous version was published in 1994. This revision provides an excellent opportunity to scrutinise what qualifies as psychiatric illness and the criteria used to make these diagnoses. But will the experts make the right calls?

The complete list of new diagnoses was released recently and included controversial disorders such as ‘excessive bereavement after a loss’ and ‘internet use gaming disorder’. The inclusion of these syndromes raises the important question of what actually qualifies as pathology.

{ King’s Review | Continue reading }

photo { Francesca Woodman, Untitled, Providence, Rhode Island (Self-portrait on the telephone), 1975-1976 }

‘If you always do what interests you, at least one person is pleased.’ –Katharine Hepburn


According to animal rights theory, respecting the interests of animals in this way would mean abolishing the use of them as resources. So we’d all have to become vegans who neither eat animals nor use any other animal products. Vegan advocates face a daunting challenge, though, since most of us have a strong prejudice in favour of humans. This makes it relatively difficult for us to empathise with non-humans, so we are reluctant to give up the spoils of animal domination — meat, eggs, cheese, wool, fur and leather — and exchange them for tofu, pleather (plastic leather) and animal liberation. […]

Suppose that we are doing our usual thing of exploiting animals because they aren’t smart or powerful enough to fight back. An alien species that is smarter and more powerful than us lands on Earth and decides to follow our example by exploiting and killing us. Why shouldn’t aliens use their technological and cerebral edge to turn us into food, clothes, entertainment and research subjects, just as we do to animals now? […]

This argument resonates because most of us have picked up a version of ‘do as you would be done by’ somewhere along the way, no matter how secular our upbringings. Could it be, then, that if we want to be consistent with our own values, the animal activists are right that we need to go vegan? […]

Sure, if we were replaced as the dominant animals on the planet, we’d probably prefer the new ruling species to be vegan. But if aliens with superior technology and minds came here and were determined to treat us the way that vegan humans treat animals on this planet, we’d still be in serious trouble. Veganism would hardly figure as a safeguard of our wellbeing.

Universal veganism wouldn’t stop the road-building, logging, urban and suburban development, pollution, resource consumption, and other forms of land transformation that kills animals by the billions. So what does veganism do exactly? Theoretically, it ends the raising, capture and exploitation of living animals, and it stops a particular kind of killing that many vegans claim is the worst and least excusable: the intentional killing of animals in order to use their bodies as material goods.

Veganism, as a whole, requires us to stop using animals for entertainment, food, pharmaceutical testing, and clothing. If it were to become universal, factory farming and animal testing would end, which would be excellent news for all the animals that we capture or raise for these purposes. But it would accomplish next to nothing for free-roaming wild animals except to stop hunting, which is the least of their problems. […]

Neutrality is impossible in a world with limited resources. Everything we take is a loss for other animals, and since we want to live, enjoy our lives and reproduce (just as they do), we will never stop bypassing animals’ desires for our own, so long as we are here.

{ Rhys Southan/Aeon | Continue reading }

The sadness will last forever


According to an influential and controversial theory, autism is the manifestation of an “Extreme Male Brain.” The reasoning goes something like this - the condition is far more prevalent in males than females; people with autism think in a distinctive style that’s more commonly observed in men than women (that is, high in systematising and low in empathising); and greater testosterone exposure in the womb appears to go hand in hand with an infant exhibiting more autism-like traits in later childhood.

Simon Baron-Cohen, the psychologist who first proposed the theory, always conjectured that there may also be such a thing as an “Extreme Female Brain.” Now in a new paper, a pair of researchers in the USA have made the case that the Extreme Female Brain exists, it’s highly empathic, and it comes with its own problematic consequences, in terms of a fear of negative evaluation by others, and related to that, a greater risk of eating disorders (which are known to be far more prevalent in women than men).

{ BPS | Continue reading }

photo { Andrew Miksys }

If everyone says I am right then who is wrong


We have all had arguments. Occasionally these reach an agreed upon conclusion but usually the parties involved either agree to disagree or end up thinking the other party hopelessly stupid, ignorant or irrationally stubborn. Very rarely do people consider the possibility that it is they who are ignorant, stupid, irrational or stubborn even when they have good reason to believe that the other party is at least as intelligent or educated as themselves.

Sometimes the argument was about something factual where the facts could be easily checked e.g. who won a certain football match in 1966.

Sometimes the facts aren’t so easily checked because they are difficult to understand but the problem is clear and objective. (…)

Sometimes the facts aren’t as mathematical or logical as the Monty Hall solution. Each party to the argument appeals to ‘facts’ which the other party disputes. (…)

Sometimes the arguments boil down to differences in values. For example, what tastes better chocolate or vanilla ice cream, or who is prettier Jane or Mary? In these cases there isn’t really a correct answer – even when a large majority favors a particular alternative. Values also have a strong way of influencing what people accept as evidence or indeed what they perceive at all.

The interesting thing is that when the disagreement isn’t a pure values difference it should always be possible to reach agreement.

{ Garth Zietsman | Continue reading }

He risked a second nervous look at the strong, almost cruel lines of her face


In fact, it was because of my feminism that I wanted to like Erotic Capital: Whether from nature or nurture, women have traditionally excelled at “soft skills” like taking the emotional temperature of others, listening, adjusting one’s behavior to any given situation, and cooperating. These all happen to be skills that, until fairly recently, have been undercompensated in the workplace. In Hakim’s book I anticipated a deftly written argument that would reclaim the value of women’s work so that maybe we’d eventually start paying people in the professions that make use of those skills — say, teaching and nursing — their true value.

That’s the book I wanted to read. The book I actually read was more like this: Men supposedly have higher sex drives than women, creating a “male sex deficit,” which means men are always in a state of wanting more of what women supply. (…) So women who are willing to address that deficit, by either having actual sex with men suffering from it or presenting themselves in an enchanting manner to exploit it, have erotic capital that can be traded for other forms of capital.

Erotic capital has many guises: from “trophy wives” whose skilled self-presentation becomes a part of a man’s public persona, to men or women who style themselves in such a way as to garner attention at their workplace, to women with otherwise limited means who sell their erotic capacity (whether forthrightly, as with sex workers and performers, or more covertly, as with sales jobs) to establish themselves. It’s “sell yourself” meets “sex sells.” What’s most surprising about all this is that Hakim seems to think she’s saying something new. (…)

That she fails to name a single feminist who has actually come out against presenting oneself well (as opposed to presenting oneself as stereotypically feminine) indicates that she’s attacking a straw feminist, not an actual one. Where are the radical feminists urging women to not use their people skills on the job? Who are these radical feminists who blame women for wearing makeup to work instead of directing their critiques at institutions that demand women do so? Hakim falsely asserts that feminists have been fighting for the eradication of charisma and charm instead of the eradication of coyness and the deployment of sex appeal as woman’s strongest — or only — weapons.

{ The New Inquiry | Continue reading }

I’ve got my heart but my heart’s no good


Your book starts with the idea, which was very prominent and commonly believed by a large group of people, that fat–eating fat and fat in your diet, particularly animal fat–isn’t good for you and it leads to heart disease. How did that come to be accepted wisdom in the medical profession?

First, let me say I think it’s still commonly believed by most people, and the latest dietary guidelines are trying to get us to limit our fat intake, and limit our saturated fat intake. This is an hypothesis that grew out of the observations of one very zealous University of Minnesota nutritionist in the 1950s, a fellow named Ancel Keys, who came up with this idea that dietary fat raised cholesterol, and it was raised cholesterol that caused heart disease. At the time there was effectively no meaningful experimental data to support–I’ll rephrase that: There was no experimental data to support that observation. It seemed plausible, though. It seemed plausible, compelling. Keys was a persuasive fellow. And by 1960 or so, the American Heart Association (AHA) got behind it in part because Keys and a fellow-proponent of this hypothesis, a cardiologist from Chicago named Jeremiah Stamler, got onto the AHA, got involved with an ad hoc committee, and were able to publish a report basically saying we should all cut our fat intake. This was 1961. Like I said, no data to support it; no experimental data at all. And once the AHA got behind it, it got a kind of believability. The attitude was: It’s probably right, and all we have to do is test it. Or, we’re going to believe it’s true, but we don’t have the data yet because we haven’t done the tests yet.

And researchers start doing the tests, experimental trials, taking a population. For instance, a famous study at the VA hospital in Los Angeles, where you randomize half of them to a cholesterol-lowering diet which is not actually low in fat, by the way–it’s low in saturated fat and high in polyunsaturated fat. And then the other half of your subjects eat a control diet and you look for heart disease over a number of years and see what happens. And trial after trial was sort of unable to prove the hypothesis true. But the more we studied it, the more people simply believed it must be true. And meanwhile, the AHA is pushing it; other observations are being compiled to support it even though in order to support it you have to ignore the observations that don’t support it. So, you pay attention to the positive evidence, ignore the negative evidence. One Scottish researcher who I interviewed memorably called this “Bing Crosby epidemiology” where you “accentuate the positive, eliminate the negative.” Basic human nature. But this is what happened. And as the AHA gets behind it, the journalists see the AHA as honest brokers of information on this, so they have no reason to doubt the AHA. And the AHA was honest brokers–they just were bad scientists. Or they were not scientists. So, then the press gets behind it, and as the press gets behind it, politicians begin to think maybe we should do something about it, and a Congressional subcommittee gets involved, run by George McGovern, that had originally been founded in the late 1960s to address hunger in America; and they did a lot of good things with school lunch programs and food stamps. And by the mid-1970s they were running out of things to do, so they decided: Since we’ve been dealing with under-nutrition, which is not enough food, they would get involved with over-nutrition, which is a problem of too much food and obesity and diabetes and heart disease. And they had one day of hearings, McGovern’s subcommittee, and they assign a former labor reporter from the Providence, RI, Journal to write the first dietary goals for the United States–the first document ever from a government body of any kind suggesting that a low fat diet is a healthy diet. And once McGovern comes out with this document, written by a former labor reporter who knew nothing about nutrition and health; now the USDA feels they have to get involved; and you get this kind of cascade or domino effect. To the point that by 1984 the National Institute of Health (NIH) holds a consensus conference saying that we have a consensus of opinion that we should all eat low fat diets, when they still don’t have a single meaningful experiment showing that a low fat diet or cholesterol lowering diet will reduce the risk of heart disease, or at least make you live longer. Because a few of the studies suggested that you could reduce the risk of heart disease but you would increase cancer. And not one study–the biggest study ever done, which was in Minnesota, actually suggested that if you put people on cholesterol-lowering diets you increase mortality; they had more deaths in the intervention group than the control group. (…)

Japanese women in Japan have very low rates of breast cancer. So when Japanese women come to the United States, by the second generation they have rates of breast cancer as high as any other ethnic group, and one possibility is it’s because they come over here and they eat more fat. But the problem with those observational studies, those comparisons, is you don’t know what you are looking at. So, you focus on fat because that’s what your hypothesis is about–and this is an endemic problem in public health–and you just don’t pay attention to anything else. So, sugar consumption is very low in Japan and very high here. So, maybe it’s sugar that’s the cause of heart disease, or the absence of sugar is the reason the Japanese are so relatively healthy; and if you don’t look at sugar, you don’t know.

{ Gary Taubes/EconTalk | Continue reading }

photo { Robert Mapplethorpe }

‘The desire to die was my one and only concern; to it I have sacrificed everything, even death.’ –Cioran


The most extreme proponent of anti-natalism is probably David Benatar, author of Better Never to Have Been, which maintains that:

(1) Coming into existence is always a serious harm. (2) It is always wrong to have children. (3) It is wrong not to abort fetuses at the earlier stages of gestation. (4) It would be better if, as a result of there being no new people, humanity became extinct.

{ EconLib | Continue reading }

Strawberries for the teeth: nettles and rainwater: oatmeal they say steeped in buttermilk.


“Most people are simply not designed to eat pasta:” evolutionary explanations for obesity in the low-carbohydrate diet movement

Low-carbohydrate diets, notably the Atkins Diet, were particularly popular in Britain and North America in the late 1990s and early 2000s. On the basis of a discourse analysis of bestselling low-carbohydrate diet books, I examine and critique genetic and evolutionary explanations for obesity and diabetes as they feature in the low-carbohydrate literature. Low-carbohydrate diet books present two distinct neo-Darwinian explanations of health and body-weight. First, evolutionary nutrition is based on the premise that the human body has adapted to function best on the diet eaten in the Paleolithic era. Second, the thrifty gene theory suggests that feast-or-famine conditions during human evolutionary development naturally selected for people who could store excess energy as body fat for later use. However, the historical narratives and scientific arguments presented in the low-carbohydrate literature are beset with generalisations, inconsistencies and errors.

{ SAGE | Continue reading }

related { Habit makes bad food too easy to swallow }

Known as ‘Ocean of Wisdom’ and ‘Buddha of Compassion’


Genes determine 50 percent of the likelihood that you will vote. Half of your altruism. One-quarter of your financial decisions. How do we know? Twin studies.

Researchers compare some behavior or trait in a set of pairs of monozygotic (identical) twins and a set of pairs of dizygotic (fraternal) twins. In theory, the siblings in each pair have been raised in the same way—i.e., they have “nurture” in common. But their “natures” might be different: Identical twins come from the same sperm and egg and are assumed to share their entire genomes; fraternal twins match up at only about half their genes. So if the pairs of monozygotic twins tend to share a trait more often than the pairs of dizygotic twins—be it the likelihood they will vote, a tendency toward altruism, or a strategy for managing their financial portfolios—the difference can be chalked up to genetics.

Some call this approach beautiful in its simplicity, but critics say it’s crude, potentially misleading, and based on an antiquated view of genetics. The implications of the studies are also just a little bit dangerous, because they suggest, for example, that some people just aren’t cut out for being nice to one another.

The idea of using twins to study the heritability of traits was the brainchild of the 19th-century British intellectual Sir Francis Galton. He’s not exactly the progenitor you might want for your scientific methods. Galton coined the term “eugenics” and was the inspiration for the push to manipulate human evolution through selective breeding. The movement eventually gave us forced sterilization and the most offensive passage in the history of the U.S. Supreme Court (and that’s really saying something): “Three generations of imbeciles are enough.” (…)

Twin studies rest on two fundamental assumptions: 1) Monozygotic twins are genetically identical, and 2) the world treats monozygotic and dizygotic twins equivalently (the so-called “equal environments assumption”). The first is demonstrably and absolutely untrue, while the second has never been proven. (…)

Twin studies also rely on the false assumption that genetics are constant throughout one’s lifetime. Mutations and environmental factors cause measurable changes to the genome as life progresses. Charney cites the example of exercise, which can accelerate the formation of new neurons and potentially increase genetic variation among individual brain cells. By the time a pair of twins reaches middle age, it’s very difficult to make any assumptions whatsoever about the similarity of their genes.

{ Slate | Continue reading }

Well my baby’s so fine, even her car looks good from behind


Intuition is one of those iffy concepts. Its purpose, use, and ontology have been heavily debated in its long and contentious history. Western proverbial jargon illustrates this: we’ve been told that he who hesitates is lost, but shouldn’t we look before we leap? And believe that we shouldn’t judge a book by its cover, but don’t the clothes make the man?

Now, psychology is weighing in. However, in place of armchair-rationality, it is using empirical data to illustrate how we actually behave. With concrete data, it seems like the intuition debate could finally be put to rest. But the opposite has occurred. Psychology has shown both the powers and perils of intuition only to complicate matters. (…)

First, there is a question about perception: How much do we see? (…)

Second, there is a question about judgment and decision-making: Should I go with my gut? Or think things through?

{ Why We Reason | Continue reading }

oil on canvas { Ingres, Comtesse d’Haussonville, 1845 }

Travis Bickle: Now I see this clearly. My whole life is pointed in one direction.


Burundanga is a scary drug. (…) The scale of the problem in Latin America is not known, but a recent survey of emergency hospital admissions in Bogotá, Colombia, found that around 70 per cent of patients drugged with burundanga had also been robbed, and around three per cent sexually assaulted. “The most common symptoms are confusion and amnesia,” says Juliana Gomez, a Colombian psychiatrist. (…)

News reports allude to another, more sinister, effect: that the drug removes free will, effectively turning victims into suggestible human puppets. Although not fully understood by neuroscience, free will is seen as a highly complex neurological ability and one of the most cherished of human characteristics.

{ Wired UK | Continue reading }

increasingly describe our behaviour as the result of a chain of
cause-and-effect, in which one physical brain state or pattern of
neural activity inexorably leads to the next, culminating in a
particular action or decision. With little space for free choice in
this chain of causation, the conscious, deliberating self seems to
be a fiction. From this perspective, all the real action is
occurring at the level of synapses and neurotransmitters.

For now most of us are content to believe that we have control over
our own lives, but what would happen if we lost our faith in free

{ Susan Sayler | Continue reading }

oil on canvas { Aron Wiesenfeld, The Wedding Party }

I was born in North Dakota a long time ago, see. And now I’m lucky enough to be here with you.


Dr Bryan Caplan, an academic and economist from George Mason University in Virginia, believes parents are working far too hard at bringing up their children. (…)

“Quit fretting over how much TV your kids watch. Don’t force them to do a million activities they hate. Accept that your children’s lives are shaped mostly by their genes and their own choices, not by the sacrifices you make in hopes of turning them into successful adults.”

Caplan points to scientific evidence to support the idea of “serenity parenting.” Research on twins and on adopted children shows, he says, that parents’ long-term effects range from small to zero for a wide range of outcomes such as health and success. (…)

Research also shows that a child’s intelligence can be increased by parental interaction when they are very young, but by the time the child reaches 12 the effect has disappeared.

{ Guardian | Continue reading | More: Bryan Caplan on Parenting | EconTalk | Audio + Transcript }

You murdered the future. That’s negative, Cam. Defeatist. Disappoints me to hear you talk that way.


If there’s one topic likely to generate spit-flecked ire, it is the controversy over the potential health threat posed by cell phone signals.

That debate is likely to flare following the publication today of some new ideas on this topic from Bill Bruno, a theoretical biologist at Los Alamos National Laboratory in New Mexico.

The big question is whether signals from cell phones or cell phone towers can damage biological tissue.

On the one hand, there is a substantial body of evidence in which cell phone signals have supposedly influenced human health and behavior. The list of symptoms includes depression, sleep loss, changes in brain metabolism, headaches and so on.

On the other hand, there is a substantial body of epidemiological evidence that finds no connection between adverse health effects and cell phone exposure.

What’s more, physicists point out that the radiation emitted by cell phones cannot damage biological tissue because microwave photons do not have enough energy to break chemical bonds.

The absence of a mechanism that can do damage means that microwave photons must be safe, they say.

That’s been a powerful argument. Until now.

Today, Bruno points out that there is another way in which photons could damage biological tissue, which has not yet been accounted for.

He argues that the traditional argument only applies when the number of photons is less than one in a volume of space equivalent to a cubic wavelength.

When the density of photons is higher than this, other effects can come into play because photons can interfere constructively.

{ The Physics arXiv Blog | Continue reading }

photo { George Tice }