nswd

ideas

(He looks up. Beside her mirage of datepalms a handsome woman in Turkish costume stands before him. Opulent curves fill out her scarlet trousers and jacket slashed with gold. A wide yells cummerbund girdles her. A white yashmak violet in the night, covers her face, leaving free only her lace dark eyes and raven hair.)

28.jpg

Although it might be called a form of lying, most societies have highly valued storytelling. (…)

If “magic” is the creation of subjective realities in the minds of other peoples, then we moderns have learned how to perform magical incantations on a vast, industrial scale.

And now comes an era when we live immersed in computer-generated “virtual” realities, rendered through lavish games where ersatz selves get to do countless things that our mundane, fleshy selves cannot. Is it any wonder that some people have been talking about a near future when this process may reach its ultimate conclusion? When the denizens of Reality will not be able to verify, by any clear-cut means, that they aren’t living in—or even existing because of—a simulation?

{ David Brin/IEET | Continue reading }

Cry not yet! There’s many a smile to Nondum.

2219.jpg

If a tree falls in the forest, and there’s nobody around to hear, does it make a sound?

Of course, the answer depends on how we choose to interpret the use of the word ‘sound’. (…)

Here the word ‘sound’ is used to describe a physical phenomenon –- the wave disturbance. But sound is also a human experience, the result of physical signals delivered by human sense organs which are synthesized in the mind as a form of perception.

Now, to a large extent, we can interpret the actions of human sense organs in much the same way we interpret mechanical measuring devices. The human auditory apparatus simply translates one set of physical phenomena into another, leading eventually to stimulation of those parts of the brain cortex responsible for the perception of sound. It is here that the distinction comes. Everything to this point is explicable in terms of physics and chemistry, but the process by which we turn electrical signals in the brain into human perception and experience in the mind remains, at present, unfathomable.

Philosophers have long argued that sound, colour, taste, smell and touch are all secondary qualities which exist only in our minds. We have no basis for our common-sense assumption that these secondary qualities reflect or represent reality as it really is. So, if we interpret the word ‘sound’ to mean a human experience rather than a physical phenomenon, then when there is nobody around there is a sense in which the falling tree makes no sound at all.

This business about the distinction between ‘things-in-themselves’ and ‘things-as-they-appear’ has troubled philosophers for as long as the subject has existed, but what does it have to do with modern physics, specifically the story of quantum theory? In fact, such questions have dogged the theory almost from the moment of its inception in the 1920s.

{ OUP | Continue reading }

photo { Walter Pickering }

‘When we remember that we are all mad, the mysteries disappear and life stands explained.’ –Mark Twain

4413.jpg

Belief in a causal loop between the stomach and the mind never disappeared. (…) In the 1880s, Nietzsche diagnosed the whole Western philosophical tradition as a case of indigestion.

{ Review of A Modern History of the Stomach: Gastric Illness, Medicine and British Society, 1800-1950 by Ian Miller | London Review of Books | Continue reading }

But, notwithstanding, we feel and know that we are eternal

27.jpg

Take a look at your hands. In them, you find atoms that once belonged to stars dead more than five billion years ago. Those stars, bigger than our sun, forged much of the chemistry of life during their last moments, before exploding into giant supernovae. They forged chemical elements spread through the interstellar medium, collecting here and there in self-gravitating hydrogen clouds. Occasionally, these clouds would become unstable to their own gravity and contract. These contracting nebulae gave rise to stars and their orbiting planets, trillions of them in our Milky Way alone.

In at least one of them, elements combined in incredibly complex ways to create living creatures. And of these myriad beings, one developed mind, the ability to sustain complex thoughts and to wonder about its origins.

We are, in a very real sense, self-aware stardust. (…)

There are many gaps to fill in this cosmic narrative, and this is what makes science exciting. As we thrust ahead, we learn more about the universe and our place in it. Perhaps one of the most controversial questions that follows from this discussion concerns our inevitability. Is our existence an inevitable consequence of the laws of Nature? Or are we an accident, and the cosmos could equally well exist without us?

{ Marcelo Gleiser | Continue reading }

If you want breakfast in bed, sleep in the kitchen

411.jpg

Where does psychological health end and mental illness begin? (…)

We are in the midst of a mental illness epidemic. Office visits by children and adolescents treated for the condition jumped forty-fold from 1994 to 2003. According to the National Institute of Mental Health, nearly half of all Americans have suffered from mental illness—depression, anxiety, even psychosis—at some time in their lives. Is one out of every two Americans mentally ill, or could it be that the system of psychiatric diagnosis too often mistakes the emotional problems of everyday life for psychopathology?

This system is codified in the Diagnostic and Statistical Manual of Mental Disorders, the official handbook of the American Psychiatric Association. Now in its fourth edition, the psychiatric bible, as it is sometimes called, spells out the criteria for over 360 different diagnoses. The DSM serves as the basic text for training practitioners, for insurance companies who rely on it to determine coverage, for social service agencies who use it to assess disabilities, and for the courts, which turn to it to resolve questions of criminal culpability, competence to stand trial, and insanity.

Despite the vast influence of DSM and the best efforts of its architects, the manual has failed to clear up the murky border between health and sickness.

{ The New Republic | Continue reading }

‘From the cradle to the coffin underwear comes first.’ –Bertolt Brecht

410.jpg

The ad industry is quickly evolving into a new industry - one that won’t offer only the limited menu of services that’s attributed to it today. I’m not sure if this new industry should even be called advertising anymore, as the term itself can be an albatross to innovation. But whatever the name is, it’ll be even more exciting and productive than in its current incarnation.

When I invented the 4th Amendment Wear brand for my consultancy, I didn’t realize at the time that it would teach me such an important lesson about where we’re headed. (…)

It’s one thing to create an ad. It’s a whole other beast to invent new technology, create products using that technology, tap into social media, and orchestrate a marketing campaign to reach millions. (…)

While much of 4th Amendment Wear’s success can be attributed to the brand being in the right place at the right time, the truth is, all brands need to be.

{ Tim Geoghegan | Continue reading | 4th Amendment Wear picked up the Gold lion for Promo & Activation at Cannes. }

Heineken Star Player… (…) Whether this piece of work gets recognized at Cannes this week or not is not relevant or even important. What’s important is that it wasn’t the regular copywriter + art director duo who came up with the Idea. It was a combination of a Storyteller and a Software Developer who conceived it.

{ Rei Inamoto | Continue reading }

Good old world (Waltz)

49.jpg

Look at eggs. Today, a couple of workers can manage an egg-laying operation of almost a million chickens laying 240,000,000 eggs a year. How can two people pick up those eggs or feed those chickens or keep them healthy with medication? They can’t. The hen house does the work—it’s really smart. The two workers keep an eye on a highly mechanized, computerized process that would have been unimaginable 50 years ago.

But should we call this progress? In a sense it sounds like a deal with the devil. Replace workers with machines in the name of lower costs. Profits rise. Repeat. It’s a wonder unemployment is only 9.1%. Shouldn’t the economy put people ahead of profits?

Well, it does. The savings from higher productivity don’t just go to the owners of the textile factory or the mega hen house who now have lower costs of doing business. Lower costs don’t always mean higher profits. Or not for long. Those lower costs lead to lower prices as businesses compete with each other to appeal to consumers.

The result is a higher standard of living for consumers. (…)

Somehow, new jobs get created to replace the old ones. Despite losing millions of jobs to technology and to trade, even in a recession we have more total jobs than we did when the steel and auto and telephone and food industries had a lot more workers and a lot fewer machines.

{ WSJ | Continue reading }

‘To lead the people, walk behind them.’ –Lao Tzu

453.jpg

I am fascinated with zombies. Always have been, but even more so since I took an interest in microbiology. The zombie apocalypse is the best known and best chronicled viral infection which hasn’t happened. But it could happen any day. (…)

One problem that has to do with zombification is the loss free will. Do zombies have free will? More to the point, do humans have free will?

In two papers entitled A Wasp Manipulates Neuronal Activity in the Sub-Esophageal Ganglion to Decrease the Drive for Walking in Its Cockroach Prey and On predatory wasps and zombie cockroaches — Investigations of “free will” and spontaneous behavior in insects, Ram Gal and Frederic Libersat from Ben Gurion University explore free will in cockroaches. Do cockroaches have free will, or are they just sophisticated automatons? And where do we draw the line between the two?

Gal and Libersat  use the following definition for free will: the expression of patterns of “endogenously-generated spontaneous behavior”. That is, a behavior which has a pattern (i.e. not just random fluctuations) and must come from within (i.e. not entirely in response to external stimuli). They cite studies where such behavior — which they define as a “precursor of free will in insects” — is observed. They then show how this behavior is removed from cockroaches when the roaches are attacked by a wasp. (…)

So how about molecular zombies?

{ Byte Size Biology | Continue reading }

In our dreams (writes Coleridge) images represent the sensations we think they cause

47.jpg

Tradition relates that, upon waking, he felt that he had received and lost an infinite thing, something he would not be able to recuperate or even glimpse, for the machinery of the world is much too complex for the simplicity of men.

{ J. L. Borges, Parables, Inferno, 1, 32 | Selected Stories & Other Writings by Jorge Luis Borges | PDF }

painting { Henry Fuseli, The Shepherds Dream, 1793 }

That day, she was amazed to discover that when he was saying ‘As you wish,’ what he meant was, ‘I love you.’ And even more amazing was the day she realized she truly loved him back.

2224.jpg

Inspiration is highly overrated. If you sit around and wait for the clouds to part, it’s not liable to ever happen. More often than not, work is salvation. (…)

The choice not to do something is almost always more interesting than the choice to do something. (…)

If you’re by nature an optimistic person, which I am, that puts you in a lot better position to be lucky.

{ Chuck Close | Continue reading }

Down down with a fuzzbox checking out what it could do

43.jpg

Social hierarchies are quite complicated. In the animal world hierarchies are wildly different based on social contexts, species, and environmental factors. For some animals, such as bull elephant seals, hierarchies are unstable—individuals spend a relatively short times at the top of the food chain—and what these alpha males get in terms of mating preferences, they pay dearly for in terms of physical fighting, aggressive confrontation, and threats from other male rivals. In unstable hierarchies, it’s hard to be at the top.

Most hierarchies are much more stable than the example of the bull elephant seal. For instance, in human social life, social hierarchies are typically stable within a specific context. For example, you and your boss aren’t likely to switch roles halfway through the year. And there is good reason for that. If people were allowed to switch willy-nilly between high and low status roles, it would be hard to know who to turn to for advice or guidance, whose directions should be followed, and who should take responsibility for the group’s failures. (…)

Hierarchies, in this case, are an essential way in which people can organize their social lives around others. So in some instances, having some people with low status and some people with high status is good. (…)

There are, of course, some important caveats to this reasoning. (…) A large history of research on socioeconomic status suggests that being low in socioeconomic status is bad for your health. In short, you die sooner when you are lower in socioeconomic status relative to others.

{ Psych Your Mind | Continue reading }

You’ve got the braun, I’ve got the brains

Creative cultural transmission as chaotic sampling

First, Chaos: Some formula produce unpredictable trajectories, for instance the Lorenz attractor. Here’s what part of a trajectory looks like:

425.jpg

You can play with the dynamics using this applet.

The trajectory will not pass through the same point twice, but is not completely random. Lorenz attractors have been used to re-sample sequences in the following way: Imagine you have a sequence of musical notes. Pick a starting point on the Lorenz trajectory and associate each note with successive points. Now you have your notes laid out on the Lorenz attractor so that for any point in the space you can find the closest associated note. If you start on the Lorenz trajectory from a different point, you can sample the notes in a different sequence. This sample will be different from the original, but tends to preserve some of the structure. That is, the Lorenz attractor scrambles the sample, but in a chaotic way, not a random one.

{ Replicated Typo | Continue reading }

related { Gilles Deleuze, Difference and Repetition, 1968 }

Don’t really talk much (uh huh)


As Zadie Smith argued in a recent New York Review of Books article, Facebook’s private-in-public mode of operation traps us:

It feels important to remind ourselves, at this point, that Facebook, our new beloved interface with reality, was designed by a Harvard sophomore with a Harvard sophomore’s preoccupations. What is your relationship status? (Choose one. There can be only one answer. People need to know.) Do you have a “life”? (Prove it. Post pictures.)

The juvenile mentality built in the medium pushes us to broadcast our private lives and expect that the details we share will be obsessively dissected. We sense, more or less consciously, that with the capability to broadcast our lives comes an obligation to be entertaining. (…)

Thanks to social media, we are no longer obliged to disguise our voyeuristic impulses. Voyeurism has been culturally legitimized. We can turn to the real events of our lives as we have retold them and to the reactions they have prompted. On the internet, our personal lives have become our television shows.

{ The New Inquiry | Continue reading }

related { Facebook Lost Nearly 6 Million Users in U.S. in May }

And the more I grow, the more you decline

331.jpg

The exceptional thing about Apple is not that it’s the most valuable consumer-facing brand in the world, that it has a market cap larger than Microsoft, or that its stock performance over the past decade bested Google. No, what’s different about Apple is that for a really long time—more than 20 of the 33 years it has been on this earth—it was a niche player. (…)

For the first two-thirds of their existence on this planet, no mammal ever got much bigger or more ambitious than a modern-day rodent. They were niche players. For about 135 of the 200 million years they have existed, mammals lived in the shadow of the dinosaurs.

The dinosaurs that had a death grip on all the available ecological niches in the tech ecosystem—mostly Microsoft—kept out other players through competitive exclusion. Competitive exclusion means, basically, that if someone’s already really good at being, for example, the top predator, there’s no room for anyone else to come in and assume that mantle.

Then something crazy happened—an asteroid strike, some climate change—and the dinosaurs, the largest creatures ever to walk the earth, were wiped out. In this analogy, it was the Internet, or the early days of the post-PC transition (e.g., the iPod) that eroded the lead of Microsoft. Of course, Microsoft is still with us, and this transition is far from complete—even the dinosaurs weren’t wiped out overnight, and plenty of their relatives are still with us today.

After that extinction event, all the advantages that Apple—and mammals—already possessed were suddenly useful. A high metabolism, good design, live birth, devices that “just work.” These things were present all along, but they were meaningless as long as something bigger and stronger was present to prevent a growth in the market share of Apple, or an expansion of the niches occupied by mammals.

Taken together, these forces explain how it’s possible that a company, or an entire class of animals, could spend so much time humming along in a tiny niche, and then achieve “overnight” success by deploying traits they had been developing all along.

{ Christopher Mims/Technology Review | Continue reading | Read more: How the Rise of Google’s Chromebook Is Like the Rise of Multicellular Life | And: The first commercial deployment of SPDY (vs HTTP), a protocol designed by Google to make websites faster, launched last week. }

photo { Jesse Chehak }

‘What is a poet? An unhappy man who conceals profound anguish in his heart, but whose lips are so fashioned  that when sighs and groans pass over them they sound like beautiful music.’ –Kierkegaard

111.jpg

You talk a lot in the book about the power of creativity and innovation, and I’m very sympathetic to that. But not everybody has creative jobs. What about those folks? Well, first of all, not everybody wants a creative job, either. I’ve met plenty of people in my life who really do want to come in at 9, be given a list of tasks they must accomplish; and they want to be able to go home when the clock strikes 5. They are not really interested in creativity. That’s fine. Or at least on the job. From your point of view, what about those who do seek to demonstrate their creativity and do not have an outlet in their workplace? As I say in the book, there are certainly plenty of people who have miserable jobs, miserable bosses. They may also have miserable home lives, as well. But when you look at the data, and if you take the United States in particular and ask, are you satisfied or pretty satisfied in your job, the numbers are pretty high. About 2/3 or more than 2/3 of the working population are satisfied or very satisfied with their jobs. Is it for me to judge whether you should be satisfied? The UPS man is trotting up to my door as we speak. He looks happy; pay is decent wages. Would I be happy? My back might ache after a couple of days on the job. The question is, in what kind of society, with what kind of political and economic system, are people given a greater choice among occupations? If you look back through history, we’ve been on this march towards more freedom; using Milton Friedman’s words, freedom to choose what sort of jobs. Back in olden times, if your father was a blacksmith, you would be a blacksmith; and your forefather was a blacksmith. Same is true of various vocations. We all know the reason families might have the surname Smith was because their entire family going back a thousand years in Europe were smiths and blacksmiths. We now no longer have those constraints, and I think that gives us greater opportunity for creativity, even if it turns out that there are discontented people, legitimately, who can’t quite figure out how to do it. I live in southern California, and they say every cab driver here has a screenplay in his trunk. He doesn’t want to be driving a cab. He wants Steven Spielberg to call him up for a meeting. Probably not going to happen. But is there another society, another form of economic matter that can make that more likely? I don’t really think so.

{ Todd Buchholz/EconTalk | Continue reading }

There’s a place on my arm where I’ve written his name, next to mine

228.jpg

Not many authors can boast of having written a best-selling pornographic novel, much less one regarded as an erotica classic—but Pauline Réage could. Make that Dominique Aury. No: Anne Desclos.

All three were the same woman, but for years the real name behind the incendiary work was among the best-kept secrets in the literary world. Forty years after the publication of the French novel Histoire d’O, the full truth was finally made public. Even then, some still considered it the most shocking book ever written. When the book came out, its purported author was “Pauline Réage,” widely believed to be a pseudonym. Although shocking for its graphic depictions of sadomasochism, the novel was admired for its reticent, even austere literary style. It went on to achieve worldwide success, selling millions of copies, and has never been out of print. (…)

Desclos (or, rather, Aury, as she became known in her early thirties) was obsessed with her married lover, Jean Paulhan. She wrote the book to entice him, claim him, and keep him—and she wrote it exclusively for him. It was the ultimate love letter. (…)

Story of O, the title of the English edition, is an account of a French fashion photographer, known only as O, who descends into debasement, torment, humiliation, violence, and bondage, all in the name of devotion to her lover, René. Over the course of the novel she is blindfolded, chained, flogged, pierced, branded, and more.

{ Guernica | Continue reading }

photo { J. Kursel }

‘There is no individual thing in nature, than which there is not another more powerful and strong. Whatsoever thing be given, there is something stronger whereby it can be destroyed.’ –Spinoza

4.jpg

Is Aging a Disease?

One argument against treating aging is that it is not a disease. To an extent, this view stems from the fact that the word aging refers to different things. One is the experience of the passage of time. Another is the acquisition of experience and wisdom that can come from living long. To avoid confusion with these benign aspects, biologists use the term “senescence” for the increasing frailty and risk of disease and death that come with aging. Put more precisely, then, the question at hand is this: Is human senescence a disease?

One approach to defining illness has been to compare a given condition to good health. Is someone’s condition typical of a person of a given gender or age? For instance, the possession of ovaries is healthy for a woman, but not a man. Likewise, one might consider muscle wasting to indicate serious disease in a 20-year-old, but not a 90-year-old. Given that everyone who lives long enough will eventually experience senescence, I can appreciate the view that it is a normal condition and therefore not pathological. Still, from my perspective as someone working on the biological basis of aging, it is hard not to see it as a disease.

Senescence is a process involving dysfunction and deterioration at the molecular, cellular and physiological levels. This endemic malfunction causes diseases of aging. Even if one ages well, escaping the ravages of cancer or type II diabetes, one still dies in the end, and one dies of something. Moreover, in evolutionary terms, aging appears to serve no real purpose, meaning it does not contribute to evolutionary fitness. Why, then, has aging evolved?

The main theory dates back to the 1930s and was developed by J. B. S. Haldane and, later, Peter Medawar—both of University College London—and by the American biologist George C. Williams of the State University of New York, Stony Brook. It argues that aging reflects the decline in the force of natural selection against mutations that exert harmful effects late in life. An inherited mutation causing severe pathology in childhood will reduce the chances of reproduction and so disappear from the population. By contrast, another mutation with similar effects—but which surfaces after a person’s reproductive years—is more likely to persist. Natural selection can even favor mutations that enhance fitness early in life but reduce late-life health. This is because the early-life effects of genes have much stronger effects on fitness. Consequently, populations accumulate mutations that exert harmful effects in late life, and the sum of these effects is aging. Here evolutionary biology delivers a grim message about the human condition: Aging is essentially a multifactor genetic disease. It differs from other genetic diseases only in that we all inherit it. This universality does not mean that aging is not a disease. Instead, it is a special sort of disease.

{ American Scientist | Continue reading }

photo { Noritoshi Hirakawa }

Each thing, as far as it can by its own power, strives to persevere in its being

443.jpg

Conatus (Latin for effort; endeavor; impulse, inclination, tendency; undertaking; striving) is a term used in early philosophies of psychology and metaphysics to refer to an innate inclination of a thing to continue to exist and enhance itself. This “thing” may be mind, matter or a combination of both.

Over the millennia, many different definitions and treatments have been formulated by philosophers. Seventeenth-century philosophers René Descartes, Baruch Spinoza, and Gottfried Leibniz, and their Empiricist contemporary Thomas Hobbes made important contributions.

The history of the term conatus is that of a series of subtle tweaks in meaning and clarifications of scope developed over the course of two and a half millennia. Successive philosophers to adopt the term put their own personal twist on the concept, each developing the term differently such that it now has no concrete and universally accepted definition.

{ Wikipedia | Continue reading }

image { 3D Simulation of Gravitational Waves produced by merging black holes, representing the largest astrophysical calculation ever performed on a NASA supercomputer. }

And it’s out where your memories lie, well the road’s out before me

465.jpg

Our house in the western Catskills overlooks the Pepacton Reservoir, a 20-mile ribbon of water between Margaretville and Downsville. Maps on the Internet, depending on their scale and detail, will show you where the reservoir is in relation to nearby towns and roads. What they won’t show you, although every resident of the area knows about them, are the four towns — Arena, Shavertown, Union Grove and Pepacton — that were flooded in the middle ‘50s so that the reservoir could be constructed. (Today, after more than 50 years, resentment against New York City remains strong.) (…)

An apparently empirical project like geography is, and always has been, interpretive through and through. “The map has always been a political agent”(Lize Mogel), has always had a “generative power” (Emily Eliza Scott), and that power can only be released and studied by those who approach their work in the manner of literary critics.

{ NY Times | Continue reading }

related { Some maps contain deliberate errors or distortions, either as propaganda or as a “watermark” helping the copyright owner identify infringement if the error appears in competitors’ maps. The latter often come in the form of nonexistent, misnamed, or misspelled trap streets. | Wikipedia }

If the way which I have pointed out as leading to this result seems exceedingly hard, it may nevertheless be discovered. Needs must it be hard, since it is so seldom found.

2223.jpg

What is a person? What is a human being? What is consciousness? There is a tremendous amount of enthusiasm at the moment about these questions.

They are usually framed as questions about the brain, about how the brain makes consciousness happen, how the brain constitutes who we are, what we are, what we want—our behavior. The thing I find so striking is that, at the present time, we actually can’t give any satisfactory explanations about the nature of human experience in terms of the functioning of the brain.

What explains this is really quite simple. You are not your brain. You have a brain, yes. But you are a living being that is connected to an environment; you are embodied, and dynamically interacting with the world. We can’t explain consciousness in terms of the brain alone because consciousness doesn’t happen in the brain alone.

In many ways, the new thinking about consciousness and the brain is really just the old-fashioned style of traditional philosophical thinking about these questions but presented in a new, neuroscience package. People interested in consciousness have tended to make certain assumptions, take certain things for granted. They take for granted that thinking, feeling, wanting, consciousness in general, is something that happens inside of us. They take for granted that the world, and the rest of our body, matters for consciousness only as a source of causal impingement on what is happening inside of us. Action has no more intimate connection to thought, feeling, consciousness, and experience. They tend to assume that we are fundamentally intellectual—that the thing inside of us which thinks and feels and decides is, in its basic nature, a problem solver, a calculator, a something whose nature is to figure out what there is and what we ought to do in light of what is coming in.

We should reject the idea that the mind is something inside of us that is basically matter of just a calculating machine. There are different reasons to reject this. But one is, simply put: there is nothing inside us that thinks and feels and is conscious. Consciousness is not something that happens in us. It is something we do.

{ Alva Noë/Edge | Continue reading }

photo { William Klein }



kerrrocket.svg