nswd

ideas

And it’s out where your memories lie, well the road’s out before me

465.jpg

Our house in the western Catskills overlooks the Pepacton Reservoir, a 20-mile ribbon of water between Margaretville and Downsville. Maps on the Internet, depending on their scale and detail, will show you where the reservoir is in relation to nearby towns and roads. What they won’t show you, although every resident of the area knows about them, are the four towns — Arena, Shavertown, Union Grove and Pepacton — that were flooded in the middle ‘50s so that the reservoir could be constructed. (Today, after more than 50 years, resentment against New York City remains strong.) (…)

An apparently empirical project like geography is, and always has been, interpretive through and through. “The map has always been a political agent”(Lize Mogel), has always had a “generative power” (Emily Eliza Scott), and that power can only be released and studied by those who approach their work in the manner of literary critics.

{ NY Times | Continue reading }

related { Some maps contain deliberate errors or distortions, either as propaganda or as a “watermark” helping the copyright owner identify infringement if the error appears in competitors’ maps. The latter often come in the form of nonexistent, misnamed, or misspelled trap streets. | Wikipedia }

If the way which I have pointed out as leading to this result seems exceedingly hard, it may nevertheless be discovered. Needs must it be hard, since it is so seldom found.

2223.jpg

What is a person? What is a human being? What is consciousness? There is a tremendous amount of enthusiasm at the moment about these questions.

They are usually framed as questions about the brain, about how the brain makes consciousness happen, how the brain constitutes who we are, what we are, what we want—our behavior. The thing I find so striking is that, at the present time, we actually can’t give any satisfactory explanations about the nature of human experience in terms of the functioning of the brain.

What explains this is really quite simple. You are not your brain. You have a brain, yes. But you are a living being that is connected to an environment; you are embodied, and dynamically interacting with the world. We can’t explain consciousness in terms of the brain alone because consciousness doesn’t happen in the brain alone.

In many ways, the new thinking about consciousness and the brain is really just the old-fashioned style of traditional philosophical thinking about these questions but presented in a new, neuroscience package. People interested in consciousness have tended to make certain assumptions, take certain things for granted. They take for granted that thinking, feeling, wanting, consciousness in general, is something that happens inside of us. They take for granted that the world, and the rest of our body, matters for consciousness only as a source of causal impingement on what is happening inside of us. Action has no more intimate connection to thought, feeling, consciousness, and experience. They tend to assume that we are fundamentally intellectual—that the thing inside of us which thinks and feels and decides is, in its basic nature, a problem solver, a calculator, a something whose nature is to figure out what there is and what we ought to do in light of what is coming in.

We should reject the idea that the mind is something inside of us that is basically matter of just a calculating machine. There are different reasons to reject this. But one is, simply put: there is nothing inside us that thinks and feels and is conscious. Consciousness is not something that happens in us. It is something we do.

{ Alva Noë/Edge | Continue reading }

photo { William Klein }

‘Maybe he hasn’t called because he’s washing his hands.’ –Blacky II

84.jpg

Why did I self-publish?

Advances are quickly going to zero. Margins are going to zero for publishers. There’s no financial benefit for going with a publisher if advances are going to zero and royalties are a few percentage points. The publishing industry does minimal editing. The time between book acceptance and release is too long (often a year or more). That’s insane and makes zero sense in a print-on-demand world when kindle versions are outselling print versions.

Most importantly, the book industry sells “books”. What they need to do is sell their “authors”. Authors now are brands, they are businesses, they are mini-empires. Publishers do nothing to help 95% of their authors build their platforms and their own brands. This would increase author loyalty and make the lack of a meaningful advance almost worth it.

{ James Altucher | Continue reading }

The machines clanked in threefour time. Thump, thump, thurap.

35.jpg

…the Dunning-Kruger Effect — our incompetence masks our ability to recognize our incompetence.  But just how prevalent is this effect? In search of more details, I called David Dunning at his offices at Cornell:

DAVID DUNNING:  Well, my specialty is decision-making. How well do people make the decisions they have to make in life? And I became very interested in judgments about the self, simply because, well, people tend to say things, whether it be in everyday life or in the lab, that just couldn’t possibly be true. And I became fascinated with that. Not just that people said these positive things about themselves, but they really, really believed them. Which led to my observation: if you’re incompetent, you can’t know you’re incompetent.

{ Errol Morris/NY Times | Continue reading }

photo { Roger Ballen }

Tell me what’s on your mind when you’re alone

876.jpg

What Thomas Young considered his greatest achievement (and he had a few) was overthrowing Newton’s century-old notions of light. In its place, he argued that light was not made up of particles, but was instead a wave, quite like the ripples on the surface of water.

At first, he met with huge resistance to his ideas. But in 1803, Young convinced his skeptics with a simple, game-changing experiment. (…)

So Young performed this experiment with light. To everyone’s surprise (but his), he found that light doesn’t act like the bullets of a machine gun. What he saw on the screen was an interference pattern – alternating bands of light and dark. The interpretation was unambiguous – light behaves like a wave, not like a bunch of particles. (…)

And so the wave theory of light took over for the next century, until no less a figure than Albert Einstein came onto the scene. In his amazing year 1905, Einstein explained a famous experiment – the photoelectric effect – by invoking the idea that light is made of particles that carry energy. He would later win the Nobel Prize for this achievement. Somewhat embarrassed by Newton’s corpuscles, physicists rebranded these particles with a new name – photons.

And soon after, engineers were building devices that could make noises whenever they detected light. Rather than hearing some kind of continuous splish-splosh that you may expect from a wave, they would hear a sound like individual raindrops – tick, tick, tick. Each of those ticks was an individual photon striking the detector.

Now, if you’re with me so far, this is a point where you can stop and scratch your head. On the one hand, Young proved that light is a wave. But then you have Einstein and these detectors. They’re practically screaming in our ears that light is a particle. So what’s really going on here?

This is the dilemma that gave rise to quantum mechanics – depending on what experiment you do, light seems to behave like a wave, or like a particle. It turns out, as physicists later discovered, that this is true for any kind of stuff, not just light.

{ Empirical Zeal | Continue reading }

‘Whenever in my dreams, I see the dead, they always appear silent.’ –Nabokov

3542.jpg

Seventy years ago, in 1940, a popular science magazine published a short article that set in motion one of the trendiest intellectual fads of the 20th century. (…) Benjamin Lee Whorf let loose an alluring idea about language’s power over the mind, and his stirring prose seduced a whole generation into believing that our mother tongue restricts what we are able to think.

In particular, Whorf announced, Native American languages impose on their speakers a picture of reality that is totally different from ours, so their speakers would simply not be able to understand some of our most basic concepts, like the flow of time or the distinction between objects (like “stone”) and actions (like “fall”). For decades, Whorf’s theory dazzled both academics and the general public alike. In his shadow, others made a whole range of imaginative claims about the supposed power of language, from the assertion that Native American languages instill in their speakers an intuitive understanding of Einstein’s concept of time as a fourth dimension to the theory that the nature of the Jewish religion was determined by the tense system of ancient Hebrew.

Eventually, Whorf’s theory crash-landed on hard facts and solid common sense, when it transpired that there had never actually been any evidence to support his fantastic claims. The reaction was so severe that for decades, any attempts to explore the influence of the mother tongue on our thoughts were relegated to the loony fringes of disrepute. But 70 years on, it is surely time to put the trauma of Whorf behind us. And in the last few years, new research has revealed that when we learn our mother tongue, we do after all acquire certain habits of thought that shape our experience in significant and often surprising ways.

Whorf, we now know, made many mistakes. The most serious one was to assume that our mother tongue constrains our minds and prevents us from being able to think certain thoughts. (…)

Consider this example. Suppose I say to you in English that “I spent yesterday evening with a neighbor.” You may well wonder whether my companion was male or female, but I have the right to tell you politely that it’s none of your business. But if we were speaking French or German, I wouldn’t have the privilege to equivocate in this way, because I would be obliged by the grammar of language to choose between voisin or voisine; Nachbar or Nachbarin. These languages compel me to inform you about the sex of my companion whether or not I feel it is remotely your concern. This does not mean, of course, that English speakers are unable to understand the differences between evenings spent with male or female neighbors, but it does mean that they do not have to consider the sexes of neighbors, friends, teachers and a host of other persons each time they come up in a conversation, whereas speakers of some languages are obliged to do so.

On the other hand, English does oblige you to specify certain types of information that can be left to the context in other languages. If I want to tell you in English about a dinner with my neighbor, I may not have to mention the neighbor’s sex, but I do have to tell you something about the timing of the event: I have to decide whether we dined, have been dining, are dining, will be dining and so on. Chinese, on the other hand, does not oblige its speakers to specify the exact time of the action in this way, because the same verb form can be used for past, present or future actions. Again, this does not mean that the Chinese are unable to understand the concept of time. But it does mean they are not obliged to think about timing whenever they describe an action. (…)

In a different experiment, French and Spanish speakers were asked to assign human voices to various objects in a cartoon. When French speakers saw a picture of a fork (la fourchette), most of them wanted it to speak in a woman’s voice, but Spanish speakers, for whom el tenedor is masculine, preferred a gravelly male voice for it. More recently, psychologists have even shown that “gendered languages” imprint gender traits for objects so strongly in the mind that these associations obstruct speakers’ ability to commit information to memory.

Of course, all this does not mean that speakers of Spanish or French or German fail to understand that inanimate objects do not really have biological sex. Nonetheless, once gender connotations have been imposed on impressionable young minds, they lead those with a gendered mother tongue to see the inanimate world through lenses tinted with associations and emotional responses that English speakers — stuck in their monochrome desert of “its” — are entirely oblivious to.

{ NY Times | Continue reading | Thanks Tim }

‘To study the meaning of man and of life — I am making significant progress here.’ –Dostoevsky

345.jpg

In the last decade, human vanity has taken a major hit. Traits once thought to be uniquely, even definingly human have turned up in the repertoire of animal behaviors: tool use, for example, is widespread among non-human primates, at least if a stick counts as a tool. We share moral qualities, such as a capacity for altruism with dolphins, elephants and others; our ability to undertake cooperative ventures, such as hunting, can also be found among lions, chimpanzees and sharks. Chimps are also capable of “culture,” in the sense of socially transmitted skills and behaviors peculiar to a particular group or band. Creatures as unrelated as sea gulls and bonobos indulge in homosexuality and other nonreproductive sexual activities. There are even animal artists: male bowerbirds, who construct complex, obsessively decorated structures to attract females; dolphins who draw dolphin audiences to their elaborately blown sequences of bubbles. Whales have been known to enact what look, to human divers, very much like rituals of gratitude. (…)

Bit by bit, we humans have had to cede our time-honored position at the summit of the “great chain of being” and acknowledge that we share the planet — not very equitably or graciously of course — with intelligent, estimable creatures worthy of moral consideration. (…)

As Paul Trout makes clear in his fascinating Deadly Powers: Animal Predators and the Mythic Imagination, the important distinction, from a human point of view, is not between animals and humans, but between animals that we eat and animals that eat us.

Trout’s book is the most ambitious survey to date of the relationship between humans and the wild carnivores that have preyed on them as long as Homo sapiens, or our hominid ancestors, have existed. (…)

It is the “other animals” who of course have paid the highest price for the human ascent to the top of the food chain. In no small part because of our own terrifying prehistory as prey, humans could not seem to stop killing, as if we had to keep reassuring ourselves, over and over, that we had indeed evolved from prey to predator. Many explanations have been offered for the massive extinctions of large animals (megafauna) that began about 12,000 years ago — viruses, meteor hits, climate changes — but the soundest hypothesis is summarized by the word “overkill.” Humans killed what they needed to eat and then killed much more, eliminating animal populations as they spread out over the globe on foot or by sea. In the Americas, the Pacific Islands, and Australia, megafaunal extinctions follow closely upon the arrival of humans.

{ LA Review of Books | Continue reading }

related { I discovered my incredible strength at the age of 13, and, almost immediately afterwards, promised myself that, one of these days, I would fight a lion. If he chooses to withdraw, or surrender, and lets me tie him up, then I will not kill him and the fight will end. But if it comes down to either me or him, I will have to kill him. If this battle does not get the positive reaction I’m expecting, then I will be forced to leave the country and go somewhere where they can appreciate a man like me: the strongest man in the world. | Foreign Policy | Continue reading }

related { Top 10 Heroic Animals }

‘I suppose every child has a world of his own — and every man, too, for the matter of that. I wonder if that’s the cause for all the misunderstanding there is in life?’ –Lewis Carroll

5.jpg

Mr. Kim belongs to an elite cadre of “puzzle masters” who spend their days building logical mazes and brain teasers. In more than 20 years as a professional puzzle designer, Mr. Kim has worked on everything from word, number and logic puzzles to toys. (…)

Mr. Kim defines puzzles as “problems that are fun to solve and have a right answer,” as opposed to everyday problems like traffic, which, he noted, “are not very well-designed puzzles.”
(…)

He likes changing locations frequently throughout the day, moving from his office to the kitchen table, then to the library or a coffee shop. Each time he changes surroundings, he tackles the problem anew. “I often find that the amount of progress I make is proportional to the number of times I start,” he said. He’s constantly doodling and carries a 3-by-5-inch notebook to record ideas, notes and images.

He borrows ideas for puzzles from architecture, music, science and art (favorite designers include Milton Glaser and Charles and Ray Eames). Occasionally, he gets ideas from dreams. After he dreamed he was surfing on waves of color, Mr. Kim had an idea for a computer game whose goal is to stay on the red wave. (…)

He defines a good puzzle as one that gets people to look at the problem in a new or counterintuitive way.

{ WSJ | Continue reading }

‘The history of the world is the history of a privileged few.’ –Henry Miller

54.jpg

The nothing to hide argument is one of the primary arguments made when balancing privacy against security. In its most compelling form, it is an argument that the privacy interest is generally minimal to trivial, thus making the balance against security concerns a foreordained victory for security. Sometimes the nothing to hide argument is posed as a question: “If you have nothing to hide, then what do you have to fear?” Others ask: “If you aren’t doing anything wrong, then what do you have to hide?”

In this essay, I will explore the nothing to hide argument and its variants in more depth. Grappling with the nothing to hide argument is important, because the argument reflects the sentiments of a wide percentage of the population. In popular discourse, the nothing to hide argument’s superficial incantations can readily be refuted. But when the argument is made in its strongest form, it is far more formidable.

In order to respond to the nothing to hide argument, it is imperative that we have a theory about what privacy is and why it is valuable.

{ Daniel J. Solve, “I’ve Got Nothing to Hide” and Other Misunderstandings of Privacy | SSRN | Continue reading }

‘Genius is the recovery of childhood at will.’ –Arthur Rimbaud

67.jpg

The great thing about cities, the thing that is amazing about cities is as they grow, so to speak, their dimensionality increases. That is, the space of opportunity, the space of functions, the space of jobs just continually increases. And the data shows that. If you look at job categories, it continually increases. I’ll use the word “dimensionality.”  It opens up. And in fact, one of the great things about cities is that it supports crazy people. You walk down Fifth Avenue, you see crazy people. There are always crazy people. Well, that’s good. Cities are tolerant of extraordinary diversity.

This is in complete contrast to companies. The Google boys in the back garage so to speak with ideas of the search engine, were no doubt promoting all kinds of crazy ideas and maybe having even crazy people around them. Well, Google is a bit of an exception, because it still tolerates some of that. But most companies start out probably with some of that buzz. But the data indicates that at about 50 employees to a hundred that buzz starts to stop. A company that was more multi dimensional, more evolved, becomes uni dimensional. It closes down.

Indeed, if you go to General Motors or you go to American Airlines or you go to Goldman Sachs, you don’t see crazy people.

{ Geoffrey West/Edge | Continue reading }

related { The Pierre hotel has suspended a supervisor and agreed to equip all room attendants with panic buttons in the wake of two alleged sexual attacks on Manhattan hotel housekeepers in about as many weeks. }

‘War is over… If you want it.’ –John Lennon

546.jpg

Very few studies have used an evolutionary approach to help understand fictional heroes, and none have directly addressed how the sex of the author might influence the characteristics of the hero. If evolved behavioral differences in the sexes have influenced the subconscious tendencies of human males and females, these differences should be reflected in the fictional characters each creates. Based on sexual selection and inclusive fitness theory, I predicted that females will be more likely than males to create heroes who have family members, and that family members will be more important in the plotlines of female-generated stories. Information collected from twenty children’s fantasy novels published after 1994 display the predicted trends.

In addition, male authors often created parents who were problematic (insane, irresponsible, or evil), something the female authors never did.

{ Victoria Ingalls, The Hero’s relationship to family: Sex differences in hero characteristics, 2010 | Continue reading | PDF }

artwork { Nina Hoffmann }

Hey bro I got somethin’ that’ll blow ya mind

745.jpg

There is an extensive literature dealing with English imperative sentences.

As is well known, these sentences have no overt grammatical subject: (1) Close the door. There is general agreement among scholars that these sentences have deep structures involving an underlying subject, “you,” which is deleted by a transformation.

There is a widespread misconception that utterances such as (2) Fuck you, which also appear to have the form of a transitive verb followed by a noun phrase and preceded by no overt subject, are also imperative. This paper will study the syntax of sentences such as (2).

{ Doug Lemoine | Continue reading }

‘All of this beauty of old times is an effect of and not a reason for nostalgia. I know very well that it is our own invention. But it’s quite good to have this kind of nostalgia, just as it’s good to have a good relationship with your own childhood if you have children. It’s a good thing to have nostalgia toward some periods on the condition that it’s a way to have a thoughtful and positive relation to your own present.’ –Michel Foucault

4512.jpg

{ Jeff Luker }

Omnia per omnia

987.jpg

Let me toss out the idea that, as our markets discover and respond to what consumers most want, our technology has become extremely adept at creating products that correspond to our fantasy ideal of an erotic relationship, in which the beloved object asks for nothing and gives everything, instantly, and makes us feel all powerful, and doesn’t throw terrible scenes when it’s replaced by an even sexier object and is consigned to a drawer.

{ NY Times | Continue reading }

The ones we rub under our arms

234.jpg

If art is a kind of lying, then lying is a form of art, albeit of a lower order—as Oscar Wilde and Mark Twain have observed. Both liars and artists refuse to accept the tyranny of reality. Both carefully craft stories that are worthy of belief—a skill requiring intellectual sophistication, emotional sensitivity and physical self-control (liars are writers and performers of their own work). Such parallels are hardly coincidental, as I discovered while researching my book on lying. Indeed, lying and artistic storytelling spring from a common neurological root—one that is exposed in the cases of psychiatric patients who suffer from a particular kind of impairment.

A case study published in 1985 by Antonio Damasio, a neurologist, tells the story of a middle-aged woman with brain damage caused by a series of strokes. She retained cognitive abilities, including coherent speech, but what she actually said was rather unpredictable. Checking her knowledge of contemporary events, Damasio asked her about the Falklands War. This patient spontaneously described a blissful holiday she had taken in the islands, involving long strolls with her husband and the purchase of local trinkets from a shop. Asked what language was spoken there, she replied, “Falklandese. What else?”

In the language of psychiatry, this woman was ‘confabulating’. Chronic confabulation is a rare type of memory problem that affects a small proportion of brain-damaged people. In the literature it is defined as “the production of fabricated, distorted or misinterpreted memories about oneself or the world, without the conscious intention to deceive”. Whereas amnesiacs make errors of omission—there are gaps in their recollections they find impossible to fill—confabulators make errors of commission: they make things up. Rather than forgetting, they are inventing.

{ The Economist | Continue reading }

The true and the false

445.jpg

Both neuroscience and social science suggest that we are more optimistic than realistic. On average, we expect things to turn out better than they wind up being. People hugely underestimate their chances of getting divorced, losing their job or being diagnosed with cancer; expect their children to be extraordinarily gifted; envision themselves achieving more than their peers; and overestimate their likely life span (sometimes by 20 years or more).

The belief that the future will be much better than the past and present is known as the optimism bias. It abides in every race, region and socioeconomic bracket. (…)

A growing body of scientific evidence points to the conclusion that optimism may be hardwired by evolution into the human brain. (…)

Scientists who study memory proposed an intriguing answer: memories are susceptible to inaccuracies partly because the neural system responsible for remembering episodes from our past might not have evolved for memory alone. Rather, the core function of the memory system could in fact be to imagine the future — to enable us to prepare for what has yet to come. The system is not designed to perfectly replay past events, the researchers claimed. It is designed to flexibly construct future scenarios in our minds. As a result, memory also ends up being a reconstructive process, and occasionally, details are deleted and others inserted.

{ Time | Continue reading }

Our sun formed 4.5 billion years ago, but it’s got 6 billion more before the fuel runs out

{ The ten-member teenage rap collective Odd Future Wolf Gang Kill Them All, led by Tyler the Creator, and including Earl Sweatshirt, Hodgy Beats, Mike G, Left Brain, Domo Genesis, Syd the Kid, Frank Ocean, Taco Bennet, and Jasper Dolphin, with their propensity for punk-inspired beats and obscene lyrics. | Malcolm Harris/The New Inquiry | full story | Plus: Where’s Earl? | The New Yorker | Thanks Daniel }

‘All beginnings are involuntary.’ –Pessoa

227.jpg

{ Pessoa, ‪The book of disquiet‬ | Continue reading }

It’s over. Thanks so much, it was lovely. I’ll get the rest of my stuff later.

226.jpg

Have too many foreclosed properties? Why not give them away?

That’s what Bank of America plans to do with as many as 150 vacant and abandoned properties in and around Chicago through a new “collaboration” with the city that’s intended to address the problem of abandoned properties.

{ Bankrate | Continue reading }

The psychologist Jens Rasmussen talks about three kinds of error: slips, mistakes, and violations. So, a slip is: you just do something you immediately realize wasn’t what you meant to do–pushed the wrong button, locked yourself out of your house, forgot your car keys. Mistakes are things you do because your view of the world is wrong. So, you took out a subprime mortgage and bought a house because you thought house prices would continue to rise and you would be able to remortgage your house. Then there’s a violation–something you know is against the rules but you did it anyway, for whatever reason. So, maybe you falsified your income.

{ Tim Harford/EconTalk }

photo { Tim Geoghegan }

‘Beyond a certain point there is no return. This point has to be reached.’ –Kafka

4564.jpg

In Excavating Kafka, James Hawes tackles some of the myths that have built up around the writer. He suggests that Kafka is generally touted - both in ‘popular culture’ and in the worthy avenues of academe - as a gaunt, melancholy, saint-like type, staring out of blurred black-and-white photographs with anguished eyes. He was a man who ordered in his will that his works should be destroyed, who languished in obscurity throughout his lifetime, who was ‘crushed by a dead-end bureaucratic job’ and, equally, by a tyrannical father. This Kafka was an all-round seer who had no interest in the reception of his work, so preoccupied was he by his ‘Kafkaesque’ imagination. ‘These are the building blocks of the K-myth,’ writes Hawes in his introduction. ‘Unfortunately, they are all rubbish.’

Hawes, a former academic who spent 10 years studying and teaching Kafka, insists that he was not a ‘lonely Middle European Nostradamus’. Rather, he lived with his parents and was set up with a relatively cushy job (six hours a day for the equivalent of £58,000 today), leaving him plenty of time to write. Thanks to his literary connections, he won a major literary prize in his early thirties before even publishing a book. He was not tragically unrequited in his love affairs; nor was he virtually unknown in his lifetime (’we see him named three times in two entirely different articles in a single edition of the Prague Daily News in 1918′). Hawes even proposes that Kafka didn’t really want his work to be burned after his death and knew full well that the loyal Max Brod would never do it.

{ Guardian | Continue reading }



kerrrocket.svg