ideas
Why does AI feel so human if it’s just a “calculator for words”? […]
Most language users are only indirectly aware of the extent to which their interactions are the product of statistical calculations.
Think, for example, about the discomfort of hearing someone say “pepper and salt” rather than “salt and pepper”. Or the odd look you would get if you ordered “powerful tea” rather than “strong tea” at a cafe.
The rules that govern the way we select and order words, and many other sequences in language, come from the frequency of our social encounters with them. The more often you hear something said a certain way, the less viable any alternative will sound.
In linguistics, the vast field dedicated to the study of language, these sequences are known as “collocations”. They’re just one of many phenomena that show how humans calculate multiword patterns based on whether they “feel right” – whether they sound appropriate, natural and human.
{ Science Alert | Continue reading }
Linguistics, robots & ai | September 10th, 2025 3:12 am
Why do philosophers keep debating the same big questions—about free will, morality, knowledge, and political authority—without ever settling them? This piece explores several possible answers. Maybe philosophy makes progress by spinning off answerable questions into the sciences. Maybe some problems are just too hard for minds like ours. Or maybe the trouble lies in language: our concepts are vague, our disagreements often verbal, or the questions themselves may be confused. […]
I suggest that philosophy’s value doesn’t lie in delivering final answers, but in helping us clarify our assumptions, explore alternatives, and better understand the questions that matter most, even when we can’t resolve them.
{ Michael Hannon | Continue reading }
ideas | August 7th, 2025 8:17 am
While the free will debate tends to focus primarily on the implications of determinism for freedom, a long line of philosophers have also argued that free will would not be compatible with indeterminism either. These arguments typically take the form of a so-called Luck Objection: a family of related arguments which all seek to show, roughly, that if an action is not causally pre-determined then it must be a sort of random happening, over which the agent lacks the control required for free will. […]
We develop an empirically plausible model of agential decision-making and apply this to the problem of luck. We argue that, under such a model, it is entirely natural to think of an agent’s actions as both ‘undetermined’ (in the sense of being under-determined) and under their own control.
{ Chance, Choice, and Control: Free Will in an Indeterministic Universe | Continue reading }
ideas | July 31st, 2025 7:39 am
An artificial intelligence firm downloaded for free millions of copyrighted books in digital form from pirate sites on the internet. The firm also purchased copyrighted books (some overlapping with those acquired from the pirate sites), tore off the bindings, scanned every page, and stored them in digitized, searchable files. All the foregoing was done to amass a central library of “all the books in the world” to retain “forever.”
From this central library, the AI firm selected various sets and subsets of digitized books to train various large language models under development to power its AI services. Some of these books were written by plaintiff authors, who now sue for copyright infringement.
[…]
Defendant Anthropic PBC is an AI software firm founded by former OpenAI employees in January 2021. Its core offering is an AI software service called Claude. When a user prompts Claude with text, Claude quickly responds with text — mimicking human reading and writing. Claude can do so because Anthropic trained Claude — or rather trained large language models or LLMs underlying various versions of Claude — using books and other texts selected from a central library Anthropic had assembled. Claude was first released publicly in March 2023. Seven successive versions of Claude have been released since. Users may ask Claude some questions for free. Demanding users and corporate clients pay to use Claude, generating over one billion dollars in annual revenue.
[…]
This order grants summary judgment for Anthropic that the training use was a fair use. And, it grants that the print-to-digital format change was a fair use for a different reason. But it denies summary judgment for Anthropic that the pirated library copies must be treated as training copies.
We will have a trial on the pirated copies used to create Anthropic’s central library and the resulting damages, actual or statutory (including for willfulness). That Anthropic later bought a copy of a book it earlier stole off the internet will not absolve it of liability for the theft but it may affect the extent of statutory damages. Nothing is foreclosed as to any other copies flowing from library copies for uses other than for training LLMs.
{ Judge rules Anthropic training on books it purchased was “fair use,” but not for the ones it stole | United States District Court, Northern District of California | Full Order | PDF }
books, law, robots & ai | June 25th, 2025 4:41 am
Tomorrow’s US military must approach warfighting with an alternate mindset that is prepared to leverage all elements of national power to influence the ideological spheres of future enemies by engaging them with alternate means—memes—to gain advantage.
{ MEMETICS—A GROWTH INDUSTRY IN US MILITARY OPERATIONS | PDF }
fights, marketing, media, strategy | May 18th, 2025 11:13 am
psychologists were grappling with how to define and measure creativity in humans. The prevailing theory—that creativity was a product of intelligence and high IQ—was fading, but psychologists weren’t sure what to replace it with. The Dartmouth organizers had one of their own. “The difference between creative thinking and unimaginative competent thinking lies in the injection of some randomness,” they wrote, adding that such randomness “must be guided by intuition to be efficient.”
Nearly 70 years later, following a number of boom-and-bust cycles in the field, we now have AI models that more or less follow that recipe. While large language models that generate text have exploded in the last three years, a different type of AI, based on what are called diffusion models, is having an unprecedented impact on creative domains. By transforming random noise into coherent patterns, diffusion models can generate new images, videos, or speech, guided by text prompts or other input data. The best ones can create outputs indistinguishable from the work of people, as well as bizarre, surreal results that feel distinctly nonhuman.
Now these models are marching into a creative field that is arguably more vulnerable to disruption than any other: music. AI-generated creative works—from orchestra performances to heavy metal—are poised to suffuse our lives more thoroughly than any other product of AI has done yet. The songs are likely to blend into our streaming […]
Music models can now create songs capable of eliciting real emotional responses, presenting a stark example of how difficult it’s becoming to define authorship and originality in the age of AI.
The courts are actively grappling with this murky territory. Major record labels are suing the top AI music generators, alleging that diffusion models do little more than replicate human art without compensation to artists. The model makers counter that their tools are made to assist in human creation.
In deciding who is right, we’re forced to think hard about our own human creativity. Is creativity, whether in artificial neural networks or biological ones, merely the result of vast statistical learning and drawn connections, with a sprinkling of randomness? If so, then authorship is a slippery concept. If not—if there is some distinctly human element to creativity—what is it? […]
We can first divide the human creative process into phases, including an ideation or proposal step, followed by a more critical and evaluative step that looks for merit in ideas. A leading theory on what guides these two phases is called the associative theory of creativity, which posits that the most creative people can form novel connections between distant concepts. […] For example, the word apocalypse is more closely related to nuclear power than to celebration. Studies have shown that highly creative people may perceive very semantically distinct concepts as close together. Artists have been found to generate word associations across greater distances than non-artists. […]
A new study, led by researchers at Harvard Medical School and published in February, suggests that creativity might even involve the suppression of particular brain networks, like ones involved in self-censorship.
{ Technology Review | Continue reading }
Ask any creativity expert today what they mean by “creativity,” and they’ll tell you it’s the ability to generate something new and useful. That something could be an idea, a product, an academic paper—whatever. But the focus on novelty has remained an aspect of creativity from the beginning. It’s also what distinguishes it from other similar words, like imagination or cleverness. […]
The kinds of LLMs that Silicon Valley companies have put forward are meant to appear “creative” in those conventional senses. Now, whether or not their products are meaningful or wise in a deeper sense, that’s another question. If we’re talking about art, I happen to think embodiment is an important element. Nerve endings, hormones, social instincts, morality, intellectual honesty—those are not things essential to “creativity” necessarily, but they are essential to putting things out into the world that are good, and maybe even beautiful in a certain antiquated sense. That’s why I think the question of “Can machines be ‘truly creative’?” is not that interesting, but the questions of “Can they be wise, honest, caring?” are more important if we’re going to be welcoming them into our lives as advisors and assistants.
{ Technology Review | Continue reading }
ideas, music, robots & ai | May 18th, 2025 7:21 am

In Plato’s dialogue Symposium, seven varied speeches are made on the meaning of love at an all-male drinking party set in ancient Athens in 416 BCE. One of the participants is the philosopher Socrates, and when it comes to his turn to speak, he is made to say something surprising: he proposes to ‘tell the truth’ about love. It’s surprising because in other Platonic dialogues, where Socrates addresses questions such as ‘What is knowledge?’, ‘What is excellence?’, and ‘What is courage?’, he has no positive answers to give about these central areas of human thought and experience: in fact, Socrates was well known for having laid no claim to knowledge, and for asserting that ‘the only thing I know is that I do not know’. How is it, then, that Socrates can claim to know the truth about something as fundamental and potentially all-encompassing as love?
The answer is that, in the Symposium, Socrates claims to know the truth only because he learned it from someone else. […]
The doctrine Socrates attributes to Diotima in the Symposium is that love – or, more precisely, the divine spirit Eros – operates on various levels. At the lowest level, love engenders erotic feelings towards the body of someone to whom one is attracted. However, what attracts us about that body is, Diotima says, a quality that we call its ‘beauty’, which in turn leads to a recognition that many other bodies possess this quality and are equally capable of inspiring erotic feelings. By recognising the presence of beauty in many bodies, one comes to understand that what is attractive to us is not the bodies themselves, but the abstract quality of beauty of which the bodies partake. […]
according to Diotima, the commonplace erotic desire that we feel towards a person we consider to be beautiful can lead us up the ‘ladder’ of love, rung by rung, ascending from the particular object of desire to a general appreciation of the abstract quality of beauty and, beyond that, to moral goodness. What begins as physical lust is ennobled by the way it encourages the lover to mount upwards to the highest goodness imaginable, the abstract ‘form of the good’.
{ Aeon | Continue reading }
ideas, relationships | April 23rd, 2025 7:24 am

SIMON: We have free will in the sense that our resulting behavior will depend on who we are and the situation we are in. People respond differently when confronting the same situation.
BORGES: So, when faced with a situation in which there is a choice to be made between two possible behaviors, we can choose one of them?
SIMON: Your mental programming does the choosing.
It seems to me that Simon is here arguing for what philosophers call “compatibilism” — the idea that determinism can coexist with meaningful human choice and responsibility.
[…]
BORGES: Now, does this account for all of our actions? That is, if my right hand is resting on my left hand, is it because it has to be this way? I believe people do quite a lot of things without any thinking.
SIMON: That’s the doing of our subconscious mind. […] that’s because we are heavily programmed. […] when we study a person who is in the process of solving a problem, we start from the assumption that every little thing has a cause. We are not always able to identify those causes.
{ When Jorge Luis Borges met one of the founders of AI | Continue reading }
oil and charcoal on linen { Chris Ofili, Iscariot Blues, 2006 }
ideas | April 12th, 2025 10:13 am
The rape of the Sabine women, also known as the abduction of the Sabine women, was an incident in the legendary history of Rome in which the men of Rome committed bride kidnappings or mass abduction for the purpose of marriage, of women from other cities in the region. It has been a frequent subject of painters and sculptors, particularly since the Renaissance.
The word “rape” is the conventional translation of the Latin word raptio used in the ancient accounts of the incident. The Latin word means “taking”, “abduction” or “kidnapping”, but when used with women as its object, sexual assault is usually implied. […]
According to Roman historian Livy, the abduction of Sabine women occurred in the early history of Rome shortly after its founding in the mid-8th century BC and was perpetrated by Romulus [legendary founder and first king of Rome] and his predominantly male followers; it is said that after the foundation of the city, the population consisted solely of Latins and other Italic peoples, in particular male bandits. With Rome growing at such a steady rate in comparison to its neighbors, Romulus became concerned with maintaining the city’s strength. His main concern was that with few women inhabitants there would be no chance of sustaining the city’s population, without which Rome might not last longer than a generation. On the advice of the Senate, the Romans then set out into the surrounding regions in search of wives to establish families with. The Romans negotiated unsuccessfully with all the peoples that they appealed to, including the Sabines, who populated the neighboring areas. […]
The Romans devised a plan to abduct the Sabine women during the festival of Neptune Equester. They planned and announced a festival of games to attract people from all the nearby towns. At the festival, […] the Romans grabbed the Sabine women and fought off the Sabine men. […] All of the women abducted at the festival were said to have been virgins except for one married woman, Hersilia, who became Romulus’s wife and would later be the one to intervene and stop the ensuing war between the Romans and the Sabines.
{ Wikipedia | Continue reading }
Linguistics, crime, flashback | March 1st, 2025 3:47 am
Spending time alone is a virtually inevitable part of daily life that can promote or undermine well-being.
Here, we explore how the language used to describe time alone—such as “me-time” “solitude,” or “isolation”—influences how it is perceived and experienced […]
linguistic framing affected what people thought about, but not what they did, while alone […]
simple linguistic shifts may enhance subjective experiences of time alone
{ PsyArXiv | Continue reading }
ideas, psychology | November 4th, 2024 12:56 pm
It begins each day at nightfall. As the light disappears, billions of zooplankton, crustaceans and other marine organisms rise to the ocean surface to feed on microscopic algae, returning to the depths at sunrise. The waste from this frenzy – Earth’s largest migration of creatures – sinks to the ocean floor, removing millions of tonnes of carbon from the atmosphere each year.
This activity is one of thousands of natural processes that regulate the Earth’s climate. Together, the planet’s oceans, forests, soils and other natural carbon sinks absorb about half of all human emissions. […]
Findings by an international team of researchers show the amount of carbon absorbed in 2023 by land has temporarily collapsed. The final result was that forest, plants and soil – as a net category – absorbed almost no carbon.
There are warning signs at sea, too. Greenland’s glaciers and Arctic ice sheets are melting faster than expected, which is disrupting the Gulf Stream ocean current and slows the rate at which oceans absorb carbon. For the algae-eating zooplankton, melting sea ice is exposing them to more sunlight – a shift scientists say could keep them in the depths for longer, disrupting the vertical migration that stores carbon on the ocean floor.
{ Guardian | Continue reading }
climate, elements, eschatology, incidents | October 15th, 2024 7:07 am
health, psychology, time | September 24th, 2024 5:47 am
I have just now come from a party where I was its life and soul; witticisms streamed from my lips, everyone laughed and admired me, but I went away — yes, the dash should be as long as the radius of the earth’s orbit ——————————— and wanted to shoot myself.
{ Søren Kierkegaard, Journal, March 1836 | Continue reading }
experience, ideas | May 10th, 2024 4:54 am
Do you surf yourself?
No, I tried. I did it for about a week, 20 years ago. You have to dedicate yourself to these great things. And I don’t believe in being good at a lot of things—or even more than one. But I love to watch it. I think if I get a chance to be human again, I would do just that. You wake up in the morning and you paddle out. You make whatever little money you need to survive. That seems like the greatest life to me.
Or you could become very wealthy in early middle-age, stop doing the hard stuff, and go off and become a surfer.
No, no. You want to be broke. You want it to be all you’ve got. That’s when life is great. People are always trying to add more stuff to life. Reduce it to simpler, pure moments. That’s the golden way of living, I think.
{ Jerry Seinfeld | GQ | Continue reading }
related { Anecdote on Lowering the work ethic }
eudaemonism, sport | April 22nd, 2024 12:31 pm

Any viral post on X now almost certainly includes A.I.-generated replies, from summaries of the original post to reactions written in ChatGPT’s bland Wikipedia-voice, all to farm for follows. Instagram is filling up with A.I.-generated models, Spotify with A.I.-generated songs. Publish a book? Soon after, on Amazon there will often appear A.I.-generated “workbooks” for sale that supposedly accompany your book (which are incorrect in their content; I know because this happened to me). Top Google search results are now often A.I.-generated images or articles. Major media outlets like Sports Illustrated have been creating A.I.-generated articles attributed to equally fake author profiles. Marketers who sell search engine optimization methods openly brag about using A.I. to create thousands of spammed articles to steal traffic from competitors.
Then there is the growing use of generative A.I. to scale the creation of cheap synthetic videos for children on YouTube. Some example outputs are Lovecraftian horrors, like music videos about parrots in which the birds have eyes within eyes, beaks within beaks, morphing unfathomably while singing in an artificial voice, “The parrot in the tree says hello, hello!” The narratives make no sense, characters appear and disappear randomly, and basic facts like the names of shapes are wrong. After I identified a number of such suspicious channels on my newsletter, The Intrinsic Perspective, Wired found evidence of generative A.I. use in the production pipelines of some accounts with hundreds of thousands or even millions of subscribers. […]
There’s so much synthetic garbage on the internet now that A.I. companies and researchers are themselves worried, not about the health of the culture, but about what’s going to happen with their models. As A.I. capabilities ramped up in 2022, I wrote on the risk of culture’s becoming so inundated with A.I. creations that when future A.I.s are trained, the previous A.I. output will leak into the training set, leading to a future of copies of copies of copies, as content became ever more stereotyped and predictable.
{ NY Times | Continue reading }
and { When Marie was first approached by Arcads in December 2023, the company explained they were seeking test subjects to see whether they could turn someone’s voice and likeness into AI. […] Marie doesn’t worry that by giving up her rights to an AI company, she’s bringing about the end of her work—as many actors fear. […] Hyperrealistic deepfakes and AI-generated content have rapidly saturated our digital lives. The impact of this ‘hidden in plain sight’ dynamic is increasing distrust of all digital media—that anything could be faked. }
eschatology, robots & ai | March 30th, 2024 6:18 am
At 40, Franz Kafka (1883-1924), who never married and had no children, was walking through a park one day in Berlin when he met a girl who was crying because she had lost her favourite doll. She and Kafka searched for the doll unsuccessfully. Kafka told her to meet him there the next day and they would come back to look for her.
The next day, when they had not yet found the doll, Kafka gave the girl a letter “written” by the doll saying “please don’t cry. I took a trip to see the world. I will write to you about my adventures.” Thus began a story which continued until the end of Kafka’s life.
During their meetings, Kafka read the letters of the doll carefully written with adventures and conversations that the girl found adorable. Finally, Kafka brought back the doll (he bought one) that had returned to Berlin.
“It doesn’t look like my doll at all,” said the girl. Kafka handed her another letter in which the doll wrote: “my travels have changed me.” The little girl hugged the new doll and brought the doll with her to her happy home. A year later Kafka died. Many years later, the now-adult girl found a letter inside the doll. In the tiny letter signed by Kafka it was written: “Everything you love will probably be lost, but in the end, love will return in another way.”
{ Avi | a true anecdote, unproven }
books, kids, toys | February 14th, 2024 7:03 am
For a quarter century, Gerry Fialka, an experimental film-maker from Venice, California, has hosted a book club devoted to a single text: James Joyce’s Finnegans Wake, one of the most famously difficult texts in literary history.
Starting in 1995, between 10 and 30 people would show up to monthly meetings at a local library. At first they read two pages a month, eventually slowing to just one page per discussion. At that pace, the group – which now meets on Zoom – reached the final page in October. It took them 28 years. […]
This November, they started back on page three.
“There is no next book,” Fialka told me. “We’re only reading one book. Forever.”
{ The Guardian | Continue reading }
Finnegans Wake was first published in 1939 and it is widely regarded as being one of the most challenging novels in English literature.
Written in a torrent of idiosyncratic language over more than 600 pages, it includes made-up words in several languages, puns and arcane allusions to Greek mythology.
{ The Times | Continue reading }
The club is among several around the world devoted to collectively untangling the meaning of Joyce’s 1939 novel, which tells many stories simultaneously, and is dense with neologisms and allusions. Critics have considered the work perplexing; a review in The New Yorker suggested it might have been written by a “god, talking in his sleep.” […]
Margot Norris, a professor emerita of English at the University of California, Irvine, and a Joyce scholar, described “Finnegans Wake” as “dramatic poetry” that instead of following a typical plot plays with the very nature of language. “We get words in ‘Finnegans Wake’ that aren’t words,” Dr. Norris said, referring to a passage of seemingly nonsense phrases: “This is Roo- shious balls. This is a ttrinch. This is mistletropes. This is Canon Futter with the popynose.” The novel, she added, “draws your attention to language, but the language isn’t going to be exactly the language that you know.” […]
“People think they’re reading a book, they’re not,” he said. “They’re breathing and living together as human beings in a room; looking at printed matter, and figuring out what printed matter does to us.”
{ NY Times | Continue reading }
previously { Joyce invented a unique polyglot-language or idioglossia solely for the purpose of this work. }
James Joyce | December 8th, 2023 2:17 pm

On 25 October 1946, Karl Popper (at the London School of Economics), was invited to present a paper entitled “Are There Philosophical Problems?” at a meeting of the Cambridge University Moral Sciences Club, which was chaired by Ludwig Wittgenstein.
The two started arguing vehemently over whether there existed substantial problems in philosophy, or merely linguistic puzzles—the position taken by Wittgenstein.
Wittgenstein used a fireplace poker to emphasize his points, gesturing with it as the argument grew more heated. Eventually, Wittgenstein claimed that philosophical problems were nonexistent.
In response, Popper claimed there were many issues in philosophy, such as setting a basis for moral guidelines. Wittgenstein then thrust the poker at Popper, challenging him to give any example of a moral rule, Popper (later) claimed to have said:
“Not to threaten visiting lecturers with pokers”
{ Wikipedia | Continue reading }
Parnet: Let’s move on to “W”.
Deleuze: There’s nothing in “W”.
Parnet: Yes, there’s Wittgenstein. I know he’s nothing for you, but it’s only a word.
Deleuze: I don’t like to talk about that… For me, it’s a philosophical catastrophe. It’s the very example of a “school”, it’s a regression of all philosophy, a massive regression. […] They imposed a system of terror in which, under the pretext of doing something new, it’s poverty instituted in all grandeur… […] the Wittgensteinians are mean and destructive. […] They are assassins of philosophy.
{ The Deleuze Seminars | Continue reading }
buffoons, controversy, fights, ideas | October 15th, 2023 10:03 am
Sartre, it will be recalled, had asserted a kind of absolute freedom for the conscious human being. It was this claim that Merleau-Ponty disputed. […] If freedom were everywhere, as seemed to be the case in Sartre’s Being and Nothingness , then freedom in effect would be nowhere […] “Free action, in order to be discernible, has to stand out from a background of life from which it is entirely, or almost entirely, absent.” (Merleau-Ponty, Phenomenology of Perception, 1945) […]
While Sartre properly emphasized the subject’s freedom, he distorted the scope of this freedom by rendering it absolute. The subject, argued Merleau-Ponty, always faced a previously established situation, an environment and world not of its own making. Its life, as intersubjectively open, acquired a social atmosphere which it did not itself constitute. Social roles pressed upon the individual as plausible courses for his life to take. Certain modes of behavior became habitual. Probably , this world, these habits, a familiar comportment: probably these would not change overnight. It was unlikely that an individual would suddenly choose to be something radically other than what he had already become. The Sartre of Being and Nothingness underestimated the weight of this realm of relative constraint and habitual inertia.
{ Merleau-Ponty: The Ambiguity of History | Continue reading
Cognitive science is lacking conceptual tools to describe how an agent’s motivations, as such, can play a role in the generation of its behavior. […] a new kind of non-reductive theory is proposed: Irruption Theory. […] irruptions are associated with increased unpredictability of (neuro)physiological activity, and they should hence be quantifiable in terms of information-theoretic entropy. Accordingly, evidence that action, cognition, and consciousness are linked to higher levels of neural entropy can be interpreted as indicating higher levels of motivated agential involvement.
{ PsyArXiv | Continue reading }
controversy, ideas, theory | April 27th, 2023 6:35 am
Humans can think about possible states of the world without believing in them, an important capacity for high-level cognition.
Here we use fMRI and a novel “shell game” task to test two competing theories about the nature of belief and its neural basis.
According to the Cartesian theory, information is first understood, then assessed for veracity, and ultimately encoded as either believed or not believed. According to the Spinozan theory, comprehension entails belief by default, such that understanding without believing requires an additional process of “unbelieving”. […]
findings are consistent with a version of the Spinozan theory whereby unbelieving is an inhibitory control process.
{ PsyArXiv | Continue reading }
neurosciences, spinoza | June 22nd, 2022 11:07 am

Inside, Mr. Pierrat found a literary treasure trove: long-lost manuscripts by Louis-Ferdinand Céline, the acclaimed but equally reviled French author who wrote classics like “Journey to the End of the Night,” published in 1932, as well as virulently antisemitic tracts. […] Céline always maintained that the manuscripts had been stolen from his Paris apartment after he escaped to Germany in 1944, fearing that he would be punished as a collaborator when the Allies liberated the city. […] David Alliot, a literary researcher, said the issue for many French was that while Céline was a “literary genius,” he was a deeply flawed human being. […]
Mr. Thibaudat said he was given the manuscripts by an undisclosed benefactor, or benefactors — he declined to elaborate — about 15 years ago. But he had kept the stash secret, waiting for Céline’s widow to die, at the request of the benefactor, whose wish was that an “antisemitic family” would not profit from the trove, he said in an interview. […]
the manuscripts includes the complete version of the novel “Casse-pipe,” partly published in 1949, and a previously unknown novel titled “Londres” […]
With his lawyer by his side, Mr. Thibaudat met Céline’s heirs in June 2020. It did not go well. Mr. Thibaudat suggested that the manuscripts be given to a public institution to make them accessible to researchers. François Gibault, 89, and Véronique Chovin, 69, the heirs to Céline’s work through their connections as friends to the family, were outraged, and sued Mr. Thibaudat, demanding compensation for years of lost revenues.
“Fifteen years of non-exploitation of such books is worth millions of euros,” said Jérémie Assous, the lawyer and longtime friend of Céline’s heirs. “He’s not protecting his source, he’s protecting a thief.”
In July, Mr. Thibaudat finally handed over the manuscripts on the orders of prosecutors. During a four-hour interview with the police, Mr. Thibaudat refused to name his source. The investigation is continuing.
{ NY Times | Continue reading }
books, economics | October 26th, 2021 12:48 pm
In building your book I wanted to pursue my own process of decomposition. I began to think about the ways in which paper degrades. Rotting in the ground, exposure to rain, chemicals (I used Xylene, a paint thinner, for the image transfers on the cover), and fire. Although rain or burying paper in the ground would have created unique and unpredictable patterns of ruin in the paper, these seemed like passive processes, whereas burning paper could achieve some level of stochastic design but in a more involved, active, and risk-exposed situation. I followed the traditional recipe for Chinese blackpowder: 75% potassium nitrate, or saltpeter, 15% carbon, 10% sulphur. […]
On a hot plate, outside, the potassium nitrate is usually dissolved in a pot of water, however instead of water I poured into the potassium nitrate a jar of my stale, sunbaked urine since it accelerates the burn process.
{ Big Other | Continue reading }
books, chem | June 24th, 2021 10:36 am
Canada, one of the most real estate-obsessed nations on earth — and one of the least affected by the 2008 crash — is up 42+% in the past year alone.
Even in Ethiopia, where my wife grew up, a three-bedroom detached house in the capital can cost you $1+ million USD.
Until recently, most people’s house price paradigm looked something like this:
A house’s market price is the maximum amount that a buyer can expect to afford over the next 25–40 years. But because wages are flatlined and purchasing parity is the same as in 1978, the only rational explanation for this current price explosion is a giant debt bubble.
But what if the paradigm — the baseline assumption of what dictates house prices — is changing?
What if the newly-redefined value of shelter is the maximum amount of annual rent that can be extracted per unit of housing? […]
As reader Valerie Kittell put it: “Airbnb-type models altered the market irreversibly by proving on a large scale that short term rentals were more lucrative than stable long-term residents.”
We’re in the middle of a paradigm shift to corporate serfdom.
Stop enriching corrupt banks — pay off your mortgages and never look back. Parents and grandparents with means: Help your kids get a start in housing before it’s out of their reach forever.
{ Jared A. Brock | Continue reading }
housing, theory | June 18th, 2021 8:16 am

In 2008, the report warned about the potential emergence of a pandemic originating in East Asia and spreading rapidly around the world.
The latest report, Global Trends 2040, [was] released last week […] “Large segments of the global population are becoming wary of institutions and governments that they see as unwilling or unable to address their needs.” […] Experts in Washington who have read these reports said they do not recall a gloomier one.
{ NY Times | Continue reading }
art { Günter Fruhtrunk, Rote Vibration, 1970 | Bridget Riley, Ra, 1981 }
eschatology | April 16th, 2021 12:14 am

The Narrative of Arthur Gordon Pym of Nantucket (1838) is the only complete novel written by American writer Edgar Allan Poe. […] The story starts out as a fairly conventional adventure at sea, but it becomes increasingly strange and hard to classify. […]
Peters, Pym, and Augustus hatch a plan to seize control of the ship […] soon the three men are masters of the Grampus: all the mutineers are killed or thrown overboard except one, Richard Parker, whom they spare to help them run the vessel. […] As time passes, with no sign of land or other ships, Parker suggests that one of them should be killed as food for the others. They draw straws, following the custom of the sea, and Parker is sacrificed.
{ Wikipedia | Continue reading }
On 19 May 1884 four men set sail from Southampton in a small yacht. They were professional sailors tasked with taking their vessel, the Mignonette, to its new owner in Australia. […] The Mignonette’s captain, Tom Dudley, was 31 years old and a proven yachtsman. Of his crew, Ned Brooks and mate Edwin Stephens were likewise seasoned sailors. The final crew-member, cabin boy Richard Parker, was just 17 years old and making his first voyage on the open sea. […]
On 5 July, sailing from Madeira to Cape Town, the Mignonette was sunk by a giant wave. […] Adrift in an open boat in the South Atlantic, hundreds of miles from land, they had little in the way of provisions. They had no water, and for food, only two 1lb tins of turnips grabbed during the Mignonette’s final moments.
Over the next 12 days, these turnips were scrupulously rationed out […] For water […] they resorted to drinking their own urine, although this too was a diminishing resource as their bodies became increasingly dehydrated.
By 17 July all supplies on board the little dinghy had been exhausted. After a further three days, the inexperienced Richard Parker could not resist gulping down sea water in an attempt to allay his thirst. It is now known that small quantities of sea water can help to sustain life in survival situations, but in that period it was widely believed to be fatal. Parker also drank far in excess of modern recommendations and he was soon violently unwell, collapsing in the bottom of the boat with diarrhea.
Even before Parker fell ill, Tom Dudley had broached the fearful topic of the “custom of the sea,” the practice of drawing lots to select a sacrificial victim who could be consumed by his crew-mates. […] According to their subsequent depositions, however, no lots were drawn. Instead, Dudley told Stephens to hold Parker’s legs should he struggle, before kneeling and thrusting his penknife into the boy’s jugular. […] Parker’s body was then stripped and butchered. The heart and liver were eaten immediately; strips of flesh were cut from his limbs and set aside as future rations. What remained of the young man was heaved overboard.
{ History Extra | Continue reading }
books, flashback, mystery and paranormal | March 27th, 2021 4:44 am
eschatology | March 5th, 2021 2:02 pm

Speakers take a lot for granted. That is, they presuppose information. As we wrote this, we presupposed that readers would understand English. We also presupposed as we wrote the last sentence, repeated in (1), that there was a time when we wrote it, for otherwise the fronted phrase “as we wrote this” would not have identified a time interval.
(1) As we wrote this, we presupposed that readers would understand English.
Further, we presupposed that the sentence was jointly authored, for otherwise “we” would not have referred. And we presupposed that readers would be able to identify the reference of “this”, i.e., the article itself. And we presupposed that there would be at least two readers, for otherwise the bare plural “readers” would have been inappropriate. And so on.
{ Stanford Encyclopedia of Philosophy | Continue reading }
photo { Pieter Hugo, Escort Kama, Enugu, Nigeria from Nollywood, 2008 }
ideas | January 20th, 2021 2:55 pm
Fette Sans: I want to cut you open and pour out all your insides, chew on your liver and your heart, and then sew you back together using your intestine and maybe I will pack you better than you were and there will be some left to crochet myself a necklace.
{ Forty-one reflections on 2020 | Continue reading }
ideas | December 25th, 2020 3:55 pm

Here’s a puzzle […] It’s called “Cain’s Jawbone,” in which people are challenged to put the shuffled pages of a murder mystery novel in their proper order. Since its creation in 1934, it has only been solved by two people — until now.
British comedian John Finnemore made it his quarantine project to crack “Cain’s Jawbone” — and he succeeded, making him just the third person to solve it in its nearly 90-year history. […]
The puzzle takes the form of 100 cards, each containing the page of a murder mystery novel. In order to solve the puzzle, participants must put all the cards in the proper order and determine who murders who in the story. There are 32 million possible combinations, which makes finding the correct result quite a feat.
{ The World | Continue reading }
books, leisure | December 3rd, 2020 5:49 pm

An astrophysicist of the University of Bologna and a neurosurgeon of the University of Verona compared the network of neuronal cells in the human brain with the cosmic network of galaxies, and surprising similarities emerged. […]
The human brain functions thanks to its wide neuronal network that is deemed to contain approximately 69 billion neurons. On the other hand, the observable universe can count upon a cosmic web of at least 100 billion galaxies. Within both systems, only 30% of their masses are composed of galaxies and neurons. Within both systems, galaxies and neurons arrange themselves in long filaments or nodes between the filaments. Finally, within both system, 70% of the distribution of mass or energy is composed of components playing an apparently passive role: water in the brain and dark energy in the observable Universe. […]
Probably, the connectivity within the two networks evolves following similar physical principles, despite the striking and obvious difference between the physical powers regulating galaxies and neurons”
{ Università di Bologna | Continue reading }
oil on canvas { Karel Appel, Portrait, 1966 }
brain, ideas, space | November 23rd, 2020 7:00 am

life expectancy for men in 1907 was 45.6 years; by 1957 it rose to 66.4; in 2007 it reached 75.5. Unlike the most recent increase in life expectancy (which was attributable largely to a decline in half of the leading causes of death including heart disease, homicide, and influenza), the increase in life expectancy between 1907 and 2007 was largely due to a decreasing infant mortality rate, which was 9.99 percent in 1907; 2.63 percent in 1957; and 0.68 percent in 2007.
But the inclusion of infant mortality rates in calculating life expectancy creates the mistaken impression that earlier generations died at a young age; Americans were not dying en masse at the age of 46 in 1907. The fact is that the maximum human lifespan — a concept often confused with “life expectancy” — has remained more or less the same for thousands of years. The idea that our ancestors routinely died young (say, at age 40) has no basis in scientific fact. […]
If a couple has two children and one of them dies in childbirth while the other lives to be 90, stating that on average the couple’s children lived to be 45 is statistically accurate but meaningless.
{ LiveScience | Continue reading | BBC }
flashback, health, time | September 27th, 2020 4:11 pm

Say you travelled in time, in an attempt to stop COVID-19’s patient zero from being exposed to the virus. However if you stopped that individual from becoming infected, that would eliminate the motivation for you to go back and stop the pandemic in the first place. This is a paradox, an inconsistency that often leads people to think that time travel cannot occur in our universe. […] In the coronavirus patient zero example, you might try and stop patient zero from becoming infected, but in doing so you would catch the virus and become patient zero, or someone else would. No matter what you did, the salient events would just recalibrate around you. Try as you might to create a paradox, the events will always adjust themselves, to avoid any inconsistency.
{ Popular Mechanics | Continue reading | More: Classical and Quantum Gravity }
theory, time | September 27th, 2020 10:40 am

I examine the relationship between unhappiness and age using data from eight well-being data files on nearly 14 million respondents across forty European countries and the United States and 168 countries from the Gallup World Poll. […] Unhappiness is hill-shaped in age and the average age where the maximum occurs is 49 with or without controls.
{ Journal of Economic Behavior & Organization | Continue reading }
A large empirical literature has debated the existence of a U-shaped happiness-age curve. This paper re-examines the relationship between various measures of well-being and age in 145 countries. […] The U-shape of the curve is forcefully confirmed, with an age minimum, or nadir, in midlife around age 50 in separate analyses for developing and advanced countries as well as for the continent of Africa. The happiness curve seems to be everywhere.
{ Journal of Population Economics | PDF }
photo { Joseph Szabo }
eudaemonism | September 12th, 2020 10:41 am

In the year 1930, John Maynard Keynes predicted that, by century’s end, technology would have advanced sufficiently that countries like Great Britain or the United States would have achieved a 15-hour work week. There’s every reason to believe he was right. In technological terms, we are quite capable of this. And yet it didn’t happen. Instead, technology has been marshaled, if anything, to figure out ways to make us all work more. In order to achieve this, jobs have had to be created that are, effectively, pointless. […]
productive jobs have, just as predicted, been largely automated away […] But rather than allowing a massive reduction of working hours to free the world’s population to pursue their own projects, pleasures, visions, and ideas […] It’s as if someone were out there making up pointless jobs just for the sake of keeping us all working. And here, precisely, lies the mystery. In capitalism, this is precisely what is not supposed to happen. Sure, in the old inefficient socialist states like the Soviet Union, where employment was considered both a right and a sacred duty, the system made up as many jobs as they had to (this is why in Soviet department stores it took three clerks to sell a piece of meat). But, of course, this is the sort of very problem market competition is supposed to fix. According to economic theory, at least, the last thing a profit-seeking firm is going to do is shell out money to workers they don’t really need to employ. Still, somehow, it happens.
{ David Graeber | Continue reading }
what I am calling “bullshit jobs” are jobs that are primarily or entirely made up of tasks that the person doing that job considers to be pointless, unnecessary, or even pernicious. Jobs that, were they to disappear, would make no difference whatsoever. Above all, these are jobs that the holders themselves feel should not exist.
Contemporary capitalism seems riddled with such jobs.
{ The Anarchist Library | Continue reading }
image { Alliander, ElaadNL, and The incredible Machine, Transparent Charging Station, 2017 }
economics, ideas | September 3rd, 2020 11:51 am
Moringa oleifera, an edible tree found worldwide in the dry tropics, is increasingly being used for nutritional supplementation. Its nutrient-dense leaves are high in protein quality, leading to its widespread use by doctors, healers, nutritionists and community leaders, to treat under-nutrition and a variety of illnesses. Despite the fact that no rigorous clinical trial has tested its efficacy for treating under-nutrition, the adoption of M. oleifera continues to increase. The “Diffusion of innovations theory” describes well the evidence for growth and adoption of dietary M. oleifera leaves, and it highlights the need for a scientific consensus on the nutritional benefits. […]
The regions most burdened by under-nutrition, (in Africa, Asia, Latin America, and the Caribbean) all share the ability to grow and utilize an edible plant, Moringa oleifera, commonly referred to as “The Miracle Tree.” For hundreds of years, traditional healers have prescribed different parts of M. oleifera for treatment of skin diseases, respiratory illnesses, ear and dental infections, hypertension, diabetes, cancer treatment, water purification, and have promoted its use as a nutrient dense food source. The leaves of M. oleifera have been reported to be a valuable source of both macro- and micronutrients and is now found growing within tropical and subtropical regions worldwide, congruent with the geographies where its nutritional benefits are most needed.
Anecdotal evidence of benefits from M. oleifera has fueled a recent increase in adoption of and attention to its many healing benefits, specifically the high nutrient composition of the plants leaves and seeds. Trees for Life, an NGO based in the United States has promoted the nutritional benefits of Moringa around the world, and their nutritional comparison has been widely copied and is now taken on faith by many: “Gram for gram fresh leaves of M. oleifera have 4 times the vitamin A of carrots, 7 times the vitamin C of oranges, 4 times the calcium of milk, 3 times the potassium of bananas, ¾ the iron of spinach, and 2 times the protein of yogurt” (Trees for Life, 2005).
Feeding animals M. oleifera leaves results in both weight gain and improved nutritional status. However, scientifically robust trials testing its efficacy for undernourished human beings have not yet been reported. If the wealth of anecdotal evidence (not cited herein) can be supported by robust clinical evidence, countries with a high prevalence of under-nutrition might have at their fingertips, a sustainable solution to some of their nutritional challenges. […]
The “Diffusion of Innovations” theory explains the recent increase in M. oleifera adoption by various international organizations and certain constituencies within undernourished populations, in the same manner as it has been so useful in explaining the adoption of many of the innovative agricultural practices in the 1940-1960s. […] A sigmoidal curve (Figure 1), illustrates the adoption process starting with innovators (traditional healers in the case of M. oleifera), who communicate and influence early adopters, (international organizations), who then broadcast over time new information on M. oleifera adoption, in the wake of which adoption rate steadily increases.
{ Ecology of Food and Nutrition | Continue reading }
Dendrology, economics, food, drinks, restaurants, theory | September 1st, 2020 4:54 pm

Currently, we produce ∼1021 digital bits of information annually on Earth. Assuming a 20% annual growth rate, we estimate that after ∼350 years from now, the number of bits produced will exceed the number of all atoms on Earth, ∼1050. After ∼300 years, the power required to sustain this digital production will exceed 18.5 × 1015 W, i.e., the total planetary power consumption today, and after ∼500 years from now, the digital content will account for more than half Earth’s mass, according to the mass-energy–information equivalence principle. Besides the existing global challenges such as climate, environment, population, food, health, energy, and security, our estimates point to another singular event for our planet, called information catastrophe.
{ AIP Advances | Continue reading }
It is estimated that a week’s worth of the New York Times contains more information than a person was likely to come across in a lifetime in the 18th century. […] The amount of new information is doubling every two years. By 2010, it’s predicted to double every 72 hours. […] The lunatic named Bobby Fisher “despised the media”: “They’re destroying reality, turning everything into media.” “News exceed reality” writes Thomas Bernhard somewhere. The saturation and repetitions in Basquiat’s paintings. The high-frequency trading. “an immense accumulation of nothing“ (Imp Kerr, 2009). An immense accumulation of ignorance. […,]
From what precedes it necessarily follows that the inescapable future of knowledge is banality, falsehood, and overabundance, which sum is a form of ignorance.
{ The New Inquiry | Continue reading }
eschatology, ideas, media | August 20th, 2020 5:48 am

Enjoying short-term pleasurable activities that don’t lead to long-term goals contributes at least as much to a happy life as self-control, according to new research. […]
simply sitting about more on the sofa, eating more good food and going to the pub with friends more often won’t automatically make for more happiness.
{ UZH | Continue reading }
eudaemonism | August 2nd, 2020 6:08 am

The Parrondo’s paradox, has been described as: A combination of losing strategies becomes a winning strategy. […]
Consider two games Game A and Game B, this time with the following rules:
1. In Game A, you simply lose $1 every time you play.
2. In Game B, you count how much money you have left. If it is an even number, you win $3. Otherwise you lose $5.
Say you begin with $100 in your pocket. If you start playing Game A exclusively, you will obviously lose all your money in 100 rounds. Similarly, if you decide to play Game B exclusively, you will also lose all your money in 100 rounds.
However, consider playing the games alternatively, starting with Game B, followed by A, then by B, and so on (BABABA…). It should be easy to see that you will steadily earn a total of $2 for every two games.
Thus, even though each game is a losing proposition if played alone, because the results of Game B are affected by Game A, the sequence in which the games are played can affect how often Game B earns you money, and subsequently the result is different from the case where either game is played by itself.
{ Wikipedia | Continue reading }
ideas | July 19th, 2020 5:53 pm
What is the feasibility of survival on another planet and being self-sustaining? […] I show here that a mathematical model can be used to determine the minimum number of settlers and the way of life for survival on another planet, using Mars as the example. […] The minimum number of settlers has been calculated and the result is 110 individuals.
{ Nature | Continue reading }
eschatology, theory | June 25th, 2020 7:55 am
eudaemonism, kids | April 13th, 2020 9:37 am

David Silver [the creator of AlphaZero] hasn’t answered my question about whether machines can set up their own goals. He talks about subgoals, but that’s not the same. That’s a certain gap in his definition of intelligence. We set up goals and look for ways to achieve them. A machine can only do the second part.
So far, we see very little evidence that machines can actually operate outside of these terms, which is clearly a sign of human intelligence. Let’s say you accumulated knowledge in one game. Can it transfer this knowledge to another game, which might be similar but not the same? Humans can. With computers, in most cases you have to start from scratch.
{ Gary Kasparov/Wired | Continue reading }
photo { Kelsey Bennett }
chess, ideas, psychology | February 23rd, 2020 8:28 pm

The madman theory is a political theory commonly associated with U.S. President Richard Nixon’s foreign policy. He and his administration tried to make the leaders of hostile Communist Bloc nations think Nixon was irrational and volatile. According to the theory, those leaders would then avoid provoking the United States, fearing an unpredictable American response.
{ Wikipedia | Continue reading }
The author finds that perceived madness is harmful to general deterrence and is sometimes also harmful in crisis bargaining, but may be helpful in crisis bargaining under certain conditions.
{ British Journal of Political Science | Continue reading }
black smoke shells fitted with computer chips { Cai Guo-Qiang, Wreath (Black Ceremony), 2011 }
U.S., fights, theory | February 18th, 2020 4:47 pm

Founded in 1945 by University of Chicago scientists who had helped develop the first atomic weapons in the Manhattan Project, the Bulletin of the Atomic Scientists created the Doomsday Clock two years later, using the imagery of apocalypse (midnight) and the contemporary idiom of nuclear explosion (countdown to zero) to convey threats to humanity and the planet. The decision to move (or to leave in place) the minute hand of the Doomsday Clock is made every year by the Bulletin’s Science and Security Board in consultation with its Board of Sponsors, which includes 13 Nobel laureates. The Clock has become a universally recognized indicator of the world’s vulnerability to catastrophe from nuclear weapons, climate change, and disruptive technologies in other domains.
To: Leaders and citizens of the world
Re: Closer than ever: It is 100 seconds to midnight
Date: January 23, 2020
{ Bulletin of the Atomic Scientists | Continue reading }
eschatology | January 27th, 2020 9:19 pm
[W]hile time moves forward in our universe, it may run backwards in another, mirror universe that was created on the “other side” of the Big Bang.
{ PBS (2014) | Continue reading }
Physics, space, theory, time | January 23rd, 2020 6:54 pm
ideas, showbiz | January 22nd, 2020 4:55 pm

Using Music as Medicine – finding the optimum music listening ‘dosage’
There was a general agreement of dosage time across 3 of the 4 domains with 11 minutes being the most common amount of time it took for people to receive the therapeutic benefit from their self- selected music preferences. The only exception was the domain of happiness where the most common length of time for people to become happier after listening to their chosen music was reduced to 5 minutes, suggesting that happy music takes less time to take effect than other music.
{ British Academy of Sound Therapy.com | PDF | More
photo { Sarah Illenberger }
eudaemonism, music, psychology | January 12th, 2020 5:07 pm

Most of the research on happiness has documented that income, marriage, employment and health affect happiness. Very few studies examine whether happiness itself affect income, marriage, employment and health. […] Findings show that happier Indonesians in 2007 earned more money, were more likely to be married, were less likely to be divorced or unemployed, and were in better health when the survey was conducted again seven years later.
{ Applied Research in Quality of Life | Continue reading }
image { Maurizio Cattelan and Pierpaolo Ferrari, Toilet Paper #1, June 2010 }
eudaemonism, psychology | November 30th, 2019 1:24 pm

English speakers have been deprived of a truly functional, second person plural pronoun since we let “ye” fade away a few hundred years ago.
“You” may address one person or a bunch, but it can be imprecise and unsatisfying. “You all”—as in “I’m talking to you all,” or “Hey, you all!”—sounds wordy and stilted. “You folks” or “you gang” both feel self-conscious. Several more economical micro-regional varieties (youz, yinz) exist, but they lack wide appeal.
But here’s what’s hard to explain: The first, a gender-neutral option, mainly thrives in the American South and hasn’t been able to steal much linguistic market share outside of its native habitat. The second, an undeniable reference to a group of men, is the default everywhere else, even when the “guys” in question are women, or when the speaker is communicating to a mixed gender group.
“You guys,” rolls off the tongues of avowed feminists every day, as if everyone has agreed to let one androcentric pronoun pass, while others (the generic “he” or “men” as stand-ins for all people) belong to the before-we-knew-better past. […]
One common defense of “you guys” that Mallinson encounters in the classroom and elsewhere is that it is gender neutral, simply because we use it that way. This argument also appeared in the New Yorker recently, in a column about a new book, The Life of Guy: Guy Fawkes, the Gunpowder Plot, and the Unlikely History of an Indispensable Word by writer and educator Allan Metcalf.
“Guy” grew out of the British practice of burning effigies of the Catholic rebel Guy Fawkes, Metcalf explains in the book. The flaming likenesses, first paraded in the early 1600s, came to be called “guys,” which evolved to mean a group of male lowlifes, he wrote in a recent story for Time. Then, by the 18th century, “guys” simply meant “men” without any pejorative connotations. By the 1930s, according to the Washington Post, Americans had made the leap to calling all persons “guys.”
{ Quartz | Continue reading }
Linguistics | November 29th, 2019 10:13 pm
kids, showbiz, time | November 14th, 2019 8:00 am

of course there is no behind the scenes, no real self, no authenticity, etc. just a precession of simulacra; influencers sort of serve the same function Baudrillard thought Disneyland served: to make everyone else feel “authentic”
{ Rob Horning }
ideas, social networks | September 16th, 2019 4:17 pm

[S]ome languages—such as Japanese, Basque, and Italian—are spoken more quickly than others. […]
Linguists have spent more time studying not just speech rate, but the effort a speaker has to exert to get a message across to a listener. By calculating how much information every syllable in a language conveys, it’s possible to compare the “efficiency” of different languages. And a study published today in Science Advances found that more efficient languages tend to be spoken more slowly. In other words, no matter how quickly speakers chatter, the rate of information they’re transmitting is roughly the same across languages.
The basic problem of “efficiency,” in linguistics, starts with the trade-off between effort and communication. It takes a certain amount of coordination, and burns a certain number of calories, to make noises come out of your mouth in an intelligible way. And those noises can be more or less informative to a listener, based on how predictable they are. If you and I are discussing dinosaurs, you wouldn’t be surprised to hear me rattle off the names of my favorite species. But if a stranger walks up to you on the street and announces, “Diplodocus!” it’s unexpected. It narrows the scope of possible conversation topics greatly and is therefore highly informative.
{ The Atlantic | Continue reading }
image { Six soap bubbles inside one another, from The Windsor Magazine, 1902 }
Linguistics | September 8th, 2019 8:51 am

Cooping was an alleged form of electoral fraud in the United States cited in relation to the death of Edgar Allan Poe in October 1849, by which unwilling participants were forced to vote, often several times over, for a particular candidate in an election. According to several of Poe’s biographers, these innocent bystanders would be grabbed off the street by so-called ‘cooping gangs’ or ‘election gangs’ working on the payroll of a political candidate, and they would be kept in a room, called the “coop”, and given alcoholic beverages in order for them to comply. If they refused to cooperate, they would be beaten or even killed. Often their clothing would be changed to allow them to vote multiple times. Sometimes the victims would be forced to wear disguises such as wigs, fake beards or mustaches to prevent them from being recognized by voting officials at polling stations.
{ Wikipedia | Continue reading }
On October 3, 1849, Edgar Allan Poe was found delirious on the streets of Baltimore, “in great distress, and… in need of immediate assistance”, according to Joseph W. Walker who found him. He was taken to the Washington Medical College where he died on Sunday, October 7, 1849 at 5:00 in the morning. He was not coherent long enough to explain how he came to be in his dire condition and, oddly, was wearing clothes that were not his own.
He is said to have repeatedly called out the name “Reynolds” on the night before his death, though it is unclear to whom he was referring.
All medical records and documents, including Poe’s death certificate, have been lost, if they ever existed.
Newspapers at the time reported Poe’s death as “congestion of the brain” or “cerebral inflammation”, common euphemisms for death from disreputable causes such as alcoholism.
The actual cause of death remains a mystery. […] One theory dating from 1872 suggests that cooping was the cause of Poe’s death, a form of electoral fraud in which citizens were forced to vote for a particular candidate, sometimes leading to violence and even murder. […] Cooping had become the standard explanation for Poe’s death in most of his biographies for several decades, though his status in Baltimore may have made him too recognizable for this scam to have worked. […]
Immediately after Poe’s death, his literary rival Rufus Wilmot Griswold wrote a slanted high-profile obituary under a pseudonym, filled with falsehoods that cast him as a lunatic and a madman, and which described him as a person who “walked the streets, in madness or melancholy, with lips moving in indistinct curses, or with eyes upturned in passionate prayers, (never for himself, for he felt, or professed to feel, that he was already damned)”.
The long obituary appeared in the New York Tribune signed “Ludwig” on the day that Poe was buried. It was soon further published throughout the country. The piece began, “Edgar Allan Poe is dead. He died in Baltimore the day before yesterday. This announcement will startle many, but few will be grieved by it.” “Ludwig” was soon identified as Griswold, an editor, critic, and anthologist who had borne a grudge against Poe since 1842. Griswold somehow became Poe’s literary executor and attempted to destroy his enemy’s reputation after his death.
{ Wikipedia | Continue reading }
books, flashback, mystery and paranormal, scams and heists | August 25th, 2019 2:46 pm

“The maximum speed required to break through the earth’s gravitational pull is seven miles a second,” says David Wojnarowicz. “Since economic conditions prevent us from gaining access to rockets or spaceships, we would have to learn to run awful fast to achieve escape from where we all are heading.”
{ The New Inquiry | Continue reading }
ideas | July 17th, 2019 6:49 am

In mid-1947, a United States Army Air Forces balloon crashed at a ranch near Roswell, New Mexico. Following wide initial interest in the crashed “flying disc”, the US military stated that it was merely a conventional weather balloon. Interest subsequently waned until the late 1970s, when ufologists began promoting a variety of increasingly elaborate conspiracy theories, claiming that one or more alien spacecraft had crash-landed and that the extraterrestrial occupants had been recovered by the military, which then engaged in a cover-up.
In the 1990s, the US military published two reports disclosing the true nature of the crashed object: a nuclear test surveillance balloon from Project Mogul.
{ Wikipedia | Continue reading }
photo { W. Eugene Smith, Untitled [man holding bottle, S-shaped foam form emerging from it], Springfield, Massachusetts, 1952 }
U.S., controversy, flashback | July 11th, 2019 8:07 am

Suppose you live in a deeply divided society: 60% of people strongly identify with Group A, and the other 40% strongly identify with Group B. While you plainly belong to Group A, you’re convinced this division is bad: It would be much better if everyone felt like they belonged to Group AB. You seek a cohesive society, where everyone feels like they’re on the same team.
What’s the best way to bring this cohesion about? Your all-too-human impulse is to loudly preach the value of cohesion. But on reflection, this is probably counter-productive. When members of Group B hear you, they’re going to take “cohesion” as a euphemism for “abandon your identity, and submit to the dominance of Group A. ”None too enticing. And when members of Group A notice Group B’s recalcitrance, they’re probably going to think, “We offer Group B the olive branch of cohesion, and they spit in our faces. Typical.” Instead of forging As and Bs into one people, preaching cohesion tears them further apart.
What’s the alternative? Simple. Instead of preaching cohesion, reach out to Group B. Unilaterally show them respect.Unilaterally show them friendliness. They’ll be distrustful at first, but cohesion can’t be built in a day.
{ The Library of Economics and Liberty | Continue reading }
photo { Stephen Shore, Queens, New York, April 1972 }
ideas, photogs, psychology, relationships | June 16th, 2019 2:01 pm

The mind-body problem enjoyed a major rebranding over the last two decades and is generally known now as the “hard problem” of consciousness […] Fast forward to the present era and we can ask ourselves now: Did the hippies actually solve this problem? My colleague Jonathan Schooler of the University of California, Santa Barbara, and I think they effectively did, with the radical intuition that it’s all about vibrations … man. Over the past decade, we have developed a “resonance theory of consciousness” that suggests that resonance—another word for synchronized vibrations—is at the heart of not only human consciousness but of physical reality more generally. […]
Stephen Strogatz provides various examples from physics, biology, chemistry and neuroscience to illustrate what he calls “sync” (synchrony) […] Fireflies of certain species start flashing their little fires in sync in large gatherings of fireflies, in ways that can be difficult to explain under traditional approaches. […] The moon’s rotation is exactly synced with its orbit around the Earth such that we always see the same face. […]
The panpsychist argues that consciousness (subjectivity) did not emerge; rather, it’s always associated with matter, and vice versa (they are two sides of the same coin), but mind as associated with most of the matter in our universe is generally very simple. An electron or an atom, for example, enjoy just a tiny amount of consciousness. But as matter “complexifies,” so mind complexifies, and vice versa.
{ Scientific American | Continue reading | Thanks Tim }
brain, ideas, neurosciences | June 13th, 2019 2:03 pm

Despite variation in lifestyle and environment, first signs of human facial aging show between the ages of 20–30 years. It is a cumulative process of changes in the skin, soft tissue, and skeleton of the face. As quantifications of facial aging in living humans are still scarce, we set out to study age-related changes in three- dimensional facial shape using geometric morphometrics.
We collected surface scans of 88 human faces (aged 26–90 years) from the coastal town Split (Croatia) and neighboring islands. Based on a geometric morphometric analysis of 585 measurement points (landmarks and semi- landmarks), we modeled sex-specific trajectories of average facial aging.
Age-related facial shape change was similar in both sexes until around age 50, at which time the female aging trajectory turned sharply. The overall magnitude of facial shape change (aging rate) was higher in women than men, especially in early postmenopause. Aging was generally associated with a flatter face, sagged soft tissue (“broken” jawline), deeper nasolabial folds, smaller visible areas of the eyes, thinner lips, and longer nose and ears. In postmenopausal women, facial aging was best predicted by the years since last menstruation and mainly attributable to bone resorption in the mandible.
{ Physical Anthropology | Continue reading }
faces, science, time | June 13th, 2019 1:46 pm

Can events be accurately described as historic at the time they are happening?
Claims of this sort are in effect predictions about the evaluations of future historians; that is, that they will regard the events in question as significant.
Here we provide empirical evidence in support of earlier philosophical arguments1 that such claims are likely to be spurious and that, conversely, many events that will one day be viewed as historic attract little attention at the time.
{ Nature Human Behaviour | Continue reading }
photo { David Sims }
ideas, photogs | June 13th, 2019 12:33 pm
Picture some serious non-fiction tomes. The Selfish Gene; Thinking, Fast and Slow; Guns, Germs, and Steel; etc. Have you ever had a book like this—one you’d read—come up in conversation, only to discover that you’d absorbed what amounts to a few sentences? I’ll be honest: it happens to me regularly. Often things go well at first. I’ll feel I can sketch the basic claims, paint the surface; but when someone asks a basic probing question, the edifice instantly collapses. Sometimes it’s a memory issue: I simply can’t recall the relevant details. But just as often, as I grasp about, I’ll realize I had never really understood the idea in question, though I’d certainly thought I understood when I read the book. Indeed, I’ll realize that I had barely noticed how little I’d absorbed until that very moment.
{ Andy Matuschak | Continue reading }
books, experience | May 13th, 2019 10:19 am

Throughout her life, Bly—born Elizabeth Jane Cochran 155 years ago on May 5, 1864—refused to be what other people wanted her to be. That trait, some philosophers say, is the key to human happiness, and Bly’s life shows why.
{ Quartz | Continue reading }
oil on linen { Susannah Martin, Helium, 2017 }
eudaemonism, flashback | May 6th, 2019 8:01 am

S is a woman if and only if:
S is systematically subordinated along some dimension (economic, legal, political, social, etc) and S is ‘marked’ as a target for this treatment by observed or imagined bodily features presumed to be evidence of a female’s biological role in reproduction.
To be a woman is to be subordinated in some way because of real or imagined biological features that are meant to indicate one’s female role in reproduction.
{ Aeon | Continue reading }
ideas, relationships | May 1st, 2019 3:15 pm

According to the 2019 World Happiness Report, negative feelings are rising around the world—and the United States is particularly hard hit with an “epidemic of addictions.” Tellingly, the report also shows a widening happiness gap, with some people reporting much more well-being and others showing much less within each country. […]
Negative feelings—worry, sadness, and anger—have been rising around the world, up by 27 percent from 2010 to 2018. […]
“The U.S. is suffering an epidemic of addictions.” This includes an addiction to technology, which researcher Jean Twenge largely blames for the worrying mental health trends among U.S. adolescents. In her chapter of the report, she argues that screen time is displacing activities that are key to our happiness, like in-person social contact. Forty-five percent of adolescents are online “almost constantly,” and the average high school senior spends six hours a day texting, on social media or on the internet.
But we’re hooked on more than just technology. According to researcher Steve Sussman, around half of Americans suffer from at least one addiction. Some of the most prevalent are alcohol, food, and work—which each affect around 10 percent of adults—as well as drugs, gambling, exercise, shopping, and sex.
There’s another possible explanation for unhappiness, though: Governments are losing their way. […] According to survey results since 2005, people across the globe are more satisfied with life when their governments are more effective, enforce the rule of law, have better regulation, control corruption, and spend in certain ways—more on health care and less on military.
{ Yes | Continue reading }
eudaemonism, psychology | May 1st, 2019 10:54 am

Where Does Time Go When You Blink?
Retinal input is frequently lost because of eye blinks, yet humans rarely notice these gaps in visual input. […]
Here, we investigated whether the subjective sense of time is altered by spontaneous blinks. […]
The results point to a link between spontaneous blinks, previously demonstrated to induce activity suppression in the visual cortex, and a compression of subjective time.
{ bioRxiv | Continue reading }
photo { Helmut Newton, A cure for a black eye, Jerry Hall, 1974 }
eyes, time | April 18th, 2019 12:50 pm
Based on the analysis of 190 studies (17,887 participants), we estimate that the average silent reading rate for adults in English is 238 word per minute (wpm) for non-fiction and 260 wpm for fiction. The difference can be predicted by the length of the words, with longer words in non-fiction than in fiction. The estimates are lower than the numbers often cited in scientific and popular writings. […] The average oral reading rate (based on 77 studies and 5,965 participants) is 183 wpm.
{ PsyArXiv | Continue reading }
Linguistics | April 13th, 2019 10:26 am

The Twelve Labours of Heracles are a series of episodes concerning a penance carried out by Heracles or Hercules, the greatest of the Greek heroes, whose name was later romanised as Hercules. They were accomplished over 12 years at the service of King Eurystheus.
[…]
Driven mad by Hera (queen of the gods), Hercules slew his son, daughter, and wife Megara. After recovering his sanity, Hercules deeply regretted his actions; he was purified by King Thespius, then traveled to Delphi to inquire how he could atone for his actions. Pythia, the Oracle of Delphi, advised him to go to Tiryns and serve his cousin King Eurystheus for twelve years, performing whatever labors Eurystheus might set him; in return, he would be rewarded with immortality.
[…]
Eurystheus originally ordered Hercules to perform ten labours. Hercules accomplished these tasks, but Eurystheus refused to recognize two: the slaying of the Lernaean Hydra, as Hercules’ nephew and charioteer Iolaus had helped him; and the cleansing of the Augeas, because Hercules accepted payment for the labour. Eurystheus set two more tasks (fetching the Golden Apples of Hesperides and capturing Cerberus), which Hercules also performed, bringing the total number of tasks to twelve.
[…]
The twelve labours:
1. Slay the Nemean lion.
2. Slay the nine-headed Lernaean Hydra.
3. Capture the Ceryneian Hind.
4. Capture the Erymanthian Boar.
5. Clean the Augean stables in a single day.
6. Slay the Stymphalian birds.
7. Capture the Cretan Bull.
8. Steal the Mares of Diomedes.
9. Obtain the girdle of Hippolyta.
10. Obtain the cattle of the monster Geryon.
11. Steal the apples of the Hesperides.
12. Capture and bring back Cerberus.
{ Wikipedia | Continue reading }
helmet, acrylic and crayon { Jean-Michel Basquiat, AARON, 1981 }
allegories, flashback | April 13th, 2019 10:20 am

As we shall see, the story of the great flood and the voyage of the ark contains so many incredible “violations of the laws of nature” that it cannot possibly be accepted by any thinking person. […]
From the moment the impending storm is announced (Genesis 6:7, 13, 17) and Jehovah sets forth the design and dimensions of the ark (Genesis 6:14-16), problems start appearing. […]
The ark is to be made out of gopher wood according to a plan that calls for the ark to be three hundred cubits long, fifty cubits wide, and thirty cubits tall (450×75x45 feet, according to most creationists. See Segraves, p. 11). It is to contain three floors, a large door in the side, and a one cubit square window at the top. The floors are to be divided into rooms, and all the walls, inside and out, are to be pitched with pitch. Since the purpose of the ark is to hold animals and plants, particularly two of “every living thing of all flesh . . . to keep them alive with thee” (Genesis 6:19), it will have to be constructed accordingly.
Before he could even contemplate such a project, Noah would have needed a thorough education in naval architecture and in fields that would not arise for thousands of years such as physics, calculus, mechanics, and structural analysis. There was no shipbuilding tradition behind him, no experienced craftspeople to offer advice. Where did he learn the framing procedure for such a Brobdingnagian structure? How could he anticipate the effects of roll, pitch, yaw, and slamming in a rough sea? How did he solve the differential equations for bending moment, torque, and shear stress? […]
As if the rough construction of the ship weren’t headache enough, the internal organization had to be honed to perfection. With space at a premium every cubit had to be utilized to the maximum; there was no room for oversized cages and wasted space. The various requirements of the myriads of animals had to be taken into account in the design of their quarters, especially considering the length of the voyage. The problems are legion: feeding and watering troughs need to be the correct height for easy access but not on the floor where they will get filthy; the cages for horned animals must have bars spaced properly to prevent their horns from getting stuck, while rhinos require round “bomas” for the same reason; a heavy leather body sling is “indispensable” for transporting giraffes; primates require tamper-proof locks on their doors; perches must be the correct diameter for each particular bird’s foot (Hirst; Vincent). Even the flooring is important, for, if it is too hard, hooves may be injured, if too soft, they may grow too quickly and permanently damage ankles (Klos); rats will suffer decubitus (ulcers) with improper floors (Orlans), and ungulates must have a cleated surface or they will slip and fall (Fowler). These and countless other technical problems all had to be resolved before the first termite crawled aboard, but there were no wildlife management experts available for consultation. Even today the transport requirements of many species are not fully known, and it would be physically impossible to design a single carrier to meet them all. […]
Genetic problems […]
Marine animals […]
Having drawn up a passenger list, the next order of business is to gather them all at dockside. At this point, the creationists themselves are unable to propound any sort of scenario in which Noah and his sons could perform such a feat, so they resort to the convenient dumping ground of the inexplicable: miracles. God himself intervened by implanting in the chosen pair from each species the instinct of migration, and by this mechanism they gathered from the four corners of the world and headed for the Plains of Shinar […] However accurate their suddenly acquired instinct, for many animals it could not have been enough to overcome the geographical barriers between them and the ark. The endemic fauna of the New World, Australia, and other remote regions, as well as animals unable to survive the Near Eastern environment, would find the journey too difficult no matter how desperately they yearned to go. Flood theorists are unperturbed by such obstacles, however, for they simply gerrymander the map to give us an antediluvian world of undivided continents and a uniform, semitropical, spring-like climate.
{ Creation/Evolution Journal | Continue reading }
art { Nobuhiko Yoshida, from JCA Annual 4, 1982 }
allegories, animals, archives | April 10th, 2019 12:51 pm

Ten years after the financial dramas of Autumn 2008, I take stock of what we have learned, what we have done, and what we have yet to do if we would avoid a repeat performance.
The primary lessons I draw are that income and wealth distribution, the endogeneity of credit-money, and finance system structure all matter profoundly not only where justice, but also where systemic stability is concerned.
The longer-term tasks still before us include a much broader and financially engineered diffusion of capital ownership over our population, citizen central banking, a permanent national investment authority, continuous public open labor market operations, debt-free or low-debt education and health insurance, and an updated form of segregating capital-raising primary from asset-trading secondary markets in the financial sector.
Shorter-term tasks include debt-forgiveness, a restoration of labor rights and countercyclical progressive taxation, and restored citizen-ownership of our secondary market makers in home mortgage and higher education debt.
{ LawArXiv | Continue reading }
U.S., economics, theory | March 12th, 2019 7:27 am

GPT-2 is trained with a simple objective: predict the next word, given all of the previous words within some text. […] GPT-2 displays a broad set of capabilities, including the ability to generate conditional synthetic text samples of unprecedented quality, where we prime the model with an input and have it generate a lengthy continuation. […]
SYSTEM PROMPT (HUMAN-WRITTEN)
In a shocking finding, scientist discovered a herd of unicorns living in a remote, previously unexplored valley, in the Andes Mountains. Even more surprising to the researchers was the fact that the unicorns spoke perfect English.
MODEL COMPLETION (MACHINE-WRITTEN, 10 TRIES)
The scientist named the population, after their distinctive horn, Ovid’s Unicorn. These four-horned, silver-white unicorns were previously unknown to science. Now, after almost two centuries, the mystery of what sparked this odd phenomenon is finally solved. […] While examining these bizarre creatures the scientists discovered that the creatures also spoke some fairly regular English. Pérez stated, “We can see, for example, that they have a common ‘language,’ something like a dialect or dialectic.” Dr. Pérez believes that the unicorns may have originated in Argentina, where the animals were believed to be descendants of a lost race of people who lived there before the arrival of humans in those parts of South America.While their origins are still unclear, some believe that perhaps the creatures were created when a human and a unicorn met each other in a time before human civilization. According to Pérez, “In South America, such incidents seem to be quite common.”
{ Open AI | Continue reading }
related { The technology behind OpenAI’s fiction-writing, fake-news-spewing AI, explained }
more { Japanese scientists used A.I. to read minds + NONE of these people exist | Thanks Tim }
quote { Who is Descartes’ Evil Genius? }
Linguistics, robots & ai | February 16th, 2019 10:28 am

Given that the fictional narratives found in novels, movies, and television shows enjoy wide public consumption, memorably convey information, minimize counter-arguing, and often emphasize politically-relevant themes, we argue that greater scholarly attention must be paid to theorizing and measuring how fiction affects political attitudes.
We argue for a genre-based approach for studying fiction effects, and apply it to the popular dystopian genre.
Results across three experiments are striking: we find consistent evidence that dystopian narratives enhance the willingness to justify radical—especially violent—forms of political action. […]
Our research not only reinforces past work showing that people often fail to distinguish between fact and fiction in learning about the world, but also illustrates that the lessons of fiction may not be what they seem. […] Rather than creating political cynicism in readers and viewers or showing them that girls can be powerful too—both lessons that are at this point probably amply supplied by the American news media and lived experience—dystopian fiction seems to be teaching them a more subtle and perhaps more concerning message: that violence and illegal activities may be both legitimate and necessary to pursue justice. Dystopian fiction appears to subtly expand the political imagination of viewers and readers to encompass a range of scenarios outside the normal realm of democratic politics, and what people then consider reasonable and thinkable appears to expand accordingly.
These results should also highlight the peril for political scientists in assuming that fiction is just entertainment. The stories we tell ourselves have profound implications for how we think about political ethics and political possibilities, and as scholars of politics, we can and should do more to map out the effects of politically-inflected fiction and entertainment.
{ Cambridge Core | Continue reading }
still { Harriet Andersson in Ingmar Bergman’s Summer with Monika, 1953 }
ideas, media | December 22nd, 2018 4:08 pm

One of the curious features of language is that it varies from one place to another.
Even among speakers of the same language, regional variations are common, and the divide between these regions can be surprisingly sharp. […]
For example, the term “you guys” is used most often in the northern parts of the US, while “y’all” is used more in the south.
{ Technology Review | Continue reading }
Linguistics, U.S. | November 29th, 2018 7:39 am

Let us consider a counterfactual history in which Szilard invents nuclear fission and realizes that a nuclear bomb could be made with a piece of glass, a metal object, and a battery arranged in a particular configuration. What happens next? Szilard becomes gravely concerned. He sees that his discovery must be kept secret at all costs. But how? His insight is bound to occur to others. He could talk to a few of his physicist friends, the ones most likely to stumble upon the idea, and try to persuade them not to publish anything on nuclear chain reactions or on any of the reasoning steps leading up to the dangerous discovery. (That is what Szilard did in actual history.)
Here Szilard faces a dilemma: either he doesn’t explain the dangerous discovery, but then he will not be effective in persuading many of his colleagues to stop publishing; or he tells them the reason for his concern, but then he spreads the dangerous knowledge further. Either way he is fighting a losing battle. The general advance of scientific knowledge will eventually make the dangerous insight more accessible. Soon, figuring out how to initiate a nuclear chain reaction with pieces of metal, glass, and electricity will no longer take genius but will be within reach of any STEM student with an inventive mindset.
The situation looks hopeless, but Szilard does not give up. He decides to take a friend into his confidence, a friend who is also the world’s most famous scientist—Albert Einstein. He successfully persuades Einstein of the danger (again following actual history). Now Szilard has the support of a man who can get him a hearing with any government. The two write a letter to President Franklin D. Roosevelt. After some committee wranglings and report-writing, the top levels of the U.S. government are eventually sufficiently convinced to be ready to take serious action.
What the U.S. government did, after having digested the information provided by Einstein and Szilard, […] was to launch the Manhattan Project in order to weaponize nuclear fission as quickly as possible. […]
But how would things have played out if there had been an easy way to make nukes? Maybe Szilard and Einstein could persuade the U.S. government to ban all research in nuclear physics (outside high-security government facilities)? […] Let us suppose that President Roosevelt could somehow mobilize enough political support to drive through a ban, and that the U.S. Supreme Court could somehow find a way of regarding it as constitutionally valid. We then confront an array of formidable practical difficulties. All university physics departments would have to be closed, and security checks initiated. A large number of faculty and students would be forced out. Intense speculations would swirl concerning the reason for all these heavy-handed measures. Groups of physics PhD students and faculty banned from their research field would sit around and speculate about what the secret danger might be. Some of them would figure it out. And among those who figured it out, some would feel compelled to use the knowledge to impress their colleagues; and those colleagues would want to tell yet others, to show they were in the know. Alternatively, somebody who opposed the ban would unilaterally decide to publish the secret, maybe in order to support their view that the ban is ineffective or that the benefits of publication outweigh the risks. […] Even if, by some miracle, the secret never leaked in the United States, scientists in other countries would independently discover it, thereby multiplying the sources from which it could spread. […]
An alternative approach would be to eliminate all glass, metal, or sources of electrical current. Given the ubiquity of these materials, such an undertaking would be extremely daunting. […] Metal use is almost synonymous with civilization, and would not be a realistic target for elimination. Glass production could be banned, and existing glass panes confiscated; but pieces of glass would remain scattered across the landscape for a long time. Batteries and magnets could be seized, though some people would have stashed away these materials before they could be collected by the authorities. […]
We now know that one cannot trigger a nuclear explosion with just a sheet of glass, some metal, and a battery. Making an atomic bomb requires several kilograms of fissile material, which is difficult to produce. We pulled out a grey ball that time. Yet with each act of invention, we reach into the urn anew.
Let us introduce the hypothesis that the urn of creativity contains at least one black ball. We can refer to this as the vulnerable world hypothesis . Intuitively, the hypothesis is that there is some level of technology at which civilization almost certainly gets destroyed unless quite extraordinary and historically unprecedented degrees of preventive policing and/or global governance are implemented.
{ Nick Bostrom | PDF }
related { Nick Bostrom on the Great Filter | PDF }
eschatology, ideas | November 22nd, 2018 6:30 am

Achieving most goals in everyday life requires persistence. Despite an abundance of relevant theoretical and empirical work, no theory details the universal causes of persistence (and non-persistence) across all goal types and settings. To address this gap in the literature, we introduce the Continuing and Returning Model of persistence. […]
[T]he goals that people pursue in daily life are varied and complex. These goals may be concrete or abstract, short-term or of indefinite length. They vary in difficulty, importance, and in how they work with or against other goals that a person holds. Some goals are intended to be started and finished in one continuous episode, but other goals are episodic, meaning that pursuit occurs across multiple episodes. People’s most important goals (the ones that they most want to accomplish) are typically not achieved in one episode of pursuit.
Everyday goals can be episodic for many reasons. Goals can be episodic by their nature (e.g., “write daily”), but more often they are episodic by choice. Breaking a goal into smaller tasks can be beneficial, as can taking breaks to rest and recover. People manage many goals in daily life and minimally have goals to eat, sleep, and maintain their social relationships. People often cannot focus on one goal at a time; instead, they manage multiple goals, allocating their time and attention among competing behavioral choice options. Thus, factors unrelated to focal goals can affect their persistence. Even goals intended to be pursued in one continuous episode (e.g., writing an email) are often deliberately stopped or interrupted by factors unrelated to someone’s motivation for the focal goal such scheduled events, other prioritized goals, and extraneous thoughts. The potentially episodic nature of everyday goal pursuit and the fact that everyday persistence can be influenced by factors external to the goal has implications for how and why persistence occurs and fails to occur. […]
In contrast to the intuitive notion that persistence is inherently good and reflective of commitment to the focal goal or to one’s personal character, the Continuing and Returning Model depicts persistence as goal pursuit process that is dynamic, multiply determined, and separable from goal attainment and other metrics of success. Persistence is not a binary behavior but rather a process that is affected by many factors. Furthermore, persistence is not always adaptive. Indeed, sometimes persistence reflects inefficient goal pursuit. Similarly, non- persistence isn’t always maladaptive. People who stop pursuing a goal are often prioritizing or accommodating other important goals, people, or events.
{ PsyArXiv | Continue reading }
related { How The Brain Switches Between Different Sets Of Rules }
oil on wood panel { Leonardo da Vinci, Lady with an Ermine, 1488-1490 }
experience, ideas | November 21st, 2018 2:58 pm

The ship of Theseus is a thought experiment that raises the question of whether a ship that has had all of its components replaced remains fundamentally the same object.
Suppose that the famous ship sailed by the hero Theseus in a great battle has been kept in a harbour as a museum piece. As the years go by some of the wooden parts begin to rot and are replaced by new ones. After a century or so, all of the parts have been replaced. Is the “restored” ship still the same object as the original?
{ Wikipedia | Continue reading }
Does the Human Body Really Replace Itself Every 7 Years?
Recent research has confirmed that different tissues in the body replace cells at different rates, and some tissues never replace cells. So the statement that we replace every cell in the body every seven years or every ten years is wrong. […]
Neurons in the cerebral cortex are never replaced.
Fat cells are replaced at the rate of about 10% per year in adults. […]
Cardiomyocyte cells [muscle cells of the heart] are replaced at a reducing rate as we age. At age 25, about 1% of cells are replaced every year. Replacement slows gradually to about 0.5% at age 70. Even in people who have lived a very long life, less than half of the cardiomyocyte cells have been replaced. Those that aren’t replaced have been there since birth.
{ Ask a Naturalist | Continue reading }
Your lungs are six weeks old - and your taste buds just ten days! […]
Liver cells only have a life span of around 150 days. […] “I can take 70 per cent of a person’s liver away in an operation and around 90 per cent of it will grow back within two months,” explains David Lloyd, consultant liver surgeon at Leicester Royal Infirmary. […]
Your eyes are one of the few body parts that don’t really change during your life. The only part that is constantly being renewed is the cornea, the transparent top layer.
{ Daily Mail | Continue reading }
A collection of the replacement rates of different cells in our body:

[…]
We note that hair elongates at about 1 cm per month while fingernails grow at about 0.3 cm per month, which is about the same speed as the continental spreading in plate tectonics that increases the distance between North America and Europe.
{ Cell Biology by the Numbers | Continue reading }
For decades, scientists believed that neurogenesis—the creation of new neurons—whirs along nicely in the brains of embryos and infants, but grinds to a halt by adulthood. But from the 1980s onward, this dogma started to falter. Researchers showed that neurogenesis does occur in the brains of various adult animals, and eventually found signs of newly formed neurons in the adult human brain. Hundreds of these cells are supposedly added every day to the hippocampus—a comma-shaped structure involved in learning and memory. The concept of adult neurogenesis is now so widely accepted that you can find diets and exercise regimens that purportedly boost it.
The trouble is: This stream of fresh neurons might not actually exist.
In a new study, and one of the biggest yet, a team led by Arturo Alvarez-Buylla at the University of California at San Francisco completely failed to find any trace of young neurons in dozens of hippocampus samples, collected from adult humans.
{ The Atlantic, March 2018 | Continue reading }
People as old as 79 may still generate new brain cells, US researchers said Thursday. […] Using autopsied brain samples from 28 people who died suddenly between the ages of 14-79, researchers looked at “newly formed neurons and the state of blood vessels within the entire human hippocampus soon after death.” […]
A study last month led by Arturo Alvarez-Buylla of the University of California in San Francisco found the opposite, however.
{ Medical Express, April 2018 | Continue reading }
The generation of cells in the human body has been difficult to study, and our understanding of cell turnover is limited. Testing of nuclear weapons resulted in a dramatic global increase in the levels of the isotope 14C in the atmosphere, followed by an exponential decrease after 1963.
We show that the level of 14C in genomic DNA closely parallels atmospheric levels and can be used to establish the time point when the DNA was synthesized and cells were born. We use this strategy to determine the age of cells in the cortex of the adult human brain and show that whereas nonneuronal cells are exchanged, occipital neurons are as old as the individual, supporting the view that postnatal neurogenesis does not take place in this region.
{ Cell | PDF }
If the cells of our skin are replaced regularly, why do scars and tattoos persist indefinitely?
The cells in the superficial or upper layers of skin, known as the epidermis, are constantly replacing themselves. This process of renewal is basically exfoliation (shedding) of the epidermis. But the deeper layers of skin, called the dermis, do not go through this cellular turnover and so do not replace themselves. Thus, foreign bodies, such as tattoo dyes, implanted in the dermis will remain.
{ Scientific American | PDF }
inkjet and acrylic on canvas { imp kerr (b.1980), not confirmed as alive, 59th st, nyc, 1977, 2018 }
ideas, science | November 16th, 2018 2:48 pm
experience, ideas | November 8th, 2018 1:25 pm

People often conduct visual searches in which multiple targets are possible (e.g., medical x-rays can contain multiple abnormalities). In this type of search, observers are more likely to miss a second target after having found a first one.
{ PsyArXiv | Continue reading }
The streetlight effect, or the drunkard’s search principle, is a type of observational bias that occurs when people only search for something where it is easiest to look.
{ Wikipedia | Continue reading }
Modern scientific instruments can extensively process “observations” before they are presented to the human senses, and particularly with computerized instruments, there is sometimes a question as to where in the data processing chain “observing” ends and “drawing conclusions” begins. This has recently become an issue with digitally enhanced images published as experimental data in papers in scientific journals. The images are enhanced to bring out features that the researcher wants to emphasize, but this also has the effect of supporting the researcher’s conclusions.
{ Wikipedia | Continue reading }
related { Errors/Biases in Clinical Decision Making }
photo { Richard Avedon, Christy Turlington for Revlon, 1990 }
ideas | November 5th, 2018 1:40 pm
flashback, ideas | October 11th, 2018 8:34 am

“What are the odds, if everything is random?” Wang wondered.
In a new paper, Wang investigates whether “hot-streak” periods are more than just a lucky coincidence. […]
Looking at the career histories of thousands of scientists, artists, and film directors, the team found evidence that hot streaks are both real and ubiquitous, with virtually everyone experiencing one at some point in their career. While the timing of an individual’s greatest successes is indeed random, their top hits are highly likely to appear in close proximity. […]
“If we know where your best work is, then we know very well where your second-best work is, and your third,” he says, “because they’re just around the corner.”
{ Kellogg School of Management | Continue reading }
economics, strategy | August 20th, 2018 4:29 am

The new “eyes wide shut” illusion uses a standard enlarging (shaving or makeup) mirror. Close one eye and look at the closed eye in the mirror; the eye should take up most of the mirror. Switch eyes to see the other closed eye. Switch back-and-forth a few times, then open both eyes. You see an open eye. Which eye is it? To find out, close one eye. Whichever you close, that’s the eye you see. How can this be possible? The brain is fusing two images of the two eyes.
{ Perception | Continue reading | Thanks Brad! }
However, no one has hitherto laid down the limits to the powers of the body, that is, no one has as yet been taught by experience what the body can accomplish solely by the laws of nature, in so far as she is regarded as extension. No one hitherto has gained such an accurate knowledge of the bodily mechanism, that he can explain all its functions; nor need I call attention to the fact that many actions are observed in the lower animals, which far transcend human sagacity, and that somnambulists do many things in their sleep, which they would not venture to do when awake: these instances are enough to show, that the body can by the sole laws of its nature do many things which the mind wonders at.
Again, no one knows how or by what means the mind moves the body, nor how many various degrees of motion it can impart to the body, nor how quickly it can move it.
{ Spinoza, Ethics, III, Proposition II, Scholium | Continue reading }
unrelated { eye colour may not be a priority when choosing a partner }
brain, eyes, spinoza | August 9th, 2018 2:46 pm

The familiarity of the phrase ‘much ado about nothing’ belies its complexity. In Shakespeare’s day ‘nothing’ was pronounced the same as ‘noting’, and the play contains numerous punning references to ‘noting’, both in the sense of observation and in the sense of ‘notes’ or messages. […]
‘Nothing’ was Elizabethan slang for the vagina (a vacancy, ‘no-thing’ or ‘O thing’). Virginity — a state of potentiality rather than actuality — is also much discussed in the play, and it is these twin absences — the vagina and virginity — that lead, in plot terms, to the ‘much ado’ of the title.
{ The Guardian | Continue reading }
photo { Olivia Rocher, I Fought the Law (Idaho), 2016 }
Linguistics, allegories, poetry, sex-oriented | August 1st, 2018 3:26 pm

What happens when we unexpectedly see an attractive potential partner? Previous studies in laboratorial settings suggest that the visualization of attractive and unattractive photographs influences time. The major aim of this research is to study time perception and attraction in a realistic social scenario, by investigating if changes in subjective time measured during a speed dating are associated with attraction. […]
When there is a perception of the partner as being physically more attractive, women tend to overestimate the duration of that meeting, whereas men tend to underestimate its duration.
{ University of Minho | Continue reading }
relationships, time | July 30th, 2018 5:18 pm

Two theoretical frameworks have been proposed to account for the representation of truth and falsity in human memory: the Cartesian model and the Spinozan model. Both models presume that during information processing a mental representation of the information is stored along with a tag indicating its truth value. However, the two models disagree on the nature of these tags. According to the Cartesian model, true information receives a “true” tag and false information receives a “false” tag. In contrast, the Spinozan model claims that only false information receives a “false” tag, whereas untagged information is automatically accepted as true. […]
The results of both experiments clearly contradict the Spinozan model but can be explained in terms of the Cartesian model.
{ Memory & Cognition | PDF }
art { Richard Long, Dusty Boots Line, The Sahara, 1988 }
neurosciences, spinoza | July 8th, 2018 9:42 am
Linguistics | May 17th, 2018 2:00 pm
ideas, leisure | April 23rd, 2018 5:58 am

Let’s begin with a simple fact: time passes faster in the mountains than it does at sea level. The difference is small but can be measured with precision timepieces that can be bought today for a few thousand pounds. This slowing down can be detected between levels just a few centimetres apart: a clock placed on the floor runs a little more slowly than one on a table.
It is not just the clocks that slow down: lower down, all processes are slower. Two friends separate, with one of them living in the plains and the other going to live in the mountains. They meet up again years later: the one who has stayed down has lived less, aged less, the mechanism of his cuckoo clock has oscillated fewer times. He has had less time to do things, his plants have grown less, his thoughts have had less time to unfold … Lower down, there is simply less time than at altitude. […]
Times are legion: a different one for every point in space. The single quantity “time” melts into a spiderweb of times.
{ Guardian | Continue reading }
photo { Julie Blackmon }
time | April 16th, 2018 9:29 am

With a few minor exceptions, there are really only two ways to say “tea” in the world. One is like the English term—té in Spanish and tee in Afrikaans are two examples. The other is some variation of cha, like chay in Hindi.
Both versions come from China. How they spread around the world offers a clear picture of how globalization worked before “globalization” was a term anybody used. The words that sound like “cha” spread across land, along the Silk Road. The “tea”-like phrasings spread over water, by Dutch traders bringing the novel leaves back to Europe.
{ Quartz | Continue reading }
art { Josef Albers, Interaction of Color, 1963 }
Linguistics, food, drinks, restaurants | February 16th, 2018 12:34 pm

Here, we analysed 200 million online conversations to investigate transmission between individuals. We find that the frequency of word usage is inherited over conversations, rather than only the binary presence or absence of a word in a person’s lexicon. We propose a mechanism for transmission whereby for each word someone encounters there is a chance they will use it more often. Using this mechanism, we measure that, for one word in around every hundred a person encounters, they will use that word more frequently. As more commonly used words are encountered more often, this means that it is the frequencies of words which are copied.
{ Journal of the Royal Society Interface | Continue reading }
Linguistics | February 15th, 2018 1:11 pm

Grapheme-color synesthesia is a neurological phenomenon in which viewing a grapheme elicits an additional, automatic, and consistent sensation of color.
Color-to-letter associations in synesthesia are interesting in their own right, but also offer an opportunity to examine relationships between visual, acoustic, and semantic aspects of language. […]
Numerous studies have reported that for English-speaking synesthetes, “A” tends to be colored red more often than predicted by chance, and several explanatory factors have been proposed that could explain this association.
Using a five-language dataset (native English, Dutch, Spanish, Japanese, and Korean speakers), we compare the predictions made by each explanatory factor, and show that only an ordinal explanation makes consistent predictions across all five languages, suggesting that the English “A” is red because the first grapheme of a synesthete’s alphabet or syllabary tends to be associated with red.
We propose that the relationship between the first grapheme and the color red is an association between an unusually-distinct ordinal position (”first”) and an unusually-distinct color (red).
{ Cortex | Continue reading }
A Black, E white, I red, U green, O blue: vowels,
Someday I shall tell of your mysterious births
{ Arthur Rimbaud | Continue reading }
art { Roland Cat, The pupils of their eyes, 1985 }
Linguistics, colors, neurosciences | February 12th, 2018 1:27 pm

If you write clearly, then your readers may understand your mathematics and conclude that it isn’t profound. Worse, a referee may find your errors. Here are some tips for avoiding these awful possibilities.
1. Never explain why you need all those weird conditions, or what they mean. For example, simply begin your paper with two pages of notations and conditions without explaining that they mean that the varieties you are considering have zero-dimensional boundary. In fact, never explain what you are doing, or why you are doing it. The best-written paper is one in which the reader will not discover what you have proved until he has read the whole paper, if then
2. Refer to another obscure paper for all the basic (nonstandard) definitions you use, or never explain them at all. This almost guarantees that no one will understand what you are talking about
[…]
11. If all else fails, write in German.
{ J.S. Milne | Continue reading }
photos { Left: William Henry Jackson, Pike’s Peak from the Garden of the Gods, Colorado Midland Series, ca.1880 | Right: Ye Rin Mok }
guide, ideas, mathematics | January 25th, 2018 1:37 pm

I argue that the state of boredom (i.e., the transitory and non-pathological experience of boredom) should be understood to be a regulatory psychological state that has the capacity to promote our well-being by contributing to personal growth and to the construction (or reconstruction) of a meaningful life.
{ Philosophical Psychology }
oil on canvas { Piet Mondrian, Composition in Black and White, with Double Lines, 1934 }
ideas | January 25th, 2018 1:05 pm

Several temporal paradoxes exist in physics. These include General Relativity’s grandfather and ontological paradoxes and Special Relativity’s Langevin-Einstein twin-paradox. General relativity paradoxes can exist due to a Gödel universe that follows Gödel’s closed timelike curves solution to Einstein’s field equations.
A novel biological temporal paradox of General Relativity is proposed based on reproductive biology’s phenomenon of heteropaternal fecundation. Herein, dizygotic twins from two different fathers are the result of concomitant fertilization during one menstrual cycle. In this case an Oedipus-like individual exposed to a Gödel closed timelike curve would sire a child during his maternal fertilization cycle.
As a consequence of heteropaternal superfecundation, he would father his own dizygotic twin and would therefore generate a new class of autofraternal superfecundation, and by doing so creating a ‘twin-father’ temporal paradox.
{ Progress in Biophysics & Molecular Biology | Continue reading }
ideas, time | January 22nd, 2018 3:23 pm

More recently we have supertasks such as Benardete’s Paradox of the Gods,
A man decides to walk one mile from A to B. A god waits in readiness to throw up a wall blocking the man’s further advance when the man has travelled ½ a mile. A second god (unknown to the first) waits in readiness to throw up a wall of his own blocking the man’s further advance when the man has travelled ¼ mile. A third god … etc. ad infinitum. (Benardete 1964, pp. 259-60)
Since for any place after A, a wall would have stopped him reaching it, the traveller cannot move from A. The gods have kept him still without ever raising a wall. Yet how could they cause him to stay still without causally interacting with him? Only a wall can stop him and no wall is ever raised, since for each wall he must reach it for it to be raised but he would have been stopped at an earlier wall. So he can move from A.
[…]
In the Nothing from Infinity paradox we will see an infinitude of finite masses and an infinitude of energy disappear entirely, and do so despite the conservation of energy in all collisions. I then show how this leads to the Infinity from Nothing paradox, in which we have the spontaneous eruption of infinite mass and energy out of nothing. […]
{ European Journal for Philosophy of Science | Continue reading }
photo { Alex Prager, Crowd #2 (Emma), 2012 }
ideas | January 19th, 2018 3:56 pm

Whereas women of all ages prefer slightly older sexual partners, men—regardless of their age—have a preference for women in their 20s. Earlier research has suggested that this difference between the sexes’ age preferences is resolved according to women’s preferences. This research has not, however, sufficiently considered that the age range of considered partners might change over the life span.
Here we investigated the age limits (youngest and oldest) of considered and actual sex partners in a population-based sample of 2,655 adults (aged 18-50 years). Over the investigated age span, women reported a narrower age range than men and women tended to prefer slightly older men. We also show that men’s age range widens as they get older: While they continue to consider sex with young women, men also consider sex with women of their own age or older.
Contrary to earlier suggestions, men’s sexual activity thus reflects also their own age range, although their potential interest in younger women is not likely converted into sexual activity. Homosexual men are more likely than heterosexual men to convert a preference for young individuals into actual sexual behavior, supporting female-choice theory.
{ Evolutionary Psychology | PDF }
related { Longest ever personality study finds no correlation between measures taken at age 14 and age 77 }
photo { Joan Crawford photographed by George Hurrell, 1932 }
relationships, time | February 7th, 2017 12:53 pm

Postmodernism has, to a large extent, run its course [despite having made the considerable innovation of presenting] the first text that was highly self-conscious, self-conscious of itself as text, self-conscious of the writer as persona, self-conscious about the effects that narrative had on readers and the fact that the readers probably knew that. […] A lot of the schticks of post-modernism — irony, cynicism, irreverence — are now part of whatever it is that’s enervating in the culture itself.
{ David Foster Wallace | Continue reading }
photo { Francesca Woodman, Self-portrait at 13, Boulder, Colorado, 1972 | Photography tends not to have prodigies. Woodman, who committed suicide in 1981 at age 22, is considered a rare exception. | NY Review of Books | full story }
ideas, photogs | February 7th, 2017 12:35 pm

This paper approaches the subject of God or a supernatural being that created the universe from a mathematical and physical point of view. It sets up a hypothesis that when the God existed before the Big Bang as an unconscious being became conscious, the energy that was produced during the process became a both highly dense and infinite temperature Cosmic Egg and exploded to create the current universe. This assumption is demonstrated by mathematical formulas and physics law, which provide a solid scientific foundation for the aforementioned theory.
{ International Education and Research Journal | Continue reading }
art { Jean-Michel Basquiat, Head, 1981 }
theory | October 31st, 2016 12:47 pm

When used in speech, hesitancies can indicate a pause for thought, when read in a transcript they indicate uncertainty. In a series of experiments the perceived uncertainty of the transcript was shown to be higher than the perceived uncertainty of the spoken version with almost no overlap for any respondent.
{ arXiv | Continue reading }
art { Paul Klee, After the drawing 19/75, 1919 }
ideas | September 20th, 2016 7:48 am

For hundreds of years, Koreans have used a different method to count age than most of the world. […] A person’s Korean age goes up a year on new year’s day, not on his or her birthday. So when a baby is born on Dec. 31, he or she actually turns two the very next day.
{ Quartz | Continue reading }
asia, time | September 16th, 2016 8:59 am

Is our perceptual experience a veridical representation of the world or is it a product of our beliefs and past experiences? Cognitive penetration describes the influence of higher level cognitive factors on perceptual experience and has been a debated topic in philosophy of mind and cognitive science.
{ Consciousness and Cognition | Continue reading }
photo { Can you think a thought which isn’t yours? A remarkable new study suggests you can }
ideas, psychology | June 24th, 2016 8:23 am

Cioffi endorses the Oxford comma, the one before and in a series of three or more. On the question of whether none is singular or plural, he is flexible: none can mean not a single one and take a singular verb, or it can mean not any and take a plural verb. His sample “None are boring” (from the New Yorker, where I work) was snipped from a review of a show of photographs by Richard Avedon. Cioffi would prefer the singular in this instance — “None is boring” — arguing that it “emphasizes how not a single, solitary one of these Avedon photographs is boring”. To me, putting so much emphasis on the photos’ not being boring suggests that the critic was hoping for something boring. I would let it stand. […]
“that usually precedes elements that are essential to your sentence’s meaning [restrictive], while which typically introduces ‘nonessential’ elements [non-restrictive], and usually refers to the material directly before it.” Americans sometimes substitute which for that, thinking it makes us sound more proper (i.e. British). On both sides of the Atlantic, the classic nonrestrictive which is preceded by a comma.
{ The Times Literary Supplement | Continue reading }
Linguistics | June 13th, 2016 11:30 am

The publication of Richard Krafft-Ebbing’s masterwork Psychopathia Sexualis in 1886 represented a landmark in thinking about human sexuality and the bizarre forms that it can take. In addition to describing different types of sexual expression that the author regarded as “perverse” (usually any form of sex that didn’t lead to procreation), it quickly became one of the most influential books on human sexuality ever written and introduced numerous new terms into common usage. One of these terms was “masochism,” which Krafft-Ebbing defined as the opposite of sadism (which he also coined). While the later is the desire to cause pain and use force, the former is the wish to suffer pain and be subjected to force.
one person in particular who was less than pleased with the new term was the Austrian author, Leopold von Sacher-Masoch. Krafft-Ebbing justified naming this new sexual anomaly after the prominent author whom he described as “the poet of Masochism” due to his erotic writings and because of his own eccentric personal life. […]
Venus in Furs, the short novel for which Sacher-Masoch is best known, was published in 1870, and has become an erotic classic in its own right. In this book, the hero Severin asks to be treated as a slave and to be abused by Wanda (the “Venus in furs” of the story). The fact that Sacher-Masoch often acted out these fantasies in real-life with his wives and mistresses was not well-known. […]
It may be a coincidence that his health went into a decline shortly after Psychopathia Sexualis came out but by March of 1895, he was delusional and violent. After attempting to kill his then-wife Hulda, she arranged for him to be discreetly moved to an asylum in Lindheim, Hesse. Although his official obituary states that he died that year, there are claims that Sacher-Masoch lived on as an anonymous asylum inmate and actually died years later.
{ Providentia | Continue reading }
Onomastics, flashback, psychology | June 13th, 2016 11:20 am

With tens or even hundreds of billions of potentially habitable planets within our galaxy, the question becomes: are we alone?
Many scientists and commentators equate “more planets” with “more E.T.s”. However, the violence and instability of the early formation and evolution of rocky planets suggests that most aliens will be extinct fossil microbes.
Just as dead dinosaurs don’t walk, talk or breathe, microbes that have been fossilised for billions of years are not easy to detect by the remote sampling of exoplanetary atmospheres.
In research published [PDF] in the journal Astrobiology, we argue that early extinction could be the cosmic default for life in the universe. This is because the earliest habitable conditions may be unstable. […] Inhabited planets may be rare in the universe, not because emergent life is rare, but because habitable environments are difficult to maintain during the first billion years.
Our suggestion that the universe is filled with dead aliens might disappoint some, but the universe is under no obligation to prevent disappointment.
{ The Conversation | Continue reading }
previously { Where is the Great Filter? Behind us, or not behind us? If the filter is in our past, there must be some extremely improbable step in the sequence of events whereby an Earth-like planet gives rise to an intelligent species comparable in its technological sophistication to our contemporary human civilization. }
still { The Day the Earth Stood Still, 1951 }
ideas, mystery and paranormal, space | June 8th, 2016 10:59 am

In “Rat Ethics” I am primarily concerned with moral arguments about the rat, in particular, Rattus norvegicus. I argue that there is a complex bias against the animal which reduces it to ‘a pest, vermin, or mischievous’. This predominant bias against rats is a product of cultural stereotyping rather than objective reasoning. A cultural and philosophical examination of the rat can expose and provide grounds for rejecting this bias. I argue that the three main types of rats we encounter (i.e., liminal, research, companion) should be given full moral consideration and determine certain basic moral rights which are distinct to each encounter. I examine the Norway rat from a historical, cultural, philosophical, and practical perspective. I conclude that we must re-evaluate our moral relations with this animal and democratically support the basic rights its moral liberation demands. The fundamental rights of all rats are: 1) the moral right to have reasonable consideration, and 2) the moral right to freedom from unnecessary suffering. Further, contract-based rights are suggested for companion rats, which take the form of additional regulation regarding breeders, retailers, and consumers.
{ Joshua Duffy | Continue reading }
images { ad for The Rats Are Coming! The Werewolves Are Here!, 1972 | Rat Fink by Adam Cruz }
animals, ideas | June 7th, 2016 4:17 am

As Goethe observed in 1797, “the publisher always knows the profit to himself and his family whereas the author is totally in the dark.” This problem of lopsided information was aggravated by the near-absence of copyright protection in the 18th and 19th century. A bestseller could be expected to spawn an abundance of pirated versions. Charles Dickens, on his first trip to the United States in 1842, complained endlessly about the pirating of his works for the U.S. market. This lack of intellectual property protection led to further conflicts of interest and opinion between authors and publishers: it was standard practice among publishers — even respectable ones — to have multiple print runs without an author’s permission, and writers sometimes tried to sell near-identical editions of the same title to multiple publishers. Because authors couldn’t trust the sales numbers if and when their publishers provided them, 19th-century book contracts were for a fixed fee rather than per-copy royalty payments. […]
Goethe engineered the following mechanism […]
I am inclined to offer Mr. Vieweg from Berlin an epic poem, Hermann and Dorothea, which will have approximately 2000 hexameters. …Concerning the royalty we will proceed as follows: I will hand over to Mr. Counsel Böttiger [Goethe’s lawyer] a sealed note which contains my demand, and I wait for what Mr. Vieweg will suggest to offer for my work. If his offer is lower than my demand, then I take my note back, unopened, and the negotiation is broken. If, however, his offer is higher, then I will not ask for more than what is written in the note to be opened by Mr. Böttiger.
Scholars had treated Goethe’s proposition as one of the enigmas left behind by one of history’s greatest literary figures. But the economists argue that there’s no mystery to Goethe’s choice of mechanism. The author wanted to know how much he was worth to Vieweg, and he devised this peculiar “auction” to get Vieweg to tell him.
{ The Millions | Continue reading }
books, economics | June 1st, 2016 6:27 am
ideas, tom wesselmann | May 10th, 2016 3:18 pm

Have you heard the one about the biologist, the physicist, and the mathematician? They’re all sitting in a cafe watching people come and go from a house across the street. Two people enter, and then some time later, three emerge. The physicist says, “The measurement wasn’t accurate.” The biologist says, “They have reproduced.” The mathematician says, “If now exactly one person enters the house then it will be empty again.”
{ Nautilus | Continue reading }
Physics, ideas | May 4th, 2016 5:05 am

The Devil looks you in the eyes and offers you a bet. Pick a number and if you successfully guess the total he’ll roll on two dice you get to keep your soul. If any other number comes up, you go to burn in eternal hellfire.
You call “7” and the Devil rolls the dice.
A two and a four, so the total is 6 — that’s bad news.
But let’s not dwell on the incandescent pain of your infinite and inescapable future, let’s think about your choice immediately before the dice were rolled.
Did you make a mistake? Was choosing “7” an error?
In one sense, obviously yes. You should have chosen 6.
But in another important sense you made the right choice. There are more combinations of dice outcomes that add to 7 than to any other number. The chances of winning if you bet 7 are higher than for any other single number.
The distinction is between a particular choice which happens to be wrong, and a choice strategy which is actually as good as you can do in the circumstances. If we replace the Devil’s Wager with the situations the world presents you, and your choice of number with your actions in response, then we have a handle on what psychologists mean when they talk about “cognitive error” or “bias”.
{ Mind Hacks | Continue reading }
ideas, psychology | April 27th, 2016 1:21 pm

The classic argument is that those of our ancestors who saw more accurately had a competitive advantage over those who saw less accurately and thus were more likely to pass on their genes that coded for those more accurate perceptions, so after thousands of generations we can be quite confident that we’re the offspring of those who saw accurately, and so we see accurately. That sounds very plausible. But I think it is utterly false. It misunderstands the fundamental fact about evolution, which is that it’s about fitness functions — mathematical functions that describe how well a given strategy achieves the goals of survival and reproduction. […]
Evolution has shaped us with perceptions that allow us to survive. But part of that involves hiding from us the stuff we don’t need to know. And that’s pretty much all of reality, whatever reality might be.
{ Quanta | Continue reading }
evolution, ideas | April 25th, 2016 12:38 pm

The Sorrows of Young Werther was published in 1774, when Goethe (1749–1832) was just twenty-five years old. A product of true literary genius, it not only represents one of the greatest works of literature ever written, but it also offers keenly intuitive insight into one of the most terrible and mystifying emotional disorders that plague humankind.
Well before Sigmund Freud, and most probably destined to become an important source of Freud’s understanding of melancholic depression, Goethe was able to peer into the soul of those afflicted with what is now termed Major Depressive Disorder (and some forms of Bipolar Disorder) and see what is taking place within those who are suffering from it. It is impressive how clearly Goethe grasped the twin roles played in mel-ancholia of narcissistic object choice and extreme ambivalence toward a love object.
{ The Psychoanalytic Quarterly | PDF }
books, ideas, psychology, relationships | February 16th, 2016 12:39 pm

The problem we will address can be characterized in either one of two ways. The first is this: why do people pursue art that evokes negative emotions, when they tend to avoid things that evoke such emotions? The emphasis here is on the disagreeable nature of certain mental states. The second characterization emphasizes the disagreeable nature of their causes (which are also, typically, their objects): why do we appreciate tragic events in art when we don’t appreciate tragic events in life? […]
We think both questions involved in the paradox can be answered with reference to the fact that sad art acknowledges sad aspects of life. […] Acknowledging involves recognizing, giving credit, honoring, or doing justice. We think that sad art does just this for its subject matter. In this respect, works of sad art have much in common with monuments to real life tragedies. The difference is that since sad art typically touches on universal themes, it ‘commemorates’ not only specific events, but general aspects of life. […]
The acknowledgement theory says that people derive pleasure from the fact that certain aspects of life are acknowledged in works of art, and answers the question why we pursue tragic art with reference to this pleasure.
{ Philosophical Studies | Continue reading }
beaux-arts, ideas | February 15th, 2016 12:33 pm

After 2.5 millennia of philosophical deliberation and psychological experimentation, most scholars have concluded that humor arises from incongruity. We highlight 2 limitations of incongruity theories of humor.
First, incongruity is not consistently defined. The literature describes incongruity in at least 4 ways: surprise, juxtaposition, atypicality, and a violation.
Second, regardless of definition, incongruity alone does not adequately differentiate humorous from nonhumorous experiences.
We suggest revising incongruity theory by proposing that humor arises from a benign violation: something that threatens a person’s well-being, identity, or normative belief structure but that simultaneously seems okay.
Six studies, which use entertainment, consumer products, and social interaction as stimuli, reveal that the benign violation hypothesis better differentiates humorous from nonhumorous experiences than common conceptualizations of incongruity. A benign violation conceptualization of humor improves accuracy by reducing the likelihood that joyous, amazing, and tragic situations are inaccurately predicted to be humorous.
{ Journal of Personality and Social Psychology }
photo { William Klein }
Linguistics, psychology | January 21st, 2016 1:55 pm

Physicist Enrico Fermi famously asked the question “Where are they?” to express his surprise over the absence of any signs for the existence of other intelligent civilizations in the Milky Way Galaxy. […]
Observations have shown that the Milky Way contains no fewer than a billion Earth-size planets orbiting Sun-like (or smaller) stars in the “Goldilocks” region that allows for liquid water to exist on the planet’s surface (the so-called habitable zone). Furthermore, the search for extraterrestrial intelligent life has recently received a significant boost in the form of “Breakthrough Listen”—a $100-million decade-long project aimed at searching for non-natural transmissions in the electromagnetic bandwidth from 100 megahertz to 50 gigahertz.
Simple life appeared on Earth almost as soon as the planet cooled sufficiently to support water-based organisms. To be detectable from a distance, however, life has to evolve to the point where it dominates the planetary surface chemistry and has significantly changed the atmosphere, creating chemical “biosignatures” that can in principle be detected remotely. For instance, Earth itself would probably not have been detected as a life-bearing planet during the first two billion years of its existence. […]
[A]n excellent first step in the quest for signatures of simple extrasolar life in the relatively near future would be to: search for oxygen, but try to back it up with other biosignatures. […]
One would ideally like to go beyond biosignatures and seek the clearest sign of an alien technological civilization. This could be the unambiguous detection of an intelligent, non-natural signal, most notably via radio transmission, the aim of the SETI (Search for Extraterrestrial Intelligence) program. Yet there is a distinct possibility that radio communication might be considered archaic to an advanced life form. Its use might have been short-lived in most civilizations, and hence rare over large volumes of the universe. What might then be a generic signature? Energy consumption is a hallmark of an advanced civilization that appears to be virtually impossible to conceal. […]
More pessimistically, biologically-based intelligence may constitute only a very brief phase in the evolution of complexity, followed by what futurists have dubbed the “singularity”—the dominance of artificial, inorganic intelligence. If this is indeed the case, most advanced species are likely not to be found on a planet’s surface (where gravity is helpful for the emergence of biological life, but is otherwise a liability). But they probably must still be near a fuel supply, namely a star, because of energy considerations. Even if such intelligent machines were to transmit a signal, it would probably be unrecognizable and non-decodable to our relatively primitive organic brains.
{ Scientific American | Continue reading }
space, theory | January 13th, 2016 3:03 pm

…the differences between “U” (Upper-class) and “non-U” (Middle Class) usages […]
The genteel offer ale rather than beer; invite one to step (not come) this way; and assist (never help) one another to potatoes. […]
When Prince William and Kate Middleton split up in 2007 the press blamed it on Kate’s mother’s linguistic gaffes at Buckingham Palace, where she reputedly responded to the Queen’s How do you do? with the decidedly non-U Pleased to meet you (the correct response being How do you do?), and proceeded to ask to use the toilet (instead of the U lavatory).
{ The Conversation | Continue reading }
Linguistics | January 12th, 2016 1:14 pm

Under ancient Jewish law, if a suspect on trial was unanimously found guilty by all judges, then the suspect was acquitted. This reasoning sounds counterintuitive, but the legislators of the time had noticed that unanimous agreement often indicates the presence of systemic error in the judicial process, even if the exact nature of the error is yet to be discovered. They intuitively reasoned that when something seems too good to be true, most likely a mistake was made.
[A] team of researchers has further investigated this idea, which they call the “paradox of unanimity.” […] The researchers demonstrated the paradox in the case of a modern-day police line-up, in which witnesses try to identify the suspect out of a line-up of several people. The researchers showed that, as the group of unanimously agreeing witnesses increases, the chance of them being correct decreases until it is no better than a random guess.
{ Phys.org | Continue reading }
ideas | January 7th, 2016 4:09 pm

A basic feature of psychological processes is their irreversibility. Every experience changes a person in a way that cannot be completely undone… one must assume that persons are continuously and irreversibly changing. […]
The logic of inductive inference entails that what is observed under given conditions at one time will occur again under the same conditions at a later time. But this logic can only be applied when it is possible to replicate the same initial conditions, and this is strictly impossible in the case of irreversible processes.
As a result, no psychological theory can attain the status of a “law”, and no result will be perfectly replicable.
{ Neuroskeptic | Continue reading }
ideas, psychology | January 3rd, 2016 8:31 am

The present research examined whether possessing multiple social identities (i.e., groups relevant to one’s sense of self) is associated with creativity. In Study 1, the more identities individuals reported having, the more names they generated for a new commercial product (i.e., greater idea fluency). […] Results suggest that possessing multiple social identities is associated with enhanced creativity via cognitive flexibility.
{ Personality and Social Psychology Bulletin | Continue reading }
photo { Gregory Miller }
ideas, photogs, psychology | December 29th, 2015 4:00 pm

In Women After All: Sex, Evolution, and the End of Male Supremacy, Melvin Konner argues that male domination is an anomaly of human history, not a natural state for the human species. Specifically, Konner suggests that male supremacy is largely an effect of an oppressive social arrangement, namely civilization, which began with the invention of agriculture when humans began to form permanent settlements. Permanent settlements enabled men to be able to accumulate resources and allowed population densities to increase mainly through higher birth rates. Higher population densities placed more intense pressure on the land’s resources. Therefore, it became necessary for men to form coalitions with neighbors to defend against intruders. Power became concentrated in the hands of a few men, leading to a stratified society where male supremacy and female subordination reigned and male violence and war intensified. Today, Konner argues that technology limits the need for the muscle and strength of men, and male domination has outlived its purpose and is maladaptive. Therefore, empowering women is the next step in human evolution. Through empowering women, equality between the sexes will be restored and man-made disasters, such as wars, sex scandals, and financial corruption, will significantly decrease or be eliminated since women (who Konner claims are less emotional than men) will be in positions of leadership and power.
{ Evolutionary Psychology | Continue reading }
evolution, flashback, theory | December 12th, 2015 9:40 am

Penrose and many others argue from practical considerations, Godel’s theorem, and on philosophical grounds, that consciousness or awareness is non-algorithmic and so cannot be generated by a system that can be described by classical physics, such as a conventional computer, but could perhaps be generated by a system requiring a quantum (Hilbert space) description. Penrose suspects that aspects of quantum physics not yet understood might be needed to explain consciousness. In this paper we shall see that only known quantum physics is needed to explain perception.
{ James A. Donald | Continue reading }
photo { Martin Parr }
Physics, mystery and paranormal, theory | December 3rd, 2015 1:40 pm

We now have four good Darwinian reasons for individuals to be altruistic, generous or ‘moral’ towards each other. First, there is the special case of genetic kinship. Second, there is reciprocation: the repayment of favours given, and the giving of favours in ‘anticipation’ of payback. Following on from this there is, third, the Darwinian benefit of acquiring a reputation for generosity and kindness. And fourth, if Zahavi is right, there is the particular additional benefit of conspicuous generosity as a way of buying unfakeably authentic advertising.
{ Richard Dawkins | Continue reading }
photo { Todd Fisher }
evolution, theory | November 13th, 2015 2:27 pm

Most people own things that they don’t really need. It is worth thinking about why. […]
A policy aimed at curbing luxury shopping might involve higher marginal tax rates or, as a more targeted intervention, a consumption tax. As it becomes harder to afford a Rolex, people will devote more money to pleasures that really matter. Less waste, more happiness.
{ Boston Review | Continue reading }
photo { Teale Coco by Ben Simpson }
economics, ideas | November 5th, 2015 8:55 am

An influential theory about the malleability of memory comes under scrutiny in a new paper in the Journal of Neuroscience.
The ‘reconsolidation’ hypothesis holds that when a memory is recalled, its molecular trace in the brain becomes plastic. On this view, a reactivated memory has to be ‘saved’ or consolidated all over again in order for it to be stored.
A drug that blocks memory formation (‘amnestic’) will, therefore, not just block new memories but will also cause reactivated memories to be forgotten, by preventing reconsolidation.
This theory has generated a great deal of research interest and has led to speculation that blocking reconsolidation could be used as a tool to ‘wipe’ human memories.
However, Gisquet-Verrier et al. propose that amnestic drugs don’t in fact block reconsolidation, but instead add an additional element to a reactivated memory trace. This additional element is a memory of the amnestic itself – essentially, ‘how it feels’ to be intoxicated with that drug.
In other words, the proposal is that amnestics tag memories with ‘amnestic-intoxication’ which makes these memories less accessible due to the phenomenon of state dependent recall. This predicts that the memories could be retrieved by giving another dose of the amnestic.
So, Gisquet-Verrier et al. are saying that (sometimes) an ‘amnestic’ drug can actually improve memory.
{ Neuroskeptic | Continue reading }
related { Kids can remember tomorrow what they forgot today }
memory, theory | September 22nd, 2015 2:41 pm

In 1734, in scotland, a 23-year-old was falling apart.
As a teenager, he’d thought he had glimpsed a new way of thinking and living, and ever since, he’d been trying to work it out and convey it to others in a great book. The effort was literally driving him mad. His heart raced and his stomach churned. He couldn’t concentrate. Most of all, he just couldn’t get himself to write his book. His doctors diagnosed vapors, weak spirits, and “the Disease of the Learned.” Today, with different terminology but no more insight, we would say he was suffering from anxiety and depression. […]
The young man’s name was David Hume. Somehow, during the next three years, he managed not only to recover but also, remarkably, to write his book. Even more remarkably, it turned out to be one of the greatest books in the history of philosophy: A Treatise of Human Nature.
In his Treatise, Hume rejected the traditional religious and philosophical accounts of human nature. Instead, he took Newton as a model and announced a new science of the mind, based on observation and experiment. That new science led him to radical new conclusions. He argued that there was no soul, no coherent self, no “I.” “When I enter most intimately into what I call myself,” he wrote in the Treatise, “I always stumble on some particular perception or other, of heat or cold, light or shade, love or hatred, pain or pleasure. I never can catch myself at any time without a perception, and never can observe any thing but the perception.” […]
Until Hume, philosophers had searched for metaphysical foundations supporting our ordinary experience, an omnipotent God or a transcendent reality outside our minds. But Hume undermined all that. When you really look hard at everything we think we know, he argued, the foundations crumble. […]
Ultimately, the metaphysical foundations don’t matter. Experience is enough all by itself. What do you lose when you give up God or “reality” or even “I”? The moon is still just as bright; you can still predict that a falling glass will break, and you can still act to catch it; you can still feel compassion for the suffering of others. Science and work and morality remain intact. Go back to your backgammon game after your skeptical crisis, Hume wrote, and it will be exactly the same game. […]
That sure sounded like Buddhist philosophy to me—except, of course, that Hume couldn’t have known anything about Buddhist philosophy.
Or could he have?
{ The Atlantic | Continue reading }
related { Neuroscience backs up the Buddhist belief that “the self” isn’t constant, but ever-changing }
photo { Rodney Smith }
ideas | September 22nd, 2015 2:41 pm

Today, of course everybody knows that “Hardball,” “Rivera Live” and similar shows are nothing but a steady stream of guesses about the future. The Sunday morning talk shows are pure speculation. They have to be. Everybody knows there’s no news on Sunday.
But television is entertainment. Let’s look at the so-called serious media. For example, here is The New York Times for March 6, the day Dick Farson told me I was giving this talk. The column one story for that day concerns Bush’s tariffs on imported steel. Now we read: Mr. Bush’s action “is likely to send the price of steel up sharply, perhaps as much as ten percent…” American consumers “will ultimately bear” higher prices. America’s allies “would almost certainly challenge” the decision. Their legal case “could take years to litigate in Geneva, is likely to hinge” on thus and such.
Also note the vague and hidden speculation. The Allies’ challenge would be “setting the stage for a major trade fight with many of the same countries Mr. Bush is trying to hold together in the fractious coalition against terrorism.” In other words, the story speculates that tariffs may rebound against the fight against terrorism.
By now, under the Faludi Standard I have firmly established that media are hopelessly riddled with speculation, and we can go on to consider its ramifications.
You may read this tariff story and think, what’s the big deal? The story’s not bad. Isn’t it reasonable to talk about effects of current events in this way? I answer, absolutely not. Such speculation is a complete waste of time. It’s useless. It’s bullshit on the front page of the Times.
The reason why it is useless, of course, is that nobody knows what the future holds.
Do we all agree that nobody knows what the future holds? Or do I have to prove it to you? I ask this because there are some well-studied media effects which suggest that simply appearing in media provides credibility. There was a well-known series of excellent studies by Stanford researchers that have shown, for example, that children take media literally. If you show them a bag of popcorn on a television set and ask them what will happen if you turn the TV upside down, the children say the popcorn will fall out of the bag. This result would be amusing if it were confined to children. But the studies show that no one is exempt. All human beings are subject to this media effect, including those of us who think we are self-aware and hip and knowledgeable.
Media carries with it a credibility that is totally undeserved. You have all experienced this, in what I call the Murray Gell-Mann Amnesia effect. […]
Briefly stated, the Gell-Mann Amnesia effect is as follows. You open the newspaper to an article on some subject you know well. In Murray’s case, physics. In mine, show business. You read the article and see the journalist has absolutely no understanding of either the facts or the issues. Often, the article is so wrong it actually presents the story backward—reversing cause and effect. I call these the “wet streets cause rain” stories. Paper’s full of them.
In any case, you read with exasperation or amusement the multiple errors in a story, and then turn the page to national or international affairs, and read as if the rest of the newspaper was somehow more accurate about Palestine than the baloney you just read. You turn the page, and forget what you know.
That is the Gell-Mann Amnesia effect. I’d point out it does not operate in other arenas of life. In ordinary life, if somebody consistently exaggerates or lies to you, you soon discount everything they say. In court, there is the legal doctrine of falsus in uno, falsus in omnibus, which means untruthful in one part, untruthful in all. But when it comes to the media, we believe against evidence that it is probably worth our time to read other parts of the paper.
{ Michael Crichton | Continue reading }
ideas, media | September 16th, 2015 2:16 pm

State-of-the-art forensic technology from South Africa has been used to try and unravel the mystery of what was smoked in tobacco pipes found in the Stratford-upon-Avon garden of William Shakespeare.
Residue from clay tobacco pipes more than 400 years old from the playwright’s garden were analysed. […] Results of this study (including 24 pipe fragments) indicated cannabis in eight samples, nicotine in at least one sample, and in two samples definite evidence for Peruvian cocaine from coca leaves.
{ The Independent | Continue reading }
photos { 1 | John K. }
books, drugs, flashback, smoking | August 10th, 2015 4:09 pm

Around 1930, the director of an evening newspaper had hired Georges Simenon as an advertising attraction. He’d had a cage constructed in the hall of his newspaper where Simenon, under eyes of the public, was to write a serial, non-stop. But on the eve of the big day, the newspaper went bankrupt. Simenon wrote the book in his room.
{ Paris Match | Continue reading }
In 1927 the publisher of Paris-Soir proposed to place Simenon in a glass cage, where he would spend three days and three nights writing a novel in public.
{ NY Times | Continue reading }
photo { Mark Heithoff }
art, books, flashback | August 7th, 2015 3:55 pm

Dogs can infer the name of an object and have been shown to learn the names of over 1,000 objects. Dogs can follow the human pointing gesture; even nine week old puppies can follow a basic human pointing gesture without being taught.
New Guinea Singing dogs, a half-wild proto-dog endemic to the remote alpine regions of New Guinea, as well as Dingoes in the remote outback of Australia are also capable of this.
These examples demonstrate an ability to read human gestures that arose early in domestication and did not require human selection. “Humans did not develop dogs, we only fine-tuned them down the road.”
Similar to the chimpanzee, Bonobos are a close genetic cousin to humans. Unlike the chimpanzee, bonobos are not aggressive and do not participate in lethal intergroup aggression or kill within their own group. The most distinctive features of a bonobo are its cranium, which is 15% smaller than a chimpanzee’s, and its less aggressive and more playful behavior. Dogs mirror these differences relative to wild wolves: a dog’s cranium is 15% smaller than an equally heavy wolf’s, and the dog is less aggressive and more playful. The guinea pig’s cranium is 13% smaller than its wild cousin the cavie and domestic fowl show a similar reduction to their wild cousins. Possession of a smaller cranium for holding a smaller brain is a telltale sign of domestication. Bonobos appear to have domesticated themselves.
In the “farm fox” experiment, humans selectively bred foxes against aggression which caused a domestication syndrome. The foxes were not selectively bred for smaller craniums and teeth, floppy ears, or skills at using human gestures but these traits were demonstrated in the friendly foxes.
Natural selection favors those that are the most successful at reproducing, not the most aggressive. Selection against aggression made possible the ability to cooperate and communicate among foxes, dogs and bonobos. Perhaps it did the same thing for humans.
{ Wikipedia | Continue reading }
animals, flashback, theory | August 6th, 2015 11:00 am

Hospitality is always a matter of urgency, always a question of speeds. The unexpected guests arrive and there is always a rush of activity: a hurried welcoming at the door, a quick cleaning up, a surreptitious rearranging or putting back into order, a preparing of food and drink. But even when the guest is expected, has been expected for a long time, there is a sense of urgency. The guests arrive — always too early or too late, even if they are ‘on time.’ Coats are taken; tours are given of the immaculate, impossibly ordered home; drinks are served, food presented. For there to be a place for hospitality, for hospitality to take (the) place, the host must hurry.
{ Sean Gaston | via Austerity Kitchen/TNI | Continue reading }
food, drinks, restaurants, household, ideas | August 6th, 2015 10:28 am

By definition, exponential growth means the thing that comes next will be equal in importance to everything that came before. […]
this exponential growth has given us terrible habits. One of them is to discount the present.
{ Idle Worlds | Continue reading }
ideas, technology | July 22nd, 2015 2:09 am

In 1908, an asteroid measuring perhaps 90-190 meters across struck Siberia, damaging over 2,000 square kilometers of Russian forest – an area that measures larger than New York’s five boroughs. Scientists estimate that the energy of that explosion was about 1,000 times that of the atomic bomb the U.S. dropped on Hiroshima in 1945.
This is far from the only close call that humans have had with asteroids. In 2004, an asteroid big enough to have its own small moon narrowly missed the planet. In 2013, an asteroid struck the Russia countryside with many times the force of the Hiroshima bomb, and was widely captured on video.
And of course, it was an asteroid, smashing into the Earth with the force of more than billion Hiroshima bombs, which nixed the dinosaurs and allowed humans to take over the Earth in the first place. [Previously: The event appears to have hit all continents at the same time | more] […]
The probability that you’ll die from an asteroid may be surprisingly large – about the same probability as dying from a plane crash, according to research.
{ Washington Post | Continue reading }
eschatology | July 15th, 2015 7:56 am

Einstein wondered what would happen if the Sun were to suddenly explode. Since the Sun is so far away that it takes light eight minutes to travel to Earth, we wouldn’t know about the explosion straight away. For eight glorious minutes we’d be completely oblivious to the terrible thing that was about to happen.
But what about gravity? The Earth moves in an ellipse around the Sun, due to the Sun’s gravity. If the Sun wasn’t there, it would move off in a straight line. Einstein’s puzzle was when that would happen: straight away, or after eight minutes? According to Newton’s theory, the Earth should know immediately that the Sun had disappeared. But Einstein said that couldn’t be right. Because, according to him, nothing can travel faster than the speed of light — not even the effects of gravity. […]
Before Einstein people thought of space as stage on which the laws of physics play out. We could throw in some stars or some planets and they would move around on this stage.
Einstein realised that space isn’t as passive as that. It is dynamic and it responds to what’s happening within it. If you put something heavy in space — let’s say a planet like Earth — then space around it gives a little. The presence of the planet causes a small dent in space (and in fact, in time as well). When something else moves close to the planet — say the Moon — it feels this dent in space and rolls around the planet like a marble rolling in a bowl. This is what we call gravity. […] Stars and planets move, causing space to bend in their wake, causing other stars and planets to move, causing space to bend in their wake. And so on. This is Einstein’s great insight. Gravity is the manifestation of the curvature of space and time.
{ Plus Magazine | Part One | Part Two }
Physics, space, time | June 6th, 2015 12:54 pm

Two pennies can be considered the same — both are pennies, just as two elephants can be considered the same, as both are elephants. Despite the vast difference between pennies and elephants, we easily notice the common relation of sameness that holds for both pairs. Analogical ability — the ability to see common relations between objects, events or ideas — is a key skill that underlies human intelligence and differentiates humans from other apes.
While there is considerable evidence that preschoolers can learn abstract relations, it remains an open question whether infants can as well. In a new Northwestern University study, researchers found that infants are capable of learning the abstract relations of same and different after only a few examples.
“This suggests that a skill key to human intelligence is present very early in human development, and that language skills are not necessary for learning abstract relations,” said lead author Alissa Ferry, who conducted the research at Northwestern.
{ Lunatic Laboratories | Continue reading }
Linguistics, psychology | May 26th, 2015 11:57 am

What happens to people when they think they’re invisible?
Using a 3D virtual reality headset, neuroscientists at the Karolinska Institute in Stockholm gave participants the sensation that they were invisible, and then examined the psychological effects of apparent invisibility. […] “Having an invisible body seems to have a stress-reducing effect when experiencing socially challenging situations.” […]
“Follow-up studies should also investigate whether the feeling of invisibility affects moral decision-making, to ensure that future invisibility cloaking does not make us lose our sense of right and wrong, which Plato asserted over two millennia ago,” said the report’s co-author, Henrik Ehrsson. […]
In Book II of Plato’s Republic, one of Socrates’s interlocutors tells a story of a shepherd, an ancestor of the ancient Lydian king Gyges, who finds a magic ring that makes the wearer invisible. The power quickly corrupts him, and he becomes a tyrant.
The premise behind the story of the Ring of Gyges, which inspired HG Wells’s seminal 1897 science fiction novel, The Invisible Man, is that we behave morally so that we can be seen doing so.
{ CS Monitor | Continue reading }
photo { Ren Hang }
ideas, neurosciences | April 24th, 2015 1:59 pm

This paper argues that there are at least five reasons why the claim that the Bible is to be taken literally defies logic or otherwise makes no sense, and why literalists are in no position to claim that they have the only correct view of biblical teachings.
First, many words are imprecise and therefore require interpretation, especially to fill in gaps between general words and their application to specific situations. Second, if you are reading an English version of the Bible you are already dealing with the interpretations of the translator since the earliest Bibles were written in other languages. Third, biblical rules have exceptions, and those exceptions are often not explicitly set forth. Fourth, many of the Bible’s stories defy logic and our experiences of the world. Fifth, there are sometimes two contrary versions of the same event, so if we take one literally then we cannot take the second one literally. In each of these five cases, there is no literal reading to be found.
Furthermore, this paper sets forth three additional reasons why such a literalist claim probably should not be made even if it did not defy logic to make such a claim. These include The Scientific Argument: the Bible contradicts modern science; The Historical Argument: the Bible is historically inaccurate; and The Moral Argument: the Bible violates contemporary moral standards.
{ Open Journal of Philosophy | PDF }
photo { Roger Mimick }
Linguistics, flashback | April 22nd, 2015 11:00 am

Movies are, for the most part, made up of short runs of continuous action, called shots, spliced together with cuts. With a cut, a filmmaker can instantaneously replace most of what is available in your visual field with completely different stuff. This is something that never happened in the 3.5 billion years or so that it took our visual systems to develop. You might think, then, that cutting might cause something of a disturbance when it first appeared. And yet nothing in contemporary reports suggests that it did. […]
What if we could go back in time and collect the reactions of naïve viewers on their very first experience with film editing?
It turns out that we can, sort of. There are a decent number of people on the planet who still don’t have TVs, and the psychologists Sermin Ildirar and Stephan Schwan have capitalised on their existence to ask how first-time viewers experience cuts. […] There was no evidence that the viewers found cuts in the films to be shocking or incomprehensible. […]
I think the explanation is that, although we don’t think of our visual experience as being chopped up like a Paul Greengrass fight sequence, actually it is.
Simply put, visual perception is much jerkier than we realise. First, we blink. Blinks happen every couple of seconds, and when they do we are blind for a couple of tenths of a second. Second, we move our eyes. Want to have a little fun? Take a close-up selfie video of your eyeball while you watch a minute’s worth of a movie on your computer or TV. You’ll see your eyeball jerking around two or three times every second.
{ Aeon | Continue reading }
Against those who defined Italian neo-realism by its social content, Bazin put forward the fundamental requirement of formal aesthetic criteria. According to him, it was a matter of a new form of reality, said to be dispersive, elliptical, errant or wavering, working in blocs, with deliberately weak connections and floating events. The real was no longer represented or reproduced but “aimed at.” Instead of representing an already deciphered real, neo-realism aimed at an always ambiguous, to be deciphered, real; this is why the sequence shot tended to replace the montage of representations. […]
[I]n Umberto D, De Sica constructs the famous sequence quoted as an example by Bazin: the young maid going into the kitchen in the morning, making a series of mechanical, weary gestures, cleaning a bit, driving the ants away from a water fountain, picking up the coffee grinder, stretching out her foot to close the door with her toe. And her eyes meet her pregnant woman’s belly, and it is as though all the misery in the world were going to be born. This is how, in an ordinary or everyday situation, in the course of a series of gestures, which are insignificant but all the more obedient to simple sensory-motor schemata, what has suddenly been brought about is a pure optical situation to which the little maid has no response or reaction. The eyes, the belly, that is what an encounter is … […] The Lonely Woman [Viaggio in ltalia] follows a female tourist struck to the core by the simple unfolding of images or visual cliches in which she discovers something unbearable, beyond the limit of what she can person- ally bear. This is a cinema of the seer and no longer of the agent.
What defines neo-realism is this build-up of purely optical situations (and sound ones, although there was no synchronized sound at the start of neo-realism), which are fundamentally distinct from the sensory-motor situations of the action-image in the old realism. […]
It is clear from the outset that cinema had a special relationship with belief. […] The modern fact is that we no longer believe in this world. We do not even believe in the events which happen to us, love, death, as if they only halfconcerned us. It is not we who make cinema; it is the world which looks to us like a bad film. […] The link between man and the world is broken. Henceforth, this link must become an object of belief: it is the impossible which can only be restored within a faith. Belief is no longer addressed to a different or transformed world. Man is in the world as if in a pure optical and sound situation.
{ Gilles Deleuze, Cinema 2, The Time-Image, 1985 | PDF, 17.2 MB }
deleuze, eyes, showbiz | April 20th, 2015 3:55 am

Aircraft are an interesting set of examples because they’re so well studied and corrected. We don’t spend time correcting hospital mistakes with nearly the speed and detail we do aircraft accidents, for example.
It used to be that airliners broke up in the sky because of small cracks in the window frames. So we fixed that. It used to be that aircraft crashed because of outward opening doors. So we fixed that. Aircraft used to fall out of the sky from urine corrosion, so we fixed that with encapsulated plastic lavatories. […] And so we add more rules, like requiring two people in the cockpit from now on. Who knows what the mental capacity is of the flight attendant that’s now allowed in there with one pilot, or what their motives are. At some point, if we wait long enough, a flight attendant is going to take over an airplane having only to incapacitate one, not two, pilots. And so we’ll add more rules about the type of flight attendant allowed in the cockpit and on and on.
There’s a wonderful story of the five whys.
The Lincoln Memorial stonework was being damaged. Why? By cleaning spray eroding it. Why? Because it’s used to clean bird poop. So they tried killing the birds. Didn’t work. Why are the birds there? To eat insects. Let’s kill the insects! Didn’t work. Why are the insects there? Because the lights are on after dusk. So let’s just turn the lights off. That works.
{ Steve Coast | Continue reading }
economics, ideas, incidents | April 1st, 2015 11:12 am

It’s hard to read the old-fashioned way, slowly and deliberately. Few of us have the patience, the concentration, or the time. When we do read, we skim, trying to get a quick “take” on the topics of the day, often conveniently served up as prepackaged excerpts by our modern media machine. We flit from one thing to the next, never pausing to think about what we’ve just read, because in our media-saturated, technology-obsessed age we just don’t have time. Worse, our bad reading habits are symptomatic of a deeper malaise. Real learning, real knowledge, and real culture have been supplanted by the shallow, utilitarian instrumentalism of modern life. The evidence is mounting. Humanities departments are losing students to the sciences and other more useful majors, where they are stuffed with facts and outfitted with skills, better to serve the state as productive citizens; our cultural models are the average heroes of a popular culture. Our culture is in decline. And we read only the headlines.
That may sound like the latest jeremiad in The New Criterion or The New Republic, but it’s actually a paraphrase of Friedrich Nietzsche’s preface to a series of lectures he delivered in the winter of 1872. […]
Nietzsche saw this image of modern print culture embodied in modern journalism’s endless pursuit of the news. In the face of the modern media machine, he longed for timelessness, but one not simply stripped of its time and place. Instead, it was an ethos of active resistance to the “idolatrous” need for the new, the latest headline, the latest commentary, the latest feuilleton. It was intended to enlist those few who were not, as he put it in the Basel lectures, “caught up in the dizzying haste of our hurtling era” and dependent on its short-lived pleasures. It was a call for calm readers.
{ The Hedgehog Review | Continue reading }
photo { Ana Cecilia Alvarez }
media, nietzsche | March 30th, 2015 4:42 am

15 years ago, the neurosciences defined the main function of brains in terms of processing input to compute output: “brain function is ultimately best understood in terms of input/output transformations and how they are produced” wrote Mike Mauk in 2000.
Since then, a lot of things have been discovered that make this stimulus-response concept untenable and potentially based largely on laboratory artifacts.
For instance, it was discovered that the likely ancestral state of behavioral organization is one of probing the environment with ongoing, variable actions first and evaluating sensory feedback later (i.e., the inverse of stimulus response). […]
In humans, functional magnetic resonance imaging (fMRI) studies over the last decade and a half revealed that the human brain is far from passively waiting for stimuli, but rather constantly produces ongoing, variable activity, and just shifts this activity over to other networks when we move from rest to task or switch between tasks.
{ Björn Brembs | Continue reading }
neurosciences, theory | March 27th, 2015 12:30 pm

Anthropodermic bibliopegy is the practice of binding books in human skin. Though extremely uncommon in modern times, the technique dates back to at least the 17th century. The practice is inextricably connected with the practice of tanning human skin, often done in certain circumstances after a corpse has been dissected.
Surviving historical examples of this technique include anatomy texts bound with the skin of dissected cadavers, volumes created as a bequest and bound with the skin of the testator, and copies of judicial proceedings bound in the skin of the murderer convicted in those proceedings, such as in the case of John Horwood in 1821 and the Red Barn Murder in 1828.
{ Wikipedia | Continue reading }
books, weirdos | March 23rd, 2015 4:38 pm
Doctor Tsun arrives at the house of Doctor Albert, who is deeply excited to have met a descendant of Ts’ui Pên. Doctor Albert reveals that he has himself been engaged in a longtime study of Ts’ui Pên’s novel. Albert explains excitedly that at one stroke he has solved both mysteries—the chaotic and jumbled nature of Ts’ui Pên’s unfinished book and the mystery of his lost labyrinth. Albert’s solution is that they are one and the same: the book is the labyrinth.
{ Plot summary of The Garden of Forking Paths/Wikipedia | Continue reading }
books | March 20th, 2015 1:20 pm
books, haha | March 18th, 2015 3:49 pm

Many people spontaneously use the word (or sound) “Um” in conversation, a phenomenon which has prompted a considerable volume of academic attention. A question arises though, can someone be induced to say “Um” by chemical means – say with the use of a powerful anaesthetic? Like, for example Ketamine? […]
[V]olunteers who were given “low doses” and “high doses” of Ketamine tended to use the words “um” and “uh” significantly more than those who received a placebo only.
{ Improbable | Continue reading }
Linguistics, drugs | March 12th, 2015 3:07 pm

Most people who describe themselves as demisexual say they only rarely feel desire, and only in the context of a close relationship. Gray-asexuals (or gray-aces) roam the gray area between absolute asexuality and a more typical level of interest. […]
“Every single asexual I’ve met embraces fluidity—I might be gray or asexual or demisexual,” says Claudia, a 24-year-old student from Las Vegas. “Us aces are like: whatevs.”
{ Wired | Continue reading }
photo { Nate Walton }
Linguistics, experience, sex-oriented | March 10th, 2015 1:05 pm

Psychology journal bans P values
P values are widely used in science to test null hypotheses. For example, in a medical study looking at smoking and cancer, the null hypothesis could be that there is no link between the two. The closer to zero the P value gets, the greater the chance the null hypothesis is false; many researchers accept findings as ‘significant’ if the P value comes in at less than 0.05. But P values are slippery, and sometimes, significant P values vanish when experiments and statistical analyses are repeated. […]
“We believe that the p < .05 bar is too easy to pass and sometimes serves as an excuse for lower quality research,”
{ Nature | Continue reading }
ideas, psychology | March 1st, 2015 3:34 pm

Lacan said that there was surely something ironic about Christ’s injunction to love thy neighbour as thyself – because actually, of course, people hate themselves. […]
The way we hate people depends on the way we love them and vice versa. According to psychoanalysis these contradictory feelings enter into everything we do. We are ambivalent, in Freud’s view, about anything and everything that matters to us; indeed, ambivalence is the way we recognise that someone or something has become significant to us. […]
We are never as good as we should be; and neither, it seems, are other people.
{ London Review of Books | Continue reading }
art { Gisèle Vienne, Teenage Hallucination, 2012 }
ideas, relationships | February 25th, 2015 3:42 pm

Every 250m years the sun, with its entourage of planets, completes a circuit of the Milky Way. Its journey around its home galaxy, though, is no stately peregrination. Rather, its orbit oscillates up and down through the galactic disc. It passes through that disc, the place where most of the galaxy’s matter is concentrated, once every 30m years or so.
This fact has long interested Michael Rampino of New York University. He speculates that it could explain the mass extinctions, such as that of the dinosaurs and many other species 66m years ago, which life on Earth undergoes from time to time. Palaeontologists recognise five such humongous events, during each of which up to 90% of species have disappeared. But the fossil record is also littered with smaller but still significant blips in the continuity of life.
Many hypotheses have been put forward to explain these extinctions (and the events may, of course, not all have the same explanation). The two that have most support are collisions between Earth and an asteroid or comet, and extended periods of massive volcanic activity. Dr Rampino observed some time ago that cometary collisions might be triggered by gravitational disruptions of the Oort cloud, a repository of comets in the outermost part of the solar system. That would send a rain of them into the part of space occupied by Earth. This has come to be known as the Shiva hypothesis, after the Hindu god of destruction. […]
In his latest paper, Dr Rampino speculates that the real culprit may be not stars, but dark matter—and that this might explain the volcanism as well.
{ The Economist | Continue reading }
Physics, eschatology, flashback | February 25th, 2015 3:43 am

The asteroid landed in the ocean and would have caused megatsunamis, for which evidence has been found in several locations in the Caribbean and eastern United States—marine sand in locations that were then inland, and vegetation debris and terrestrial rocks in marine sediments dated to the time of the impact. […]
The asteroid landed in a bed of gypsum (calcium sulfate), which would have produced a vast sulfur dioxide aerosol. This would have further reduced the sunlight reaching the Earth’s surface and then precipitated as acid rain, killing vegetation, plankton, and organisms that build shells from calcium carbonate (coccolithophores and molluscs). […]
The impact may also have produced acid rain, depending on what type of rock the asteroid struck. However, recent research suggests this effect was relatively minor, lasting for approximately 12 years. […]
Such an impact would have inhibited photosynthesis by creating a dust cloud that blocked sunlight for up to a year, and by injecting sulfuric acid aerosols into the stratosphere, which might have reduced sunlight reaching the Earth’s surface by 10–20%. It has been argued that it would take at least ten years for such aerosols to dissipate, which would account for the extinction of plants and phytoplankton, and of organisms dependent on them (including predatory animals as well as herbivores). […]
The event appears to have hit all continents at the same time. […]
The event eliminated a vast number of species. Based on marine fossils, it is estimated that 75% or more of all species were wiped out by the K–Pg extinction. In terrestrial ecosystems all animals weighing more than a kilo disappeared.
The most well-known victims are the non-avian dinosaurs. […]
The fact that the extinctions occur at the same time as the Chicxulub asteroid impact strongly supports the impact hypothesis of extinction. […]
The Chicxulub crater is more than 180 kilometres (110 mi) in diameter and 20 km (12 mi) in depth, making the feature one of the largest confirmed impact structures on Earth; the impacting bolide that formed the crater was at least 10 km (6 mi) in diameter. […] Researchers dated rock and ash samples from the impact to roughly 66 million years ago. […]
Some scientists maintain the extinction was caused or exacerbated by other factors, such as volcanic eruptions, climate change, or sea level change, separately or together.
{ The Cretaceous–Paleogene (K–Pg) extinction event | Chicxulub crater }
related { Plants survive better through mass extinctions than animals }
related { Rising Sea Levels Are Already Making Miami’s Floods Worse }
related { 12 ways researchers think human civilisation is most likely to end }
eschatology, flashback | February 18th, 2015 2:15 pm

Dr. Yalom, I would like a consultation. I’ve read your novel “When Nietzsche Wept,” and wonder if you’d be willing to see a fellow writer with a writing block.
No doubt Paul sought to pique my interest with his email. […] Ten days later Paul arrived for his appointment. […]
“I was in philosophy at Princeton writing my doctorate on the incompatibility between Nietzsche’s ideas on determinism and his espousal of self-transformation. But I couldn’t finish. I kept getting distracted by such things as Nietzsche’s extraordinary correspondence, especially by his letters to his friends and fellow writers like Strindberg.
“Gradually I lost interest altogether in his philosophy and valued him more as an artist. I came to regard Nietzsche as a poet with the most powerful voice in history, a voice so majestic that it eclipsed his ideas, and soon there was nothing for me to do but to switch departments and do my doctorate in literature rather than philosophy.
“The years went by,” he continued, “my research progressed well, but I simply could not write. Finally I arrived at the position that it was only through art that an artist could be illuminated, and I abandoned the dissertation project entirely and decided instead to write a novel on Nietzsche. But the writing block was neither fooled nor deterred by my changing projects. It remained as powerful and unmovable as a granite mountain. And so it has continued until this very day.”
I was stunned. Paul was an old man now. He must have begun working on his dissertation well over a half-century ago. […]
“Tell me more,” I said. “Your family? The people in your life?”
“No siblings. Married twice. Divorced twice. Mercifully short marriages. No children, thank God.”
This is getting very odd, I thought. So talkative at first, Paul now seems intent on giving me as little information as possible. What’s going on?
{ NY Times | Continue reading }
experience, nietzsche, psychology | February 16th, 2015 1:27 pm

Does morality depend on the time of the day? The study “The Morning Morality Effect: The Influence of Time of Day on Unethical Behavior” published in October of 2013 by Maryam Kouchaki and Isaac Smith suggested that people are more honest in the mornings, and that their ability to resist the temptation of lying and cheating wears off as the day progresses. […]
One question not addressed by Kouchaki and Smith was whether the propensity to become dishonest in the afternoons or evenings could be generalized to all subjects or whether the internal time in the subjects was also a factor.
All humans have an internal body clock — the circadian clock — which runs with a period of approximately 24 hours. The circadian clock controls a wide variety of physical and mental functions such as our body temperature, the release of hormones or our levels of alertness. The internal clock can vary between individuals, but external cues such as sunlight or the social constraints of our society force our internal clocks to be synchronized to a pre-defined external time which may be quite distinct from what our internal clock would choose if it were to “run free”. Free-running internal clocks of individuals can differ in terms of their period (for example 23.5 hours versus 24.4 hours) as well as the phases of when individuals would preferably engage in certain behaviors. […]
Some people like to go to bed early, wake up at 5 am or 6 am on their own even without an alarm clock and they experience peak levels of alertness and energy before noon. In contrast to such “larks”, there are “owls” among us who prefer to go to bed late at night, wake up at 11 am, experience their peak energy levels and alertness in the evening hours and like to stay up way past midnight. […]
The researchers Brian Gunia, Christopher Barnes and Sunita Sah therefore decided to replicate the Kouchaki and Smith study with one major modification: They not only assessed the propensity to cheat at different times of the day, they also measured the chronotypes of the study participants. Their recent paper “”The Morality of Larks and Owls: Unethical Behavior Depends on Chronotype as Well as Time of Day” confirms that Kouchaki and Smith findings that the time of the day influences honesty, but the observed effects differ among chronotypes. […]
[M]orning morality effect and the idea of “moral exhaustion” towards the end of the day cannot be generalized to all. In fact, evening people (”owls”) are more honest in the evenings. […]
Gunia and colleagues only included morning and evening people in their analysis and excluded the participants who reported an intermediate chronotype, i.e. not quite early morning “larks” and not true “owls”. This is a valid criticism because newer research on chronotypes has shown that there is a Gaussian distribution of chronotypes. Few of us are extreme larks or extreme owls, most of us lie on a continuum.
{ Fragments of Truth | Continue reading }
photo { Steven Meisel }
hormones, psychology, time | February 10th, 2015 2:24 pm

No Big Bang? Quantum equation predicts universe has no beginning
The widely accepted age of the universe, as estimated by general relativity, is 13.8 billion years. In the beginning, everything in existence is thought to have occupied a single infinitely dense point, or singularity. Only after this point began to expand in a “Big Bang” did the universe officially begin.
Although the Big Bang singularity arises directly and unavoidably from the mathematics of general relativity, some scientists see it as problematic because the math can explain only what happened immediately after—not at or before—the singularity.
{ Phys.org | Continue reading }
Hilbert managed to build a hotel with an infinite number of rooms, all of which are occupied.
Suppose a new guest arrives and wishes to be accommodated in the hotel. Because the hotel has an infinite number of, we can move any guest occupying any room n to room n+1 (the occupant of room 1 moves to room 2, room 2 to room 3, and so on), then fit the newcomer into room 1.
Now suppose an infinite number of new guests arrives: just move any occupant of room n to room 2n (room 1 to room 2, room 2 to room 4, room 3 to room 6, and so on), and all the odd-numbered rooms (which are countably infinite) will be free for the new guests.
{ Wikipedia | Continue reading }
ideas, space, theory | February 9th, 2015 1:11 pm

Serendipity, the notion that research in one area often leads to advances in another, has been a central idea in the economics of innovation and science and technology policy. Claims about serendipity, and the futility of planning research, were central to the argument in Vannevar Bush’s Science–The Endless Frontier often considered the blueprint of post-World War II U.S. science policy. […]
The idea of serendipity has been influential not only in practice, but also in theory. Much of the economic work on the governance of research starts from the notion that basic research has economically valuable but unanticipated outcomes. Economic historians, most notably Nathan Rosenberg, have emphasized the uncertain nature of new innovations, and that many technologies (for example, the laser) have had important, but unanticipated, uses and markets. Like Vannevar Bush, prominent economists studying science policy have argued that research cannot and should not be targeted at specific goals but instead guided by the best scientific opportunities, as have influential philosophers of science. […]
[T]here is surprisingly little large-sample evidence on the magnitude of serendipity. This has contributed to perennial debate about the benefits of untargeted or fundamental research, relative to those from basic (or applied) research targeted at specific goals. […] [C]laims about serendipity have been important for diffusing calls (from Congress and taxpayers) to shift funding from fundamental research to that targeted at specific outcomes. […]
I provide evidence on the serendipity hypothesis as it has typically been articulated in the context of NIH research: that progress against specific diseases often results from unplanned research, or unexpectedly from research oriented towards different diseases. […] If the magnitudes of serendipity reported here are real, this would pose real challenges for medical research funding. If disease is not the right organizing category for NIH research, then what might be? Is it possible to mobilize taxpayer and interest group support for science that cuts across dis- eases, or is the attachment of disease categories, however fictitious, required? Even more fundamentally, serendipity makes it hard to fine tune policy to stimulate research areas that taxpayers care about (or even limit the growth of areas where there is too much innovation), and assess whether a funding agency is allocating its funds reasonably given what its patrons desire.
{ Bhaven N. Sampat/SSRN | Continue reading }
image { Radiograph of Brånemark’s initial rabbit specimen. While studying bone cells in a rabbit femur using a titanium chamber, Brånemark was unable to remove it from bone. His realization that bone would adhere to titanium led to the concept of osseointegration and the development of modern dental implants. | more }
ideas | February 9th, 2015 11:53 am

Facebook will soon be able to ID you in any photo
The intention is not to invade the privacy of Facebook’s more than 1.3 billion active users, insists Yann LeCun, a computer scientist at New York University in New York City who directs Facebook’s artificial intelligence research, but rather to protect it. Once DeepFace identifies your face in one of the 400 million new photos that users upload every day, “you will get an alert from Facebook telling you that you appear in the picture,” he explains. “You can then choose to blur out your face from the picture to protect your privacy.” Many people, however, are troubled by the prospect of being identified at all—especially in strangers’ photographs. Facebook is already using the system, although its face-tagging system only reveals to you the identities of your “friends.”
{ Science | Continue reading }
related { Bust detection algorithm }
photo { Rachel Roze }
Linguistics, faces, social networks | February 7th, 2015 3:03 pm

The idea that unconscious thought is sometimes more powerful than conscious thought is attractive, and echoes ideas popularized by books such as writer Malcolm Gladwell’s best-selling Blink.
But within the scientific community, ‘unconscious-thought advantage’ (UTA) has been controversial. Now Dutch psychologists have carried out the most rigorous study yet of UTA—and find no evidence for it. […] The report adds to broader concerns about the quality of psychology studies and to an ongoing controversy about the extent to which unconscious thought in general can influence behaviour.
{ Scientific American | Continue reading }
art { Bronzino, Portrait of Lucrezia Panciatichi, 1545 }
controversy, psychology | January 28th, 2015 2:15 pm

The philosopher Socrates remains, as he was in his lifetime (469–399 B.C.E.), an enigma, an inscrutable individual who, despite having written nothing, is considered one of the handful of philosophers who forever changed how philosophy itself was to be conceived. […]
The extant sources agree that Socrates was profoundly ugly, resembling a satyr more than a man—and resembling not at all the statues that turned up later in ancient times and now grace Internet sites and the covers of books. He had wide-set, bulging eyes that darted sideways and enabled him, like a crab, to see not only what was straight ahead, but what was beside him as well; a flat, upturned nose with flaring nostrils; and large fleshy lips like an ass. Socrates let his hair grow long, Spartan-style (even while Athens and Sparta were at war), and went about barefoot and unwashed, carrying a stick and looking arrogant. […] Something was peculiar about his gait as well, sometimes described as a swagger so intimidating that enemy soldiers kept their distance. He was impervious to the effects of alcohol and cold, but this made him an object of suspicion to his fellow soldiers on campaign. […]
What seemed strange about Socrates is that he neither labored to earn a living, nor participated voluntarily in affairs of state. Rather, he embraced poverty and, although youths of the city kept company with him and imitated him, Socrates adamantly insisted he was not a teacher and refused all his life to take money for what he did. […] Because Socrates was no transmitter of information that others were passively to receive, he resists the comparison to teachers. Rather, he helped others recognize on their own what is real, true, and good—a new, and thus suspect, approach to education. He was known for confusing, stinging and stunning his conversation partners into the unpleasant experience of realizing their own ignorance, a state sometimes superseded by genuine intellectual curiosity. […] Socrates was usually to be found in the marketplace and other public areas, conversing with a variety of different people—young and old, male and female, slave and free, rich and poor—that is, with virtually anyone he could persuade to join with him in his question-and-answer mode of probing serious matters. […]
It did not help matters that Socrates seemed to have a higher opinion of women than most of his companions had, speaking of “men and women,” “priests and priestesses,” and naming foreign women as his teachers: Socrates claimed to have learned rhetoric from Aspasia of Miletus, the lover of Pericles; and to have learned erotics from the priestess Diotima of Mantinea. […]
Athenian citizen males of the upper social classes did not marry until they were at least thirty, and Athenian females were poorly educated and kept sequestered until puberty, when they were given in marriage by their fathers. Thus the socialization and education of males often involved a relationship for which the English word ‘pederasty’ (though often used) is misleading, in which a youth approaching manhood, fifteen to seventeen, became the beloved of a male lover a few years older, under whose tutelage and through whose influence and gifts, the younger man would be guided and improved. It was assumed among Athenians that mature men would find youths sexually attractive, and such relationships were conventionally viewed as beneficial to both parties by family and friends alike. A degree of hypocrisy (or denial), however, was implied by the arrangement: “officially” it did not involve sexual relations between the lovers and, if it did, then the beloved was not supposed to derive pleasure from the act—but ancient evidence (comedies, vase paintings, et al.) shows that both restrictions were often violated. What was odd about Socrates is that, although he was no exception to the rule of finding youths attractive, he refused the physical advances of even his favorite.
{ Stanford Encyclopedia of Philosophy | Continue reading }
flashback, ideas | January 25th, 2015 5:54 am

The weather impacts not only upon our mood but also our voice. An international research team has analysed the influence of humidity on the evolution of languages.
Their study has revealed that languages with a wide range of tone pitches are more prevalent in regions with high humidity levels. In contrast, languages with simpler tone pitches are mainly found in drier regions. This is explained by the fact that the vocal folds require a humid environment to produce the right tone.
The tone pitch is a key element of communication in all languages, but more so in some than others. German or English, for example, still remain comprehensible even if all words are intonated evenly by a robot. In Mandarin Chinese, however, the pitch tone can completely change the meaning of a word.
{ EurekAlert | Continue reading }
Linguistics, noise and signals, water | January 25th, 2015 5:53 am

[TheJosh]
If I go every other day I will be at the gym 4-5 times a week, is that over training?
I typically work out for 60-90 minutes, I push my self and raise the weight each week.
[…]
steviekm3
That makes no sense. There are only 7 days in a week. If you go every other day that is 3.5 times a week.
TheJosh
Monday, Wednesday, Friday, Sunday. That is 4 days.
How do you go 3.5 times? Do a half workout or something? lol
Justin-27
7x in 2 weeks = 3.5 times a week, genius.
And yeah, 3x a week, full body workouts are good.
TheJosh
I never said anything about going exactly 7 times, like I said, if I go every other day, that is 4 DAYS A WEEK. How hard is that to comprehend?
Week 1 - Sunday, Tuesday, Thursday, Saturday
Week 2 - Monday, Wednesday, Friday, Sunday.
8 DAYS IN 2 WEEKS
In your terms,
8x in 2 weeks = 4 times a week, genius.
All Muscle and No Brains? lol
steviekm3
You double counted Sunday - that is 2 weeks plus 1 day.
Did you fail grade 2 math ?
PLUS your old post said 4 or 5 times a week. Now you just neglect to mention the 5.
Grow up and admit when you are wrong. Believe me you will get a a lot further in life this way.
TheJosh
Are you retarded?
Maybe you should look at a calander, I didn’t double count sunday, my two weeks started and ended on sunday, exactly 14 days.
What don’t you understand?
EDIT - Here is a Calender, I made little dots for each day so you could comprehend.
Justin-27
Dude THAT IS 15 DAYS!!!!!! You can’t have a week go Sun-Sat, then Sun-Sun. Look at the damn pic you posted, count the days what do they equal?!?!?! FIFTEEN!
I was right, 3.5 x a week, and so was the first guy to post it, and you my bright friend are el wrongo.
[…]
TheJosh
There is 7 days in a week, if you workout every other day, you work out 4 days a week, how hard is that to ****ing comprehend?!
Ill do it out in 4 weeks for you, maybe it will make more sense?
Week 1 - Sunday, Tuesday, Thursday, Saturday
Week 2 - Monday, Wednesday, Friday, Sunday
Week 3 - Tuesday, Thursday, Saturday, Monday
Week 4 - Wednesday, Friday, Sunday, Tuesday
Week 5 - Thursday, Saturday, Monday, Wednesday
Week 6 - Friday, Sunday, Tuesday, Thursday
No matter how you look at it, if you workout every other day, you work out 4 times a week.
[…]
A week is sunday-sunday.
I think you just don’t know how to count, it’s alright, I won’t tell anyone. lol
Sunday-Saturday is only 6 days, do you have 6 days weeks where you live?
Justin-27
Yes, you workout 4x the first week, then 3 the next.
I’m right, you’re a effing moron.
[…]
Sun-Sat is only 6 days?!
Sunday ONE
Monday TWO
Tuesday THREE
Wednesday FOUR
Thursday FIVE
Friday SIX
Saturday SEVEN
Arizona public schools=FAIL
{ Bodybuilding.com | Continue reading }
haha, social networks, time | January 6th, 2015 3:39 pm

DNA is generally regarded as the basic building block of life itself. In the most fundamental sense, DNA is nothing more than a chemical compound, albeit a very complex and peculiar one. DNA is an information-carrying molecule. The specific sequence of base pairs contained in a DNA molecule carries with it genetic information, and encodes for the creation of particular proteins. When taken as a whole, the DNA contained in a single human cell is a complete blueprint and instruction manual for the creation of that human being.
In this article we discuss myriad current and developing ways in which people are utilizing DNA to store or convey information of all kinds. For example, researchers have encoded the contents of a whole book in DNA, demonstrating the potential of DNA as a way of storing and transmitting information. In a different vein, some artists have begun to create living organisms with altered DNA as works of art. Hence, DNA is a medium for the communication of ideas. Because of the ability of DNA to store and convey information, its regulation must necessarily raise concerns associated with the First Amendment’s prohibition against the abridgment of freedom of speech.
New and developing technologies, and the contemporary and future social practices they will engender, necessitate the renewal of an approach towards First Amendment coverage that takes into account the purposes and values incarnated in the Free Speech Clause of the Constitution.
{ Charleston School of Law | Continue reading }
photo { Bruce Davidson }
Linguistics, genes, law | January 2nd, 2015 6:55 am

New theories suggest the big bang was not the beginning, and that we may live in the past of a parallel universe.
[…]
Time’s arrow may in a sense move in two directions, although any observer can only see and experience one.
{ Scientific American | Continue reading }
photo { Tania Shcheglova and Roman Noven }
Physics, photogs, theory, time | December 9th, 2014 3:05 pm

When I started life Hegelianism was the basis of everything: it was in the air, found expression in magazine and newspaper articles, in novels and essays, in art, in histories, in sermons, and in conversation. A man unacquainted with Hegel had no right to speak: he who wished to know the truth studied Hegel. Everything rested on him; and suddenly forty years have gone by and there is nothing left of him, he is not even mentioned - as though he had never existed. And what is most remarkable is that, like pseudo-Christianity, Hegelianism fell not because anyone refuted it, but because it suddenly became evident that neither the one nor the other was needed by our learned, educated world.
{ Leon Tolstoy, What then must we do?, 1886 | PDF }
ideas | November 21st, 2014 10:47 am

O’Brian oversees America’s master clock. It’s one of the most accurate clocks on the planet: an atomic clock that uses oscillations in the element cesium to count out 0.0000000000000001 second at a time. If the clock had been started 300 million years ago, before the age of dinosaurs began, it would still be keeping time — down to the second. […]
At the nearby University of Colorado Boulder is a clock even more precise than the one O’Brian watches over. […] This new clock can keep perfect time for 5 billion years.”It’s about the whole, entire age of the earth,” says Jun Ye, the scientist here at JILA who built this clock. […]
But this new clock has run into a big problem: This thing we call time doesn’t tick at the same rate everywhere in the universe. Or even on our planet.
Right now, on the top of Mount Everest, time is passing just a little bit faster than it is in Death Valley. That’s because speed at which time passes depends on the strength of gravity. Einstein himself discovered this dependence as part of his theory of relativity, and it is a very real effect.
The relative nature of time isn’t just something seen in the extreme. If you take a clock off the floor, and hang it on the wall, Ye says, “the time will speed up by about one part in 1016.” […] Time itself is flowing more quickly on the wall than on the floor. These differences didn’t really matter until now. But this new clock is so sensitive, little changes in height throw it way off. Lift it just a couple of centimeters, Ye says, “and you will start to see that difference.” […]
The world’s current time is coordinated between atomic clocks all over the planet. But that can’t happen with the new one.
{ NPR | Continue reading }
photo { Petra Collins }
Physics, time | November 12th, 2014 4:41 pm

Crimes such as bribery require the cooperation of two or more criminals for mutual gain. Instead of deterring these crimes, the state should disrupt them by creating distrust among criminals so they cannot cooperate. In a cooperative crime with two criminals, the state should offer amnesty and a bounty to the criminal who first secures punishment of the other criminal. When the bounty exceeds the bribe, a bribed official gains less from keeping the bribe than from confessing and receiving the bounty. Consequently the person who pays the bribe cannot trust the person who takes it. The game’s unique equilibrium is non-cooperative and bribes disappear.
{ Review of Law & Economics }
economics, law, theory | November 12th, 2014 4:26 pm

Findings from two experiments suggest that priming the passage of time through the sound of a ticking clock influenced various aspects of women’s (but not men’s) reproductive timing. Moreover, consistent with recent research from the domain of life history theory, those effects depended on women’s childhood socioeconomic status (SES). The subtle sound of a ticking clock led low (but not high) SES women to reduce the age at which they sought to get married and have their first child (Study 1), as well as the priority they placed on the social status and long-term earning potential of potential romantic partners (Study 2).
{ Human Nature | Continue reading }
photo { Aaron McElroy }
relationships, time | September 19th, 2014 11:13 am

This paper considers when a firm’s freely chosen name can signal meaningful information about its quality, and examines a setting in which it does.
Plumbing firms with names beginning with an “A” or a number receive five times more service complaints, on average. In addition, firms use names beginning with an “A” or a number more often in larger markets, and those that do have higher prices.
These results reflect consumers’ search decisions and extend to online position auctions: plumbing firms that advertise on Google receive more complaints, which contradicts prior theoretical predictions but fits the setting considered here.
{ Ryan C. McDevitt | PDF }
Linguistics, marketing | September 10th, 2014 3:12 pm

There seems to be wide support for the idea that we are living in an “age of complexity,” which implies that the world has never been more intricate. This idea is based on the rapid pace of technological changes, and the vast amount of information that we are generating (the two are related). Yet consider that philosophers like Leibniz (17th century) and Diderot (18th century) were already complaining about information overload. The “horrible mass of books” they referred to may have represented only a tiny portion of what we know today, but much of what we know today will be equally insignificant to future generations.
In any event, the relative complexity of different eras is of little matter to the person who is simply struggling to cope with it in everyday life. So perhaps the right question is not “Is this era more complex?” but “Why are some people more able to manage complexity?” Although complexity is context-dependent, it is also determined by a person’s disposition. In particular, there are three key psychological qualities that enhance our ability to manage complexity:
1. […] higher levels of IQ enable people to learn and solve novel problems faster […]
2. […] individuals with higher EQ [emotional quotient] are less susceptible to stress and anxiety […]
3. […] People with higher CQ [curiosity quotient] are more inquisitive and open to new experiences […] they are generally more tolerant of ambiguity.
{ Harvard Business Review | Continue reading }
photo { Never before seen Corinne Day shots }
ideas, photogs | August 28th, 2014 12:46 am

Who will guard the guards?
In posing the famous question, the Roman poet Juvenal was suggesting that wives cannot be trusted, and keeping them under guard is not a solution—because the guards cannot be trusted either.
Half a millennium or so earlier, Plato in The Republic expressed a more optimistic view regarding the guardians or rulers of the city-state, namely that one should be able to trust them to behave properly; that it was absurd to suppose that they should require oversight.
{ Wikipedia | Continue reading }
flashback, ideas | August 28th, 2014 12:12 am

People loved for their beauty and cheerfulness are not loved as irreplaceable, yet people loved for “what their souls are made of” are. Or so literary romance implies; leading philosophical accounts, however, deny the distinction, holding that reasons for love either do not exist or do not include the beloved’s distinguishing features. […]
I defend a model of agency on which people can love each other for identities still being created, through a kind of mutual improvisation. […]
I draw another analogy to jazz, this time relating the attraction and concern constitutive of interpersonal love to the reciprocal appreciation and responsiveness of musicians who improvise together as partners. Musicians who improvise together as partners recognize each other to be trying to express the same musical idea, even though the contents of their ideas are still being worked out.
{ PhilPapers | PDF }
ideas, relationships | July 21st, 2014 3:01 pm

Since 1990, the Gerontology Research Group has assumed the role of record keepers for the world’s supercentenarians, or persons older than 110. […]
When it comes to age forgery, Coles has seen it all. He recently received a claim from India of an individual who is supposedly 179—a feat that is almost certainly physically impossible. The deceit can be harder to spot, such as the time a man in Turkey tried to pass himself off as his deceased brother, who was ten years older. And in one particularly challenging case, the government of Bolivia issued false documents to a man who was 106, stating that he was 112.
These problems are well known among those who study the very old. “Ninety-eight percent of ages claimed over 115 are false,” says Thomas Perls, a professor of medicine and geriatrics at Boston Medical Center, and director of the New England Centenarian Study. Based on a research paper he published on the topic, Perls says that “There’s a total of ten different major reasons why people do this.”
Sometimes, the motivation for lying is monetary. In the U.S., for example, a handful of people inflated their ages in order to claim to be Civil War veterans, giving them access to pensions. […] In other cases, a government or group might want to demonstrate that theirs is a “superior race.”
{ Smithsonian | Continue reading }
health, time | July 21st, 2014 3:01 pm

An extensive literature addresses citizen ignorance, but very little research focuses on misperceptions. Can these false or unsubstantiated beliefs about politics be corrected? […] Results indicate that corrections frequently fail to reduce misperceptions among the targeted ideological group. We also document several instances of a ‘‘backfire effect’’ in which corrections actually increase misperceptions among the group in question.
{ Springer Science+Business Media | PDF }
ideas, psychology | July 16th, 2014 2:22 pm

The term “stress” had none of its contemporary connotations before the 1920s. It is a form of the Middle English destresse, derived via Old French from the Latin stringere, “to draw tight.” The word had long been in use in physics to refer to the internal distribution of a force exerted on a material body, resulting in strain. In the 1920s and 1930s, biological and psychological circles occasionally used the term to refer to a mental strain or to a harmful environmental agent that could cause illness.
{ Wikipedia | Continue reading }
The modern idea of stress began on a rooftop in Canada, with a handful of rats freezing in the winter wind.
This was 1936 and by that point the owner of the rats, an endocrinologist named Hans Selye, had become expert at making rats suffer for science.
“Almost universally these rats showed a particular set of signs,” Jackson says. “There would be changes particularly in the adrenal gland. So Selye began to suggest that subjecting an animal to prolonged stress led to tissue changes and physiological changes with the release of certain hormones, that would then cause disease and ultimately the death of the animal.”
And so the idea of stress — and its potential costs to the body — was born.
But here’s the thing: The idea of stress wasn’t born to just any parent. It was born to Selye, a scientist absolutely determined to make the concept of stress an international sensation.
{ NPR | Continue reading }
art { Richard Phillips, Blauvelt, 2013 }
Linguistics, flashback, health | July 10th, 2014 9:58 am

Four experiments examined the interplay of memory and creative cognition, showing that attempting to think of new uses for an object can cause the forgetting of old uses. […] Additionally, the forgetting effect correlated with individual differences in creativity such that participants who exhibited more forgetting generated more creative uses than participants who exhibited less forgetting. These findings indicate that thinking can cause forgetting and that such forgetting may contribute to the ability to think creatively.
{ APA/Psycnet | Continue reading }
art { Kazumasa Nagai }
ideas, psychology | June 30th, 2014 8:18 am

“I looked up at the shower head, and it was as if the water droplets had stopped in mid-air” […]
Although Baker is perhaps the most dramatic case, a smattering of strikingly similar accounts can be found, intermittently, in medical literature. There are reports of time speeding up – so called “zeitraffer” phenomenon – and also more fragmentary experiences called “akinetopsia”, in which motion momentarily stops.
For instance, travelling home one day, one 61-year-old woman reported that the movement of the closing train doors, and fellow passengers, was in slow motion and “broken up”, as if in “freeze frames”. A 58-year-old Japanese man, meanwhile, seemed to be experiencing life like a badly dubbed movie; in conversation, he found that although others’ voices sounded normal, they were out of sync with their faces. […]
One explanation for this double-failure is that our motion perception system has its own stopwatch, recording how fast things are moving across our vision – and when this is disrupted by brain injury, the world stands still. For Baker, stepping into the shower might have exacerbated the problem, since the warm water would have drawn the blood away from the brain to the extremities of the body, further disturbing the brain’s processing.
Another explanation comes from the discovery that our brain records its perceptions in discrete “snapshots”, like the frames of a film reel. “The healthy brain reconstructs the experience and glues together the different frames,” says Rufin VanRullen at the French Centre for Brain and Cognition Research in Toulouse, “but if brain damage destroys the glue, you might only see the snapshots.”
{ BBC | Continue reading }
neurosciences, time | June 30th, 2014 7:11 am

Realism is a term that can be understood only by contrasting it with an opposite term, such as idealism or representationalism. But representationalism has indeed to presuppose something that is represented, in order for the representation to be possible at all. […]
Our grasp on reality is always determined by our own way of accessing it. A realism which can take hold of this presupposition is to be called phenomenological realism. In this sense, reality is always given only in representation, that is, mediated by our access to it, but is not itself representation.
It is an objectivity opposed to ourself, it has a particular place and it appears, but its appearance does not belong to the subject, it is simply there. Therefore, appearances are spatial and have to be described as such.
{ Meta Journal | PDF }
ideas | June 25th, 2014 12:20 pm

It’s a question that has plagued philosophers and scientists for thousands of years: Is free will an illusion?
Now, a new study suggests that free will may arise from a hidden signal buried in the “background noise” of chaotic electrical activity in the brain, and that this activity occurs almost a second before people consciously decide to do something. […]
Experiments performed in the 1970s also raised doubts about human volition. Those studies, conducted by the late neuroscientist Benjamin Libet, revealed that the region of the brain that plans and executes movement, called the motor cortex, fired prior to people’s decision to press a button, suggesting this part of the brain “makes up its mind” before peoples’ conscious decision making kicks in.
To understand more about conscious decision making, Bengson’s team used electroencephalography (EEG) to measure the brain waves of 19 undergraduates as they looked at a screen and were cued to make a random decision about whether to look right or left.
When people made their decision, a characteristic signal registered that choice as a wave of electrical activity that spread across specific brain regions.
But in a fascinating twist, other electrical activity emanating from the back of the head predicted people’s decisions up to 800 milliseconds before the signature of conscious decision making emerged.
{ Live Science | Continue reading }
related { Searching for the “Free Will” Neuron }
ideas, neurosciences | June 24th, 2014 3:49 am

The fact that someone is generous is a reason to admire them. The fact that someone will pay you to admire them is also a reason to admire them. But there is a difference in kind between these two reasons: the former seems to be the `right’ kind of reason to admire, whereas the latter seems to be the `wrong’ kind of reason to admire. The Wrong Kind of Reasons Problem is the problem of explaining the difference between the `right’ and the `wrong’ kind of reasons wherever it appears. In this paper I argue that two recent proposals for solving the Wrong Kind of Reasons Problem do not work.
{ Nathaniel Sharadin/Pacific Philosophical Quarterly | Continue reading }
ideas | June 13th, 2014 2:39 pm

Just as we can design and install digital apps in our electronic devices, we can design and install mindapps in our minds. For philosophy the big-problem is the hegemonic assumption that all good thinking takes place in our ordinary, default mindbody state—wakefulness. Because of this error, the vast extensions of our minds beyond our default state are neglected, and directions for future mind development are stunted, if not outright denied. Multistate theory releases that constriction. By reformulating our minds as variables for experimental philosophy, multistate theory re-asks philosophical questions, extends current issues, and engenders fun speculations. Because psychedelics are the most dramatic example of widely known mindbody psychotechnologies, we will illustrate multistate theory with psychedelics’ contributions
{ Thomas B. Roberts | Continue reading }
drugs, ideas | June 13th, 2014 7:39 am

For centuries, scientists studied light to comprehend the visible world. […] But in the late 19th century all that changed […] the whole focus of physics—then still emerging as a distinct scientific discipline—shifted from the visible to the invisible. […] Today its theories and concepts are concerned largely with invisible entities: not only unseen force fields and insensible rays but particles too small to see even with the most advanced microscopes. […] Theories at the speculative forefront of physics flesh out this unseen universe with parallel worlds and with mysterious entities named for their very invisibility: dark matter and dark energy. […]
…the concept of “brane” (short for membrane) worlds. This arises from the most state-of-the-art variants of string theory, which attempt to explain all the known particles and forces in terms of ultra-tiny entities called strings, which can be envisioned as particles extended into little strands that vibrate. Most versions of the theory call for variables in the equations that seem to have the role of extra dimensions in space, so that string theory posits not four dimensions (of time and space) but 11. As physicist and writer Jim Baggott points out, “there is no experimental or observational basis for these assumptions”—the “extra dimensions” are just formal aspects of the equations. However, the latest versions of the theory suggest that these extra dimensions can be extremely large, constituting extra-dimensional branes that are potential repositories for alternative universes separated from our own like the stacked leaves of a book. Inevitably, there is an urge to imagine that these places too might be populated with sentient beings, although that’s optional. The point is that these brane worlds are nothing more than mathematical entities in speculative equations, incarnated, as it were, as invisible parallel universes. […]
Scientists, of course, are not just making things up, while leaning on the convenience of supposed invisibility. They are using dark matter and dark energy, and (if one is charitable) quantum many-worlds and branes, and other imperceptible and hypothetical realms, to perform an essential task: to plug gaps in their knowledge with notions they can grasp.
{ Nautilus | Continue reading }
related { How it works: An ultra-precise thermometer made from light }
Physics, theory | June 11th, 2014 2:10 pm

I understand by ‘God’ the perfect being, where a being is perfect just in case it has all perfections essentially and lacks all imperfections essentially. […]
Given that there are good reasons for thinking that the premises of the Compossibility Argument (CA) are true, it seems to me we have a good reason to think that God’s existence is possible. Of course, this does not, by itself, allow us to conclude to the much more important thesis that God exists, and so the atheist can consistently admit God’s possibility and maintain her atheism.
{ C’Zar Bernstein/Academia | Continue reading }
The omnipotence paradox states that: If a being can perform any action, then it should be able to create a task which this being is unable to perform; hence, this being cannot perform all actions. Yet, on the other hand, if this being cannot create a task that it is unable to perform, then there exists something it cannot do.
One version of the omnipotence paradox is the so-called paradox of the stone: “Could an omnipotent being create a stone so heavy that even he could not lift it?” If he could lift the rock, then it seems that the being would not have been omnipotent to begin with in that he would have been incapable of creating a heavy enough stone; if he could not lift the stone, then it seems that the being either would never have been omnipotent to begin with or would have ceased to be omnipotent upon his creation of the stone.
The argument is medieval, dating at least to the 12th century, addressed by Averroës (1126–1198) and later by Thomas Aquinas. Pseudo-Dionysius the Areopagite (before 532) has a predecessor version of the paradox, asking whether it is possible for God to “deny himself”.
[…]
A common response from Christian philosophers, such as Norman Geisler or Richard Swinburne is that the paradox assumes a wrong definition of omnipotence. Omnipotence, they say, does not mean that God can do anything at all but, rather, that he can do anything that’s possible according to his nature.
{ Wikipedia | Continue reading }
related { Jesus and Virgin Mary spotted on Google Earth pic }
theory | June 2nd, 2014 11:51 am

It is just possible to discern some points beneath the heated rhetoric in which Patricia Churchland indulges. But none of these points is right. If you hold that “mental processes are actually processes in the brain,” to quote Churchland, then you are committed to the thesis that it is sufficient to understand the mind that one understands the brain, and not merely necessary. This is just the well-known “identity theory” of mind and brain: mental processes are identical to brain processes; and the identity of a with b entails the sufficiency of a for b. To hold the weaker thesis that knowledge of the brain is merely necessary for knowledge of the mind is consistent even with being a heavy-duty Cartesian dualist, since even such a dualist accepts that mind depends causally on brain.
{ Patricia Churchland vs. Colin McGinn/NY Review of Books | Continue reading }
brain, controversy | June 1st, 2014 6:59 am

While both tourism research and photography research have grown into substantial academic disciplines, little has been written about their point of intersection: tourist photography. In this paper, I argue that a number of philosophically oriented theories of photography may offer useful perspectives on tourist photography. […]
When I was observing photographing tourists on the Pont Neuf and in the Jardin des Tuileries in Paris, one of the things that struck me was the fact that some tourists, when they came across a sculpture, first took a picture of it, and only started looking after the picture had been taken. Perhaps Sontag is right to argue that the production of pictures serves to appease the tourist’s anxiety about not working; in any case, this type of predatory photographic behavior promotes the accumulation of images to a goal in itself rather than a means to produce meaning or memories.
{ Dennis Schep/Depth of Field | Continue reading }
ideas, photogs, within the world | May 21st, 2014 5:50 pm

The atomists held that there are two fundamentally different kinds of realities composing the natural world, atoms and void. Atoms, from the Greek adjective atomos or atomon, ‘indivisible,’ are infinite in number and various in size and shape, and perfectly solid, with no internal gaps. They move about in an infinite void, repelling one another when they collide or combining into clusters by means of tiny hooks and barbs on their surfaces, which become entangled. Other than changing place, they are unchangeable, ungenerated and indestructible. All changes in the visible objects of the world of appearance are brought about by relocations of these atoms: in Aristotelian terms, the atomists reduce all change to change of place. Macroscopic objects in the world that we experience are really clusters of these atoms; changes in the objects we see—qualitative changes or growth, say—are caused by rearrangements or additions to the atoms composing them. While the atoms are eternal, the objects compounded out of them are not.
In supposing that void exists, the atomists deliberately embraced an apparent contradiction, claiming that ‘what is not’ exists.
{ The Stanford Encyclopedia of Philosophy | Continue reading }
Scientists discover how to turn light into matter after 80-year quest.
{ The Stanford Encyclopedia of Philosophy | Continue reading }
Physics, ideas | May 20th, 2014 12:11 pm

We investigate the possibility that a decision-maker prefers to avoid making a decision and instead delegates it to an external device, e.g., a coin flip. In a series of experiments the participants often choose lotteries between allocations, which contradicts most theories of choice such as expected utility but is consistent with a theory of responsibility aversion that implies a preference for randomness. A large data set on university applications in Germany shows a choice pattern that is also consistent with this theory and entails substantial allocative consequences.
{ SSRN | Continue reading }
photo { Richard Sandler }
ideas | May 15th, 2014 1:55 pm

The term ‘perspective’ comes from the language of vision. We literally see things from and with a particular perspective. Our eyes are located at a particular point in space, from which some things are visible and others are not, e.g. the top of the table, but not its underneath. A scene looks different from different perspectives. […]
Nietzsche is saying that philosophical beliefs about truth and goodness are part of a particular perspective on the world, a short-sighted, distorting perspective. One of its most important distortions is that it denies that it is a perspective, that its truths are unconditional, that it represents the world as it truly is. But philosophers are wrong to think that it is possible to represent or hold beliefs about the world that are value-free, ‘objective’, ‘disinterested’. […]
We can support Nietzsche’s argument by an evolutionary account of human cognition. We can’t possibly take in everything around us. We must be selective in order to survive at all. So from the very beginning, our intellects are responsive to our interests, our biological instincts and all that develops from them – our emotions, desires and values. So we do not and cannot experience the world ‘as it is’, but always selectively, in a way that reflects our values. […]
If Nietzsche claims that all our knowledge is from a particular perspective, then his claims about perspectives and his theory of perspectivism must itself be from a particular perspective. So is what he says about perspectives objectively true or not?
{ Michael Lacewing | PDF }
image { Camille Henrot, still from The Strife of Love in a Dream, 2011 }
nietzsche | May 9th, 2014 1:50 pm

What if someone had already figured out the answers to the world’s most pressing policy problems, but those solutions were buried deep in a PDF, somewhere nobody will ever read them?
According to a recent report by the World Bank, that scenario is not so far-fetched. The bank is one of those high-minded organizations — Washington is full of them — that release hundreds, maybe thousands, of reports a year on policy issues big and small. Many of these reports are long and highly technical, and just about all of them get released to the world as a PDF report posted to the organization’s Web site.
The World Bank recently decided to ask an important question: Is anyone actually reading these things? They dug into their Web site traffic data and came to the following conclusions: Nearly one-third of their PDF reports had never been downloaded, not even once. Another 40 percent of their reports had been downloaded fewer than 100 times. Only 13 percent had seen more than 250 downloads in their lifetimes. […]
And let’s not even get started on the situation in academia, where the country’s best and brightest compete for the honor of seeing their life’s work locked away behind some publisher’s paywall.
{ Washington Post | Continue reading }
ideas, science | May 9th, 2014 1:42 pm

As many theorists have noted, consciousness, while both familiar and intimate, remains deeply mysterious. The problem of explaining consciousness persists despite all attempts from the pre-Socratic Greeks to modern day philosophers at illuminating this perplexing subject. Throughout history many great thinkers supported the notion that consciousness or some sort of spiritual reality is distinct from matter, and indeed might be the fundamental source of all reality. However, the dominant view in the twentieth century settled on a more materialistic argument: consciousness most likely emerges from complex biological processes, which in turn are based ultimately on complex interactions between subatomic particles.
This view remains unsatisfactory for some philosophers of mind. While advances in neuroscience have led to improvements in our understanding of how processes within the brain work, we still are no closer to understanding experience at the most basic level. This is what Chalmers (1995) has termed the “hard problem” of consciousness. According to Chalmers, materialistic explanations of consciousness would be consistent with a world populated by zombies acting like people in the world, yet devoid of interior experience. Tackling the hard problem of conscious- ness, Chalmers argues, likely requires abandoning a purely materialistic view of consciousness.
The various theories of consciousness can arguably be grouped into five categories: materialism, dualism, panpsychism, neutral monism, and idealism. As noted above, the current mainstream view looks for materialistic explanations. This typically takes the form of arguing that consciousness must be a higher level activity that has emerged from lower level processes, such as complex biological processes. […]
Material dualism holds that matter and consciousness are two substances that differ fundamentally in a number of ways.1 This and other differences lead to the perhaps unsolvable problem of how such fundamentally different substances can interact. Historically, support for dualism fits well with such religious notions as the soul or supernatural agency. Dualism has attracted fewer adherents, however, as philosophy gravitated toward more naturalistic explanations. […]
Two closely related alternatives are panpsychism and neutral monism. Panpsychism holds that matter and mind are joined as one. The usual view of panpsychism holds that all matter, even electrons, has some aspect of mind, albeit at a rudi- mentary level. While panpsychism has relatively few adherents today, this class of explanations has had a long history in philosophy, being a close relative to animism that was common in early cultures (Skrbina, 2007). Neutral monism holds that matter and consciousness are aspects of some more neutral and fundamental reality. […]
One last alternative is idealism, which holds that the physical universe is composed of mind. […]
After a brief survey of the evidence, I conclude that the best explanation would probably be neutral monism. I then explore a framework for neutral monism, using well-known features of quantum mechanics, to develop a ground or bridge between consciousness and matter.
{ The Journal of Mind and Behavior | PDF }
art { Ellsworth Kelly, Black Forms, 1955 }
ideas | May 5th, 2014 1:02 pm

Can you ever be reasonably sure that something is random, in the same sense you can be reasonably sure something is not random (for example, because it consists of endless nines)? Even if a sequence looked random, how could you ever rule out the possibility that it had a hidden deterministic pattern? And what exactly do we mean by “random,” anyway?
These questions might sound metaphysical, but we don’t need to look far to find real-world consequences. In computer security, it’s crucial that the keys used for encryption be generated randomly—or at least, randomly enough that a potential eavesdropper can’t guess them. Day-to-day fluctuations in the stock market might look random—but to whatever extent they can be predicted, the practical implications are obvious. Casinos, lotteries, auditors, and polling firms all get challenged about whether their allegedly random choices are really random, and all might want ways to reassure skeptics that they are.
Then there’s quantum mechanics, which famously has declared for a century that “God plays dice,” that there’s irreducible randomness even in the behavior of subatomic particles.
{ American Scientist | Continue reading }
image { Matt Waples }
ideas, mathematics | April 30th, 2014 12:45 pm

We tend to characterize art as “self-expression,” but that’s really more a description of bad art. The immature artist, as Eliot wrote, is constantly giving in to the urge to vent what’s inside, whereas the good artist seeks to escape that urge. […]
Social media turns us all into bad poets.
{ Rough Type | Continue reading | Thanks Rob }
ideas, social networks | April 29th, 2014 2:57 pm

Spinoza is quoted approvingly […] to the effect that the free man is the one who thinks about, or fears, death the least. Such fear he considers to be a passive emotion, or affection, which is a bondage to pain, symptomatic of our impotence and servitude. Spinoza writes,
Hope is nothing else but an inconstant pleasure, arising from the image of something future or past, whereof we do not yet know the issue. Fear, on the other hand, is an inconstant pain also arising from the image of something concerning which we are in doubt. If the element of doubt be removed from these emotions, hope becomes Confidence and fear become Despair. In other words, Pleasure or Pain arising from the image of something concerning which we have hoped or feared.
The free man, in this light, is one who has not only cultivated the stronger active emotion of acquiescence to the univocal chorus of necessity, but has also learned to disengage external factors which are coincident with such passive emotions.
{ James Luchte | Continue reading }
spinoza | April 11th, 2014 12:25 pm

Giving violators more punishment than they deserve can undermine the benefits of cooperative action. […] At the same time, imposing markedly less punishment than what a violator deserves creates disaffection and acrimony that also can subvert cooperation. In other words, it is not punishment that is needed to maintain social cooperation, but justice. […]
In 1848, the discovery of gold brought 300,000 men to California from all over the world. Yet this sudden mass of humanity lived without a functioning legal system. And if there had been a legal enforcement system, it was unclear what law it would enforce. […] Without a functional government, there were no licensing procedures, fees, or taxes to regulate gold prospecting. No miner worked land that he owned. Any prospector could join any mining camp at any time. Camp populations were heterogeneous: “Puritans and drunkards, clergymen and convict, honest and dishonest, rich and poor.” There was no common language, culture, or legal experience. […] The men shared a common set of needs, however. Each miner needed to be able to leave whatever he owned unguarded each day while he worked his claim. A miner who found gold needed to protect his find until he could convert it into cash or goods.
{ Paul H. Robinson/SSRN | Continue reading }
flashback, ideas, law | March 28th, 2014 7:19 am

Horses are the only species other than man transported around the world for competition purposes.
In humans, transport across several time zones can result in adverse symptoms commonly referred to as jetlag.
Can changes in the light/dark cycle, equivalent to those caused by transport across several time zones, affect daily biological rhythms, and performance in equine athletes?
[…]
We found that horses do feel a change in the light/dark cycle very acutely, but they also recover very quickly, and this resulted in an improvement in their performance rather than a decrease in their performance, which was exactly the opposite of what we thought was going to happen.
{ HBLB | PDF }
horse, time | March 24th, 2014 1:26 pm
music, nietzsche | March 21st, 2014 11:14 am

“Would you please take a selfie of my friend and I in front of this window?”
She was not aware that she had approached a linguist. […]
It would not be like him to snarl that of my friend and I should be of my friend and me (or perhaps better, of me and my friend). Nor did he remonstrate with the woman over her rather extraordinary misuse of the noun selfie.
{ Language Log | Continue reading }
unrelated { Photographer countersues Empire State Building for $5M over topless photos }
Linguistics, photogs | March 20th, 2014 3:15 pm

More than 400 years after Shakespeare wrote it, we can now say that “Romeo and Juliet” has the wrong name. Perhaps the play should be called “Juliet and Her Nurse,” which isn’t nearly as sexy, or “Romeo and Benvolio,” which has a whole different connotation.
I discovered this by writing a computer program to count how many lines each pair of characters in “Romeo and Juliet” spoke to each other, with the expectation that the lovers in the greatest love story of all time would speak more than any other pair.
{ FiveThirtyEight | Continue reading }
Linguistics, books | March 19th, 2014 7:40 am

In his story Sarrasine, Balzac, describing a castrato disguised as a woman, writes the following sentence: “This was woman herself, with her sudden fears, her irrational whims, her instinctive worries, her impetuous boldness, her fussings, and her delicious sensibility.” Who is speaking thus? Is it the hero of the story bent on remaining ignorant of the castrato hidden beneath the woman? Is it Balzac the individual, furnished by his personal experience with a philosophy of Woman? Is it Balzac the author professing ‘literary’ ideas on femininity? Is it universal wisdom? Romantic psychology? We shall never know, for the good reason that writing is the destruction of every voice, of every point of origin. Writing is that neutral, composite, oblique space where our subject slips away, the negative where all identity is lost, starting with the very identity of the body writing.
{ Roland Barthes, The Death of the Author, 1967 | Continue reading }
ideas, roland barthes | March 13th, 2014 11:42 am

These fictional examples suggest that creativity and dishonesty often go hand-in-hand. Is there an actual link? Is there something about the creative process that triggers unethical behavior? Or does behaving in dishonest ways spur creative thinking? My research suggests that they both exist: Encouraging people to think outside the box can result in greater cheating, and crossing ethical boundaries can make people more creative in subsequent tasks.
{ Scientific American | Continue reading }
ideas, psychology | March 11th, 2014 11:59 am

The arguments for ditching notes and coins are numerous, and quite convincing. In the US, a study by Tufts University concluded that the cost of using cash amounts to around $200 billion per year – about $637 per person. This is primarily the costs associated with collecting, sorting and transporting all that money, but also includes trivial expenses like ATM fees. Incidentally, the study also found that the average American wastes five and a half hours per year withdrawing cash from ATMs; just one of the many inconvenient aspects of hard currency.
While coins last decades, or even centuries, paper currency is much less durable. A dollar bill has an average lifespan of six years, and the US Federal Reserve shreds somewhere in the region of 7,000 tons of defunct banknotes each year.
Physical currency is grossly unhealthy too. Researchers in Ohio spot-checked cash used in a supermarket and found 87% contained harmful bacteria. Only 6% of the bills were deemed “relatively clean.” […]
Stockholm’s homeless population recently began accepting card payments. […]
Cash transactions worldwide rose just 1.75% between 2008 and 2012, to $11.6 trillion. Meanwhile, non traditional payment methods rose almost 14% to total $6.4 trillion.
{ TransferWise | Continue reading }
The anal stage is the second stage in Sigmund Freud’s theory of psychosexual development, lasting from age 18 months to three years. According to Freud, the anus is the primary erogenous zone and pleasure is derived from controlling bladder and bowel movement. […]
The negative reactions from their parents, such as early or harsh toilet training, can lead the child to become an anal-retentive personality. If the parents tried forcing the child to learn to control their bowel movements, the child may react by deliberately holding back in rebellion. They will form into an adult who hates mess, is obsessively tidy, punctual, and respectful to authority. These adults can sometimes be stubborn and be very careful over their money.
{ Wikipedia | Continue reading }
related { Hackers Hit Mt. Gox Exchange’s CEO, Claim To Publish Evidence Of Fraud | Where are the 750k Bitcoins lost by Mt. Gox? }
economics, psychology, theory | March 10th, 2014 8:28 am

Two fields stand out as different within cognitive psychology. These are the study of reasoning, especially deductive reasoning and statistical inference, and the more broadly defined field of decision making. For simplicity I label these topics as the study of reasoning and decision making (RDM). What make RDM different from all other fields of cognitive psychology is that psychologists constantly argued with each other and with philosophers about whether the behavior of their participants is rational. The question I address here is why? What is so different about RDM that it attracts the interests of philosophers and compulsively engages experimental psychologists in judgments of how good or bad is the RDM they observe.
Let us first consider the nature of cognitive psychology in general. It is branch of cognitive science, concerned with the empirical and theoretical study of cognitive processes in humans. It covers a wide collection of processes connected with perception, attention, memory, language, and thinking. However, only in the RDM subset of the psychology of thinking is rationality an issue. For sure, accuracy measures are used throughout cognitive psychology. We can measure whether participants detect faint signals, make accurate judgments of distances, recall words read to them correctly and so on. The study of non-veridical functions is also a part of wider cognitive psychology, for example the study of visual illusions, memory lapses, and cognitive failures in normal people as well as various pathological conditions linked to brain damage, such as aphasia. But in none of these cases are inaccurate responses regarded as irrational. Visual illusions are attributed to normally adaptive cognitive mechanisms that can be tricked under special circumstances; memory errors reflect limited capacity systems and pathological cognition to brain damage or clinical disorders. In no case is the person held responsible and denounced as irrational.
{ Frontiers | Continue reading }
photo { Slim Aarons }
ideas, psychology | March 9th, 2014 4:52 am

Le pop art dépersonnalise, mais il ne rend pas anonyme : rien de plus identifiable que Marilyn, la chaise électrique, un pneu ou une robe, vus par le pop art ; ils ne sont même que cela : immédiatement et exhaustivement identifiables, nous enseignant par là que l’identité n’est pas la personne : le monde futur risque d’être un monde d’identités, mais non de personnes.
We must realize that if Pop Art depersonalized, it does not make anonymous: nothing is more identifiable than Marilyn, the electric chair, a tire, or a dress, as seen by Pop Art; they are in fact nothing but that: immediately and exhaustively identifiable, thereby teaching us that identify is not the person: the future world risks being a world of identities, but not of persons.
{ Roland Barthes, Cette vieille chose, l’art, 1980 }
art { Andy Warhol, Foot and Tire, 1963-64 }
related { David Cronenberg on Foot and Tire }
art, roland barthes, warhol | March 8th, 2014 4:56 am

Quantum physics is famously weird, counterintuitive and hard to understand; there’s just no getting around this. So it is very reassuring that many of the greatest physicists and mathematicians have also struggled with the subject. The legendary quantum physicist Richard Feynman famously said that if someone tells you that they understand quantum mechanics, then you can be sure that they are lying. And Conway too says that he didn’t understand the quantum physics lectures he took during his undergraduate degree at Cambridge.
The key to this confusion is that quantum physics is fundamentally different to any of the previous theories explaining how the physical world works. In the great rush of discoveries of new quantum theory in the 1920s, the most surprising was that quantum physics would never be able to exactly predict what was going to happen. In all previous physical theories, such as Newton’s classical mechanics or Einstein’s theories of special and general relativity, if you knew the current state of the physical system accurately enough, you could predict what would happen next. “Newtonian gravitation has this property,” says Conway. “If I take a ball and I throw it vertically upwards, and I know its mass and I know its velocity (suppose I’m a very good judge of speed!) then from Newton’s theories I know exactly how high it will go. And if it doesn’t do exactly as I expect then that’s because of some slight inaccuracy in my measurements.”
Instead quantum physics only offers probabilistic predictions: it can tell you that your quantum particle will behave in one way with a particular probability, but it could also behave in another way with another particular probability. “Suppose there’s this little particle and you’re going to put it in a magnetic field and it’s going to come out at A or come out at B,” says Conway, imagining an experiment, such as the Stern Gerlach experiment, where a magnetic field diverts an electron’s path. “Even if you knew exactly where the particles were and what the magnetic fields were and so on, you could only predict the probabilities. A particle could go along path A or path B, with perhaps 2/3 probability it will arrive at A and 1/3 at B. And if you don’t believe me then you could repeat the experiment 1000 times and you’ll find that 669 times, say, it will be at A and 331 times it will be at B.”
{ The Free Will Theorem, Part I | Continue reading | Part II | Part III }
Physics, theory | March 4th, 2014 4:26 pm

Ultracrepidarian (n):”Somebody who gives opinions on subjects they know nothing about.”
Groke (v): “To gaze at somebody while they’re eating in the hope that they’ll give you some of their food.” My dog constantly grokes at me longingly while I eat dinner.
{ BI | Continue reading }
Linguistics, buffoons | March 3rd, 2014 6:35 am

Author profiling is a problem of growing importance in applications in forensics, security, and marketing. E.g., from a forensic linguistics perspective one would like being able to know the linguistic profile of the author of a harassing text message (language used by a certain type of people) and identify certain characteristics. Similarly, from a marketing viewpoint, companies may be interested in knowing, on the basis of the analysis of blogs and online product reviews, the demographics of people that like or dislike their products. The focus is on author profiling in social media since we are mainly interested in everyday language and how it reflects basic social and personality processes.
{ PAN | Continue reading }
photos { Neal Barr, Texas Track Club, 1964 }
Linguistics, social networks, technology | February 21st, 2014 7:02 am
haha, ideas, social networks | February 19th, 2014 9:19 am

A set of mathematical laws that I call the Improbability Principle tells us that we should not be surprised by coincidences. In fact, we should expect coincidences to happen.
One of the key strands of the principle is the law of truly large numbers. This law says that given enough opportunities, we should expect a specified event to happen, no matter how unlikely it may be at each opportunity. Sometimes, though, when there are really many opportunities, it can look as if there are only relatively few. This misperception leads us to grossly underestimate the probability of an event: we think something is incredibly unlikely, when it’s actually very likely, perhaps almost certain.
{ Scientific American | Continue reading }
ideas, mathematics | February 18th, 2014 1:08 pm

In recent years, numerous studies have shown how music hijacks our relationship with everyday time. For instance, more drinks are sold in bars when with slow-tempo music, which seems to make the bar a more enjoyable environment, one in which patrons want to linger—and order another round. Similarly, consumers spend 38 percent more time in the grocery store when the background music is slow. Familiarity is also a factor. Shoppers perceive longer shopping times when they are familiar with the background music in the store, but actually spend more time shopping when the music is novel. Novel music is perceived as more pleasurable, making the time seem to pass quicker, and so shoppers stay in the stores longer than they may imagine. […]
While music usurps our sensation of time, technology can play a role in altering music’s power to hijack our perception. The advent of audio recording not only changed the way music was disseminated, it changed time perception for generations. Thomas Edison’s cylinder recordings held about four minutes of music. This technological constraint set a standard that dictated the duration of popular music long after that constraint was surpassed. In fact, this average duration persists in popular music as the modus operandi today. […]
Neuroscience gives us insights into how music creates an alternate temporal universe. During periods of intense perceptual engagement, such as being enraptured by music, activity in the prefrontal cortex, which generally focuses on introspection, shuts down. The sensory cortex becomes the focal area of processing and the “self-related” cortex essentially switches off. As neuroscientist Ilan Goldberg describes, “the term ‘losing yourself’ receives here a clear neuronal correlate.” […]
But it is Schubert, more than any other composer, who succeeded in radically commandeering temporal perception. Nowhere is this powerful control of time perception more forceful than in the String Quintet. Schubert composed the four-movement work in 1828, during the feverish last two months of his life. (He died at age 31.) In the work, he turns contrasting distortions of perceptual time into musical structure. Following the opening melody in the first Allegro ma non troppo movement, the second Adagio movement seems to move slowly and be far longer than it really is, then hastens and shortens before returning to a perception of long and slow. The Scherzo that follows reverses the pattern, creating the perception of brevity and speed, followed by a section that feels longer and slower, before returning to a percept of short and fast. The conflict of objective and subjective time is so forcefully felt in the work that it ultimately becomes unified in terms of structural organization.
{ Nautilus | Continue reading }
music, neurosciences, time | February 3rd, 2014 3:56 pm