‘It ain’t what they call you… it’s what you answer to.’ –W.C. Fields


In the year 1930, John Maynard Keynes predicted that, by century’s end, technology would have advanced sufficiently that countries like Great Britain or the United States would have achieved a 15-hour work week. There’s every reason to believe he was right. In technological terms, we are quite capable of this. And yet it didn’t happen. Instead, technology has been marshaled, if anything, to figure out ways to make us all work more. In order to achieve this, jobs have had to be created that are, effectively, pointless. […]

productive jobs have, just as predicted, been largely automated away […] But rather than allowing a massive reduction of working hours to free the world’s population to pursue their own projects, pleasures, visions, and ideas […] It’s as if someone were out there making up pointless jobs just for the sake of keeping us all working. And here, precisely, lies the mystery. In capitalism, this is precisely what is not supposed to happen. Sure, in the old inefficient socialist states like the Soviet Union, where employment was considered both a right and a sacred duty, the system made up as many jobs as they had to (this is why in Soviet department stores it took three clerks to sell a piece of meat). But, of course, this is the sort of very problem market competition is supposed to fix. According to economic theory, at least, the last thing a profit-seeking firm is going to do is shell out money to workers they don’t really need to employ. Still, somehow, it happens.

{ David Graeber | Continue reading }

what I am calling “bullshit jobs” are jobs that are primarily or entirely made up of tasks that the person doing that job considers to be pointless, unnecessary, or even pernicious. Jobs that, were they to disappear, would make no difference whatsoever. Above all, these are jobs that the holders themselves feel should not exist.

Contemporary capitalism seems riddled with such jobs.

{ The Anarchist Library | Continue reading }

image { Alliander, ElaadNL, and The incredible Machine, Transparent Charging Station, 2017 }

The sun is there, the slender trees, the lemon houses

Moringa oleifera, an edible tree found worldwide in the dry tropics, is increasingly being used for nutritional supplementation. Its nutrient-dense leaves are high in protein quality, leading to its widespread use by doctors, healers, nutritionists and community leaders, to treat under-nutrition and a variety of illnesses. Despite the fact that no rigorous clinical trial has tested its efficacy for treating under-nutrition, the adoption of M. oleifera continues to increase. The “Diffusion of innovations theory” describes well the evidence for growth and adoption of dietary M. oleifera leaves, and it highlights the need for a scientific consensus on the nutritional benefits. […]

The regions most burdened by under-nutrition, (in Africa, Asia, Latin America, and the Caribbean) all share the ability to grow and utilize an edible plant, Moringa oleifera, commonly referred to as “The Miracle Tree.” For hundreds of years, traditional healers have prescribed different parts of M. oleifera for treatment of skin diseases, respiratory illnesses, ear and dental infections, hypertension, diabetes, cancer treatment, water purification, and have promoted its use as a nutrient dense food source. The leaves of M. oleifera have been reported to be a valuable source of both macro- and micronutrients and is now found growing within tropical and subtropical regions worldwide, congruent with the geographies where its nutritional benefits are most needed.

Anecdotal evidence of benefits from M. oleifera has fueled a recent increase in adoption of and attention to its many healing benefits, specifically the high nutrient composition of the plants leaves and seeds. Trees for Life, an NGO based in the United States has promoted the nutritional benefits of Moringa around the world, and their nutritional comparison has been widely copied and is now taken on faith by many: “Gram for gram fresh leaves of M. oleifera have 4 times the vitamin A of carrots, 7 times the vitamin C of oranges, 4 times the calcium of milk, 3 times the potassium of bananas, ¾ the iron of spinach, and 2 times the protein of yogurt” (Trees for Life, 2005).

Feeding animals M. oleifera leaves results in both weight gain and improved nutritional status. However, scientifically robust trials testing its efficacy for undernourished human beings have not yet been reported. If the wealth of anecdotal evidence (not cited herein) can be supported by robust clinical evidence, countries with a high prevalence of under-nutrition might have at their fingertips, a sustainable solution to some of their nutritional challenges. […]

The “Diffusion of Innovations” theory explains the recent increase in M. oleifera adoption by various international organizations and certain constituencies within undernourished populations, in the same manner as it has been so useful in explaining the adoption of many of the innovative agricultural practices in the 1940-1960s. […] A sigmoidal curve (Figure 1), illustrates the adoption process starting with innovators (traditional healers in the case of M. oleifera), who communicate and influence early adopters, (international organizations), who then broadcast over time new information on M. oleifera adoption, in the wake of which adoption rate steadily increases.

{ Ecology of Food and Nutrition | Continue reading }

The operation was a success, but the patient died


Currently, we produce ∼1021 digital bits of information annually on Earth. Assuming a 20% annual growth rate, we estimate that after ∼350 years from now, the number of bits produced will exceed the number of all atoms on Earth, ∼1050. After ∼300 years, the power required to sustain this digital production will exceed 18.5 × 1015 W, i.e., the total planetary power consumption today, and after ∼500 years from now, the digital content will account for more than half Earth’s mass, according to the mass-energy–information equivalence principle. Besides the existing global challenges such as climate, environment, population, food, health, energy, and security, our estimates point to another singular event for our planet, called information catastrophe.

{ AIP Advances | Continue reading }

It is estimated that a week’s worth of the New York Times contains more information than a person was likely to come across in a lifetime in the 18th century. […] The amount of new information is doubling every two years. By 2010, it’s predicted to double every 72 hours. […] The lunatic named Bobby Fisher “despised the media”: “They’re destroying reality, turning everything into media.” “News exceed reality” writes Thomas Bernhard somewhere. The saturation and repetitions in Basquiat’s paintings. The high-frequency trading. “an immense accumulation of nothing“ (Imp Kerr, 2009). An immense accumulation of ignorance. […,]

From what precedes it necessarily follows that the inescapable future of knowledge is banality, falsehood, and overabundance, which sum is a form of ignorance.

{ The New Inquiry | Continue reading }

‘Tout vainqueur insolent à sa perte travaille.’ –Jean de La Fontaine


The Parrondo’s paradox, has been described as: A combination of losing strategies becomes a winning strategy. […]

Consider two games Game A and Game B, this time with the following rules:

1. In Game A, you simply lose $1 every time you play.
2. In Game B, you count how much money you have left. If it is an even number, you win $3. Otherwise you lose $5.

Say you begin with $100 in your pocket. If you start playing Game A exclusively, you will obviously lose all your money in 100 rounds. Similarly, if you decide to play Game B exclusively, you will also lose all your money in 100 rounds.

However, consider playing the games alternatively, starting with Game B, followed by A, then by B, and so on (BABABA…). It should be easy to see that you will steadily earn a total of $2 for every two games.

Thus, even though each game is a losing proposition if played alone, because the results of Game B are affected by Game A, the sequence in which the games are played can affect how often Game B earns you money, and subsequently the result is different from the case where either game is played by itself.

{ Wikipedia | Continue reading }

Give me a ho, if you’ve got your funky bus fare, ho

What is the feasibility of survival on another planet and being self-sustaining? […] I show here that a mathematical model can be used to determine the minimum number of settlers and the way of life for survival on another planet, using Mars as the example. […] The minimum number of settlers has been calculated and the result is 110 individuals.

{ Nature | Continue reading }

‘“I am a Microsoft Word man.” Says the human dressed like Microsoft Word.’ –David A Banks


David Silver [the creator of AlphaZero] hasn’t answered my question about whether machines can set up their own goals. He talks about subgoals, but that’s not the same. That’s a certain gap in his definition of intelligence. We set up goals and look for ways to achieve them. A machine can only do the second part.

So far, we see very little evidence that machines can actually operate outside of these terms, which is clearly a sign of human intelligence. Let’s say you accumulated knowledge in one game. Can it transfer this knowledge to another game, which might be similar but not the same? Humans can. With computers, in most cases you have to start from scratch.

{ Gary Kasparov/Wired | Continue reading }

photo { Kelsey Bennett }

Arms apeal with larms


The madman theory is a political theory commonly associated with U.S. President Richard Nixon’s foreign policy. He and his administration tried to make the leaders of hostile Communist Bloc nations think Nixon was irrational and volatile. According to the theory, those leaders would then avoid provoking the United States, fearing an unpredictable American response.

{ Wikipedia | Continue reading }

The author finds that perceived madness is harmful to general deterrence and is sometimes also harmful in crisis bargaining, but may be helpful in crisis bargaining under certain conditions.

{ British Journal of Political Science | Continue reading }

black smoke shells fitted with computer chips { Cai Guo-Qiang, Wreath (Black Ceremony), 2011 }

‘And now it goes as it goes and where it ends is Fate.’ –Aeschylus


Founded in 1945 by University of Chicago scientists who had helped develop the first atomic weapons in the Manhattan Project, the Bulletin of the Atomic Scientists created the Doomsday Clock two years later, using the imagery of apocalypse (midnight) and the contemporary idiom of nuclear explosion (countdown to zero) to convey threats to humanity and the planet. The decision to move (or to leave in place) the minute hand of the Doomsday Clock is made every year by the Bulletin’s Science and Security Board in consultation with its Board of Sponsors, which includes 13 Nobel laureates. The Clock has become a universally recognized indicator of the world’s vulnerability to catastrophe from nuclear weapons, climate change, and disruptive technologies in other domains.

To: Leaders and citizens of the world
Re: Closer than ever: It is 100 seconds to midnight
Date: January 23, 2020

{ Bulletin of the Atomic Scientists | Continue reading }

my wife said I never listen to her, or something like that

[W]hile time moves forward in our universe, it may run backwards in another, mirror universe that was created on the “other side” of the Big Bang.

{ PBS (2014) | Continue reading }

‘Nous avons exagéré le superflu, nous n’avons plus le nécessaire.’ –Proudhon


{ Sergei Eisenstein, On Disney )

Olobobo, ye foxy theagues!


English speakers have been deprived of a truly functional, second person plural pronoun since we let “ye” fade away a few hundred years ago.

“You” may address one person or a bunch, but it can be imprecise and unsatisfying. “You all”—as in “I’m talking to you all,” or “Hey, you all!”—sounds wordy and stilted. “You folks” or “you gang” both feel self-conscious. Several more economical micro-regional varieties (youz, yinz) exist, but they lack wide appeal.

But here’s what’s hard to explain: The first, a gender-neutral option, mainly thrives in the American South and hasn’t been able to steal much linguistic market share outside of its native habitat. The second, an undeniable reference to a group of men, is the default everywhere else, even when the “guys” in question are women, or when the speaker is communicating to a mixed gender group.

“You guys,” rolls off the tongues of avowed feminists every day, as if everyone has agreed to let one androcentric pronoun pass, while others (the generic “he” or “men” as stand-ins for all people) belong to the before-we-knew-better past. […]

One common defense of “you guys” that Mallinson encounters in the classroom and elsewhere is that it is gender neutral, simply because we use it that way. This argument also appeared in the New Yorker recently, in a column about a new book, The Life of Guy: Guy Fawkes, the Gunpowder Plot, and the Unlikely History of an Indispensable Word by writer and educator Allan Metcalf.

“Guy” grew out of the British practice of burning effigies of the Catholic rebel Guy Fawkes, Metcalf explains in the book. The flaming likenesses, first paraded in the early 1600s, came to be called “guys,” which evolved to mean a group of male lowlifes, he wrote in a recent story for Time. Then, by the 18th century, “guys” simply meant “men” without any pejorative connotations. By the 1930s, according to the Washington Post, Americans had made the leap to calling all persons “guys.”

{ Quartz | Continue reading }

Tony: [to Lady and Tramp with an Italian accent] Now-a, first-a we fix the table-a.


related { Disney Plus warns users of ‘outdated cultural depictions’ in old movies }