The average human vocabulary consists of approximately 20,000 word families, yet only 6000-7000 word families are required to understand most communication.
One possible explanation for this level of redundancy is that vocabulary size is selected as a fitness indicator and is used for display. Human vocabulary size correlates highly with measurable intelligence and when choosing potential mates individuals actively prefer other correlates of intelligence, such as education.
Here we show that males used more low frequency words after an imaginary romantic encounter with a young female shown in a photograph relative to when they viewed photographs of older females. Females used fewer low frequency words when they imagined a romantic encounter with a young male shown in a photograph relative to when they viewed photographs of older males.
Anderson contends that the past 20 years have seen a total stagnation in the production of new cultural aesthetics. In other words, the end of the 50s looked nothing like the end of the 70s, but 1989 looks remarkably similar to 2009.
If you watch a person using the net, you see a kind of immersion: Often they are very oblivious to what is going on around them. But it is a very different kind of attentiveness than reading a book. In the case of a book, the technology of the printed page focuses our attention and encourages a linear type of thinking. In contrast, the internet seizes our attention only to scatter it. We are immersed because there’s a constant barrage of stimuli coming at us and we seem to be very much seduced by that kind of constantly changing patterns of visual and auditorial stimuli. When we become immersed in our gadgets, we are immersed in a series of distractions rather than a sustained, focused type of thinking. (…)
It is important to realize that it is no longer just hyperlinks: You have to think of all aspects of using the internet. There are messages coming at us through email, instant messenger, SMS, tweets etc. We are distracted by everything on the page, the various windows, the many applications running. You have to see the entire picture of how we are being stimulated. If you compare that to the placidity of a printed page, it doesn’t take long to notice that the experience of taking information from a printed page is not only different but almost the opposite from taking in information from a network-connected screen. With a page, you are shielded from distraction. We underestimate how the page encourages focussed thinking – which I don’t think is normal for human beings – whereas the screen indulges our desire to be constantly distracted.
The young can’t advance because everywhere they find my complacent generation is in situ. Thus the only way of solving the problem is to make everyone of a certain age, say over 50, walk the plank. (…)
The choice boils down to whether it’s better for people to have a decade at the beginning or at the end of their careers where they are demoralised and underemployed. The answer is easy: surely it is better to be more active at the beginning.
To have people idle at a time when they are full of energy and their grey-cell count is at a maximum is a shocking waste. And in any case, my generation has had it very good for much too long. We bought houses when they were still just about affordable. We had free education and pensions. It’s all been jolly nice, and I’ve enjoyed it a lot. Now is the time to start to pay.
In fact, it has become pretty clear that deciphering consciousness will definitely be more difficult than describing the dynamics of DNA. Crick himself spent more than two decades attempting to unravel the consciousness riddle, working on it doggedly until his death in 2004. His collaborator, neuroscientist Christof Koch of Caltech, continues their work even today, just as dozens of other scientists pursue a similar agenda — to identify the biological processes that constitute consciousness and to explain how and why those processes produce the subjective sense of persistent identity, the self-awareness and unity of experience, and the “awareness of self-awareness” that scientists and philosophers have long wondered about, debated and sometimes even claimed to explain.
So far, no one has succeeded to anyone else’s satisfaction. Yes, there have been advances: Understanding how the brain processes information. Locating, within various parts of the brain, the neural activity that accompanies certain conscious perceptions. Appreciating the fine distinctions between awareness, attention and subjective impressions. But yet with all this progress, the consciousness problem remains popular on lists of problems that might never be solved.
Perhaps that’s because the consciousness problem is inherently similar to another famous problem that actually has been proved unsolvable: finding a self-consistent set of axioms for deducing all of mathematics. As the Austrian logician Kurt Gödel proved eight decades ago, no such axiomatic system is possible; any system as complicated as arithmetic contains true statements that cannot be proved within the system.
Gödel’s proof emerged from deep insights into the self-referential nature of mathematical statements. He showed how a system referring to itself creates paradoxes that cannot be logically resolved — and so certain questions cannot in principle be answered.
To determine “how public figures realize creative forms of apologetic speech in order to minimize their responsibility for misdeeds,” Kampf examined 354 conditional apologies made by Israeli public figures, organizations, or institutions between 1997 and 2004, breaking them down into specific categories and sub-categories. (…)
After making racist remarks about Ethiopian immigrants, writer Samuel Shnitzer replied with a classic “if” statement: “If someone was hurt by the column I wrote, I am very sorry about that.” Ariel Sharon’s 2002 statement concerning the deaths of Palestinian civilians during a military campaign managed to include both an “if” and a “but”: “The Israeli Defense Force is sorry if civilians were injured, but not for the successful operation.”
As Kampf pointed out, this delicate wordplay is important to politicians who want to keep their jobs. But it’s even more crucial to business executives who, if they truly accepted responsibility, might end up in jail. A research team led by the University of Ulster’s Owen Hargie analyzed the testimony of four CEOs of financial institutions before a committee of the British Parliament in 2009 and noted a similar pattern of obfuscation.
“The main type of apology used by the senior bankers fell into the ‘I’m sorry you’re sick’ category, where the person is in effect saying that he or she has no personal responsibility for what happened, but recognizes and expresses sympathy for the person’s predicament,” the researchers write.
{ The drawings of butterflies done by Vladimir Nabokov were intended for “family use.” He made these on title pages of various editions of his works as a gift to his wife and son and sometimes to other relatives. None of these drawings portray real butterflies, both the images and the names he assigns to them are his invention. | Nabokov Museum | Continue reading }
The idea that our universe is embedded in a broader multidimensional space has captured the imagination of scientists and the general population alike.
This notion is not entirely science fiction. According to some theories, our cosmos may exist in parallel with other universes in other sets of dimensions. Cosmologists call these universes braneworlds. And among that many prospects that this raises is the idea that things from our Universe might somehow end up in another.
A couple of years ago, Michael Sarrazin at the University of Namur in Belgium and a few others showed how matter might make the leap in the presence of large magnetic potentials. That provided a theoretical basis for real matter swapping.
Today, Sarrazin and a few pals say that our galaxy might produce a magnetic potential large enough to make this happen for real. If so, we ought to be able to observe matter leaping back and forth between universes in the lab. In fact, such observations might already have been made in certain experiments.
I’d say that in about half of my business conversations, I have almost no idea what other people are saying to me. The language of internet business models has made the problem even worse. When I was younger, if I didn’t understand what people were saying, I thought I was stupid. Now I realize that if it’s to people’s benefit that I understand them but I don’t, then they’re the ones who are stupid.
Group settings can diminish expressions of intelligence, especially among women
Research led by scientists at the Virginia Tech Carilion Research Institute found that small-group dynamics — such as jury deliberations, collective bargaining sessions, and cocktail parties — can alter the expression of IQ in some susceptible people. “You may joke about how committee meetings make you feel brain dead, but our findings suggest that they may make you act brain dead as well,” said Read Montague, who led the study.
In fact, it was because of my feminism that I wanted to like Erotic Capital: Whether from nature or nurture, women have traditionally excelled at “soft skills” like taking the emotional temperature of others, listening, adjusting one’s behavior to any given situation, and cooperating. These all happen to be skills that, until fairly recently, have been undercompensated in the workplace. In Hakim’s book I anticipated a deftly written argument that would reclaim the value of women’s work so that maybe we’d eventually start paying people in the professions that make use of those skills — say, teaching and nursing — their true value.
That’s the book I wanted to read. The book I actually read was more like this: Men supposedly have higher sex drives than women, creating a “male sex deficit,” which means men are always in a state of wanting more of what women supply. (…) So women who are willing to address that deficit, by either having actual sex with men suffering from it or presenting themselves in an enchanting manner to exploit it, have erotic capital that can be traded for other forms of capital.
Erotic capital has many guises: from “trophy wives” whose skilled self-presentation becomes a part of a man’s public persona, to men or women who style themselves in such a way as to garner attention at their workplace, to women with otherwise limited means who sell their erotic capacity (whether forthrightly, as with sex workers and performers, or more covertly, as with sales jobs) to establish themselves. It’s “sell yourself” meets “sex sells.” What’s most surprising about all this is that Hakim seems to think she’s saying something new. (…)
That she fails to name a single feminist who has actually come out against presenting oneself well (as opposed to presenting oneself as stereotypically feminine) indicates that she’s attacking a straw feminist, not an actual one. Where are the radical feminists urging women to not use their people skills on the job? Who are these radical feminists who blame women for wearing makeup to work instead of directing their critiques at institutions that demand women do so? Hakim falsely asserts that feminists have been fighting for the eradication of charisma and charm instead of the eradication of coyness and the deployment of sex appeal as woman’s strongest — or only — weapons.
Popular discourse portrays marriage as a source of innumerable public and private benefits, happiness, companionship, financial security, and even good health. Complementing this view, our legal discourse frames the right to marry as a right of access, the exercise of which is an act of autonomy and free will.
However, a closer look at marriage’s past reveals a more complicated portrait. Marriage has been used - and importantly, continues to be used - as state-imposed sexual discipline.
Until the mid-twentieth century, marriage played an important role in the crime of seduction. Enacted in a majority of U.S. jurisdictions in the nineteenth century, seduction statutes punished those who ’seduced and had sexual intercourse with an unmarried female of previously chaste character’ under a ‘promise of marriage.’ Seduction statutes routinely prescribed a bar to prosecution for the offense: marriage. The defendant could simply marry the victim and avoid liability for the crime. However, marriage did more than serve as a bar to prosecution. It also was understood as a punishment for the crime. Just as incarceration promoted the internalization of discipline and reform of the inmate, marriage’s attendant legal and social obligations imposed upon defendant and victim a new disciplined identity, transforming them from sexual outlaws into in-laws.
What occurred to Newton was that there was a force of gravity, which of course everybody knew about, it’s not like he actually discovered gravity– everybody knew there was such a thing as gravity. But if you go back into antiquity, the way that the celestial objects, the moon, the sun, and the planets, were treated by astronomy had nothing to do with the way things on earth were treated. These were entirely different realms, and what Newton realized was that there had to be a force holding the moon in orbit around the earth. This is not something that Aristotle or his predecessors thought, because they were treating the planets and the moon as though they just naturally went around in circles. Newton realized there had to be some force holding the moon in its orbit around the earth, to keep it from wandering off, and he knew also there was a force that was pulling the apple down to the earth. And so what suddenly struck him was that those could be one and the same thing, the same force.
(…)
I’m not sure it’s accurate to say that physicists want to hand time over to philosophers. Some physicists are very adamant about wanting to say things about it; Sean Carroll for example is very adamant about saying that time is real. You have others saying that time is just an illusion, that there isn’t really a direction of time, and so forth. I myself think that all of the reasons that lead people to say things like that have very little merit, and that people have just been misled, largely by mistaking the mathematics they use to describe reality for reality itself. If you think that mathematical objects are not in time, and mathematical objects don’t change — which is perfectly true — and then you’re always using mathematical objects to describe the world, you could easily fall into the idea that the world itself doesn’t change, because your representations of it don’t.
There are other, technical reasons that people have thought that you don’t need a direction of time, or that physics doesn’t postulate a direction of time. My own view is that none of those arguments are very good. To the question as to why a physicist would want to hand time over to philosophers, the answer would be that physicists for almost a hundred years have been dissuaded from trying to think about fundamental questions. I think most physicists would quite rightly say “I don’t have the tools to answer a question like ‘what is time?’ - I have the tools to solve a differential equation.” The asking of fundamental physical questions is just not part of the training of a physicist anymore.
(…)
On earth, of all the billions of species that have evolved, only one has developed intelligence to the level of producing technology. Which means that kind of intelligence is really not very useful. It’s not actually, in the general case, of much evolutionary value. We tend to think, because we love to think of ourselves, human beings, as the top of the evolutionary ladder, that the intelligence we have, that makes us human beings, is the thing that all of evolution is striving toward. But what we know is that that’s not true. Obviously it doesn’t matter that much if you’re a beetle, that you be really smart. If it were, evolution would have produced much more intelligent beetles. We have no empirical data to suggest that there’s a high probability that evolution on another planet would lead to technological intelligence. There is just too much we don’t know.
Many of us cling to the notion that memory is a reliable record and trawling through it can be similar to flipping through an old photo album. But what about the memories - sometimes vivid in nature - of things that never were?
Examining the false stories that we can create for ourselves is the aim of a new initiative led by artist Alasdair Hopwood. As part of a residency at the Anomalistic Psychology Research Unit led by Chris French at Goldsmiths College, University of London, Hopwood aims to explore what false memories reveal about our sense of identity.
To do this, he has created the False Memory Archive, a collection of people’s fabricated recollections either jotted down after talks he has given or submitted online at the project’s website. (…)
For Hopwood, examining the ways we deceive ourselves through memory is perhaps a natural progression. He has worked with fellow artists as part of the WITH Collective on projects that expose and poke fun at the many ways we style our public selves. “Identity is not fixed,” he says. Instead, it shifts depending on the company we are in, and even the format of the interaction - be it social media or in person. We’re extraordinarily preoccupied with sculpting our identities, as the glut of self-help books and pseudoscientific methods for personal development demonstrates.
Benford’s Law, also known as the rule of first-digits, is a rule that says in data sets borne from real-life (perhaps sales of coffee or payments to a vendor), the number 1 should be the first digit in a series approximately 30% of the time, instead of 11% as would happen had a random number between one and nine been generated.
The rule was first developed by Simon Newcomb, who noticed that in his logarithm books the first pages showed much greater signs of use than those pages at the end. Later the physicist Frank Benford collected some 20,000 observations to test the theory, which he too stumbled upon.
Benford found that the first-digits of a variety of things in nature, like elemental atomic weights, the areas of rivers, and the numbers that appeared on front pages of newspapers, started with a one more often than any other digit.
The reason for that proof is the percentage difference between consecutive single-digit numbers. Say a firm is valued at $1 billion. For the first digit to become a two (or to reach a market cap of $2 billion), the value of the firm will need to increase by 100%. However, once it reaches that $2 billion mark, it only needs to increase by 50% to get to $3 billion. That difference continues to decline as the value increases.
Coordinated universal time (UTC) is the time scale used all over the world for time coordination. It was established in 1972, based on atomic clocks, and was the successor of astronomical time, which is based on the Earth’s rotation. But since the atomic time scale drifts from the astronomical one, 1-second steps were added whenever necessary. In 1972 everybody was happy with this decision. Today, most systems we use for telecommunications are not really happy with these “leap” seconds. So in January member states of the International Telecommunications Union will vote on dropping the leap second. (…)
We are using a system that breaks time. The quality of time is continuity. This is why a majority in the international community want to change the definition of UTC and drop the leap second. (…)
It was agreed some years ago that we should not think of any kind of adjustment in the near future, the next 100 or 200 years. In about the year 2600 we will have a half-hour divergence. However, we don’t know how time-keeping will be then, or how technology will be. So we cannot rule for the next six or seven generations.
Unfortunately the critical ability – both of ideology and literature – is rather limited today. There is a certain type of commentary, thinking and writing which is both perceived and sees itself as critical thinking, without actually being either critical or thinking (literary critique in which the feelings of the reviewer is understood as the truth, academic research which is only restating an existing doxa, predictable moves in media debates which only confirm current positions). (…) But it is difficult to resist the demand for simple answers, clear stances, confirmation of identities and solutions which correspond to the current order. And if you do resist the demand for simplicity, there is always someone else who delivers the preferred answers and acceptable opinion.