nswd

technology

Cause nobody is that strong

km.jpg

A 17-year-old boy, caught sending text messages in class, was recently sent to the vice principal’s office at Millwood High School in Halifax, Nova Scotia.

The vice principal, Steve Gallagher, told the boy he needed to focus on the teacher, not his cellphone. The boy listened politely and nodded, and that’s when Mr. Gallagher noticed the student’s fingers moving on his lap.

He was texting while being reprimanded for texting.

“It was a subconscious act,” says Mr. Gallagher, who took the phone away. “Young people today are connected socially from the moment they open their eyes in the morning until they close their eyes at night. It’s compulsive.”

Because so many people in their teens and early 20s are in this constant whir of socializing—accessible to each other every minute of the day via cellphone, instant messaging and social-networking Web sites—there are a host of new questions that need to be addressed in schools, in the workplace and at home. Chief among them: How much work can “hyper-socializing” students or employees really accomplish if they are holding multiple conversations with friends via text-messaging, or are obsessively checking Facebook? (…)

While their older colleagues waste time holding meetings or engaging in long phone conversations, young people have an ability to sum things up in one-sentence text messages, Mr. Bajarin says. “They know how to optimize and prioritize. They will call or set up a meeting if it’s needed. If not, they text.” And given their vast network of online acquaintances, they discover people who can become true friends or valued business colleagues—people they wouldn’t have been able to find in the pre-Internet era.

{ Wall Street Journal | Continue reading }

In this era of media bombardment, the ability to multitask has been seen as an asset. But people who commonly have simultaneous input from several types of media—surfing the Web while texting and listening to music, for instance—may in fact find it harder to filter out extraneous information. “We embarked on the research thinking that people who multitasked must be good at it,” says Clifford Nass, a psychologist at Stanford University who studies human-computer interaction. “So we were enormously surprised.”

{ American Scientist | Continue reading }

illustration { Richard Wilkinson }

Walk into a drugstore, and the last thing you see is drugs

{ What if you saw the world with your ears? Gameplay footage from Devil’s Tuning Fork, a game created by the DePaul Game Elites team at DePaul University’s College of Computing & Digital Media in Chicago. | Continue reading }

Not that I loved Caesar less, but that I loved Rome more

sh.jpg

The 400-year-old mystery of whether William Shakespeare was the author of an unattributed play about Edward III may have been solved by a computer program designed to detect plagiarism.

Sir Brian Vickers, an authority on Shakespeare at the Institute of English Studies at the University of London, believes that a comparison of phrases used in The Reign of King Edward III with Shakespeare’s early works proves conclusively that the Bard wrote the play in collaboration with Thomas Kyd, one of the most popular playwrights of his day.

The professor used software called Pl@giarism, developed by the University of Maastricht to detect cheating students, to compare language used in Edward III — published anonymously in 1596, when Shakespeare was 32 — with other plays of the period.

He discovered that playwrights often use the same patterns of speech, meaning that they have a linguistic fingerprint. The program identifies phrases of three words or more in an author’s known work and searches for them in unattributed plays. In tests where authors are known to be different, there are up to 20 matches because some phrases are in common usage. When Edward III was tested against Shakespeare’s works published before 1596 there were 200 matches.

{ Times | Continue reading }

photo { Shakespeare, The Cobbe Portrait, c. 1610 | More | The Cobbe portrait may be the only authentic image of Shakespeare made from life. }

‘To win the fame baby, it’s all the same baby.’ –Michael Jackson

2020.jpg

So let’s look at Twitter in the context of Abraham Maslow’s concept of a hierarchy of needs, first presented in his 1943 paper “A Theory of Human Motivation.”

Maslow’s hierarchy of needs is most often displayed as a pyramid, with lowest levels of the pyramid made up of the most basic needs and more complex needs are at the top of the pyramid. Needs at the bottom of the pyramid are basic physical requirements including the need for food, water, sleep and warmth. Once these lower-level needs have been met, people can move on to higher levels of needs, which become increasingly psychological and social. Soon, the need for love, friendship and intimacy become important. Further up the pyramid, the need for personal esteem and feelings of accomplishment become important. Finally, Maslow emphasized the importance of self-actualization, which is a process of growing and developing as a person to achieve individual potential.

Twitter aims primarily at social needs, like those for belonging, love, and affection. Relationships such as friendships, romantic attachments and families help fulfill this need for companionship and acceptance, as does involvement in social, community or religious groups. Clearly, feeling connected to people via Twitter helps to fulfill some of this need to belong and feel cared about.

An even higher level of need, related to self-esteem and social recognition, is also leveraged by Twitter.

{ PsychologyToday | Continue reading }

photo { Juliane Eirich }

The evening fell just like a star, left a trail behind

ep1.jpg

{ Erotika phone ad | via Copyranter }

Swarm intelligence, the collective behavior of decentralized, self-organized systems

ci.jpg

We’re in the middle of a huge platform shift in computing and most of us don’t even know it. The transition is from desktop to mobile and is as real as earlier transitions from mainframes to minicomputers to personal computers to networked computers with graphical interfaces. And like those previous transitions, this one doesn’t mean the old platforms are going away, just being diminished somewhat in significance. All of those previous platforms still exists. And desktops, too, will remain in some form when the mobile conversion is complete, though we are probably no more than five years from seeing the peak global population of desktop computers. We’d be there right now if we’d just figured out the I/O problem of how to stash a big display in a tiny device. But we’re almost there. That’s what this column is largely about.

{ Robert X. Cringely | Continue reading }

Who would like to buy some aspirin?

lp.jpg

{ A new technique for measuring the accessibility of a city shows why Paris is more accessible than London. }

That’s why I left this snow biz, and got into show biz

nb.jpg

I just love this (…) even though it’s a year old… “Facebook co-founder Dustin Moskovitz is leaving “to build an extensible enterprise productivity suite, along with a high-level open-source software development toolkit, built for the Web from the ground up.” In English, please?

{ AdScam | Continue reading }

illustration { Noma Bar }

Spy plane at 20,000 feet above. Roger.

c2.jpg

{ Flight Simulator Game Chair | The V1 computer desk }

I have come 500 miles just to see a halo

ep.jpg

We are apparently now in a situation where modern technology is changing the way people behave, people talk, people react, people think, and people remember. And you encounter this not only in a theoretical way, but when you meet people, when suddenly people start forgetting things, when suddenly people depend on their gadgets, and other stuff, to remember certain things. This is the beginning, it’s just an experience. But if you think about it and you think about your own behavior, you suddenly realize that something fundamental is going on. (…)

As we know, information is fed by attention, so we have not enough attention, not enough food for all this information. And, as we know — this is the old Darwinian thought, the moment when Darwin started reading Malthus — when you have a conflict between a population explosion and not enough food, then Darwinian selection starts. And Darwinian systems start to change situations. And so what interests me is that we are, because we have the Internet, now entering a phase where Darwinian structures, where Darwinian dynamics, Darwinian selection, apparently attacks ideas themselves: what to remember, what not to remember, which idea is stronger, which idea is weaker. (…)

It’s the question: what is important, what is not important, what is important to know? Is this information important? Can we still decide what is important? And it starts with this absolutely normal, everyday news. But now you encounter, at least in Europe, a lot of people who think, what in my life is important, what isn’t important, what is the information of my life. And some of them say, well, it’s in Facebook. And others say, well, it’s on my blog. And, apparently, for many people it’s very hard to say it’s somewhere in my life, in my lived life.

{ The Age of Informavore / A Talk With Frank Schirrmacher | Edge | Continue reading }

photo { Richard Kern }

Strength against strength; for he has the power of Zeus, and will not be checked till one of these two he has consumed.

38.gif

{ The Year 2038 Problem: Example showing how the date would reset at 03:14:08 UTC on 19 January 2038. | Wikipedia | Continue reading }

Sidewalk sundae strawberry surprise

gm.jpg

Rupert Murdoch said recently that he’s planning to stop Google News from indexing his publications including the Times of London and the Wall Street Journal. Murdoch’s idea is that Google News and the like make it too easy for Internet users to sample news for free rather than paying for it as God and Rupert intended. Mark Cuban, who is very clever but with whom I rarely agree, thinks this is smart on Murdoch’s part, because Twitter is changing the way people find news, effectively disintermediating Google, but not the News Corp. publications, themselves.

It’s funny how Murdoch’s statement made Cuban think of Twitter while it made me think immediately of the A&P.

The Great Atlantic and Pacific Tea Company, or A&P, was America’s first national chain of food markets. Hell, it was America’s first self-serve market, first to have store brands, first to advertise nationally, first to have a customer loyalty program (in 1912!), first to publish its own magazine (Womens’ Day, which is still around, though no longer owned by the A&P), and for most of my childhood back in Ohio A&P was the big Kahuna of grocery chains. With $5.4 billion in sales in the mid-1960s, A&P was at least 20 percent bigger than any of its competitors.

But after 105 years of setting the pace for the grocery industry, A&P peaked in the mid-1960s and went into a decline that lasted for at least 15 years and, it can be argued, continues even to this day. A&P, which has had German owners (the Tengelman Group) since the 1970s, is more of a super-regional chain today and doesn’t particularly vie for industry leadership on any measure. What happened in the mid-1960s to hurt A&P was it opted out of being indexed by Google News.

Well not literally, but close enough. A&P management, which back in the mid-60’s was still chosen from the founding Hartford family, decided at that time to abandon shopping centers — retail aggregators as Google is a news aggregator. They reasoned that in most shopping centers the anchor store was an A&P. In their view their supermarket was the main draw for a shopping center and didn’t need any of those other shops or stores to provide traffic. The rest of the shopping center was seen by A&P management as being purely parasitic.

{ Robert Cringely | Continue reading }

related { Interview with Rupert Murdoch | video }

previously { For the next few decades, journalism will be made up of overlapping special cases. }

photox { The last days of Gourmet magazine }

Everyone knew that hotel was a goner, they broke all the windows, they took all the door knobs

ap.jpg

Advances in technology have revealed that our brains are far more altered by experience or training than was thought possible. The memory-storing hippocampus region of the brain in London taxi drivers is bigger, and the auditory areas of musicians more developed, than average. Even learning to juggle can result in a certain amount of rewiring of the brain.

So the Lord Chief Justice’s suggestion that a lifetime spent on the internet will alter the way we think and process information is well founded. But whether these changes will enhance or degrade our powers of imagination, recall and decision-making has divided scientists.

Baroness Greenfield, director of the Royal Institution, was among the first to warn that today’s children may grow up with short attention spans and no imagination. Others suggest that the abandonment of books means people will lose the ability to follow a plot from start to finish. However, as yet little or no evidence has emerged to support these fears.

Short-term studies have, if anything, shown internet use to have a positive impact on our mental powers.

A study published this week, for instance, revealed that when “internet naive” adults carried out web searches every day for two weeks, it boosted the activity in brain areas linked to decision-making and working memory.

The reality is likely to be a trade-off: certain abilities will be enhanced at the expense of others. It could be that browsing through the vast quantities of information on the web leaves people better equipped to filter out the irrelevant and focus on the important.

Meanwhile, people may get worse at keeping the bigger picture in mind. Only a long-term psychological study will provide a definitive picture of how internet use affects cognition.

{ Times }

artwork { Peter Atkins, Last Fuel For 1000km, 2009 }

I got a 20, 6 O’Clock extra crystal, it’s only E-40

lg.jpg

New cosmetic medical devices including DIY lasers are expected to explode into a $1.3 billion market 2013, up from just $296 million in 2008, according to the analyst group Medical Insights. The growth in the market appears to be coming from light-based products that claim to either remove or grow hair on the human body. The Silk’n Hair was the first at-home laser device to be approved by the FDA, in 2006, although it didn’t come on the market until early 2008.

The laser hair removers damage the hair follicles that are in their growth phase, generally leading to some permanent reductions of body hair. DiBernardo questioned whether the lasers used in the home devices were powerful enough to get the kind of results that clinics achieve.

{ Wired | Continue reading }

photo { Viviane Sassen }

I got a color tv, so i can see the knicks play basketball


YouTube may pay less to be online than you do, a new report on internet connectivity suggests, calling into question a recent analysis arguing Google’s popular video service is bleeding money and demonstrating how the internet has continued to morph to fit user’s behavior.

In fact, with YouTube’s help, Google is now responsible for at least 6 percent of the internet’s traffic, and likely more — and may not be paying an ISP at all to serve up all that content and attached ads.

Credit Suisse made headlines this summer when it estimated that YouTube was binging on bandwidth, losing Google a half a billion dollars in 2009 as it streams 75 billion videos. But a new report from Arbor Networks suggests that Google’s traffic is approaching 10 percent of the net’s traffic, and that it’s got so much fiber optic cable, it is simply trading traffic, with no payment involved, with the net’s largest ISPs.

{ Wired | Continue reading }

Bass! How low can you go?

hr.jpg

{ Highway to Russia, 1959 | Paleofuture }

It’sa me, Mario

dv.jpg

{ New scientific techniques have uncovered evidence that this picture is a previously unrecognised work by Leonardo da Vinci. | Antique Trade Gazette | Continue reading }

‘Power is everywhere; not because it embraces everything, but because it comes from everywhere.’ –Michel Foucault

ab.jpg

In our contemporary ‘information age’, information and the body stand in a new, peculiar, and ambiguous relationship to one another. Information is plumbed from the body but treated as separate from it, facilitating, as Irma van der Ploeg has suggested, the creation of a separate virtual ‘body-as-information’ that has affected the very ontology of the body.

This ‘informatization of the body’ has been both spurred and enabled by surveillance techniques that create, depend upon, and manipulate virtual bodies for a variety of predictive purposes, including social control and marketing. While, as some feminist critics have suggested, there appears to be potential for information technologies to liberate us from oppressive ideological models, surveillance techniques, themselves so intimately tied to information systems, put normative pressure on non-normative bodies and practices, such as those of queer and genderqueer subjects. Ultimately, predictive surveillance is based in an innately conservative epistemology, and the intertwining of information systems with surveillance undermines any liberatory effect of the former.

{ Surveillance, Gender, and the Virtual Body in the Information Age | Full Text | More: Surveillance & Society, Vol 6, No 4 (2009)

photo { Arno Bani }

MDMA got you feeling like a champion, the city never sleeps better slip you a Ambien

ts.jpg

Nuclear physics was once considered the pinnacle of man’s effort to know reality; its image was tarnished by association with the bomb’s destructive violence.

{ The New Atlantis | Continue reading }

photo { Taryn Simon | video of the explosion }

With a ‘Ring and Valve Special’

dy.jpg

{ Dyson’s blade-free fan | full story | More: Gizmag, IT Media News }



kerrrocket.svg