From photography to supercomputers: how we see ourselves in our inventions

Back in 2008, the technologist Ray Kurzweil estimated that the processing power of the human brain was in the region of 20 quadrillion calculations per second and that, as soon as we developed a supercomputer fast enough, simulating the brain would just be a problem of getting the software right. It was announced last month that the world’s fastest supercomputer, China’s Tianhe-2, can carry out almost 34 quadrillion calculations per second, meaning that, according to Kurzweil, we have the potential to simulate one and two-thirds of a human brain inside a single machine.

The idea that we could fit “one and two-thirds” of our brain function in a computer may seem a little flippant but it is not an unreasonable conclusion if you think of the brain as primarily a calculating engine. If this seems a little distant from your everyday experience, the idea that the mind is this “computation at work” is an assumption so embedded in modern neuroscience that it’s almost impossible to find anyone arguing for a non-computational science of the brain.

It’s not that this approach is necessarily wrong. Science has produced many useful and important advances based on exactly this assumption. The first kick in the World Cup was taken by a paralysed man in a robotic exoskeleton that he controlled through a brain-computer interface, all of which was based on exactly this mathematical view of the mind. The difficulty comes, however, when we assume that there is nothing more to explain in the mind and brain than calculations. What starts as a tool to help us understand ourselves, begins to replace us in our understanding.

But avoiding this pitfall may be more difficult than we think. Historically, our theories of the brain tend to be dominated by ideas we take from the technology of the day. The historian of psychology Douwe Draaisma has shown that while we often believe that we first learn about ourselves and then apply this knowledge to technology, it almost invariably happens the other way round. We tend to understand ourselves through our inventions.

The ancient Greek philosopher Plato had a theory that the mind was like a wax writing tablet. More than two millennia on, after seeing an alchemist demonstrate glow-in-the-dark phosphorescent liquid that had been synthesised for the first time, the 17th-century scientist Robert Hookesuggested that the mind stored memories just as this material seemed to “store” light. In the 1870s, when Thomas Edison first presented the phonograph to the world, scientists began discussing a theory of auditory memory as “an album containing phonographic sheets”. When photography was invented, it was used as a metaphor for memory partly because it captured information in a way that was a little blurry and had a tendency to fade. In an interesting twist, modern cognitive scientists have to remind their audience that “memory is not like a taking a photograph” because modern cameras do their job too efficiently to be a good metaphor for remembering.

When computers arrived, we inevitably saw ourselves in our machines and the idea of the mind as an information processor became popular. Here, the mind is thought to consist of information processing networks where data is computed and transformed. One of the newest and most fashionable theories argues that the central function of the brain is to statistically predict new information. The idea is that the brain tries to minimise the errors it makes in its predictions by adjusting its expectations as it gets new information. It’s a theory that originated from the mathematical “predictive coding” model that was developed to help second world war gunners predict where moving enemy planes would be in the two seconds it took for an anti-aircraft shell to reach them. The statistical theory became widespread when it was realised that it could be used to guess missing audio when sound was sent over a telephone network, meaning that less information needed to be sent as the rest could be mathematically reconstructed. Similar ideas were first adopted to improve artificial intelligence – Google’s speech recognition is now based on it – and then taken up by neuroscientists to explain data from the brain.

It could be that we’ve reached “the end of history” as far as neuroscience goes and that everything we’ll ever say about the brain will be based on our current “brain as calculation” metaphors. But if this is not the case, there is a danger that we’ll sideline aspects of human nature that don’t easily fit the concept. Our subjective experience, emotions and the constantly varying awareness of our own minds have traditionally been much harder to understand as forms of “information processing”. Importantly, these aspects of mental life are exactly where things tend to go awry in mental illness, and it may be that our main approach for understanding the mind and brain is insufficient for tackling problems such as depression and psychosis. It could be we simply need more time with our current concepts, but history might show us that our destiny lies in another metaphor, perhaps from a future technology.

From Terminator to Transcendence, popular culture is awash with fears about cyborgs, but, in terms of understanding ourselves, we have been cyborgs for centuries. We’ve lived in a constantly evolving relationship with machines that has profoundly affected how we see human nature. Perhaps the question is not whether we are lazy to take ideas from machines in order to understand ourselves but whether we can ever think beyond them.

Vogue Japan, September 2014

Photographed by Daniele + Iango

periodicult90s:

Levi’s, Elle magazine, September 1991.

periodicult90s:

Levi’s, Elle magazine, September 1991.

In general, people are not drawn to perfection in others. People are drawn to shared interests, shared problems, and an individual’s life energy. Humans connect with humans. Hiding one’s humanity and trying to project an image of perfection makes a person vague, slippery, lifeless, and uninteresting.

Made green tea truffles today after my workout as a treat and also for a party tomorrow that I’m attending, if they last until then.

We were the first human beings who would never see anything for the first time. We stare at the wonders of the world, dull-eyes, underwhelmed. Mona Lisa, the Pyramids, the Empire State Building. Jungle animals on attack, ancient icebergs collapsing, volcanoes erupting. I can’t recall a single amazing thing I have seen firsthand that I didn’t immediately reference to a movie or TV show. A fucking commercial. You know the awful singsong of the blase: ‘Seeeen it.’ I’ve literally seen it all, and the worst thing, the thing that makes me want to blow my brains out, is: The secondhand experience is always better. The image is crisper, the view is keener, the camera angle and the soundtrack manipulate my emotions in a way reality can’t anymore. I don’t know that we are actually human at this point, those of us who are like most of us, who grew up with TV and movies and now the Internet. If we are betrayed, we know the words to say; when a loved one dies, we know the words to say. If we want to play the stud or the smart-ass or the fool, we know the words to say. We are all working from the same dog-eared script. It’s a very difficult era in which to be a person, just a real, actual person, instead of a collection of personality traits selected from an endless Automat of characters. It had gotten to the point where it seemed like nothing matters, because I’m not a real person and neither is anyone else. I would have done anything to feel real again.

Isamu Noguchi did not draw a distinction between formal sculpture and functional forms; he approached both in a similar way. This cast iron bowl is a deceptively simple example of the play between the silhouette and the void.

Isamu Noguchi did not draw a distinction between formal sculpture and functional forms; he approached both in a similar way. This cast iron bowl is a deceptively simple example of the play between the silhouette and the void.

(Source: s-c-r-a-p-b-o-o-k)

About to be officially discharged from physical therapy! My meniscus healed surprisingly fast and I’m able to work at a higher intensity without any real pain which has been an incredible feeling. I just have to remember to not always push my body so hard, which is something I always forget about when I’m in the heat of the moment. 

  
The power of lonely

You hear it all the time: We humans are social animals. We need to spend time together to be happy and functional, and we extract a vast array of benefits from maintaining intimate relationships and associating with groups. Collaborating on projects at work makes us smarter and more creative. Hanging out with friends makes us more emotionally mature and better able to deal with grief and stress.

Spending time alone, by contrast, can look a little suspect. In a world gone wild for wikis and interdisciplinary collaboration, those who prefer solitude and private noodling are seen as eccentric at best and defective at worst, and are often presumed to be suffering from social anxiety, boredom, and alienation.

But an emerging body of research is suggesting that spending time alone, if done right, can be good for us — that certain tasks and thought processes are best carried out without anyone else around, and that even the most socially motivated among us should regularly be taking time to ourselves if we want to have fully developed personalities, and be capable of focus and creative thinking. There is even research to suggest that blocking off enough alone time is an important component of a well-functioning social life — that if we want to get the most out of the time we spend with people, we should make sure we’re spending enough of it away from them. Just as regular exercise and healthy eating make our minds and bodies work better, solitude experts say, so can being alone.

One ongoing Harvard study indicates that people form more lasting and accurate memories if they believe they’re experiencing something alone. Another indicates that a certain amount of solitude can make a person more capable of empathy towards others. And while no one would dispute that too much isolation early in life can be unhealthy, a certain amount of solitude has been shown to help teenagers improve their moods and earn good grades in school.

“There’s so much cultural anxiety about isolation in our country that we often fail to appreciate the benefits of solitude,” said Eric Klinenberg, a sociologist at New York University whose book “Alone in America,” in which he argues for a reevaluation of solitude, will be published next year. “There is something very liberating for people about being on their own. They’re able to establish some control over the way they spend their time. They’re able to decompress at the end of a busy day in a city…and experience a feeling of freedom.”

Figuring out what solitude is and how it affects our thoughts and feelings has never been more crucial. The latest Census figures indicate there are some 31 million Americans living alone, which accounts for more than a quarter of all US households. And at the same time, the experience of being alone is being transformed dramatically, as more and more people spend their days and nights permanently connected to the outside world through cellphones and computers. In an age when no one is ever more than a text message or an e-mail away from other people, the distinction between “alone” and “together” has become hopelessly blurry, even as the potential benefits of true solitude are starting to become clearer.

Solitude has long been linked with creativity, spirituality, and intellectual might. The leaders of the world’s great religions — Jesus, Buddha, Mohammed, Moses — all had crucial revelations during periods of solitude. The poet James Russell Lowell identified solitude as “needful to the imagination;” in the 1988 book “Solitude: A Return to the Self,” the British psychiatrist Anthony Storr invoked Beethoven, Kafka, and Newton as examples of solitary genius.

But what actually happens to people’s minds when they are alone? As much as it’s been exalted, our understanding of how solitude actually works has remained rather abstract, and modern psychology — where you might expect the answers to lie — has tended to treat aloneness more as a problem than a solution. That was what Christopher Long found back in 1999, when as a graduate student at the University of Massachusetts Amherst he started working on a project to precisely define solitude and isolate ways in which it could be experienced constructively. The project’s funding came from, of all places, the US Forest Service, an agency with a deep interest in figuring out once and for all what is meant by “solitude” and how the concept could be used to promote America’s wilderness preserves.

With his graduate adviser and a researcher from the Forest Service at his side, Long identified a number of different ways a person might experience solitude and undertook a series of studies to measure how common they were and how much people valued them. A 2003 survey of 320 UMass undergraduates led Long and his coauthors to conclude that people felt good about being alone more often than they felt bad about it, and that psychology’s conventional approach to solitude — an “almost exclusive emphasis on loneliness” — represented an artificially narrow view of what being alone was all about.

“Aloneness doesn’t have to be bad,” Long said by phone recently from Ouachita Baptist University, where he is an assistant professor. “There’s all this research on solitary confinement and sensory deprivation and astronauts and people in Antarctica — and we wanted to say, look, it’s not just about loneliness!”

Today other researchers are eagerly diving into that gap. Robert Coplan of Carleton University, who studies children who play alone, is so bullish on the emergence of solitude studies that he’s hoping to collect the best contemporary research into a book. Harvard professor Daniel Gilbert, a leader in the world of positive psychology, has recently overseen an intriguing study that suggests memories are formed more effectively when people think they’re experiencing something individually.

That study, led by graduate student Bethany Burum, started with a simple experiment: Burum placed two individuals in a room and had them spend a few minutes getting to know each other. They then sat back to back, each facing a computer screen the other could not see. In some cases they were told they’d both be doing the same task, in other cases they were told they’d be doing different things. The computer screen scrolled through a set of drawings of common objects, such as a guitar, a clock, and a log. A few days later the participants returned and were asked to recall which drawings they’d been shown. Burum found that the participants who had been told the person behind them was doing a different task — namely, identifying sounds rather than looking at pictures — did a better job of remembering the pictures. In other words, they formed more solid memories when they believed they were the only ones doing the task.

The results, which Burum cautions are preliminary, are now part of a paper on “the coexperiencing mind” that was recently presented at the Society for Personality and Social Psychology conference. In the paper, Burum offers two possible theories to explain what she and Gilbert found in the study. The first invokes a well-known concept from social psychology called “social loafing,” which says that people tend not to try as hard if they think they can rely on others to pick up their slack. (If two people are pulling a rope, for example, neither will pull quite as hard as they would if they were pulling it alone.) But Burum leans toward a different explanation, which is that sharing an experience with someone is inherently distracting, because it compels us to expend energy on imagining what the other person is going through and how they’re reacting to it.

“People tend to engage quite automatically with thinking about the minds of other people,” Burum said in an interview. “We’re multitasking when we’re with other people in a way that we’re not when we just have an experience by ourselves.”

Perhaps this explains why seeing a movie alone feels so radically different than seeing it with friends: Sitting there in the theater with nobody next to you, you’re not wondering what anyone else thinks of it; you’re not anticipating the discussion that you’ll be having about it on the way home. All your mental energy can be directed at what’s happening on the screen. According to Greg Feist, an associate professor of psychology at the San Jose State University who has written about the connection between creativity and solitude, some version of that principle may also be at work when we simply let our minds wander: When we let our focus shift away from the people and things around us, we are better able to engage in what’s called meta-cognition, or the process of thinking critically and reflectively about our own thoughts.

Other psychologists have looked at what happens when other people’s minds don’t just take up our bandwidth, but actually influence our judgment. It’s well known that we’re prone to absorb or mimic the opinions and body language of others in all sorts of situations, including those that might seem the most intensely individual, such as who we’re attracted to. While psychologists don’t necessarily think of that sort of influence as “clouding” one’s judgment — most would say it’s a mechanism for learning, allowing us to benefit from information other people have access to that we don’t — it’s easy to see how being surrounded by other people could hamper a person’s efforts to figure out what he or she really thinks of something.

Teenagers, especially, whose personalities have not yet fully formed, have been shown to benefit from time spent apart from others, in part because it allows for a kind of introspection — and freedom from self-consciousness — that strengthens their sense of identity. Reed Larson, a professor of human development at the University of Illinois, conducted a study in the 1990s in which adolescents outfitted with beepers were prompted at irregular intervals to write down answers to questions about who they were with, what they were doing, and how they were feeling. Perhaps not surprisingly, he found that when the teens in his sample were alone, they reported feeling a lot less self-conscious. “They want to be in their bedrooms because they want to get away from the gaze of other people,” he said.

The teenagers weren’t necessarily happier when they were alone; adolescence, after all, can be a particularly tough time to be separated from the group. But Larson found something interesting: On average, the kids in his sample felt better after they spent some time alone than they did before. Furthermore, he found that kids who spent between 25 and 45 percent of their nonclass time alone tended to have more positive emotions over the course of the weeklong study than their more socially active peers, were more successful in school and were less likely to self-report depression.

“The paradox was that being alone was not a particularly happy state,” Larson said. “But there seemed to be kind of a rebound effect. It’s kind of like a bitter medicine.”

The nice thing about medicine is it comes with instructions. Not so with solitude, which may be tremendously good for one’s health when taken in the right doses, but is about as user-friendly as an unmarked white pill. Too much solitude is unequivocally harmful and broadly debilitating, decades of research show. But one person’s “too much” might be someone else’s “just enough,” and eyeballing the difference with any precision is next to impossible.

Research is still far from offering any concrete guidelines. Insofar as there is a consensus among solitude researchers, it’s that in order to get anything positive out of spending time alone, solitude should be a choice: People must feel like they’ve actively decided to take time apart from people, rather than being forced into it against their will.

Overextended parents might not need any encouragement to see time alone as a desirable luxury; the question for them is only how to build it into their frenzied lives. But for the millions of people living by themselves, making time spent alone time productive may require a different kind of effort. Sherry Turkle, director of the MIT Initiative on Technology and Self, argues in her new book, “Alone, Together,” that people should be mindfully setting aside chunks of every day when they are not engaged in so-called social snacking activities like texting, g-chatting, and talking on the phone. For teenagers, it may help to understand that feeling a little lonely at times may simply be the price of forging a clearer identity.

John Cacioppo of the University of Chicago, whose 2008 book “Loneliness” with William Patrick summarized a career’s worth of research on all the negative things that happen to people who can’t establish connections with others, said recently that as long as it’s not motivated by fear or social anxiety, then spending time alone can be a crucially nourishing component of life. And it can have some counterintuitive effects: Adam Waytz in the Harvard psychology department, one of Cacioppo’s former students, recently completed a study indicating that people who are socially connected with others can have a hard time identifying with people who are more distant from them. Spending a certain amount of time alone, the study suggests, can make us less closed off from others and more capable of empathy — in other words, better social animals.

“People make this error, thinking that being alone means being lonely, and not being alone means being with other people,” Cacioppo said. “You need to be able to recharge on your own sometimes. Part of being able to connect is being available to other people, and no one can do that without a break.”

(Source: Boston.com)

A very rare document on Issey Miyake. Explores his design studio and factory and spotlights his garments on the runway. Miyake talks about his background, training, and some of his favorite projects, such as designing costumes for William Forsythe’s Frankfurt Ballet and outfits for Lithuania’s 1992 Olympic team. Originally broadcast on Wowow Japan Satellite Broadcasting in 1993.

nonnative 

2014 Autumn & Winter

LEISURE after LABOUR