Helmut Lang F/W 97
Wish my family wasn’t on the other side of the world so we could share some homemade meals together.
But the real difficulties, the real arts of survival, seem to lie in more subtle realms. There, what’s called for is a kind of resilience of the psyche, a readiness to deal with what comes next. These captives lay out in a stark and dramatic way what goes on in every life: the transitions whereby you cease to be who you were. Sometimes an old photograph, an old friend, an old letter, will remind you that you are not who you once were, for that person who dwelt among them, valued this, chose that, wrote thus, no longer exists. Without noticing it you traversed a great distance; the strange has become familiar and the familiar if not strange at least awkward or uncomfortable, an out-grown garment.
Some people inherit values and practices as a house they inhabit; some of us have to burn down that house, find our own ground, build from scratch, even as a psychological metamorphosis. As a cultural metamorphosis the transition is far more dramatic. — Rebecca Solnit, A Field Guide to Getting Lost
Still Life - Fast Moving, 1956
Oil on canvas. 49 1/2 x 63 in.
“Sociologists, it turns out, have studied these covert athletic biases. Carl Stempel, for example, writing in the International Review for the Sociology of Sport, argues that upper middle class Americans avoid “excessive displays of strength,” viewing the bodybuilder look as vulgar overcompensation for wounded manhood. The so-called dominant classes, Stempel writes—especially those like my friends and myself, richer in fancy degrees than in actual dollars—tend to express dominance through strenuous aerobic sports that display moral character, self-control, and self-development, rather than physical dominance.”
from "What Your Workout Says About Your Social Class," Pacific Standard
Tennis, squash, golf, yachting……
Although I know which category I fall under, I’ve always had positive experiences with the diverse range of people I’ve met though various sporting events. I say to each their own!
Helmut Lang wedding dress for Stella Tennant.
Back in 2008, the technologist Ray Kurzweil estimated that the processing power of the human brain was in the region of 20 quadrillion calculations per second and that, as soon as we developed a supercomputer fast enough, simulating the brain would just be a problem of getting the software right. It was announced last month that the world’s fastest supercomputer, China’s Tianhe-2, can carry out almost 34 quadrillion calculations per second, meaning that, according to Kurzweil, we have the potential to simulate one and two-thirds of a human brain inside a single machine.
The idea that we could fit “one and two-thirds” of our brain function in a computer may seem a little flippant but it is not an unreasonable conclusion if you think of the brain as primarily a calculating engine. If this seems a little distant from your everyday experience, the idea that the mind is this “computation at work” is an assumption so embedded in modern neuroscience that it’s almost impossible to find anyone arguing for a non-computational science of the brain.
It’s not that this approach is necessarily wrong. Science has produced many useful and important advances based on exactly this assumption. The first kick in the World Cup was taken by a paralysed man in a robotic exoskeleton that he controlled through a brain-computer interface, all of which was based on exactly this mathematical view of the mind. The difficulty comes, however, when we assume that there is nothing more to explain in the mind and brain than calculations. What starts as a tool to help us understand ourselves, begins to replace us in our understanding.
But avoiding this pitfall may be more difficult than we think. Historically, our theories of the brain tend to be dominated by ideas we take from the technology of the day. The historian of psychology Douwe Draaisma has shown that while we often believe that we first learn about ourselves and then apply this knowledge to technology, it almost invariably happens the other way round. We tend to understand ourselves through our inventions.
The ancient Greek philosopher Plato had a theory that the mind was like a wax writing tablet. More than two millennia on, after seeing an alchemist demonstrate glow-in-the-dark phosphorescent liquid that had been synthesised for the first time, the 17th-century scientist Robert Hookesuggested that the mind stored memories just as this material seemed to “store” light. In the 1870s, when Thomas Edison first presented the phonograph to the world, scientists began discussing a theory of auditory memory as “an album containing phonographic sheets”. When photography was invented, it was used as a metaphor for memory partly because it captured information in a way that was a little blurry and had a tendency to fade. In an interesting twist, modern cognitive scientists have to remind their audience that “memory is not like a taking a photograph” because modern cameras do their job too efficiently to be a good metaphor for remembering.
When computers arrived, we inevitably saw ourselves in our machines and the idea of the mind as an information processor became popular. Here, the mind is thought to consist of information processing networks where data is computed and transformed. One of the newest and most fashionable theories argues that the central function of the brain is to statistically predict new information. The idea is that the brain tries to minimise the errors it makes in its predictions by adjusting its expectations as it gets new information. It’s a theory that originated from the mathematical “predictive coding” model that was developed to help second world war gunners predict where moving enemy planes would be in the two seconds it took for an anti-aircraft shell to reach them. The statistical theory became widespread when it was realised that it could be used to guess missing audio when sound was sent over a telephone network, meaning that less information needed to be sent as the rest could be mathematically reconstructed. Similar ideas were first adopted to improve artificial intelligence – Google’s speech recognition is now based on it – and then taken up by neuroscientists to explain data from the brain.
It could be that we’ve reached “the end of history” as far as neuroscience goes and that everything we’ll ever say about the brain will be based on our current “brain as calculation” metaphors. But if this is not the case, there is a danger that we’ll sideline aspects of human nature that don’t easily fit the concept. Our subjective experience, emotions and the constantly varying awareness of our own minds have traditionally been much harder to understand as forms of “information processing”. Importantly, these aspects of mental life are exactly where things tend to go awry in mental illness, and it may be that our main approach for understanding the mind and brain is insufficient for tackling problems such as depression and psychosis. It could be we simply need more time with our current concepts, but history might show us that our destiny lies in another metaphor, perhaps from a future technology.
From Terminator to Transcendence, popular culture is awash with fears about cyborgs, but, in terms of understanding ourselves, we have been cyborgs for centuries. We’ve lived in a constantly evolving relationship with machines that has profoundly affected how we see human nature. Perhaps the question is not whether we are lazy to take ideas from machines in order to understand ourselves but whether we can ever think beyond them.
Levi’s, Elle magazine, September 1991.
In general, people are not drawn to perfection in others. People are drawn to shared interests, shared problems, and an individual’s life energy. Humans connect with humans. Hiding one’s humanity and trying to project an image of perfection makes a person vague, slippery, lifeless, and uninteresting. — Robert Glover
(Source: psych-quotes, via fuckyeahexistentialism)
Made green tea truffles today after my workout as a treat and also for a party tomorrow that I’m attending, if they last until then.
We were the first human beings who would never see anything for the first time. We stare at the wonders of the world, dull-eyes, underwhelmed. Mona Lisa, the Pyramids, the Empire State Building. Jungle animals on attack, ancient icebergs collapsing, volcanoes erupting. I can’t recall a single amazing thing I have seen firsthand that I didn’t immediately reference to a movie or TV show. A fucking commercial. You know the awful singsong of the blase: ‘Seeeen it.’ I’ve literally seen it all, and the worst thing, the thing that makes me want to blow my brains out, is: The secondhand experience is always better. The image is crisper, the view is keener, the camera angle and the soundtrack manipulate my emotions in a way reality can’t anymore. I don’t know that we are actually human at this point, those of us who are like most of us, who grew up with TV and movies and now the Internet. If we are betrayed, we know the words to say; when a loved one dies, we know the words to say. If we want to play the stud or the smart-ass or the fool, we know the words to say. We are all working from the same dog-eared script. It’s a very difficult era in which to be a person, just a real, actual person, instead of a collection of personality traits selected from an endless Automat of characters. It had gotten to the point where it seemed like nothing matters, because I’m not a real person and neither is anyone else. I would have done anything to feel real again. — Gillian Flynn, Gone Girl
(Source: kllyourdarlings, via aprilbeaujangles)