![]() On the side of angels or devils
|
|||||
Is it morally wrong to eat animals? The debate is ripe with contradictions and hard-fought, passionate views.A man and his dog waited at the lights. Tall man, black Staffie, no lead. The man crouched beside the animal, one hand on its glossy flank, the other gently tickling under its collar. Man-to-dog. The message was complex; communing, controlling, safekeeping. When the lights changed the two animals crossed together, but at the man’s behest, the canine gazing up at his master with evident adoration.Was this relationship wrong? Was it immoral? Was it different in kind from eating animals? And is that also wrong?Well, yes, yes, no and yes, if you follow animal rights arguments to their logical conclusion.
The anti-meat arguments are a compelling mix of logic and sentiment. Thoroughly applied, however, they would end virtually all our relationships with animals. No butchers, farm animals, lab-rats, hunting or zoos, but also no police dogs, guide dogs, horse-racing, pets. The only animals left in most people’s lives would be insects and bacteria. Let me confess straight up that, as a largely unrepentant lifetime carnivore, I’m not proposing a definitive answer to the meat question. Indeed, it surprises me that most people are so comfortably ensconced as either qualmless carnivores or vociferous vegetarians, since my own attempts at understanding reveal only murk and more murk. It’s no mere dietary thing, meat. To be sure, even dietary things can induce death by overthinking, given the environmental and political football food has become. But meat goes also to history, human rights and, if you will, ontology – our place and purpose in the universe. There are over-simplifications on both sides. Vegos often assume their stance is supported by our latter-day understanding of humanity as embedded in (rather than master of) the animal kingdom. As though making animals kin automatically precludes our eating them (though it clearly doesn’t preclude their eating us). As though cooking spag bol were the moral equivalent of feasting on granny. Carnivores tend to argue from biological primitivism – our canine dentition, our need for iron and our urge to hunt seeming to endow meat with a natural “rightness”. “If we kill for food, not for sport or cruelty,” they typically reason, “it’s natural, and therefore legitimate.” Neither argument is exactly rigorous. On the one hand, being brothers in the animal kingdom never stopped lion eating gazelle. On the other, while primitivism appeals to our sense of deep purpose, it ignores that the point of civilisation – the great human task – is to transcend the primitive and find our inner angels. But illogic has never inhibited popularity. Of the two viewpoints, meat-eating is certainly more widely held, but vegetarianism has a growing moral edge. Polemics on the evils of meat range from earnest documentary such as Anna Krein’s Quarterly Essay, “Us and Them”, to novels such as Ruth Ozeki’s wonderful, funny and bleak My Year of Meats, Mark Lewis’s wonderfully weird film The Natural History of the Chicken, and even, arguably, Chris Noonan’s Babe. However you look at it – environmentally, financially, health-wise – meat is bad, destructive and gobsmackingly expensive. It takes more than 100,000 litres of water to produce a kilo of beef; 900 for a kilo of wheat. Animal protein requires eight times as much fossil fuel as plant protein and, through flatulence, releases almost half the global emissions of methane, which is more than 20 times as greenhouse-destructive as carbon dioxide. Animal farming is also intensively polluting, producing, says the animal rights philosopher Gary Francione, 130 times the waste of the entire human population, most of it dumped straight into waterways, leading to nitrogen eutrophication. Then there’s the land question: per calorie, animals occupy 20 times as much land as plants, on top of which about 40 per cent of world grain goes to feed farm animals. Then there’s health – the cancers and miscarriages, the hormones and plastics, the pesticides, the genetic engineering (theirs), the obesity (ours). According to the RSPCA, about half of all Australian beef cattle are still injected with synthetic growth hormones. Which is to say, if store prices reflected the true cost of meat, the already stark price differential would increase stratospherically. Yet these arguments, however persuasive, are still not core. They represent the practical, utilitarian wing of anti-meatism; sensible, demonstrable adages that can be at least partially satisfied by reducing (rather than ending) meat consumption. The core vegan argument is moral. This makes it far more hardline, far harder to appease; using animals for our own ends is, simply, wrong. How does it work, this morality of meat? Even serious animal rights philosophers – Peter Singer, Robert Garner, Gary Francione, Tom Regan – are not above emotive manipulation. In fact, they revel in it. Regan tells an especially gruesome story of Chinese restaurant patrons selecting their caged cats and dogs, and the appalling live scalding-and-skinning that follows. Emotivism is understandable. It’s an emotional subject. Our cruelties to animals are vast in scale and in their determination, duplicity and depravity. From shark fin soup (with finless sharks left to suffocate slowly, spiralling to the sea bottom) to routine chemical and cosmetic testing on the eyes of captive dogs and rabbits (detailed by Francione) and the frantic scrabbling of a thousand lobsters, as described in David Foster Wallace’s Consider the Lobster, habitual animal treatment undertaken in our name makes the notorious ABC footage of Australia’s live cattle trade look like play school. Francione and others even argue against dairy products on the grounds that cows are constantly “raped” and kept pregnant to sustain lactation, losing their calves moments after birth. Yet, by themselves, these cruelties will not end meat- or cheese-eating, since most of us respond with the “can’t they just kill or milk animals humanely?” line. No, Francione would say, they can’t – not on anything like the scale or cost of demand. This industry requires animals be kept caged, drugged, beakless, clawless, enfeebled and out of sight for their entire existences. Some Westerners have adopted the challenging position of eating only meat they have killed. But for most of us this is practically impossible, as well as distasteful. It might be possible – with a lot more regulation and vastly more money – to eat, infrequently, only small portions of meat that has been reared and killed decently. Yet the core question remains: is even humane, organic meat wrong? We know it is possible for humans to survive without meat. Certain Asian, neolithic and pantheistic cultures have done it for centuries. The Jains use no animal product; others kill and eat meat but with sacred respect. Yet it is fair to say that human dominance over nature (the so-called Judeo-Christian view) is a core Western concept which, though it may yet precipitate our downfall, has also underpinned our success. Many see this as sinful. They blame Descartes’ view that animals lacked inherent moral status. (Anna Krien wrongly interprets the Cartesian Cogito – “I think, therefore I am” – as an expression of human supremacy, when its real significance was as the residual, hard-core truth after Descartes’ exhaustive “universal doubt”.) It wasn’t until the 19th century that animals began to be considered as moral entities. Jeremy Bentham is widely regarded as having founded the animal rights movement, when he argued in 1823 that: “The question is not, Can they reason? nor, Can they talk? but, Can they suffer?” Francione’s book offers an extensive catalogue of animal suffering at human hands – in farms, zoos, circuses, laboratories and puppy mills. But the entire argument rests on two principles: the Principle of Equal Consideration, and the Humane Treatment Principle. The first requires that the life of an animal be taken as the moral equivalent of a human life. The second holds that humans should inflict suffering on animals only when necessary. When is animal suffering necessary? Well, never, really. Francione argues that it is seldom true – even in drug-testing, which he says is more degrading and less effective than is generally known – that our survival demands animal suffering. It is hard to fault Francione’s argument if you accept his principles. But do we accept the principles? The Equal Consideration principle makes the other redundant, since if animals are human-equivalents, it is obvious that unnecessary suffering should be avoided (although equally obvious that we do not achieve this). Francione and other animal-rights theorists typically use slavery as proof that neither tradition, nor culture, nor economics constitutes sufficient reason to perpetuate a wrong. Millions of us, they argue, were fine with slavery. We made owning people possible for ourselves by believing them our moral inferiors. This is what we do with animals. That was wrong then; this is wrong now. But this is Francione’s mistake. The wrongness of owning humans does not necessarily imply the same about animals. It all hinges on the idea of moral equality. For the slavery analogy to be valid, the Principle of Equal Consideration must be in place. We must regard animals as our moral equivalents. Do we, though? Even in our finest moments, do we really think a grasshopper weighs as much – morally – as a human? Moreover, the slavery analogy would outlaw even pet-owning, pets being dependent on our whims and predilections. But there are issues here, besides the frank hypocrisy with which Francione himself dedicates his book to his shaggy white dog Bonnie, who he says is “a person, a member of the moral community” but whose veganism is clearly his choice, not hers. It’s not simply that I can conceive of no happier creature than the brown Burmese curled up beside me right now, snoring, belly up. Undignified, yes; miserable, no. Or even that, without pets and farming, entire species would vanish. Children would grow up knowing nothing of animal life beside the ants they must treat as equals. It’s more that, in the end, we are different. Animals clearly suffer. Some of them at least are clearly conscious, sentient, emotional. But you don’t see dogs having moral qualms about chewing on a chicken leg. The mere fact that we put ourselves through this anguish sets us, I believe, apart. We are, all of us, angel and devil. Francione’s is perhaps the angel voice, and we need these voices. But we are also rampant, hunting carnivores. Devilment is part of our humanity, part of our species beauty. Our truth. So where does it leave us? I’m going with balance, extending Michael Pollan’s “eat food, mostly plants”, with “make meat a treat”, and holding my breath until they grow steak-and-kidney on passionfruit vines. |
Spare the rod, spoil the planet
- THE ESSAY

A plague of unwhacked children. Is it possible that, in an array of future threats that includes climate change, financial collapse, sprawl, greed, war, pestilence and famine, humanity’s primary problem will be none of these but, rather, the global generation of unwhacked children?
”You all did perfect today!” The nice young swimming tutor, waist-deep in water, farewells his preschool flock. It’s not true of course. They didn’t all do perfect, and none of the adults present, neither parents nor teachers, believes that they did.
It’s a typical bunch of kids – some athletic, some weedy, some plump, some hopelessly timid. None of them can swim. One or two did reasonably well, considering. Most were (by definition) average and a couple, including the one who screamed steadily throughout, were appalling. Yet everyone present colludes in the lie, believing it to be for the best. Best, that is, for the children.
This has become the prevailing educational dogma of our time. Pain of any kind must be avoided, at all times and at all costs.
High school science is taught via the baking of cookies, literature by watching movies and maths by playing animated video games. Many schools, many parents, believe it is ”cruel” to reveal children’s marks to the class. Fine for everyone to know who’s the best swimmer or footballer, but when it comes to Latin verbs or differential equations, evident inadequacy could be scarring.
If it’s not altogether fun and feelgood for the kiddies, it’s not acceptable. In many ways this might be seen as democracy’s inevitable endgame, but the unspoken rationale goes something like this:
One. Happiness is more important than anything; more important than goodness, decency, wealth, duty, achievement or knowledge. More important even than friendship or love, which matter only insofar as they bring happiness.
Two. Our primary job, as parents, is to maximise our children’s future happiness.
Three. Future happiness builds on present happiness.
Four. Present happiness requires self-esteem, maximum pleasure and, as nearly as possible, the absence of pain.
Pain prevention therefore becomes the parent’s paramount task.
This is fascinating. To most of us, now, it will seem obvious that minimising our child’s pain is a primary parental role. Yet this only shows how complete, and how quiet, the revolution has been.
To some extent it’s territory explored by Christos Tsiolkas’s book (and TV series) The Slap. But Tsiolkas’ interest is primarily in the personal and interpersonal ramifications of the physical act; relationship stuff.
What if it’s bigger than that? Much, much bigger? Political, environmental, spiritual? What if the proliferation of the unwhacked affects the very future of civilisation?
(I should add at this point that I use the term ”unwhacked” in part as a metaphorical shorthand to capture all those whose childhoods have been quilted by the various forms of pain-avoidance. I should note, too, that my own children – my own parenting – are by no means exempt. Indeed, it is my own experience of hedonic parenting that provokes my curiosity.)
Traditional modes of child-rearing have depended largely (and at times almost wholly) on the choreographed application of pain – humiliation, criticism, deprivation and actual, physical contact.
We are appalled. We see all infliction of pain, especially on children, as cruel and wrong.
In this we are influenced by two centuries of literature, from all the evil authority figures of Victorian reformist novels (Dickens’s frightful Mr Creakle and Charles Kingsley’s cruel schoolmasters and ”doctors who give little children so much physic” they should themselves be bled, dosed with calomel and have their teeth pulled) to Roald Dahl’s Miss Trunchbull, who discus-throws kiddies by their plaits or shoves them in the ”pokey”.
The inheritors and perpetuators of this save-the-children movement were, of course, the baby boomers, parents of today’s young teachers (including my swimming tutor). But to Victorian evangelism the hippie ”just-add-water” attitude to child-rearing admixed a further, crucial ingredient.
In baby-boom peacenik philosophy – I use the word loosely – the anti-cruelty push of the 19th century evangelists was entwined with a Voltairean naturism; a noble savage philosophy that flipped nature from being humanity’s perfectable project to being, in its raw state, the supreme good.
Civilisation flipped conversely, from being the ultimate goal to becoming itself a liability, viz. Paul Klee’s desire to ”unlearn” drawing, Ivan Illich’s best selling Deschooling Society and so on. It was a Wordsworthian inversion of original sin: that we come to earth ”trailing clouds of glory”, from which our birth is ”but a sleep and a forgetting”. This puts appetite, not truth or goodness, on the tip of society’s arrows.
It’s an idea that the reformers would not countenance. Kingsely’s children’s tale The Water Babies doubled as an anti-child-labour tract, a critique of simplistic scientism and an argument for Darwin’s controversial evolutionary theories.
Yet Kingsley rejected outright the idea of gratification as a path to goodness, or even happiness. In The Water Babies, a group of humans called Doasyoulikes degenerate, precisely because they do whatever they wish, into gorillas, losing the power of speech and eventually being shot by explorers.
In rejecting both wanton cruelty and wanton gratification, the 19th century reformers drew an elegant distinction – between pain inflicted for the pleasure or betterment of the inflictor, and pain inflicted for the good of the subject (or victim).
I see your incredulous response. Most of us grew up ridiculing the idea of good pain, and the bottom-smackers adage, ”this hurts me more than it hurts you”. In what world might pain benefit the victim?
Well, in our world. The real world.
As functioning adults we frequently distinguish between bad pain, good pain and pain that is morally neutral. Bad pain is destructive – as when you hold your hand to a flame. Morally neutral pain has no lasting effect, for better or worse – such as exercising stiff muscles already stiff from overuse, or walking up a ladder barefoot, or even the pain of childbirth.
Good pain, however, is that pain necessary to some greater goal; yoga, for example. Study. Piano practice. Obeying the boss – or God, if you’re religiously inclined. Calisthenics. Dieting. Staying sober at lunch in order to pick the kids up from school. Any of the dozens of daily moments when we enact the unwished for some greater good.
Yet when it comes to disciplining our children, this is a distinction we often fail to draw. People presume that the refusal to smack – the determination to shape the child using carrot only, no stick – reflects, if anything, excessive love and concern for the child.
You see people in the local park, pleading with their pets, arguing with them, telling them that barking at the neighbouring poodle is not nice. The dog remains unrepentant, unreflective and untrained. Parenting, similarly, it seems to me, is a form of Stockholm syndrome, where the bond itself depends on benign but unwavering control being from the same hand.
The historian Barbara Tuchman famously blamed the Middle Ages’ tolerance of cruelty on the huge rate of infant mortality and parents’ consequently low emotional investment in childhood.
Many have disagreed, but Tuchman’s theory suggests that perhaps we have moved to the opposite extreme. Perhaps our extraordinarily low infant mortality has produced an over-investment in childhood and an extreme reluctance even to contemplate inflicting pain, even in the child’s interests.
Yet it is still possible that some pain – some level, some kinds, carefully applied – is in the best long-term interest of the kid.
Parental discussions of the smacking question are informative on this point. Many people’s biggest objection is not that smacking might hurt or damage the child but that to do it feels bad. To hit a child in anger feels brutalising. To hit a child coldly, without passion, is almost worse.
Force Majeure’s wonderful Never Did Me Any Harm at the Sydney Theatre Company this year wove real voices together into a side-splitting exposé´ of parenting neuroses.
”Whenever we did it we’d feel quite upset,” says one. ”You actually find yourself disliking a three year-old … but we had a lot of trouble disciplining Jimmy cause … he can really push our buttons … ”
So is the anti-pain, anti-danger push in fact pro-child, or pro-parent? Was the bottom-smacker’s old hurts-me-more-than-it-hurts-you adage actually accurate? Is smacking actually worse for the parent than the kid, and is this why we oppose it?
Is the urge to cosset our kids truly about their happiness, or our cowardice?
To some extent the answer depends on whether childhood pain, in any of its forms, can be a force for good.
Let’s be clear. Pain is not always good, for adults or children, even in the long-term educational or spiritual sense. Sufferers of chronic pain are often worn down or destroyed by it, with no pay-off enlightenment. But it is also true that what might seem intolerable suffering at the time can turn out, in retrospect, to have been profoundly formative and fortifying.
This is the substance of the old proverbs and cliché´s, our old scoffing-posts; what doesn’t kill you makes you stronger, all that. But what if they’re true?
Some experts argue that, for boys in particular, a short sharp smack is greatly preferable to an hour’s parental talk. But that’s still measuring pleasure, not benefit. Yet, even when the pain is more intense and prolonged, even when there is genuine damage, the outcome can be good.
Fiona Scott-Norman’s recent book, Don’t Peak in High School, documents the (to us) counterintuitive phenomenon whereby pained and difficult childhoods – flavoured by bullying, loneliness, misery and even abuse – often generate wildly creative and successful adults.
Of course it’s impossible to demonstrate the causality in these things. But it is at least plausible that early hardship might build spiritual or moral muscle – resilience, at the very least. That those evil teachers and doctors of history weren’t all sadists; some, at least, were genuine educators.
The converse is clear; sparing the rod can spoil not only the child – brattishness, obesity, overweening entitlement and depression – but also the planet.
Many psychiatrists say their waiting rooms are lately filled with young adults who suddenly realise they’re not in fact geniuses and the world is not their oyster. Shattered, thus, they want instead to die.
I sympathise. Truly. But the planetary consequences are bigger still: excess entitlement, widely spread, can only exacerbate appetite-driven climate change.
Children who grow up with unearned self-esteem will not only cope less well with crisis when it comes, but will by their behaviours increase its likelihood and intensity. Triple whammy.
Perhaps we’ll stop short of Jonathan Swift’s proposal for ”preventing the children of poor people in Ireland, from being a burden on their parents or country … ”
”I have been assured,” says Swift, ”that … a young healthy child, well nursed, is at a year-old a most delicious, nourishing and wholesome food, whether stewed, roasted, baked or boiled … ”
But maybe the odd smack?
ESSAY: A view with a room
In private homes and public spaces, people need discrete places where they can think, imagine or just ‘hang’.

I once told an audience that I’d spent my life in search of the perfect room. At the end someone stood up and said she thought that was really sad. For a moment I feared she might offer me one of hers, like some stray needing a roof. But she had no such intent – and actually it’s not sad, unless the quest is itself pathetic.
Yet ”perfect” isn’t really the point. The point is that a good room – a really good room – is a rare and extraordinary thing. This is what’s sad.
As students, we were taught that every house should have a magic room. And not just houses. Rooms are the currency of our lives. We’re born into them and we die in them. In between we feast, marry, toil and dream, by and large, in rooms. Their qualities intimately flavour our defining moments and mostly, we make or choose them for ourselves.
They are our elect containers. Yet few are very good.
Rooms that welcome you from the heart, making you feel neither exposed nor claustrophobic but immediately and profoundly at home; rooms that are as comfortable and comforting as a good conversation over dinner are likely to be counted in single figures throughout a lifetime.
I’ve been there, but never owned one. This may be partly a scale issue. Rooms in which I feel instantly at home are, for example, St Paul’s Cathedral, London, and Palladio’s wonderful basilica – top floor, natch – in Vicenza. I’m also fond of his Villa Capra, outside Vicenza, with its great central dome and square enfilade of rooms: four sides, four great stairs, four white Ionic porticos.
The Palace of Westminster I also find quite snug. Ditto Mies van der Rohe’s sublime National Gallery in Berlin. I could live there, no trouble.
This sounds like delusions of grandeur. But I think it’s almost the opposite: a love of spaces that dignify and uplift the human soul. Mine, anyway. Some of the best rooms of my experience are imaginary, or at least collaged from life. There’s the rustic one, where for some reason I imagine my old age; a one-room timber shack perched on seaside rocks, with a rickety path from the orchard at rear and, at front, a pair of great storm shutters that lift to the ocean breeze.
There’s the walking-and-thinking room, which doubles as the writing-and-arguing room: a long gallery in the Jacobean manner with aged oak floor and stripy light made by an alternating pattern of solid and void (walls a metre thick and book-lined) along its length.
And there’s another that’s really a house of rooms, organised concentrically like Russian dolls but along a solidity gradient, with the cosiest and most secure room (the nook or library) in the centre, its walls increasingly flimsy and transparent as you move out towards a verandah and, finally, a walled garden. Entering or leaving the house is therefore an exercise in wrapping or unwrapping.
Real rooms are less enchanting. A good room needn’t be grand (although an eight-foot ceiling is a definite setback). A magic room must have good proportions, natural light from more than one direction, low sills, clean details and fresh air. It may be spatially complex or an exemplar of pared simplicity, but it must be defined – six planes, minimum. Those are the basics.
Then it gets more subtle. A good room must have the right balance of shelter and prospect. It must have enough connectivity – to other rooms, to views, to outside space – but not too much. If it is very simple, all emphasis is on light and proportion; if complex, it must not lose itself in its machinations. It must also have a good and apt acoustic; serene for a bedroom, bright for a kitchen.
It’s contingent, sure, but not really so hard. Yet almost all rooms you enter, for tea or talk, for meetings or lectures, for dinner or cabaret, are underlit or overexposed, badly proportioned, low-ceilinged, too-high windowed, too removed from the action or too close, claustrophobic, acoustically gaunt, over-decorated or just plain cruel. Such rooms are without exception depressing. Yet they are ubiquitous. Why do we tolerate it?
It’s not just the rooms of our private and professional lives that have been ruined. That’d be bad enough. But more emphatically destroyed by modernism’s stormtroopers – so emphatically that our expectations, too, have been crushed underfoot and we barely notice – are the rooms of our public selves: the rooms of the city.
In 1748 the Italian architect Giambattista Nolli drew for Pope Benedict XIV his Pianta Grande di Roma – now celebrated simply as the Nolli map of Rome. Drawn to gauge urban density, Nolli’s map was one of the earliest ichnographic representations of the city – which is to say, a flat, ground-parallel map, as opposed to the bird’s-eye view convention of the time.
But what made the Nolli map a cult object among urban designers 250 years later, when post-modernism brought the city back into intellectual vogue, was its drawing technique. In particular, the use of a technique called poche.
Poche simply meant the blocking-in, in pencil or ink, of the walls in a plan. Nolli, however, used poche to block-in all private space in the city, leaving the connective, public spaces blank. This let the city read as a systemic lacework of public space – streets and squares and alleys, but also porches and public interiors such as churches and temples. For the first time, the city could be understood as what it essentially was: a series of shared, intricate, interconnecting rooms.
Etymology can throw some light here, since poche as a noun – ”la poche” – also means pocket, giving us pocked, pouch and poke (as in pig). The connection is the sense of a container, in this case the space between two lines, being filled. But applying the idea of pocketedness to city fabric clarifies the stark typological difference between cities traditional and modern.
For the traditional city is indeed a pocking of the planet’s surface – a teasing and hummocking of its material to create a common nest of sufficient complexity and comfort to satisfy our immaterial needs as well as our physical ones. The familiar analogy between a city and an ants’ nest suggests itself here, and city images like those extraordinary underground cities in Cappadoccia.
The traditional city, in this sense, is not just a habitation but a tool for infiltrating the planet, even as it protects us from her. Literally, the city makes the planet make room for us.
The modern city, by contrast, is made not of objects holding space – rooms – but of objects occupying space. Modern rhetoric was all about the flow and interpenetration of space, the reality was all about objects: individual, stand-alone buildings – skyscrapers for the city, bungalows for the burbs. This is why modernist planning always focuses, despite itself, not on what the city is like to be in, but on how it works, and what it looks like as an object.
The transformation was profound. This low-rise to high-rise shift was not just a change in degree, but a full, 180 inversion. Instead of being linked and mixed, things – buildings, uses, genders – were now separated. Modernism sent both women and domesticity to the suburbs, leaving male-dominated business to ”own” the city centre, now renamed Central Business District.
Councils began aggressively selling laneways and pushing site-consolidation to facilitate the new towers. (Seidler’s Australia Square, for example, consolidated a dozen sites and several alleys.) Thus, although in people terms the 60-storey city is scarcely more dense than its 12-storey predecessor, the fine, pocketed grain of cities was obliterated.
That was bad enough. Suburbia, sold to women via images of happy children, convenience gadgets, double garages and gingham tablecloths, quickly revealed itself as a gilded cage. Still more destructive, because less reversible, was the effect on our valuing and making of space; a dramatic shift in the power balance from public to private.
Office buildings, once expected to define and decorate public space, even at the expense of their internal amenity, now expected the public realm to serve them.
The shift was not only from room to object, but from a female spatial model (female in the plumbing or electrical sense, as in concave, receptive) to a male one.
Where once the buildings had been seen as collectively concave, containers of city life, they were now convex, aggressively elbowing space away. Where once the public realm had been seen as the figure, it was now background; surplus and shapeless. Where once the feminine (concave, receptive) principle had been the model, now it was all male.
In 1925, Le Corbusier had raised arms against the street (”Il faut tuer la rue!”) and against every city activity that was slow, meandering or old. This was Jane Jacobs’s point, in 1961. But so thoroughly have we been schooled in the collective solipsism by which we see public space strictly as something to be driven through between home and work that, a century on, we are still suffering Corbusier’s toxic influence.
Public spaces used to be, genuinely, a culture’s living rooms, where people would simply hang. Not necessarily for a purpose; sometimes just to be, together. Even streets were rooms – rooms defined by rooms; narrow dog-leg lanes lined with shops and coffee houses. Female receptacles for feminine activities. In Rome and Vicenza they still do that, and the spaces still encourage it – which is why we love those towns. Here, though, in our own places, we regard anyone hanging around a public space without a purposeful activity as a dero and a threat.
Our streets are utilitarian thoroughfares. Our squares are filthy, unloved and unwalled. Our public green spaces are parks, open, grassed and windy, but never gardens. Never rooms.
We’re fighting back, with street cafes and restaurants on every corner. But our cafes swirl with fume and noise and smoke and our ”squares” are mostly traffic islands. Half the time this doesn’t even seem unreasonable: of course the city-centre is a nasty thoroughfare. Of course sprawl-and-mall-land is where we spend our lives. Obviously.
At a thinking level, this is breathtakingly old-fashioned. Saurian. Yet still it dominates our reality. Even our illustrious visitors seem slow to respond. Professor Jan Gehl, calling Sydney ”a city in distress”, proposes a Town Hall Square that is simply a demolished half-block, flat, grey, unwalled, defined only by three main roads.
Professor Peter Walker’s Barangaroo headland park is green, but undefined and wholly convex. There’s no room there, no place to be. All park, no garden.
Will this change, as feminisation takes hold? Perhaps. Who knows. Feminism has also brought us the me-ring, the McMansion and the SUV revolution. There’s hope, but I’m still looking out for my room with a (small, intimate) view.
INVASION OF THE PERSONAL
Conveying the personal was once seen to be subjective and an inadequate vessel for truth. But today it appears paramount.

I love how mafia wise-guys, before they garrotte the guy or chop his fingers off with an axe, apologise, saying, “it’s not personal, it’s just business”. As though it’s in some way less dreadful to be killed for business reasons. It’s as though it’d be so much worse if they hated you personally, as well as sending you to sleep with the fishes. As though “business” cleans what “personal” sullies.
True, my experience of garrotting is limited. But the mafia model is really just a caricature of the broader tradition in which ”personal” and ”business” are separate walled gardens; business for him, personal for her. The personal garden is a subset of the other, set within the larger, business world and protected by it.
Yet each garden has its own set of rules, manners and morals, with the Big Man – the godfather – as the connecting door between the two.
In business, a clear might-is-right hierarchy applies. There are rules and, on occasion, ethics, but business life is expected to be tough, cut-throat and rational – or at least faux-rational. Behaviour is expected to be temperate and ”professional”, all emotions robustly reason-wrapped.
The personal world, by contrast, has long-maintained a protocol that is predominantly sensual and emotional, full of frills and cakes, laughter and tears, flowers and dresses.
Values despised by the business world as soft and unprofessional – compassion, generosity, prettiness – unabashedly hold sway, while talk of money and politics is banned.
That was then. Now, the separating walls have all but collapsed and the personal, like some rampant convolvulus fed by feminism, populism and casualism, has engulfed pretty much everything.
It’s not just the routine over-sharing of celebrities whose sole occupation is to stage their private lives for public consumption, or phone salesmen and call-centre droids calling you by first name from the other side of the planet. From academia to etiquette, from child-rearing to religion, the personal has become our main operative mode.
As Margaret Thatcher reflects in The Iron Lady, “one of the great problems of our age is that we’re governed by people who care more about feelings than they do about thoughts and ideas”. How did this happen, and what does it mean?
Take photos. A century ago people did not smile in photos. Ever. Photography was new and awesome, and a photo-session was a formal and serious occasion. Even wedding photos, even family photos – like that of my grandfather arriving as a ringleted and lace-collared three-year-old from Wales in 1903, or my grandmother as one of 13 children lined up along the verandah – show exclusively best-dressed, unsmiling faces.
We read this as sternness, the emotional whalebone of the times. More probably it was a sense, now lost, that one’s real and enduring self was one’s serious self. So thoroughly have we rejected this idea that even toddlers learn to crack a smile the instant they see a camera.
To judge by people’s self-representation on Facebook, you’d take the very word ”serious” as an archaism, lacking current meaning. Looking back, we see history as a series of monarchs and battles, but quite probably the future will regard our era as a succession of crowded exuberances. In part, no doubt, this great show of ceaseless collective joy results from an apprehension that, absent God, our life’s main task is to be happy. Despite – or perhaps because of – the increasing occurrence of major depression, happiness has become the work.
This may be just fine. If Aristotle is to be believed, it will do us only good to pretend happiness, even if we’re not actually feeling it. But a side-effect of the constant smiling and social-media chattiness is to fix us permanently in the personal mode.
Look around. In our world every transport ticket is called ”my train” or ”my ferry” (lower case, natch – wouldn’t want those fascist caps creeping in) as though we think someone might filch them. Laws are published with explanations that, for intimacy of approach, are framed not as explanation but as a first-person dialectic. “What will happen if I …?” “How do I resolve it if I …?” “What does this mean for me …?”
Parents struggle to be not their child’s mentor, but mate. For many, ”mate” is the standard term of parental endearment. Even academia, for so long a bastion of abstraction, privileges the personal and subjective as core. In literature, where not long ago the first person was all but verboten (outside autobiography and memoir), authors are now criticised for not being personal enough.
Everyone’s at it. Will Self writes in glorious detail of the rare blood disease that means he must have weekly venesections. Hilary Mantel about her endometriosis. There’s nothing wrong with this. Both write brilliantly and in ways likely to engage some and help others. But it now seems that, to be a writer, one must proffer one’s own pain, psychic or physical, for public delectation. When I wanted to write a book on pain – looking at the interplay between mind and body, spirit and chemistry, nature and culture through a series of personal stories – I was told by agents and publishers alike – “well, we’d be interested if you had a personal pain story to tell, but not otherwise.”
If you haven’t had a childhood busking on the streets of New York with your junkie father, or an old age being pack-raped by dogs – or if you have but don’t want to talk about it – forget it. Sydney writer Anna Funder was taken to task by The Guardian’s Rachel Cusk for not personalising her new novel, All That I Am, as she had her first book, Stasiland. This notwithstanding that Stasiland was a personal non-fiction narrative, a history, whereas the second book was an historical novel, and wasn’t about her.
A generation earlier, on the (modernist) presumption that the personal is irretrievably subjective and therefore an inadequate vessel for truth, Funder would have been ridiculed for interposing herself into her first, historical book, and praised for removing herself from the novel.
Cusk’s post-modern take is the direct opposite. In personalising Stasiland, she writes, Funder “gave it an irrevocable moral character” that is absent in a book not so personalised. (Cusk’s own first novel was a personal exploration of motherhood.)
Non-fiction was once the territory of truth – of historical record, intellectual and political argument, reasoned analysis. The Enlightenment, or Age of Reason, is usually seen as being triggered by printing press. But it wasn’t just the fact of mass production; it was the content. People were hungry for ideas and analysis.
Kant, Bacon, Locke and Paine weren’t writing about their unhappy relationships with their mothers. Enlightenment didn’t mean finding your own path to a personal nirvana. It was about ideas; about finding – proposing, debating, strengthening – the common truths that would prove sufficiently resilient to found an entire civilisation.
“We hold these truths to be self evident, that all men are created equal, endowed by their creator with certain inalienable rights …”
These days, even the Creator is conceived more as an old chum than an authority figure. Gradually all suggestions of hierarchy (darkness, poetry, obscure language, axiality and even the altar) have been replaced by an open, centred everyday-ness that runs through the architecture, the language, the ritual and the lighting.
An ecclesiastical tradition that once strove to lift humans to heaven now seems determined, rather, to bring God down to our level, in the round, on the floor, quotidian. Why? So that the relationship – yours and God’s – can be intimate, personal.
At Hillsong, where everything is relayed on huge video screens and accompanied by big-band riffs, I found myself counting the number of times “I” or “me” occurred through the service, and made it well into the double figures. This is a victory, of sorts, for the peacenik generation. Just as we all now wear jeans to the opera, post-modern Jahweh wears jeans to church.
But it’s not that simple. The shift is not limited to manners and aesthetics, and its implications are not superficial. It’s as though we are unable think outside the personal and subjective. As though, once post-modernism, egalitarianism and feminism undermined our trust in objective truth, we lost first our respect, then our capacity for objective thought, slipping unaware into the warm collective solipsism of the universal me. This is a profound epistemological – as well as moral and metaphysical – revolution. Yet it has happened so quietly we barely notice. The thing about revolutions is, their results tend to be the direct opposite of their intentions (and yes, the Arab Spring looks like a possible instance).
So what would the world be like, if we really did conceive ourselves to be a collection of principally subjective entities? Tribal, in a word. Oh I imagine there would be upsides. Life would become increasingly Californian, strenuously enwrapped in sustaining the illusion that life’s a breeze.
But there are distinct downsides. We’ve all read Lord of the Flies. Life without abstraction begins with feel-good and ends with feel-bad. It starts with fun, friendship and compassion, ends in brutality and tears.
It’s one thing for newspapers and zines to feed our apparently insatiable appetite for the personal lives of strangers. But what chance is there for a big idea where feelings are the coin of the realm?
It’s like that fatuous T-shirt logo; “imagination is more important than knowledge – Albert Einstein”. If we really believe this, if feelings, dreams and fantasies are as “true” as facts, the Enlightenment is over, and we’re already lost in the misted highlands of fear, brutality and superstition.
For we forget. Civilisations can fall. And ours depends on our capacity to engage richly and collectively in abstract thought in two areas especially. One is law. It’s clear, despite Jefferson’s lovely wordage, that our rights are by no means inalienable.
They can be alienated on a tyrant’s whim without notice. The values we hold so dear – women’s rights, children’s rights, free speech, fair votes, the right to walk down the street in shorts and thongs without being raped, sliced or bombed – are protected only by our collective investment in law’s fundamental principle; do as you would be done to.
This was Christ’s genius as a moral philosopher, if not a prophet. For this principle turns the personal into the abstract, binding the individual inextricably into the lovely landscape of civilisation, and keeping the wilderness at bay.
Yet civilisation is no given. It is a construct, a garden of its own, as safe only as its walls are strong. Most of us – the small, the weak, the sick, the soft, the decent; everyone in fact except the chest-thumping silverbacks – depend wholly on its mural integrity. Outside, there be dragons.
Every attempt to fracture those walls – to admit pockets of indigenous or sharia law, such as under the guise of ”cultural sensitivity” when really it’s apartheid – weakens the walls and draws the dragon-breath closer. The other arena where impartiality is vital to life is, of course, science. More urgent than law, it is more problematic, since it depends on a skill that is counter-cultural; our capacity to recognise and revere authority. It is also bleeping red.
What’s at issue is not just whether we’re educating enough scientists to compete economically with China (we’re not, by a long way). It’s bigger than that. Given false courage by the idea that all opinions are created equal, the blogosphere threatens to become a mobosphere, where the truths of science – of climate change, for example – are shouted down by an ignorant and heedless rabble unable to distinguish genuine science from the bought or beholden sort.
Perhaps it was always going to come to this. Perhaps those very self-evident truths – liberte, egalite, fraternite – bore within, as Benjamin Franklin once noted, the seeds of their own destruction. But perhaps not. Perhaps, if we’re conscious enough, we can still rescue abstraction from the toothless maw of the subjective.
A glimpse of what is possible
- ESSAY
Modern architecture was fuelled by altruism but, far from delivering a new world order, it resembles a failed experiment.

I have two friends. In fact (I like to think), there are more than two but these two make a pair, in the parliamentary sense. Both are men in their 70s, cultivated of habit, warm of heart and distinguished of mind, yet their points of view on one particular subject are diametrically opposed.
That subject, which is still causing divisions a century on, is modern architecture. One of these friends is an architect, something of a grandee within the profession, author of a high modern, Corbusier-inspired building that was much feted in Britain in the ’50s and much reviled thereafter.
A kind man, as well as a clever one, he is nonetheless entirely unrepentant and recently wrote of the building: ”On the inside, in response to the light, the space between the buildings becomes bigger as the building increases in height down the hill. Secondly, the ‘streets in the air’, the decks as they became known, have worked well, designed as they were for children’s play, for neighbourly chat and for deliveries.”
The other, a lifelong anarchist, intellectual and ladies’ man, a clinger to high principle, told me recently over coffee that, quite simply, Le Corbusier had ruined the world.
This difference could be, simply, the architect versus the rest, evidence of the stubborn chasm between the profession and its clientele. But the truth is more interesting.
The story of modern architecture mimics, in many ways, Aristotle’s guiding principle of tragedy. Like Macbeth, say, or Lear, the horror of its fall was proportionate to and caused by the loftiness of its ambition. The higher the lower, you might say. Modern architecture is often described as arrogant but it wasn’t just that, not in the normal sense. Modern architecture genuinely wanted to save the world – genuinely thought it could.
No doubt this was its own hubris, an open door (in Aristotelian terms) to catastrophic failure. But the altruism was genuine. A century ago, give or take, modernism wanted passionately to end the squalor, inequality and outright ugliness of ordinary lives and ended by just as emphatically making most of those things worse.
Modernism was a seductive paradox – seductive, both aesthetically and morally, and paradoxical because, like so many revolutions, it achieved the very inverse of its goals. Yet in all of this, altruism was key – which is what makes this particular tragedy so compelling.
The recent Mad Square show of early 20th century German art offered a clue. Arranged chronologically, the show opened with Ludwig Meidner’s Apocalyptic Landscape of 1913, depicting, with terrifying prescience, a world blown to smithereens.
The next room distilled what can only be called ”the horror”. It was a collection of Max Beckmann, Otto Dix and George Grosz, all returned from World War I. With ruthless and satirical compassion they painted what they saw: maimed soldiers, cankered whores, fat capitalists – violated lives, futile desires, pointless sacrifice. For their courage alone, they command love.
Then, as the horror tipped over into hysteria, Dada. No longer content to paint the futility, Dada enacted, re-enacted and reconstituted it in a way that was funny, sad, savage, heartbroken, intelligent and bitter.
And that was it, really, the high point not only of the show, but of modern art. Suddenly … whammo! The elevator hits the deck. Room after room and year after year of abstraction. In architecture, it was Gropius and the Bauhaus. In art, abstraction, cubism, minimalism.
As if artists everywhere were saying: ”Enough. We’ve had it with that feelings stuff. No more emotion, no more dank and bloody history. All too hard. We’ll create a new world order in which men are machines. To feel nothing is to risk nothing.”
Utility became god. For the Bauhaus, houses that looked like machines and schools that looked like factories were glorious goals. Le Corbusier wanted to destroy the street, separate fast from slow traffic and all traffic from pedestrians, who would happily inhabit those very streets in the sky that my friend is still eulogising.
Architects started to design houses without walls, floors without support, cities without location. The cantilever was cool. Abstraction was out of its box, stalking us, and its grip was ferocious. In architecture especially, abstraction was without pity, charm, tradition or humour.
The Prince of Wales’s view that modern architecture destroyed more of London than the Luftwaffe is well-known. Less known is that this destruction was undertaken with the very best of intentions – intentions that, examined in themselves, would look scarcely different from the Prince’s own.
And this is the paradox. Measured by its effects, modernism looks greedy, threadbare and arrogant – the capitalist’s plaything. But measured from the inside, by what was in its heart, modernism was realising the enlightenment freedoms – openness, honesty, fairness and suffrage. But the rhetoric-reality chasm worked the other way as well. Modernism pretended to be anti-aesthetic, constantly proclaiming that ”architecture is not a style”. But at its best – to wit, Mies van der Rohe’s Barcelona Pavilion – it had a transcendent beauty that few other buildings match. When it’s good, it’s very good. Indeed the modern aesthetic has an almost spiritual quality – the modern sublime, you might say.
Tom Cordell’s wonderful documentary, Utopia London, isn’t interested in spiritual qualities. In true London fashion, it looks solely at the social and political meaning of the modern project, in order (one suspects) to counter the Prince’s outrage.
Utopia London traces the history of London modernism from its 1930s beginnings. From Berthold Lubetkin’s 1938 Finsbury Health Centre, through the 1951 Festival of Britain – a deliberate postwar tonic for a depressed nation – to George Finch’s famous Lambeth Towers and public housing in Brixton, which later became the scene, and some would say the source, of riots.
Modernism did for architecture what democracy attempted to do for economics – even out the differences. Classicism had allowed, and indeed required, a hierarchy of significance (the privileging of front over back, centre of sides, and the very idea of ”the orders”). Modernism deliberately rejected this, instead adopting as its ideal form that diagram of equality, the grid.
Later reviled as a symbol of mind-numbing sameness, the grid became, for a time, modernism’s symbol of a society of equal citizens. According to the grid, all flats in a block, all buildings in an estate, all roads in a precinct would be equally privileged. Modernism was the built form of socialism and the grid was its ultimate expression of equality, freedom and choice.
It is Cordell’s thesis that this, ”modernism’s Marxist underwear”, was the reason for its rejection – first by Churchill and later by Thatcher. Utopia London even includes an interview with the redoubtable Alice Coleman, whose work in the 1980s purported to prove causal links between bad design and antisocial behaviour and was seen to provide the rationale for Thatcher’s sell-off of council housing across Britain.
But Tory politicians weren’t the only ones who rejected modern architecture. By the 1960s it was also extremely unpopular. It may be that poor people, public housing tenants, would have disliked any buildings in which they were housed – certainly the ultra-brutalist but middle-class Barbican has not suffered the same rejection.
But there was also an aesthetic element to the revilement. Housing choiceless people in bare concrete, with eight-foot ceilings, lightless corridors and mean aluminium windows was always likely to end in tears.
Modern architecture’s commitment to abstraction was, and is, hard to love. It wasn’t just the grid. The entire aesthetic became indivisible from its moral undertow. This involved ”truth to materials” (bare brick), structural integrity (exposed steel girders), absence of decoration, mass production (regimented sameness) and open planning (everyone can hear what you say).
For architects and students, the moral stuff made modernism thrilling. For the first time, architecture had a mission – a core goodness comparable with the other professions. By the time I came to architecture as a student in the mid-1970s, the Pruitt-Igoe flats in St Louis had already been dynamited and the seminal postmodern tomes by Venturi and Jencks were already classics.
We were postmodern creatures yet, though we knew its failures, modernism glowed like a light on the hill. The rest of architectural history, from the Egyptian to the baroque, seemed sepia-toned by comparison, a recitation of styles and technologies. Only modernism had purpose – this was both its strength and its undoing, its fatal flaw.
The modern dream was also, of course, impossible. That, perhaps, was its core appeal.
Those lovely houses, glass-walled platforms that cantilever out over the San Fernando Valley and are so beloved of filmmakers, are like the anti-matter – the less is more version – of tradition’s royal palace. They eat money. The vast concrete-and-glass living room will be breathtakingly beautiful if, and only if, the glass is permanently sparkling, the concrete clean and warmed, the furniture squashy and the martinis discreetly shaken.
But it wasn’t just economically impossible. It was also biologically and psychologically impossible, requiring us all to be perfect. And this, too, was why we loved it. Modernism didn’t just promise a new world order. It also promised a new kind of human to inhabit it, the kind that could live in a completely open, hard-surfaced plan, a glass-walled house, or a walking city.
Mess-less, needless, rootless, the future human would be a cheerful, efficient, super-productive, self-oiling, no-waste, time-and-motion perfect machine, like El Lissitzky’s New Man (1923), the denizens of Fritz Lang’s Metropolis (1927) or the generations yet-unborn who Le Corbusier believed would learn to live in his ”Radiant City”.
This in itself was exciting. We ourselves were to be remade. Reborn, even. This was the New Jerusalem, the transcendent, crystalline idea of which the moderns spoke – the seductive paradox. So our disappointment with modernism was all the more vehement for being underpinned by a massive, irredeemable disappointment in ourselves.
And there are ironies, rippling layers of irony. Modernism’s turfing, in the name of utility, all practical wisdom about dealing with rain, light, temperature and ventilation in a graceful manner. The ease with which this socialist vision became the plaything of morally-threadbare capitalism. And, most urgently, the fact that – though we see its failures all round us and bewail them constantly – modernism is still with us.
We’re still doing it, as any lowbrow real estate rag or glamorous interiors mag reveals. Shorn of all belief systems it may be but the modern look – the yearning for the (at least the appearance of a) minimalist life – is still with us, trapped like some abused child on an infinite loop of continuous present.
Current architecture is lost, torn between eco-ineffectuality and laser-cut frivolity, so, or perhaps because, the crucial question remains unexamined – should architecture abandon or reinvest in modernism?
Was modernism a blip, a bad joke, a failed experiment? Should architecture now jettison all pretension to moral purpose and recognise itself as a space-making craft available, in the traditional way, mainly to the rich?
Or was modernism truly a paradigm shift, of which we have lived only the first tiny part? Should architecture reclaim modernism’s elusive vision attempting for the first time to connect it deeply into its roots of ancient precedent so that it might flower wildly for the future?
ESSAY: Gender and the city
Alliances between urbanism, feminism and the new connectivity may spell the end of a man’s world.

The smartphone is the new smoking. It’s the cigarette packet size thing you take with you everywhere. It’s what you do if you’re waiting for a bus or nervous at a party; what you light up the minute you emerge from the cinema. The smartphone is the marginally antisocial crutch you cannot live without.
But, if it is the new smoking, what are the dangers? I’m not just thinking brain tumours here. If it does, as some say, signify a whole new social paradigm, does the new connectivity come with hidden costs?
You’ve been there. The teens who persist, glazed, in playing phone games through the meal; the couples at brunch together, both silently tweeting, texting and emailing; the student who forks out hundreds of dollars for a replacement because it will take at least a day to get the ”old” phone fixed, and 24 hours without one is unthinkable.
Research from Britain’s communications regulator, Ofcom, confirms what we already knew. The addiction is real: 61 per cent of teenagers and 37 per cent of adults describe themselves as “highly addicted”. Nearly half of teenagers surveyed and 22 per cent of adults admit using their phone in the toilet or bathroom. Britons send an average five texts each a day: in some families, it is normal for all members to have their phones at hand all the time, even during meals.
For the phone, clearly, is not just a tool. Like the printing press or the steam engine or the machine gun. It shapes the very wave it rides, sometimes in unforeseen directions. It’s not as if the human primate has always felt a need to be in continual contact with 700 Facebook friends and 3000 followers on Twitter and is only now able to assuage that need. The need itself is new.
For young people in particular, it’s as if the unwatched life is something to fear; unscrutinised, like Bishop Berkeley’s tree, they will simply cease to exist. In this sense, the cybersphere has taken the place of God. If the photos of an event are not on Facebook within the hour, the event itself might as well not have happened. Didn’t happen, as far as the hive mind is concerned. And the hive is everything; infallible, ineradicable, all-seeing, all-powerful. What does it mean?
You could, of course, regard this as a collective neurosis, the need for constant crowd support and reassurance. It could be a mass withdrawal, in the face of global insecurity, into the herd, people huddling like emperor penguins against the Arctic cold; a mass withdrawal into defensive mediocrity.
On the other hand, it could be a good thing. Perhaps the new connectivity flags the evolution of new interpersonal skills, a mass flowering of co-operation and collaboration. After the London riots, as after the global financial crisis (if ”after” is the right word), we are seeing socialist theories being dusted off in various guises. But perhaps it’s socialism with a difference. Perhaps we’re witnessing the human manifestation of the phenomenon known as ”emergence”, whereby the collective is able to harness a group intelligence that is more – not less – than the sum of its parts and act, without centralised leadership, for the good of the whole.
History may show that the global financial crisis, climate change and associated social unrest are the last gasps of individualism. The pain of these events may prove to be the birthing of their remedy, the New Connectivity (just as the horrors of World War I pushed modern abstraction and suburbia into existence).
If so, this shift may be allied to a parallel shift in gender dominance; a replacement of the aggressive, individualist male paradigm with female principles of networking, connectivity and relationship. A replacement of thrust, you might say, by chatter. An emergence.
Let’s say, for the sake of argument, that it’s true. That the 20th century represented the apotheosis of ”male” values such as speed, aggression, individualism, silo thinking and what you might call thrust, manifesting in cities of high-rise cores, an ”incontinent puddle” of freestanding houses and a spaghetti of motorways.
And say we’ve passed ”peak male”, just as we’re said to have passed peak oil, peak water and peak food. Assume also that Camille Paglia’s immortal quip that if women ran the world we’d still be living in grass huts is something of a caricature. What, then, might 21st or 22nd century cities look like, formed according to the more collective, collaborative values of the female hive mind?
The most familiar example of emergence is slime mould, where single-celled amoebae can sometimes coalesce into what is effectively a single organism. In this form, “a bag of amoebae in a thin slime sheath”, as Princeton biologist John Bonner puts it, billions of separate creatures somehow act co-operatively, as if with a collective intelligence. They can achieve tasks and solve problems, such as finding the shortest route through a maze, without any evident communication system, much less any centralised command or ”brain”.
More sophisticated examples include ant colonies. Popular culture, such as the film Antz, tends to portray ant colonies as totalitarian command economies, with the queen on top of a primate-type hierarchy. Nothing could be further from the truth.
In ant life, there is no command structure. No one tells the workers what task to perform or how to deal with an emergency, yet the babies are nursed, the aphids milked, the eggs moved out of danger, the midden located just far enough from the town and the ant cemetery, with a collective intelligence that is astonishing.
Human nesting structures, or cities, are similarly complex yet self-organising systems. Broadly speaking, decisions as to who does what, where – as to which strip becomes the fruit market or the gay precinct – are not handed down, and when such attempts are made they generally fail. These things in cities ”just happen” because there are things that work; behavioural rules we share.
Many of these are the rules of shopping. We’ll shop more happily, for instance, when there are two sides to the street, in a street that clearly connects to other streets, and when no level change is required. These ”rules”, made by no one but based on our primate nature, are analogous to the rules by which ants operate, and together they shape our shared behaviours and spaces.
The idea that you can plan cities, top down, is a modern one. Previous centuries produced the odd Baron Haussmann, who ”planned” – or, more accurately, carved – by fiat. But before World War II few cities had planners or laws that would give them power. The first NSW planning legislation, an amendment to the Local Government Act, came into force in July 1945, and gave three years for the preparation of Cumberland County’s first plan.
Traditionally, cities were seen as reflections and even shapers of human minds and cultures but not in themselves as things to be shaped. It was the horror expressed by those such as Dickens and Engels, and the reforming zeal of the likes of Disraeli and Carlyle – underlined by the sheer dreadfulness of World War I – that made people think they must do something about those teeming, infested, industrial cities.
And do something they did. As the Prince of Wales so famously said, modernism’s efforts to clean up the human act destroyed more of London than the Luftwaffe.
Modernism’s take on connectivity was linear and utilitarian, its effect as much separative as connective. Motorways allowed fast traffic but separated it from all other traffic with superhighways that were themselves barriers in the landscape; the consolidation of land for skyscrapers obliterated laneways, the capillary-system of cities; use-zoning deliberately separated nesting from working and male from female, while the widespread deployment of cul-de-sacs, clover leafs and one-way traffic systems coarsened city texture, trampling the connectivity that was its essence and consistently preferring the car-encapsulated human to the human on the street.
A classic example is Parramatta, where fine historic bones are all but obscured beneath a plethora of heavy-handed interventions, but which would instantly revert to some semblance of life were the yoke of modernism lifted. But contemporary examples abound as well, like Melbourne’s Docklands and, I fear, Barangaroo.
The alliances between urbanism, feminism and the new connectivity are little explored. Yet it is clear that, just as modernism deliberately banished women to the suburbs as bait to draw men from the evils of the urban night, so postmodern urbanism, in rediscovering the traditional city, has striven both to repopulate the city with non-corporate human life and to reassert feminine values.
The principles of urbanism – intricate connectivity, engagement with history, a focus on the slow, the pedestrian, the being there (as opposed to the getting there) – are feminine values. So, it should be no surprise that many of the voices of this ”new” urban movement, from Jane Jacobs onward, are female.
Of course, actual genders are irrelevant; the point is the shift of flavour and principle. Yet it is noticeable on the Barangaroo development, for example, almost all the proponents of what is, now, a very old-style modernist scheme are male, while many of the opposing voices are female.
Feminism has always been torn between arguing that women are not essentially different from men and need only be free to ”be” men, and arguing, on the contrary, that women and men are fundamentally different and that ”equality” requires, rather, the accordance of equal respect to male and female values. The first, clearly, would change city form but little. The second view is therefore more interesting, speculatively, since it allows us to ask how different would a female-formed city be.
Not tall, for a start. One imagines a city of maybe six or eight storeys, perhaps quite dense but crazed with interconnecting laneways and gardens and courtyards, lots of shared spaces, public transport, trees and street furniture. A city of quiet orderliness, more focused on the spaces and relationships between things, than the things themselves.
Something, in fact, very like the pre-industrial city which, while hardly matriarchal, was built as a human nesting structure, with women at its core.
The Transition Towns movement in Britain and Australia may prove an exemplar. Proposing not a regression to traditionalism but a rediscovery of local economy – including the now-familiar community gardens and farmers markets – the Transition movement aims to help communities brace for the potentially dramatic effects of climate change and peak oil.
There are, of course, downsides to such a city, which will kick in at the point where connectivity drifts into conformism, safety into wowserism and niceness into timidity. So, the struggle must be to maintain the tension between collectivity and freedom, so that cities and cultures benefit from the drive to recolonise their centres without losing the essential edge of the mysterious, the unpredictable, the untame.
THE ESSAY Canberra: A hole in the heart
With its centenary approaching, Canberra must cast off the plump country town aura and realise its prize-winning vision, writes Elizabeth Farrelly.
The people in white coats and gloves operating the pliers, snipping ligatures with agonising unhaste in the revolving pod atop Black Mountain Tower, are not surgeons, though for a moment the tension seems comparable. They’re archivists, and the narrow wooden crate at the centre of their attention holds our equivalent of sacred drawings.
The da Vinci moment, jokes a dignitary – US Ambassador Jeffrey Bleich, as it happens – and we titter, seeing the irony. (Are we so mystery-deprived that even bureaucratic boxes seem sacred?) Yet we crane forward as one to see the fine crate dismantled and the long roll removed. For this is the last unopened box of base-drawings from the Canberra design competition, a hundred years ago.
They’re lovely things, these turn-of-the-century Coulter watercolours; cycloramas a couple of metres long and a couple of handspans high, breathtaking in their sweet simplicity.
I imagine the Griffins in their 1911 Chicago studio, feel their excitement as, even before the images of this strange and far-flung place can be digested, ideas for its future city start to form in their minds.
And this, just what was in the combined Griffin mind, has become something of an issue. But it’s not the only issue facing Canberra.
It may not yet have impinged on your consciousness, but it will. Our capital is turning 100. It’s a gradual birthday. The competition took almost two years to fruit, so in 2013 Canberra can expect its telegram from the Queen.
It is therefore timely to ask, what does our national capital mean to us? And what does it say about us?
Ross Garnaut’s recent climate change address to the National Press Club was one of the most compelling political speeches this country has produced. Yet knowing that it came direct from one of the most car-dependent cities on Earth added a level of surrealism that goes some way to undermining any commitment to, in Garnaut’s term, “do our fair share” for the planet.
For although Canberra is far more dispersed than even Griffin intended, the suburban ideal – aka sprawl – was always its core promise. Canberra, with its bush-burbs and satellite towns, is sprawl squared. Yet it’s also, inescapably, our symbol: how do we reconcile this with a clean, green Australian future?
A recent flurry of Canberra-type activity has signposted the gradual build-up – the foothills, if you will, to the official Canberra centenary. Simultaneous with last month’s box-opening was the launch by creative director Robyn Archer of Capithetical, an ideas competition for a contemporary capital.
A few days later, Alasdair McGregor’s biography Grand Obsessions: The Life and Work of Walter Burley Griffin and Marion Mahony Griffin won the National Biography Prize. That was followed, a few days later, by the discovery of a lost Griffin drawing – part of the original competition entry – rolled up in the back seat of the car of Canberra historian Dr Dave Headon. What does it all signify, if anything?
Admittedly, Canberra can hardly be considered complete. In the life of a city, a hundred years is barely gestatory. But still, a hundred years is a hundred years; time, surely, for some meaning to accumulate.
Admittedly, people who live there love it. But for the rest of us, Canberra can feel a lot like an airy outer suburb or plump country town. All grass and trees and lakeside bike paths, it’s not much more than a nice place to tootle round for the occasional weekend.
But when you consider it as our national capital – when you attend an important government meeting, say, and sink up to your stilettos in mud on the way there because there’s no footpath across the paddock, or when you enter Parliament House to find the reek of chlorine suggests you should have your goggles and flippers, or when you exit face to face with the dilapidated “tent embassy”, or attempt to navigate your way round the circular roads that all look the same, or try to find anything that might pass for a city centre at the heart of all this sprawl – it’s hard not to feel that Canberra is under-designed, under-formed and under-occupied. It feels, in short, empty.
We ridicule the Kiwis for making a flightless bird their mascot but what kind of nation takes an absent presence for its symbolic city?
Emptiness, Griffinologists are reliably quick to point out, is not what they intended. But Canberra is the built form of the gap – nay, the yawning chasm – between word and deed. And in both these aspects, the hole in the donut might seem an apt symbol for Australian politics, just as it seems an apt symbol for our big empty continent.
If Burley Griffin had intended such satire, secreting such subversive symbolism in his design for Canberra the way Leonardo is said to have secreted heresies within his Last Supper, I’d love him more. But as it is, I cannot escape the feeling that Griffin simply mistook the scale by a factor of 10.
Grand Obsessions may be the book title, but there’s an irony here, because the outsize grandeur of Griffin’s concept meant that, on the ground, Canberra is un-grand in the extreme.
Canberra scholars – and it can be something of a surprise to realise that there is among urbanists a global Canberra conversation – argue over the real meaning of the Griffin plan. There is even an entire school of esoterica, well worthy of Dan Brown, devoted to reading vesica piscis and other anthroposophic symbolism into the plan.
But much of the conversation is more practical. To what extent, ask planners, does (or could, or should) Canberra the city resemble that plan? It is generally agreed that Griffin’s idea was of a more dense and urban centre, with building- and tram-lined streets, than what was built. But as to the future, and whether Canberra can ever be made to resemble a proper city, views divide. And the minute a blade of grass is threatened, politics steps in.
I was a student when I first gave Canberra any thought. Romaldo Giurgola’s new Parliament House, then under construction, was an object of reverence and pilgrimage. It was somehow thrilling that Australia should build something so vast and still so shamelessly symbolic, bespeaking (we thought) a reverence for design that students share, but for which, on graduating, quickly learn not to hope.
With the Parliament’s completion, though, the thrill vanished, leaving a building with too much pomp and too little dignity; too much careful handicraft inside – it struck us as the official equivalent of the Country Women’s Association crochet stall – and sheep on the roof.
For me, the main Canberra question is always this: does the same disappointment, the sense of unfulfilled promise, pertain to the city as a whole?
I’ve always had a soft spot for Canberra the idea; for its triangular geometry (Civic, Russell and Capitol Hill) drawn via the City Beautiful Movement from L’Enfant’s Washington and Le Notre’s Versailles.
I like the dramatic potential inherent in a city whose centre is of public and international significance, while its skirts are personal and habitable in nature.
I’m even sympathetic with Griffin’s attempts to soften and modernise this old classical stiffness, replacing the God-fearing, bilateral symmetries of French classicism with a looser, more democratic weave; a move from lacework, if you will, to crochet.
The sad truth is that, although the idea is definitely interesting, Canberra the city is much less charming, partly because the intellectual content has been strained out of it by successive amendments, like flavour out of an old teabag. And partly because the idea itself was inherently weak.
Robyn Archer, in launching Capithetical, was at pains to insist that “Canberra has a fascinating subculture”. But I say how can it have a subculture, when there’s no evidence even of a culture? Archer says: “Canberra is a shy city, like Kyoto, with a fan in front of its face.” I say, shy? Fiddlesticks.
It’s downright private. For anyone outside the cabals – the pollies, the bureaucrats, the legions of gravytrainers – Canberra is a F*** Off city. There’s just no other way to say it, or see it. And this FO-ness is built into the very geometry of the place.
Every indistinguishable circle or parkway, every indistinguishable turn-off, every outscale landscape gesture and unwalkable museum; it’s all designed for private knowledge, private access and private transport.
The fact that you can be stranded for an hour within the parliamentary triangle and miss your flight because there are no cabs and no other means of transport is a disgrace, and it’s not about the transport provision, but about the city plan that makes it, simply, impossible.
This privateness is unforgivable in any city, since cities are, in their essence, public creatures, but it is especially unforgivable in a capital city.
And although Burley Griffin may not have intended this quality, it was always inherent in his geometries.
Comparing Versailles, Washington and Canberra makes it clear. Not only are the similarities immediately apparent, but also the differences. For where Versailles and Washington are essentially rectilinear grids with the radial geometry overlaid (giving the diagonal avenues and etoiles so favoured by Beaux Arts designers), Canberra reversed this.
Griffin achieved his “democratisation” by switching the geometric priorities; making the radial geometry dominant, and relegating the grid to secondary infill. This allowed the limitless panoramic expansion – or sprawl – that seemed so “democratic”, but also ensured that Canberra became the shapeless, dull, unnavigable and profligate city we see today.
What’s to be done, and who’s going to do it?
Not the Capithetical competition, for a start. Capithetical shows every sign of having been set up by Canberra bureaucrats, devoted to keeping Canberra as is.
The competition brief precludes any response to the existing city, requiring instead a completely hypothetical take on a new capital – including a strong hint that cities will be virtual henceforth – along with a two-page report that shows you understand Canberra’s history.
If you think that sounds more like a school project than an international design competition, you’d be right – only it won’t be done by schools because you have to register before you even get the brief. There’s that privacy thing again. They do like to keep it in the family.
Capithetical is carefully designed to look creative while leaving the Canberra status quo unthreatened, which sounds just perfect as a government strategy. But what Canberra desperately needs, and what it should have before Garnaut delivers his next Climate Change bromide from there, is the dramatic densification that would achieve three things at once: congruence with Griffin’s vision, a genuine urban grandeur, and sustainability. Now there’s a triangular plan for you.
Post comment
Hi Elizabeth- not sure if this is the right place to make contact but it seems the only place. My friend Annette Tills told me to contact you. I am a town planning consultant in Auckland and have assisted my local community of Parnell in preparing its own plan for the future- using outside the square thinking- like using underutilised roads for building, or community spaces. The NZ Herald ran an article about it today A12,. You seem like a forward thinking ‘urban activist’ and having contacts like you would be helpful.
PermalinkRegards
Jenni