If I had a dollar for every time someone complained about wage discrimination against women, I’d be a millionaire.
Unless I were a woman, in which case I’d have $770,000.
odd duck
If I had a dollar for every time someone complained about wage discrimination against women, I’d be a millionaire.
Unless I were a woman, in which case I’d have $770,000.
Stephen Fry, with a hat tip to Kylie Sturgess:
A concerto is an argument between an individual and the state. Between an individual and society. It is an individual voice crying out and trying to make a statement of some kind. And it’s often drowned out by the orchestra, and it fights back. And the orchestra fights back. And it fights back. And the dynamic of listening to that is like nothing on Earth.
I am highly wary of anyone who would write a book entitled You Are Not a Gadget; oh here we go, a Luddite screed about how Goog-Face-Pads are making us lonely/stupid/lazy. So it was with trepidation that I delved into a lengthy interview at The Edge from last year with Jaron Lanier on what the hell it is we’re all supposed to do with this whole Internet thing we’ve found ourselves swimming in.
I’m glad I did, particularly as someone who has dreams (of the metallic cylinder variety) of becoming self-sustaining through my mainly-online writing and creative work. The interview itself is extremely wide-ranging, but what caught my attention was Lanier’s wrestling with the implications of an information economy trying to emerge in the context of global recession:
I’m astonished at how readily a great many people I know, young people, have accepted a reduced economic prospect and limited freedoms in any substantial sense, and basically traded them for being able to screw around online. There are just a lot of people who feel that being able to get their video or their tweet seen by somebody once in a while gets them enough ego gratification that it’s okay with them to still be living with their parents in their 30s, and that’s such a strange tradeoff….
To me, a lot of the culture of youth seems to be using the Internet as a form of denialism about their reduced prospects. They’re like, “Well, sure we can’t get a job and we need to live with our parents, but we can tweet”, or something. “Let us tweet!”
Now, for the record, I am 34, have a job (for now), and do not live with either of my parents. I have a wife and a kid and another on the way, and as tough as things are (and they are tough), we are not living by tweets alone. But Lanier’s somewhat derisively expressed concern still rings somewhat true to me. I might swap job prospects for real-world relationships in Lanier’s scenario, but it remains true that I take a lot of solace in sufficiently “buzzy” online work of mine, invest a lot of emotion and energy into their production, and let way too much of my ego and sense of self ride on how they fare. So, but for the grace of Jebus, there go I.
So what’s the alternative? What else could I or Lanier’s 30-something washout do? Particularly since so many more real-world gigs are vaporizing, like God in the Hitchhiker’s Guide, into a puff of logic.
The thing that I’m thinking about is the Ted Nelson [early Internet pioneer] approach … where people buy and sell each other information, and can live off of what they do with their hearts and minds as the machines get good enough to do what they would have done with their hands.
This model doesn’t really exist yet, and Lanier laments it. Part of why it doesn’t exist is because you can either already afford to go the “Apple route” and pay into, and then hopefully subsist on, a high-end but closed system, or the “Google model” in which you’ve already given up your intellectual property in order to have free access to its low-end computational power. That breeds a turbulent and (Lanier doesn’t use this word, but I will) ghettoized Internet.
And so when all you can expect is free stuff, you don’t respect it, it doesn’t offer you enough to give you a social contract. What you can seek on the Internet is you can seek some fine things, you can seek friendship and connection, you can seek reputation and all these things that are always talked about, you just can’t seek cash. And it tends to create a lot of vandalism and mob-like behavior. That’s what happens in the real world when people feel hopeless, and don’t feel that they’re getting enough from society. It happens online.
To avoid the ugly, people need universally to recognize the value of their own bits; to understand that what they offer to the Internet, usually for nothing, does have monetary value and should be treated as such. I don’t at all pretend to know how Lanier would have this actually manifest — just because I can’t get a job driving a bus because now they’re all automated, it doesn’t follow that I have something to blog about that people will pay me a living wage for the privilege of reading.
But on the institutional level, we do see a version of this doggy-paddling toward viability: the New York Times paywall. The Times takes a gamble that its bits are worth paying for, that they are not simply Google fodder for bottom dwellers. Results so far are mixed, as far as its profitability is concerned, but I think the philosophy is noble and pretty rock solid. It just may be too late.
Is it too late for me and my fellow blog-post slingers? Is the output of our brains something that we can turn to substinance on the Wild, Wild Web? I hope so. I spend an awful lot of time here. Let us tweet!
It’s a long story, so I won’t bore you with it, but suffice it to say, I wanted to spread my own bloggy wings and fly out of the nest so generously provided me by Dawnne. I thought Squarespace would be able to handle this simple move from one platform to theirs, but they botched it severally, and I gave up. They meant well, but they just couldn’t get their shit together for me.
What’s that mean? It means I’m setting up shop here at Tumblr to make it all as simple as possible. What I wasn’t able to do is bring the entire archive of Near Earth Object with me, so instead, I’ve exported it to a free WordPress.com site, Near Earth Archive, and it’s all there. In coming days, I will likely move some of the more important posts over to this site so they can live with the current stuff, but the vast majority of it will simply live at the archive site.
This is Near Earth Object’s new home, and in honor of this change, and also to make it square with the actual term I’ve been using as a title, I’m putting the damn hyphen in. Welcome to Near-Earth Object.
Old links to the old site will no longer work, unfortunately. Luckily (?) no one really reads this stuff, so the number of folks affected by that hiccup will be rather small. The domain nearearthobject.net, of course, remains with me and will point to this site.
This has been such a mess for me I almost feel guilty. But my plan now is to take my friend/enemy Justin Sapp’s advice and to pick once place and stick. And so I shall.
(And yeah, yeah. I know I said I’d stop tweaking. Gimme a break.)
The IceCube Neutrino Observatory just figured out that gamma bursts have nothing to do with cosmic rays, and that means no one knows where they come from. Via io9:
…the telescope was able to conclusively contradict 15 years worth of previous predictions while still under construction, and now it’s pretty much demolished one of the leading theories of extra-galactic physics. Really, all the IceCube data serve as a good reminder of how much science relies on disappointing non-results just as much as major breakthroughs – without the former to show the limits of our current understanding, we’d risk finding ourselves awash in a sea of indistinguishable false positives.
Oh snap! Science was wrong! How do we fix it?
That’s right. More science.
And I didn’t even have to use my AK.
This worries me a little (by Toby Litt in Granta):
A couple of years ago, I spent three months playing World of Warcraft – partly as research for a short story I was writing, mostly because I became addicted to it. This convinced me of one thing: If the computer games which exist now had existed back in 1979 I would not have read any books, I think; I would not have seen writing as an adequate entertainment; I would not have seen going outdoors as sufficiently interesting to bother with.
Similarly, I find it difficult to understand why any eleven-year-old of today would be sufficiently bored to turn inward for entertainment.
This raises the question as to how future writers will come about, without ‘silence, exile and cunning’ – without the need for these things?
I was formed, as a writer, by the boredom of the place in which I lived.
Now, I did have video games when I was eleven (not anything of the scale or complexity of WoW, but I had the NES and the Segal Genesis), and I think they are a big reason (second only to cable TV) as to why I almost never read books at that age, despite being “bookish” in all other respects. With rare exceptions, I allowed the television screen to use up almost every single waking minute of my life. I can’t tell you how much I regret that.
Eventually, I got bored. In my boredom, I learned to play — just barely — some guitar, and wrote songs. Or wrote in my journal. As a young adult, particularly when I was a working actor without television available to me, I got really bored, and dove head first into my songwriting, and other reading and writing as well, for a good stretch of about five years.
But in the thick of the social web today, along with the rigor of parenthood, I am once again rarely bored. I loathe television now, even to the point where even high-quality programming makes me impatient and anxious for the time I lose to viewing it. But my iPad and Mac and iPhone ensure that I never need be without distraction once the kid is asleep.
A happy difference now from my days of TV-cured boredom is that I spend a huge amount of time on my devices reading, far more than I did as a child or teenager. I am not delving into genuine books as much as I would like (and not nearly as much as I did when I was essentially bereft of television and Internet access), but my iPad serves primarily as a reader for long- and medium-form written content. I almost never visit YouTube, I play almost no games, etc.
But that’s me, a nerd who never fully embraced his nerddom in his teens, and is now trying to intellectually and culturally catch up. To today’s eleven-year-olds, will such an endeavor even occur to them? I’d like to think so. I’d like to think, at least, that they’ll do better than me. On a hopeful note, my two-year-old son Toby, who, although he does love his episodes of Dinosaur Train, absolutely loves books and being read to. I will do all I can to keep him loving books. He’ll be a better man for it.
Advice I could stand to take, from Rob Beschizza, editor of BoingBoing:
Getting snared by technology-tweaking, especially design, is the fastest and easiest way to waste time to no good end as an indie blogger type. There’s only one thing that brings in readers, and marketing people call it “content”. Writing. Artwork. Games. Whatever it is that you do that other people care about.
The confusion between the technology of blogging and the art of it is natural, because we’re still close to the dawn of the medium.
This has definitely been one of my weak points, to which my three or four longtime readers can attest. I’ve hopped platforms and gnashed my teeth over silly design conundra more times than is defensible. I’m only recently waking up to the idea that I’ve got to stop worrying about the packaging, particularly considering the relatively tiny audience I have. A nifty logo, while nifty, will not draw an audience.
Spotted on Indeed.com, a “job listing” from a Texas mother looking for someone to help her transport her teenage daughters to school and activities, emphasis mine:
I am a mother who is 38 years old, I am a teacher in Tomball ISD, my husband is American and I am Mexican. I need to find a woman or girl that is nice, kind, and has good manners because you would be a role model for my daughters too. Christian or Catholic would be best. If you think you are atheist, please don’t take the job, I do not want those ideas in my daughters’ heads. We are a very kind and positive and affectionate family.
Just stretch your imagination and think about what folks might say if instead the ad feared for the effect of Christian “ideas in my daughters’ heads.”
You know what? I’ll bet she’d be “kind and positive” toward an atheist applicant before she called the police.
Listening tonight to the nearly-unbearable “Retraction” edition of “This American Life” in which Mike Daisey is taken to task for his fabrication of details about his experiences in China, I kept waiting for Daisey to more effectively counter the assertion by Ira Glass that people who come to see a monologue expect that every word of it is true.
Perhaps it’s because Glass and the myriad bloggers and reporters feasting on this story are themselves journalists, and therefore can’t help but expect something like this to be akin to what they do, a retelling of actual events. And perhaps it’s because my roots are in theatre that I feel like Glass is wrong; one may not even think about it consciously while watching a show, but I feel that people on the whole do understand that a show is a show. I know that when I saw Daisey perform his excellent How Theatre Failed America in DC a few years ago, I certainly had no illusions that he was giving a 100% factual account of his life in theatre. Of course he was going to embellish, exaggerate, and invent. Why? Because he was spinning a tale, based on facts but not relying on them, that told a larger truth.
I understand that at least as far as “This American Life” and, perhaps even more damning, his op-ed in the New York Times are concerned, it’s the packaging of his story that matters. It does indeed sound as though Daisey offered his play as an entirely factual retelling and therefore worthy of being used as such on the show (and that his manufactured experiences could be written as though they were actual reportage for his New York Times piece). There’s no excusing the presentation of fiction as fact to news outlets.
But I have to wonder at “This American Life” for even wishing to do so with Daisey’s play. If they wanted to use his piece as a springboard, why not simply excerpt some pieces of a performance, make clear that what we’re hearing is a story told by an actor in a play, and then delve deeper into the very real, no less serious issue at hand? Why even decide to hand essentially an entire episode over to what they know is a piece of theatre? Glass says not killing the show after being thwarted in their attempts to contact Daisey’s translator was their big mistake. I think their big mistake was in thinking that a play might possibly be, not just the inspiration, but the substance of one of their reports. I find it hard to believe, but I am forced to believe, that Glass and company are as naive as he claims they are when it comes to credulousness about the veracity of performance art.
I don’t know what Mike Daisey was thinking. He’s such a brilliant writer and performer, and I think it would be a genuine, substantive loss to the culture if we were to lose what he does because of this — particularly since his larger motive was so crucial, so real. I can only presume that the idea of getting his show on “This American Life” and of getting to be treated with a kind of reverence by the media became con-fused with that larger motive. He is an actor, after all, and we are nothing if not attention whores of the worst kind. (Hey! Go download my music!!!!) I wish so badly that he had handled this all so differently. All he had to say to Glass, to the media, to his audience, in any subtle form he wished, that his play is just that, a play, but that it is based on many true events and reports. Done.
I also wish that when Ira Glass pressed him as to whether it was acceptable for his play to be in part constructed of fictions that he had said, proudly, that the art of storytelling has a different goal than journalism, and that his job is to get his audience to think and to feel something. Daisey does that extremely well, and the things he wants us to care about remain worth caring about.
Side note: I am more than a little sickened by many of the tech bloggers and journalists whose work I usually think extremely highly of, but are now dancing on Daisey’s reputation’s grave, almost delighted that Daisey is facing this new firestorm. This seems to me to be borne out of nothing other than their own desire to not have to feel anything about the source of the gadgets off of which they base their careers. Now they’re off the hook, so they believe, and they have someone to put in the stockades for his heresy. It’s deeply disappointing.
The so-called post-PC revolution came to my house. This has been my first weekend off in a couple of weeks, and though I’ve been browsing the Web, tweeting, and now blogging, I didn’t even turn on my computer yesterday, and if you know me at all, you know that the only reason this could be so would be if I was hospitalized and unconscious.
Look, I know. For millions of people, this is already how they live their digital lives (not unconscious, the whole post-PC thing) — without a “computer.” But I didn’t get it before. And I assure you, it’s not because of the new iPad announcement (lord if I could afford one), but rather because I began to boil down what I actually wanted to do on a device.
Let’s face it; even for so-called “power users,” which I sometimes fancy myself as being, 95% of what we do on our devices is low-intensity, passive consumption. We browse the Web, we check email, screw around on the social networks. Put apps and games and the like aside. Mostly, we’re sitting and staring at pictures and text. This does not require a 12-core Mac Pro and a Thunderbolt Display. But for a while, I was saying that it did at least require a MacBook Air; though underpowered in terms of processing heft, its physical slightness and its speedy solid state drive is/was more than enough for, again, 95% of what people actually do, save for software developers and feature film editors. Indeed, my contention has been that an 11” MacBook Air, which I have, entirely obviates the need for a tablet — it’s so small and light that something like an iPad would be redundant. A tablet, I thought, was a powerful-yet-mostly-unnecessary toy.
Especially if one already has an iPhone/smartphone — and an e-ink Kindle too! Tablets? Please.
But let me tell you what I think affects all of this, which I think may still be true in some contexts: not living alone.
Think of this. You’re sitting in the living room with your spouse or significant other. Nothing much is going on, there’s some light chatting, perhaps your partner is watching something on TV, but only half paying attention. You want to do your Web browsing (or as I usually refer to it, your dicking-around), and you have either your iPhone or your MacBook. If you’re on your iPhone, admit it, even in this day and age of Utter iPhone Ubiquity (UiU), you still look like a closed-off, toy-obsessed douche when your attention is focused on a 2.3 by 4.5-inch rectangle in your hand. You know it’s true. Even while you’re using your own smartphone in the most noble, serious, and non-douchey of ways, you still think other people on their phones look like douches. Or crazy.
But let’s say instead you opt for the MacBook (if you’re opting for a Windows PC, I don’t even know what to say to you). Well then, even with the smallest laptop, you’re opening up a two-paned device, the display of which instantly takes dominance within the horizon of one’s vision. Once you’ve opened up the laptop, even if you are doing nothing more than clicking a few likes on Facebook, to anyone else in the room you are now “on the computer,” and that’s it, you’re effectively not there. How many times have I heard, while doing nothing that commands my attention, heard from my lovely wife, “Do you have to be on the computer now?”
(And a Kindle? That’s a book. Different territory altogether. When you’ve got a book open, you’re reading a book, and people get that, and that’s what you’re doing on a Kindle, so this is not terribly relevant to this overall thesis.)
So: I received a Kindle Fire as a gift, and being a tablet doubter, I mainly disregarded it — not because I didn’t like it, it was obviously quite neat, but I figured that it was, as I said, redundant. But the more of my day-to-day work that I’ve done on my personal computer, and the more cognizant I’ve been of being present for my family without crunching my attention into my wee iPhone or disappearing behind a laptop display, the more I’ve wanted something that allowed me my dicking-around and a level of acceptable environmental and social awareness.
And there was that Kindle Fire.
It’s not a perfect device by any means. Both my iPhone and my MacBook are an order of magnitude better than the Fire at pretty much anything I’d want to do with it (save for reading books, which I do on my e-ink Kindle anyway). This is a big reason I neglected it. But in the contexts I’ve described above, it suddenly became the perfect device. A tablet allows for the casual, passive consumption of content without leaving the appearance of being absorbed in something that either looks like “work” (on a computer) or a “gadget” (on a phone).
In many ways (and may The Steve forgive me for saying so) it’s better than an iPad because of its size; being 7 inches makes it close in size to an unintrusive-looking trade paperback rather than the iPad which is more physically prominent, nearer to a clipboard in size. (I’m not crazy. A Retina Display and Apple’s seamless merging of hardware and software are much preferable to Amazon’s otherwise-laudable efforts in this arena. But I have what I have and can afford what I can afford.)
But more to the point, it’s become my primary device for that aforementioned 95% of stuff that I actually do. When I’m working on a piece of long form writing (like this post), doing more complex creative work like videos or presentations, or recording and mixing music, I absolutely want my completely awesome 11” MacBook Air, and probably attached to my 24-inch display. Taking pictures, making quick Twitter observations, or getting information wherever I am, I want my crazy-amazing iPhone 4S. But for just about everything else, I get it now. The tablet is the way to go.
I thought I’d miss heavy multi-tasking, quickly jumping between windows and applications, and having several things my field of attention at once, but I don’t — unless, again, I’m working on something. In almost all other cases, though, holding a medium-sized slate of glass and pixels makes by far the most sense.
I think my wife agrees. But I haven’t asked her, because I’ve been on the computer.
Within the next generation I believe that the world’s rulers will discover that infant conditioning and narco-hypnosis are more efficient, as instruments of government, than clubs and prisons, and that the lust for power can be just as completely satisfied by suggesting people into loving their servitude as by flogging and kicking them into obedience. In other words, I feel that the nightmare of Nineteen Eighty-Four is destined to modulate into the nightmare of a world having more resemblance to that which I imagined in Brave New World. The change will be brought about as a result of a felt need for increased efficiency. Meanwhile, of course, there may be a large scale biological and atomic war — in which case we shall have nightmares of other and scarcely imaginable kinds.
Thank you once again for the book.
— Letter from Aldous Huxley to George Orwell, found via Teleread.
Architecture professor Thomas De Monchaux, in a piece that has almost nothing to do with economic policy, helps to clarify thinking about the concept of austerity:
Economically, austerity — which the Germans, among others, are intent on forcing upon their southern brethren — can sound like a good idea, but might actually exacerbate the conditions it ostensibly ameliorates. One day, we might look back on cuts in public services and infrastructure during a downturn with the same disbelief with which today’s doctors recall the medieval medicine of deliberately cutting and bleeding the sick.
Austerity has costs, De Monchaux tells us, in art, production, and yes, economics, such that we may not suspect them. But think about Apple, which well represents the austerity-in-design that I think De Monchaux would agree qualifies. In boiling everything down to raw necessities, Apple makes objects that are more expensive (often) and require greater sacrifices from those who built them (see Shenzhen).
People in desperate economic times, however, are not consumer products or art. Their lives and livelihoods ought not be “boiled down” to achieve some idealized notion of frugality, some platonic ideal that exists mainly in the minds of those who can always afford austere Apple objects.
Richard Holmes’ 2009 tome is aptly titled. It’s a wonder, and it takes an age to read it. Right. I wanted to get that out of the way, as the fact of its lengthiness weighs on me as I consider penning a reaction to its substance. It feels really long.
But, as with many efforts, it is worth it. The Age of Wonder is an exhaustive chronicle of the Romantic era of science — indeed, the dawn of the very term. It focuses primarily on a small cluster of main “characters,” beginning with the intrepid Joseph Banks (and his utterly fascinating adventures in Tahiti) all the way through the Herschel lineage (William, his sister Caroline, and William’s son John) — and just before Charles Darwin takes his voyage on the Beagle. It is a tale of presumptions shattered, egos inflated and exploded, and orthodoxies forever upended — and not just those of stodgy religionists, but of even the most open-minded of explorers and philosophers. As Humphrey Davy, perhaps the most prominent of Holmes’ subjects, said, “The first step towards the attainment of real discovery was the humiliating confession of ignorance.” There is a lot of that documented here.
Perhaps the most prominent theme throughout the book, with all of its detailed (often to a fault) recountings of experiments, arguments, and internal struggles, is that of the development of a professional discipline whose aim is more than the sum of its parts. What would eventually be known as science would become a practice not simply of confirming or denying the veracity of hypotheses, but it would perhaps be the one great force that ushers humanity beyond its terrestrial and provincial understanding of itself. Holmes summarizes the thinking of Samuel Taylor Coleridge on this subject:
… Coleridge was defending the intellectual discipline of science as a force for clarity and good. He then added one of his most inspired perceptions. He thought that science, as a human activity, ‘being necessarily performed with the passion of Hope, it was poetical’. Science, like poetry, was not merely ‘progressive’. It directed a particular kind of moral energy and imaginative longing into the future. It enshrined the implicit belief that mankind could achieve a better, happier world.
This was not so simple, of course. Even the previously noted Davy faced his own crisis on conscience as reason as a force behind a moral, and not just practical, philosophy challenged even the least superstitious of minds. In 1828 Davy wrote,
The art of living happy is, I believe, the art of being agreeably deluded; and faith in all things is superior to Reason, which, after all, is but a dead weight in advanced life, though as the pendulum to the clock in youth.
But “living happy” is not the same as living well, not the same as progress, not the same as advancing overall well-being.
There were those of this time who began to see something more than a happy illusion being stripped away, but rather a means to liberation of the species, a new reigniting of the the Enlightenment’s flame. Holmes offers the words of Percy Shelley as the technology of ballooning had become the center of international awe and controversy.
Yet it ought not to be altogether condemned. It promises prodigious faculties for locomotion, and will allow us to traverse vast tracts with ease and rapidity, and to explore unknown countries without difficulty. Why are we so ignorant of the interior of Africa? — Why do we not despatch intrepid aeronauts to cross it in every direction, and to survey the whole peninsula in a few weeks? The shadow of the first balloon, which a vertical sun would project precisely underneath it, as it glided over that hitherto unhappy country, would virtually emancipate every slave, and would annihilate slavery forever.
This did not happen literally, of course, but it reminds us that within genuine understanding of all things lies the potential to transcend them.
Side note:
Also rife within The Age of Wonder are examples of the seemingly timeless wars between religion and science, and science’s struggle to be seen as something other than raw atheism. Holmes tells of the profession’s coming to terms, as it were, with its own moniker, and the old demons are ever-present:
There was no general term by which these gentlemen could describe themselves with reference to their pursuits.
‘Philosophers’ was felt to be too wide and lofty a term, and was very properly forbidden them by Mr. Coleridge, both in his capacity as philologer and metaphysician. ‘Savans’ was rather assuming and besides too French; but some ingenious gentleman [in fact Whewell himself] proposed that, by analogy with ‘artist’, they might form ‘scientist’ — and added that there could be no scruple to this term since we already have such words as ‘economist’ and ‘atheist’ — but this was not generally palatable.
The analogy with ‘atheist’ was of course fatal. Adam Sedgwick exploded: ‘Better die of this want [of a term] than bestialize our tongue by such a barbarism.’ But in fact ‘scientist’ came rapidly into general use from this date, and was recognised in the OED by 1840. Sedgwick later reflected more calmly, and made up for his outburst by producing a memorable image. ‘Such a coinage has always taken place at the great epochs of discovery: like the medals that are struck at the beginning of a new reign.’
This argument over a single word — ‘scientists’ — gave a clue to the much larger debate that was steadily surfacing in Britain at this crucial period of transition 1830-34. Lurking beneath the semantics lay the whole question of whether the new generation of professional ‘scientists’ would promote safe religious belief or a dangerous secular materialism.
Same as it ever was.