Sunday, May 29, 2011

Pretty Soon You're Talking Real Money, Part II

In The Pleasure of Finding Things Out, Richard Feynman was asked his opinion on whether another Michael Faraday could emerge in today's (some today twenty-odd years ago, that is) scientific milieu. Faraday was a rare and spectacular sort of genius, one of the fathers of electromagnetic theory. He never had formal training, reportedly found the mathematical basis of the early theorists incomprehensible, and yet he managed to piece together the basic relationship of electricity and magnetism, figured out some of the subtle business of electrical polarizaiton of materials, describing qualitatively (as I write in my occasional proposal) physics that it would take another sixty or seventy years to report the mathematics for. He did this on a combination of pure intuition, language, and a facility for cobbling together chewing gum and baling wire experiments. That he managed this before the invention of duct tape is no doubt equally remarkably to experimental physicists. As for me, my admiration for Faraday is only enhanced by the fact he looked like some plausible combination of a sideshow barker and garage-tinkering lunatic, but then nearly everyone in the early 19th century looked like that.

Anyway, Feynman replied that a late twentieth-century Faraday was unlikely. Physics had evolved, he thought, to the point where it was necessary to understand the current mathematics to really make a new innovation in the field. Naturally, he held out that it wasn't impossible, but he didn't currently see enough new area where even the basics needed to be worked out. He believed that the questions that were being asked in the 1980s were on the forward edge of theory, or outside of the easily measurable.

This morning on NPR, as part of a series this week that is evidently sponsored by the damn Chamber of Commerce, I learned that Peter Thiel, the co-founder (wait, which half did he found?) of PayPal is offering students $100,000 to drop a couple years of college and become entrepreneurs. Now, on one hand, I get it. With some colleges topping $50k per year these days, it's a hell of an investment, and if you're nineteen and can get into the grind without first sinking that cost, then you're ahead of the game. I'd tell you college is a pure scam if I didn't personally value it so much, and if my engineer's training (mostly trained judgment, but no doubt some people are born with that) wasn't so helpful. But if you're the kind of kid that can create high tech with twist ties and duct tape, then those two extra years of theory are probably not going to make the difference in your career. For the right kind of kid, thihs is a good deal: high-risk, sure, and it's not competitive remuneration with a full-time employment with benefits and existing capital equipment, but you 'll be ahead of your peers looking for that deal in two years when your venture fails. But is it a good way to be looking at engineering in society?

I mean, fucking PayPal, anyway. There was a short window where the acceptance of the internet, as a new medium, supported innovation that could get by on concepts (that is, without fucking doing or making anything), an arena freshly enough sodden where any goddamn thing had a chance of taking root. You idea men flourished precisely because you were in a unique moment when there were no established competitors, or because your particular branding took a little better than C2it or CertaPay or whatthefuckever unremembered version of pets.com failed to find utility, and if I remember 1999 at all, about 99.4% of those conceptual masterpieces still managed to blow other people's fortunes, thanks to about as much actual technical or business savvy as your typical 1830s peddlar of miracle tonic. But yeah Pete, your confirmation bias tells you you're a genius. Let's ask for your next business opinion.

As a professional bullshitter in the field of applied research, I have a good idea how far the lower five figures are going to get you. A hundred grand is exactly the business I'm in. To identify a problem, to propose a solution, and work it out is hard enough. Often you find it's for marginal improvement (or marginal loss) that requires a detailed cost analysis, and that's on the off-chance it works at all. There's a question of how far you can get in your garage, a question of far can you get without infrastructure. To do technical research you need to measure things. You need a laboratory, tools, at a minimum, materials to build things out of, and while there is room for innovation in the area, even the basic areas, it's not so virgin a field as it once was. As I started to write this up, CNN was broadcasting an excited news piece on the latest X Prize, which rewards complicated high-tech ventures after they demonstrate success. How much do you think you have to invest for a 10% chance to win a $1.4 million for a mechanical oil separator? We need 'em so badly (and we do), then why are we doing it on people's own thin dimes? Fucking cheapskates.

Now, we're not quite in the place with engineering as we are with fundamental physics: there's room for tinkerers, and the entrepreneurial model isn't completely broken. I don't intend to discourage the effort by any means, and I think that finding these people and supporting them is wise. But spotting a hundred grand to spark a high-risk research program is chump change, and doing it for the equivalent expected value is even worse. Baiting kids with dreams of Mark Zuckerberg or Steve Wozniak to fabricate shit in their dorm rooms and garages is probably not the best alternative to more comprehensive funding research the sciences, and it's not as if you can count on the paradigm shifting every generation, especially when you leave it to revolutionize itself, while fluffing the egos and fortunes of the people who recognize talent instead of apply it. Democraticization of innovation seems to correlate with the speed of its progress (rich patrons and then universities was better for progress than keeping it in the monastaries, letting women into the academy was a plus, that sort of thing), although it's hard to generalize across the slow sweep of history. Twenty Under Twenty and the X-Prizes are not bad ideas, and it's great to have something like that in the suite of science investment. But relying on them over straight-up funding seems like a giant step backward.

Tuesday, May 24, 2011

Review: The Poisonwood Bible, by Barbara Kingsolver

Browsing randomly through the non-genre aisles is just a terrible way to shop for books, no matter what your dotty strollers and romantically-inclined geeks would prefer to tell you. It's great to get drawn in by something stacked nearby and all (I might have been rewarded to find my way here if, for example, I happened to be mired in the Stephen King wing of the store) but if you don't have much free time, it's a lot better to have a list. This time I didn't, and so when I made a trip to the local Barnes and Noble to support one my daughter's school programs, I was encouraged to buy something quickly, but also listlessly. The Barbara Kingsolver book I had intended to buy, had I remembered to write it down, was her charming manifesto about gardening, not her Important Novel from the fiction section. I'm not angry about it. It's more of a segue than a complaint.

The Poisonwood Bible was very enjoyable—I tore right through it—although if you ever catch me without some criticisms, then look for signs of encroaching senility. I hate to be pushed that hard by the display, and a novel marketed as significant has got to face some high standards. In that light, I'll try and reveal my usual assortment of faint damns as quickly as I can; there were definitely some small contentions that kept creeping in. The Poisonwood Bible is the story of a missionary family's attempt to evangelize a village in the Congo in the early 1960s, told in a sequence of rotating (all-female) character points of view. We are introduced to Orleanna first, the mother, who in the opening sequence appears to be addressing the reader (she actually is not, and there is a nice symmetry with the Orleanna pieces that is not obvious at the outset, which do well under a re-read) and introducing the story as a stand-in for the author herself. This next shifts to the point of view of Leah, one of the middle children, and she is similar enough to her mother's voice, and so damnably precocious for a 14-year-old, that she sounds a lot like the author too. Two of the other sisters (Ruth May and Rachel, the oldest and youngest) feel again similar, but now straightjacketed respectively by childhood and by general dimwittedness, which leaves Adah as the odd girl out, physically handicapped, sly, secretive, and cynical, and of course I liked her best from the get-go. For the other four, it takes a while for their individual natures to be drawn out. To be fair, they're family, facing the same immovable obstacles, and I am sure that Kingsolver realizes that it's not uncommon to get to know a bunch of sisters this way.

The bulk of the novel covers the 18 months or so they lived in the Congo village, a period which takes them through the nation's independence from Belgium, and quick subjugation under the Mobutu regime. A frame story for revealing the alternating anecdotes would have helped this book a great deal. The individual sequences comes off a little like the indistinctly-timed interview portions of your lazier television mockumentary. It is unclear how do their composition might fit in alongside with the contemporaneously occurring drama. It's as if the characters are being deposed in some neutral purgatory space by the omniscient narrator. It would be a plausible explanation if Nathan had ordered the kids to write about their experiences--it would have been within his character, and they had plenty of downtime--but if the parents had been aware of children's' diaries, then some of the challenges in the book could have been overcome by reading them. And only Adah is ever portrayed as keeping a journal (a coded one). Touching on this framing issue could have helped some other things too. Leah (who I continue to see as the author's stand-in) is the only character that really seems to grow and change much in that long real-time section. If they wrote them all at once (as hinted by the section headings), then that would explain the stasis in tone, but in that case, the voices still don't change in the next sections, years or months later. Maybe we expect this from Rachel, who only grows into a bigger nitwit during this stretch, but here's Ruth May, who's somewhere in the neighborhood of 8 years old at the beginning (I missed her revealed age, but old enough to write?) and in almost two years, we'd expect her especially to evolve a great deal.

A few throwaways and Kingsolver could have settled the minds of certain kinds of continuity-minded geeks, but the stasis goes even a little beyond that. The conflict in the story is very well laid out, and the set pieces are well-positioned to proceed to a logical and probably inevitable conclusion, which they do. And to an extent they are revealed or they intensify (I don't want to give the impression that the plot is poorly written), but the conflicts do not develop. It's stated at the outset that Orleanna resents her husband, she doesn't grow to that point. Nathan doesn't become a tyrant, he starts as one. The only one that moves away from Nathan (and that needs to) is Leah, and she moves to someone else, a(n improbably appropriate) romantic interest, but that one's telegraphed from miles away too. The sections play out as examples of the known difficulties, but those misunderstandings were always there. And it's weird, because in the last third of the book, the long epilogue, the characters age in great leaps, and get a chance to look back to understand how their experience in Africa has defined them. Here the evolution of their characters is suddenly wholly plausible and highly persuasive. Now Adah is challenged with the selfishness of her conception of things. (Is it plausible that her handicap was merely learned behavior, incidentally? It gives her an interesting vehicle for self-reflection, but I'm not sure how realistic that is.) Now Leah develops depth to her cultural understanding. Hell, even Rachel evolves postscripturally into the true mode of her uselessness, and Kingsolver is able to subtly put an iota of wisdom in her head too. Certainly she's grown beyond a Georgia debutante, despite her disinclination to.

A modern reader might find it hard to buy into Nathan. He's a smart and motivated guy, but how could he maintain his will over a family of wiser, cleverer, more dynamic, and more interesting women by the mere force of patriarchy? If he didn't resemble so many of that, and even the children's (my parents') generation, if I didn't see my own family members so clearly right in there, then I might think him a caricature. Instead, I see him as an accurate (if extreme) portrayal of how people can oppress and subdue the families they imagine they nurture. His religious inflexibility is ironic--as if Christianity hadn't evolved to accommodate any number of societies, including his distinctively American Baptist take on it—but Nathan isn't a man with the slightest dose of irony, nor one to question the singularity of the American experience. If we learn anything more about Nathan, it's the perfect depth of his contemptibility.

The parallel of the Price family with America's treatment of the Congo isn't very subtle, and it gets a mention, but admirably, Kingsolver doesn't really harp on it. Viewing (patriarchal) politics and society from the angle of motherhood and womanhood is useful. The author shows, in the context of an interesting story, how power can be willfully blind and self-interested, as well as how its use can extend from the powerful, or fail to, instead extending from the setting. (To that running interest of mine, she makes a good anthropological case, intentionally or no, about calorie (and protein) availability as it dictates certain modes of civilization.) The African women are focused mostly on the basic dynamics of life and the forces above impinge on it, but change things only with difficulty, more by changing the conditions of things than imposing rules and ideas. I thought Kingsolver did an excellent job of positioning that observational understanding against the larger relationships in the world theater, giving the modern corporate state the indictment it deserves, although occasionally you do occasionally get a whiff of credulity when it comes to the prospect of any better proposals (Could Lumumba really have been so benevolent? Could the pre-contact Congolese society really have been so well balanced?) but it's smart to pose them from Leah who is given to a bit of ideal-worship in spite of moving past her old man, and anyway, it's understood that these are lost questions, worth regretting.


[There's probably a good case to be made that the Enlightenment- or colonial-era Europeans did not understand primitivism very well, or were at least ill-equipped to study it. (I say "probably" because I'm still not very willing to delve into the source literature.) It seems like a related issue, although I think it'd be better to say that your typical citizen these days, with more available information but lots of his own society's structure and benefits in front of his eyes, simply declines to make a validating comparison. I think that Kingsolver (based as much on Animal, Vegetable, Miracle than this one) and I, as well as of other writers on the theme (Wendell Berry or Eduardo Galeano come to mind), and a couple readers of this blog, will agree that we've ignored some valuable lessons from the throwback days, that would be worth re-examining in a modern context. A community of interconnected but more deeply rooted localities, each appropriate to its own environment, is probably a better one, and possibly an end-point of our own arc anyway. I don't know if it's realistic to think we can up and go all Iroquois Confederation or anything, but it's interesting how radical that would really be. Western history has been a long story of consolidation and subjugation over the couple of millennia (and of course people got the empire bug in Asia, Africa and the Americas from time to time too). Localizing like that would certainly throw the European-style bordered nation-state right on its ass, but I have my usual urge to caveat the hell out of that sort of thing: (1) we'd need a lot fewer people; (2) it's a better land-use and community support model, but it's society model that has a lot of room to righteously suck. Ample interconnections between the nodes has got to be an improvement over a more primitive form, as does technology and exchange. Imagine limited but versatile travel, easy communication, ready access to ideas, science and history. Also, (3) centralization works for some things, although it seems very difficult to pick and choose what we employ it for. Public insurance and resource management without ruling classes and wars between them? Good luck with that. Maybe I should call all this out as a longer post, but it fits in Kingsolver's themes, and frankly, I desperately need some new headlines.]

Thursday, May 19, 2011

Pretty Soon You're Talking Real Money

So, I saw on TV last night that Obama is looking to raise a billion dollars for his 2012 presidential campaign. There's not a candidate alive who doesn't think his stewardship is worth the effort and manpower required to put him in that capacity, but still, a billion bucks? Wow.

I'll admit that a billion doesn't buy what it used to. If so inclined, candidate Obama could bankroll both Feeding America and Doctors without Borders for a year ($400M and $600M respectively) with that kind of scratch. If he had the gall, he could also front the entire budget of federal alternative energy research for a year too. Or fund three free days (more or less) blowing people up in Iraq and Afghanistan.

A billion dollars spread out to every hospital in the country (about 6000 of them) could donate about $170,000 to each, about a doctor for a year. Spread to every town (about 25000) could be about $40,000 each, employing yet another sorely underpaid teacher for twelve months, although probably not with benefits. Spread to every household (about 100M), then it's crisp clean ten dollar bill. Not much, but we can certainly appreciate the thought. Once I got past my justifiable suspicions, I would at least drink the sixpack the president bought for my family. Chump change I can believe in.

(Parenthetically, I suppose that this implies if everyone and their spouse checked off the $3 box on their 1040s, then these fuckers would have more than enough money for campaigning and related graft.)

It's not like we don't know who the guy is, but let's assume he still needs to extract a couple hundred thousand bucks to attend the inevitable debates. He can explain the other details of the stunt to the incredulous press during the usual briefings. If president Obama decided to forgo the rest of the business (and taking the further improbable assumption that one can fundraise a billion without returning a significant chunk of it to the fundraising activities)--no babykissing, no ad buys, no rubber chicken meet-n-greets--would you vote for Barack Obama if he gave you ten bucks?

Sure you'd just be getting bought off by the donors directly, but at least you'd be in the loop for once. And it'd be a lot fucking quieter. And who knows, maybe the pundit goobers could find something useful to talk about besides the campaign.

Review: The Pleasure of Finding Things Out, by Richard Feynman

By Richard Feynman? Well, it's a series of presentations and interviews given by him, so the byline is mostly correct. Although it includes some of his famous technical predictions (on the future of computing and nanotechnology) and indictments (his report on the Challenger disaster), it's basically non-technical, filled up with anecdotes and the wider variety of his thoughts. It contains most of what I had actually read or heard of Feynman before I picked up some of his physics lectures the year before last. I saw the video following his report on the space shuttle in materials science class back when I was a freshman—the one where he dunks the o-ring into the ice water—and thought it was a bit of grandstanding actually. I was more impressed with the report statement this time around, which seems less out to impress and more to rather boldly condemn the deserving. I'd read the same snippet of Cargo Cult Science on the internet a half dozen times, which seems to get at something profound, and I remain ambivalent about the plenty of room on the bottom speech, as pull-quotes have appeared in front of approximately forty bazillion talks or review papers in the past 20 years. And, right, I am sure that Feynman diagrams got mentioned in passing somewhere near the end of physics III, where they wrapped up a survey of the stuff that was part of the field but you probably wouldn't need unless you chose to study it. And that's it. I was aware of who he was, knew something of his general contributions, and had heard of his mercurial approach to life. The influence on scientists and rationalists of my acquaintance has tended to sneak in here and there.

So let me tell you why I am disappointed. It just makes my own quasi-public bloviating seem so pointless. Maybe it should be validating, but the science philosophy you'll find here is of a tune with what I've been occasionally wailing about for more or less the entirety of time I've held this blog: accepting doubt as part of an honest worldview; evidence-based thinking; the dynamism between theory and measurement; approximations and representations pitted against objective reality; inquiry as a sort of moral imperative; informed wiseassery. What more does that leave me to comment about it? And I am forced to ask: how much debt does my own struggling worldview owe to this guy? Obviously I've read and interacted with many folks who were influenced by him. How corrupted am I by the company I've kept? Do we all think the same? What a depressing thought. (I don't actually think it's quite right: here's only one of many giants who asked the questions I, and you, happened to land in the middle of a public answering.)

Feynman's science philosophy was clearly important to him, but from these interviews, espousing it was more of a byproduct of his life than a motivation. (He always had quantum electrodynamics to fall back on, after all, not to mention rhythm.) He didn't read a lot of the stuff (and I still wonder how much it shows, really), dismissed what little he did read as an elaborate exercise in simplistic thinking, but for all that, he did do a lot of philosophizing. If this collection is representative of all his interviews, then it's a big part of what the public wanted to hear from him, and what we took away, and so he gets the role by default. It's a shame, almost, that he never really took it as far as he took his science, and while he was willing to march up and acknowledge the big moral questions of his career, I think he chose to leave some of the difficult ones hanging. Was he haunted by his role in the Manhattan project? He has a story about it that he liked to tell (he must have been asked about it a lot), and from it, I think the answer was yes. He tells us of the distinction between the thrill of the intellectual work, and of being a part of a community of exceptional and quirky scientists, and the late-dawning realization of the bomb's implications, which might be the end of all things. But acknowledgement is not judgment. Was there shame, regret, disillusionment? I can't really tell from the writing. In various of these interviews, Feynman would rather set judgment and decisions apart from the scientific process, and I think that's a fair peace, if it's a valid one. But if your research is driven with an intent to do massive harm, then should you do it? He doesn't seem to be the guy to indulge in very much self recrimination, and what the hell, he was really young at that time. By the time he challenged the NASA higher-ups later in life, his view of managerial competence had obviously dimmed. In the last segment of the book, he makes some similar observations on religion, noting, diplomatically, that its matters of spiritual fulfillment are, and should probably remain, unchallenged by science, but that faith exceeded its power of natural explanation some centuries ago. He avoids reaching a deeper conclusion about this, but maybe he's only offering a properly skeptical interpretation, and leaving the actual judgment to the audience. Maybe that's the best thing an honest thinker can do.

It must have been a big score to interview Feynman about religion. But for a few vestigial cultural trappings, I don't get Feynman as any kind of deist, but in ways he thinks like one, really one of these wonder-in-the-miracle-of-god's-creation types. He is infectious when he's talking about the surprising elegance of the universe, and the surprisingly deep logical reach of mathematics, and the underappreciated poetry of these things. He talks about his early mentors, especially his father, who taught him to approach an understanding of the world with appreciation, playfulness and creativity. And speaking of cargo cults, you could do all the things his dad did with little Richard, and you still wouldn't get a Feynman--any more than freezing over your yard and strapping the skates onto Junior gets you a little Gretzky--there was no doubt some outstanding nature that came together with the outstanding nurture. You could see where it grew from: here's the rare scientist who you'd imagine could get himself to devalue his own beliefs or theories with pure objectivity, given the proper evidence, possibly because he was humble enough, or had a knack to see things clearly from several different approaches, or because it was easy and exciting for him to reformulate his understanding of things.

I've mentioned that I picked this to read paired against that David Foster Wallace romp, Brief Interviews with Hideous Men, which looks at childhood and other power structures with every intention of holding the reader down, and biting him back, a wit which was the definition of mordant. Since both were broken up into shorts, I basically shuffled them together, like an angel/devil sort of thing on each shoulder. One guy felt paralyzed by doubt, the other energized by it. Feynman's wit was more the force of inspiration, clear thinking, and optimism in new discoveries, which he retained even after catching a glimpse of how people and nature really work.

Monday, May 09, 2011

And I'm Over Getting Older

Well, it's obvious that thinking about the state and trajectory of the species couldn't depress me very much more, so maybe it's time to change the subject to something that is a madcap buzz of positivity and optimism, you know, like getting older. Thirty-eight and counting, and somehow, without realizing it, I've crossed over into crinkled-forehead, responsible adulthood. This sucks! I mean, what the fuck, how did this happen? How did it happen to me?

I was never huge into trendy music movements, but there were a few bands I'd try and see if I could, and I generally liked the experience of a good crowd enough to try and get out once in a while. I haven't gone and seen a big rock show for ten years now, and I'm not sure how I got permission even that time, but the last experience was typical. Waiting in traffic, watching everyone toke weed in the parking lot, a run to the beer tent to pleasantly remove the edge, sweat, noise, screaming, darkness, dancing with minimal rhythm. It was almost exactly ten years ago, and I tell you, there's nothing like being outdoors with a beer in a crowd on a summery evening. It's exhilirating. I've never seen a professional baseball game, but last weekend's concert had me walking the kids on the sidewalk outside of Fenway park right before the Red Sox game, and there was something similar. Since the last show though, my live music experience has generally been limited to bar blues (meaning the setting rather than the musical structure), and your more ecumenical sort of outdoor event (a number of bluegrass festivals in that last category). I've really come to appreciate the summertime show that can get multiple generations up there and dancing around. Somehow, I've drifted away from any venue where you might readily spot a defiant youth.

I went to my first "real" concert when I was 15, when a friend dragged me along to go see Cheap Trick at a small outdoor venue. Good times, enjoyable show, but there wasn't a lot of 'em I needed to see at that age. At 13, safe to say that I had no friggin' clue at all about the music scene. My daughter, well, hasn't been quite the same. She's been following a Canadian band for about a year, and in February not only got her chance to see them, but actually got her picture taken wih her favorite singer (supervised, thank goodness). Since then, she has become totally insufferable, branching out into an appreciation of the general scene, blasting the radio or plugging herself into it, and coming up with all these annoying expectations.

What's her music like? I'm googling up some descriptors of the genre, including punk, rock, pop, and emo, but I probably would have used none of those words. Or maybe you need to use all of them, either as some kind of post-generational fusion, or (depending on the artist) the usual approach to the lowest common denominator. I've found it to be musically competent (even if it failed to melt my face off, dude), and it doesn't suck out of the gate. I'd describe the sound as something resembling dance tracks that someone finally decided would be more worthwhile with actual instrumentalists playing them and with songwriting that actually aspired to care what the lyrics said. I'd warrant a guess that the fan community occupies some transitional ground between the factory-made teenybopper garbage and whatever the college kids are into nowadays. Or maybe it's the legitimate big thing--who can tell now that there's no radio anymore?

Suggestively, it rises to about an entendre and a half, which appears to fly right over the heads of the fangirls (and it's just as well). I can't tell if the whole thing has been strangled down to semi-authenticity by what's left of the music industry, or if they're all just resigned to not out-do their parents and grandparents. I mean, our rock icons have already got the sexual ambiguity, religious affront, tuneless shouting, drug culture, music comprised entirely of sampling, death iconography, creepy body art, and angry rebellion covered, so what's left to piss off the parents? All that hasn't gone out of style is the stuff that never will: sex and youth. I mean, if these guys were to pump their fists and scream how it's all bullshit anyway and fuck The Man, then Mom and Dad are going to be cheering louder than the kids are.

Anyway, I digress. Here's the scene from a couple weeks ago. "Please Daddy, my friend already bought tickets. I love this band, you can't say no."

Now, I am a pretty permissive parent, but telling me that I can't say no is right up there with telling me that that's all you can eat--I'll show her what I can or can't do. Also, I really didn't want to have to deal with it, so I thought that saying the dad thing played up nicely to the family gift for contrarianism and reverse psychology. "No way are you going to a concert without an adult. Thirteen years old? You must be joking. Hell no." (Yes! Triumph!)

Which how I ended up with a ticket to go see All Time Low on Friday, along with a couple of other bands a that fanned out a little bit in either direction on the teenager/adult spectrum. (The headlining band collected bras, which is a bit creepy given the fans' age, but I suspect they weren't removed at the scene. My daughter's friend brought some pajamas in her bag to throw, but we were too far to reach, and had a better chance at hitting the sound guys and so refrained.) As mentioned above, it wasn't the music that was so bad, but the crowd was definitely ...offputting. A two-hour drive with the Fenway traffic, and the beer was overpriced and crappy, but the line to the bar, as well as to the men's room, was non-existent. There were 13-14 year old girls as far as the eye could see, and I failed to pass myself off as a teenager, even though I tried to dress down. On the second-tier section, where we were, the kids all lined up along the along the balcony, and there was enough space behind them for the straggling minority of parents to mill around and look bored. The cheering was decidedly high-pitched.

(Late in the show, they put the Bruins game on in the bars, and sometimes male cheers would sound up out of nowhere, drowning out the kids for a few seconds. I think it pissed off the band a little, but you know, welcome to Boston.)

When I used to go to rock concerts, people would hold up lighters during the inevitable ballads. (The one Grateful Dead show I watched, the place looked like a Christmas tree the second the lights went down, as the lighters got to more normal use as well.) Now it's constantly-waved iphones and cameras, to similar audience effect. It's weird without the (absence of) smells, but smoking is now outlawed here in Massachusetts for just about all public places (the single most benevolent accomplishment of the nanny state), and it's weird without the general intoxication, but most of the audience was too young to drink. There wasn't much press of crowd up in the balcony, as I said, but it looked somewhat energetic down below. Kids still mosh evidently, to varying approval of the bands.

My daughter and her friend snapped about ten dozen pictures, and at least one bedroom now has a new All Time Low shrine. And of course I'm curious what it will grow into. But what's the rush? Older comes before you're ready anyway.


Close it out, kiddos:
Maybe it's not my weekend, but it's gonna be my year,
And I've been going crazy, I'm stuck in here...


Tuesday, May 03, 2011

Perspective

What usually jars is the racism. You're reading along, and suddenly some horrible slur leaps from a dead author's pen. Ethnic characters are confined to minor roles, and generally reduced to accommodate popular caricatures. Hook-nosed and oleaginous Jews haunt the boundaries of European literature; pickaninnies and injuns pop up to offend from American books (and hell, from American television in living memory); British novels are populated with innumerable demeaning extras from the various colonies. Examples are so trivial and common that it's hard to hunt for them. When race is addressed consciously, there's some threshold of skill under which the thinking of the time could be exposed and subverted, but that was still done from a worldview that included the racism. I am thinking of your Faulkners or Twains or Conrads in that second category, who showed us whiteness with its scars and its travesties, and were observant and capable enough to complicate identities and entertain true personhood, but still used racial characters to to tell stories about what it meant to be white in those times, even with the knowing gleam that it meant being a monster. Looking back, you almost wonder why those great minds tortured themselves around a now-obvious empathy, why they instead elected to develop complexities which didn't fail to include the simplified foreignness, but of course it's how they, or everyone around them, were used to thinking about darker people. More than that: it's how they were used to observing them. We underappreciate how difficult it was to look past the prejudices that their society was built around, as well as how thoroughly we fail to see the ones which inform our own. Writing character fiction is so much extrapolation of ourselves into alien minds (and they all are alien), and even extending the map as far as possible still communicates something to readers about you and the worldview you inhabit. It had to be hard for a 19th century white man to write realistically about black identity and experience, especially when it was not customary to sit and have a conversation on an equal footing. (There is plenty historical literary trend to dismiss women too, but at least there, a fella had incentive and excuse to occasionally talk to them.) Can we judge a writer for being part of his times? Maybe and maybe not, but I think we can judge their times. Ours too.

[Jews may have gotten a head start on rehabilitation in the western canon. I was interested to read, for example, how Charles Dickens revised Fagin after the original publication of Oliver Twist, following the feedback of Jewish friends. But his subsequent efforts to create empathetic Jews still seem a little patronizing, don't they? Fifty years later, and I thought that James Joyce was a smidge patronizing to Leopold Bloom too, despite all the effort at a realistic in-the-head representation of him.]

There are a couple of associations I've had in my life that, while I don't really approve of the traditional view, were nonetheless wonderful experiences. As an adult, it fills me with fondness and apology, torturing me with ambivalence and presents a lot of conflict about institutions and individuality thanks to good personal experiences in them and the quality of people that inhabit them. (Sound familiar?) A big one of those was the boy scouts, which I loved for some of the reasons I was supposed to, and also for the aspects our little band of losers, misfits, and assholes refused to take seriously. Recently, I was reminded of Boy's Life. (Can I now justify leaving that perplexing comment?) When I was a kid, I used to go to the library and pore over that magazine, skipping to the comic serializations of John Christopher's Tripod stories (since I stole this person's thumbnail, go ahead and check the thing out at length on their blog; I get a kick out of the dangly schlonginess of that lone Tripod tentacle), and stayed for the boy's outdoor adventure porn. The Tripods would put a little mesh hat on you, and you'd go through life hypnotized, oblivious to their nefarious alien schemes (which of course I no longer remember, probably they were stealing our precious resources, aliens always do that). Science fiction likes to invent reasons to blind people to the horrifying truths, but I think the reality is more banal.

For example, the boy scouts. Please ignore here the recent blowups the organization has had over gay members and at which point it discards moms, the weird thing about the boy scouts is that they are, at heart, a military-friendly organization. We've got the uniforms, the regimentation, the pledges and purity oaths. (About half of my leaders were veterans too, and I should note good people, but I don't want to confuse anecdote and data here.) More than that, there's the history. I mean, scouting is a combat job. Lord Robert Baden-Powell, the revered founder, looms over the movement like some benevolent spirit, cast in fading colors in a Stetson and fatherly mustache, more symbolic than real, like the George Washington of self-reliant boys. In life, he was a career military man, advancing Britain in its imperial heyday, forging his scoutcraft in Africa and melding it with a naturalist appreciation there, fighting in India and the Mediterranean, rising rapidly through the ranks to cement his reputation leading a miraculous resistance at the siege of Mafeking in the second Boer War. It was a decidedly odd sensation to roll across a boyhood icon in one of Churchill's histories, and you know, it's a far different perspective than what I got when I was 10. The Boer war is remembered for the Brits' innovative use of pestilential concentration camps to their military advantage, and as Wikipedia notes, the boy soldiers (participating in a civilized junior capacity) at Mafeking contributed to his ideas to promote military scouting skills to kids too. And look, I don't want to demolish the man's reputation so much as I want to develop ambiguity and complexity about it. He was an important figure in a monstrous enterprise as well as an educational one. Baden-Powell may well be shining example of personal discipline, a Kipling-esque model of integrity, a genuine survivalist and naturalist. On the other hand, what reason to think he didn't order improper executions, or send legions of expedient locals to their doom? Any cause beyond revisionism to believe he wasn't impressed, however naively, with fascist ideals later in life? I mean, the overlap in style is a little discomfiting. Was he not also a propagandist, a guiltless and decent face to paste on Britain's foul imperial reach, monarchical infestation, and heartless butchery of the dusky hordes? I don't remember any of those things getting much attention when I combed the back issues of Boy's Life.

We might call the man a product of his times too, and aren't we all. Baden-Powell shot people that, to his understanding, it was okay to shoot. He operated nobly within his idiom, which is the usual and understandable approach to the human experience, but the legacy of that worldview is still actively fucking up the globe. And things like imperialism and peonage, violence and exploitation, deforestation and extinction, persist because people are inclined to make the best of their various situations, and not push much against the bounds they're born into. (Should they? How should they? Isn't revolution its own evil?) It's hard to bust out of the paradigm. It's hard even to identify it. I am certainly doing no better, and I admit that paradigms can come with some redeeming features too.

Do you remember how you felt in 2001 when you saw this?

Disgusted, angry is how I felt. How does it compare to these assholes?

In 2011, it's hard to find a picture of those cheering Palestinians (of which there were evidently as many as a couple dozen) that isn't linked to some really noxious blog. The cheering Americans are all over the quality outlets, but hey, it's more recent. (It took me till the ride home to find the right expression of my distaste, and of course I only find that I was beaten to it, but at least that was one less photo I had to look up.) Similarly, it's almost been enough to make me swear off Facebook.

I don't read books for the lessons and I hate to preach (really, what the hell has happened to me?), and by no means do I suggest giving up on the literary canon. Beauty, insight, and entertainment are justification enough, and the apologies get easier the farther you go back in the past, or the more disconnected you let yourself feel. (Parenthetically, it took a while to understand how lucky it is to be so removed. I remember I took a class in college where we were instructed read Jane Eyre and Wide Sargasso Sea back to back. It was an interesting contrast I thought, but I felt like an outside observer to both narratives, and no doubt still would. For some of us doofi, this empathy thing takes years.) But the act of working out context, of mapping our own worldview onto the alien mind and strange times of a great writer is a project with some nice side benefits, not a bad tool when it comes to building understanding and perspective.

They'll judge us for our times too, possibly by some distasteful or unanticipated standard, or maybe on standards we just prefer to not admit. 19th century racism and imperialism didn't exactly go unopposed. Maybe it's worth asking what are we ignoring in our bliss. As for me, I tell myself that I am at least struggling to an awareness of the paradigm, that at least I won't celebrate my cognitive dissonance. It's not like history will view me any better.

Wednesday, April 20, 2011

Review: Brief Interviews with Hideous Men, by David Foster Wallace

In 1985, Richard Feynman presented a suite of ideas to a Japanese audience about the future of computing, the text of which was included in The Pleasure of Finding Things Out, the second half of this paired review, coming to a blog near you, um, real soon now. Feynman, of course, was a remarkably clear thinker, and is remembered for being rather accurate in his predictions about these things (or at least enough prognostication hit the mark that lectures could be plucked 15 years later to support a reputation of prescience). He talked about miniaturization, which was obviously an early trend (and a safe prediction in '85!), and parallel computing is also something we adopted before so very long, although more interested scientists than me can tell me how closely and how well he called that one. One of the ideas he threw around that was new to me were about using (more) reversible processes for low-energy computing. Imagine the inside of your computer shredded down to the very cells: at the component level, the basic functions of almost all the devices--transistors and diodes--are to control the flow and direction of an electrical current, open and shutting like little informational ratchets. When they are powered, they're meant to be irreversible. As you wire these up together into little logic structures and present inputs, as shown in the NAND gate from his lecture (could have been any of them: NAND returns 0 if both inputs are 1, and returns 1 otherwise), it presents a new output and then neglects what brought it there. If the output were allowed to roll back through the gate, then it would become meaningless--you have to run each device at energies many orders of magnitude higher than thermal diffusion, so that the gate does not do that. You have to supply enough power to make sure your computation rolls down to the very end.

If, on the other hand, your output function preserves information which produced the decision, then, said Feynman, rolling back over the little hills wouldn't be a big deal. So shown here is his picture of a reversible NAND logic gate. Letting the system diffuse backwards is no longer so disconcerting, it can slosh back and forth at the local scale, so long as there is a net energy gradient pushing the whole thing forward. It may well end up being slower than CMOS, but it'll use far less energy.

[And let's do have fun thinking about that a sec. Imagining a characteristic clock cycle or distance between gates, could computation still occur very well as we encroached on that spatial or temporal period? Couldn't we have a low-activation energy irreversible computer? Presumably that would run along a cascade of chemical reactions—would that DNA computer I talked about a few years ago qualify? Would you bother to make a reversible logic gate out of transistors, or would you find some other element?]

I suppose that's the way that statistical (thermodynamic) processes tend to work in nature, cumulatively, but since we're talking computers, doesn't that seem like how we think too? I mean that as a personal observation and not a scientific statement, that is, I don't know if amounts to a servicable model of neural behavior, but as far as the way thoughts roll along in my tiny brain, the surging forward and racing back like waves, caught in loops as they chase their imaginary tails, maybe making some forward progress but only with effort and a great deal of redundance, making conclusions but only after gently wearing in the path and trucking the assumptions along, then it seems to be right on. I'm happy that it does go forward, at least sometimes, and maybe it's as useful a scale of intelligence as we'll get. I can almost hear the engine roaring along for some people while the clutch fails to engage, and it may be Feynman's genius that he was better able than most people to keep it clicking forward (and was fearless enough to let the path take him where it led).

And this may well be one of the pleasures of fiction too, to extract the linear trends out of the highly recursive subjective experiences. I think it's good writing to engage a deeper understanding of what cognition and communication really feels like in the extreme close-up, but on the other hand, there's a reason we've been lying to ourselves about it for 6000+ years. We humans seem to find this intellectual muddling forward to be just a touch unpleasant and like to make our stories about sequences and decisions. And then let's throw into that bubbling stew some other human frailties--depression, negative comparisons, failed standards—and a deep awareness of the process can start to be a real bitch. Self-loathing can achieve a very special circularity among smart people, where it leads to an analysis of your own character, which leads an understanding of why you're miserable and how you've failed to change it, which is loathsome. It may be astute, but how much of this does anyone really want to read?

This is the spirit that suffuses almost everything in Brief Interviews with Hideous Men. (There are many such brief interviews, and the whole book includes those among a long series of other vignettes and shorts.) The title of the book is the punchline: the many depressed or disturbed characters are exposed by the text as, in fact, horrible people, despite their lies and notwithstanding their occasional awareness. I hate that I got sick of them halfway through, and after a while I stopped distinguishing them so well, particularly in the "hideous men" sections, where everyone shared the same mannerisms (unpunctuated quote unquote speech qualifiers and so forth) and a universally easy intellectual access to the world of the therapist's couch, whatever the actual varied settings of the interview. Too much self-help styled recrimination hanging under a very affected textual experiment. Eventually it all started to sound in my mind like the relentless thumps and scratches of David Foster Wallace flogging himself.

I don't think it had to connect in such a leaden fashion with me. There are points of this book that are funny, which would have worked even better if he had made his jokes and kept things short. I wish there were more of them. I remarked to Schmutzie that some of the sections could read a little like a comedy routine, and a feeling of external context would have done wonders for that, but I didn't often have enough in the text itself to spot Wallace's humor cues (and it wasn't quite satisfying enough to run the experiment assuming it's humor). Some of the situations are sufficiently absurd (there's an episode of intimidation by literal dick-waving) that they're worth a bemused chuckle, and some of the characters attain a self-negating Humbertish pomposity that is entertainment gold (god help me, I thought the episode where the old man elaborates on his deathbed his resentment of his offspring—babies are such users—was hilarious). But that's not an author comparison I'd have preferred to make, because while I can accept the discomforting marriage of felicity and nastiness, it's got to balance, and Nabokov comes off as a far better writer in the matchup, while Wallace leaves me heavy on the hideousness. Likewise, the contrast of their stated equanimity to everything else they reveal in their interviews was well-positioned and entertaining when it was covered the first dozen times, but eventually the mode got too well-used. I do wonder how those parts would have grabbed me as a teenager, when I was more willing to entertain that "what women want" could be compartmentalized from their existence as human beings.

Wallace tried to break the short story form often and consciously in this book. Breaking the fourth wall and addressing the reader worked much less well than taking apart the language and monkeying with it. There is a story—told disjointed, tinted, and almost poetic—that elevates a personal tragedy into something almost beautiful and dreamlike. I also enjoyed the freewheeling retelling of a California drama in the style of some futuristic mythic versificator. The skills are there, and more playfulness and less hideousness would have gone a long way toward my enjoyment. I don't want the self-loathing spiral to seem even more shallow and yet inescapable.

Wednesday, March 30, 2011

Yes Asshole, the Rich Are Getting Richer


I visited my parents last weekend, and as I once mentioned before, these excursions usually include a lazy Sunday morning with the old home-town news rag. The city of Waterbury is most recently famous for the frightening habits of former mayors, but many years before that (maybe stretching into my early youth) the region I grew up was part of an important industrial hub. It's an urban mix peculiar to the northeastern United States: decades of corporate flight that should, you'd think, have given them some perspective by now on how difficult it is to run a city—whipping up its economy or providing services, depending on which church of ideas you attend—when the local hiring firms keep disappearing and abandoning the tax base. Despite this long trend, the Waterbury paper remains a bastion of conservative opinion, dancing with the one that brought 'em (down) over this timespan, and is currently and constantly worked up about Big Government as well as the scarier, swarthier immigrant population which is no longer from European countries that begin with the letter "I" (and which lacks those erstwhile job prospects). [Although maybe there's something to the Big Government points: no doubt any incentive the city can now offer—tax breaks, loans, industrial sites with all the hookups—is piddling consolation for taking away the freedom to dump all of the tailings you can directly into the Naugatuck river, and that pesky Superfund law clearly did slow down that mall project a few years ago, but those fine, fine minimum wage retail jobs got there anyway, they did.]

Which is not really what I'm getting at, beyond to say that the old local paper grants me a special annoyance. When my wife turns on the news at home or when I drive back listening to All Things Considered, then I only need to marvel about how extreme the mainstream has become, and the distress doesn't last. When I steal time online, I see the accepted conservatives actually subjected to the comments they so richly invite, which sates me enough to not have to write anything about it myself. But when I go to visit Mom and Dad, and the op-eds are framed in official-looking black-and-white, and what letters filter in through the editors are as supportive as they are illiterate, when, it's editorial policy aside, it's a pretty decent paper for its market size, then I find that no one is getting livid here but me. Among the usual inveterate assembly of Malkins and Wills and Krauthammers, the Republican American trucks in your more shameless (and artless) variety of deniers and class warriors for its editorial page. On Sunday, it was some guy named Steven R. Cunningham of the American Institute for Economic Research. (The version I read is behind a subscription wall, but you can find plenty of copies online from other venues.) I don't really know anything about the AIER, other than it's in the most beautiful part of Massachusetts, but if this guy is an example of its alleged mission of objective economics education and not representing concentration of wealth, then its founders are surely spinning in their graves. More likely it's just your standard pro-power think tank. Cunningham is out to skewer "one of the most enduring economic myths" that the rich are getting richer.

"It isn't true. When most people think of the rich, they probably are thinking of people with great wealth. When they think of the poor, they probably are thinking of people with low incomes. While there's obviously a correlation between wealth and income, they're not the same. And we shouldn't confuse them."
I'm going to limit the line-by-lines for this guy—the full FJM thing is not my bag, and I'm not quite making that the central point either—and excessive charts are boring (I'll link for you though), but that article should not pass without comment. From the opening graf, he proceeds from here to thrash, not this alleged myth, but a fairly irrelevant straw person, noting that people who have the most wealth are generally older, which may or may not mean something or other, but certainly leads the reader away from the important distinctions he didn't make between wealth and income, and rich and poor.

Most people consider "rich" or "poor" to be a state of concern about meeting basic necessities, extraneous pleasures, and once those things are taken care of, of attaining status. I suppose it's nice that we weren't treated to the usual false equivalence between modern pleasures (like iPods and TVs) and necessities (like cost of living and, if we wish to outlive our pre-industrial counterparts, medical care, and, of course, the indenture that most people accept to attain those things), but since richness is in part something you feel, then it's not surprising there's some subjectivity in the definitions.

But income and wealth are more quantitative. And yes, it's an important distinction, but as an economics educator (yar har), Mr. Cunningham might understand that the reason people prefer to discuss income distribution is because it's just easier to come by. Most countries keep statistics on this sort of thing, which is handy if you try to make informed economic arguments, comparisons between countries, and other stuff you'd think would be important to economics educators. Wealth assets, meanwhile, are more varied in form, and more likely to be undisclosed and private. Wealth can mean less liquidity than the numbers on your paycheck represent, but lets not kid ourselves that the "wealthy" are exemplified by the old people receiving fixed annuity payments in Cunningham's hypothetical anecdote, or that the very wealthy are in any way not rich. Steve-O is counting on his readers to neglect looking very closely at the wealth distribution he mentions, which in Waterbury is evidently a good bet. When you do look at those numbers, they're far more damning than income distribution when it comes to inequality: the top 1% of the wealthy own about 35% of it, and the top 20% own about half. It's slightly less unequal in terms of net worth (because lots of people have home equity) than it is for financial wealth, but either one is sufficient to roundfile his whole thesis. Yes, based on analysis of wealth, more of it is concentrating in the upper levels and yes, there is less distributed among us proles in the lower 80% as time goes by. The rich are getting richer, and the wealthy are getting wealthier.
"For example, from 2000 to 2009, inflation-adjusted household income fell 4.5 percent, but consumer spending increased 22.4 percent. This raises an obvious question: How did people dramatically increase spending on shrinking paychecks? The answer is: They didn't."
Hey, I wonder if anything else changed in 2000-2009! I won't keep you in suspense. Among other things, household debt increased in this timeframe by about 12% per year, while income fell as stated. I'm sure there's a relationship to spending here somewhere.
"They did increase spending. But paychecks weren't shrinking. Instead, the number of individuals per U.S. household was shrinking, which lowered the average. Real disposable income, which is essentially total after-tax income, rose 25.2 percent from 2000 to 2009. At the same time, however, households got smaller, as more people divorced, or rejected or delayed marriage. So total spending went up, while average household income - due to the larger number of households - went down."
I'm the last person to buy into the idea that economics is a field with engineering precision and scientific understanding, but we can still endeavor to put useful numbers to this sort of thing for the purposes of estimates, and some of these institutional numbers are publicly recorded and pretty easy to come by.

Households only shrank a little in this time period, but there's definitely been a downward trend since the 1960s. Cunningham is probably tooting some social dogwhistles here, but the down-slope has correlated pretty strongly with decreased fertility rate, and the smaller households are largely a result of there being fewer children in all, and more people living alone (e.g.). We can look at this a little more objectively using useful variables such as the dependency ratio, which is the ratio of the too-young and/or too-old (depending on how it is broken down; the latter usually trotted out for Social Security scare stories, but the former is more relevant to household size) compared to people of working age, and this has also declined in the cited time frame, most of the decline coming from, again, fewer children. We can also look at the participation rate, which is the number of people of working age that are in fact working. Eyeballing the graphs and applying some simple math gives relevant ratios:

2000: 1.1 children per worker
2010: 1.3 children per worker

So people are, on average, supporting more kids, contrary to Cunningham's statement, but hold the phone for a minute here... The trend since the sixties has been fewer children. If you click on the chart for participation rate, you'll note another awesome economic revolution that happened in about 2000, when the bubble burst: all of a sudden there turned out to be a lot more people than jobs. Household size shrunk slightly, but the fact that the number of available jobs dried up affected the number more.

[Most of the post-1960 growth in the participation rate is due to women entering the workforce, which no doubt has contributed to the decreasing fertility rate as well. But these two trends before 2000 (not to mention extant retirement schemes since 1935 or so) have overall been to drastically reduce the number of dependents per worker. Cunningham is, of course, prevaricating here in a general sense, even if he's picked out a little patch on which he can daub on some bullshit. Fewer dependents from 1960-2000 probably did help people feel richer though, but I don't think the increased participation rate did. Are you richer when you need two incomes to do what your dad managede with one? (If you are a woman, you may indeed be freer.) This is a reason that household income is relevant.]
"The problem is that we are not told that the top 20 percent of households includes four times as many workers as the bottom 20 percent, and nearly six times as many full-time, year-round workers. Knowing this makes a lot of difference in interpreting the original statement."
People in lower quintiles have fewer earners per household, but they also have fewer children to support. The ratios of earners per household are pretty shocking, really. The summaries in the Wiki article are consistent with all of the above figures. And what exactly do they prove? Rich households, we are pretty sure from experience and data, do not tend to have six earners in them, not without some hefty violations of child labor laws, nor are they comprised of sprawling complexes filled with in-laws and cousins, or at least that's not the sort of arrangement that pops up on the lifestyle shows. Rich households (obviously) top out at a little less than two earners per home. You have to conclude that poorer households have not only a significant fraction of zero income people, but to approach four-to-one, over half of them need to have no earners in them at all. And among those working in the bottom 20%, most of them are only doing it part time. This seems to be a horrifying reason that they're poor, and not really a point in favor of the awesomeness of the rich. I mean, it's another way of looking unemployment—of course the lowest income group is going to include all the people who have zero income—but these numbers are telling us that that comprises a hell of a lot of people. Is this sinecured fucker really ginning up contempt for all those lucky duckies with no jobs at all? (Yes.) And it's clear from the same data that working part time isn't going to do a damned thing for you either.

And needless to say, the income that the quintiles receive is well-published, and only the top two really have made gains, before or after taxes, in forty years. The fact that the lowest quintile is largely unemployed does not refute this.
"Yet, economic mobility is a characteristic that helps differentiate the United States from many other countries. Between 2004 and 2007, for example, roughly a third of the households in the lowest income group moved up to a higher income group, according to the Census Bureau, while roughly a third of the households in the highest income group moved down."
Sure, income mobility is a great thing in this country. The fact that it's less great than it used to be, or is less great than in other countries (even historically aristocratic ones), well, we peasants should shut up and be thankful for what we've got.

#

And look, I'm sorry, but that had to be exorcised. It used to be even longer. If I am so motivated, I'll take out the hyperbole and send it as a letter to the editor to be unpublished. Here's the part that's getting me though, the actual point if you want to call it that. I can understand why people employed by the AIER write and publish this sort of crap—they're paid to, directly, by people who have the wealth and power they're apologizing for—but I'm disgusted by people who continue, despite evidence, to lap it up and sell it on the retail market. You'd think that anyone above drinking age might have noticed some general economic trends by this point, and yet the entire News-o-verse has already let go of Two Thousand Eight. I don't expect Truth to be folded up and handed to me, but a little more than a thin gruel of shallow marketing disguised as evidence would be okay. Hell, just losing the certitude would be a plus, especially when you're peddling the same crap you were two decades ago, during which time the wealth community has gotten pretty much everything it's asked for. At least this Cunningham guy's an obvious whore, obviously shilling for interests that aren't mine. What the hell is the editor's excuse? What power is he speaking the truth to?

The game I usually see played with things like this (and that I am playing here too) is one of competing narratives, of different takes on the same data. Commenters like me don't usually angle straight for the lie, and call it. We like to see the twists of truth instead, different takes on it, and target a rebuttal that harvests the seeds of refutation that the writer himself sowed, and there's plenty of that here. (Maybe this thought would be better spared for the next someone who is inclined to fact-check a Megan McArdle column or something, but whatever.) But Steven Cunningham is doing more than misleading with statistics. When he poses the idea that the rich are getting richer and leads with "it isn't true," it's baldfaced.

Thursday, March 24, 2011

Review: Axis, by Robert Charles Wilson

Axis is the sequel to Spin, which I (much more succinctly than usual) enjoyed. A worthy followup I suppose, but I the enjoyment this time around was somewhat less. (Spin was well-received and won some awards, and maybe Wilson worked fast to sell a few books while his brand was hot.) I found the characters likeable, but not especially compelling. Or rather, I found that the intersting people were the ones who spent most of the novel offscreen while protagonist Lise Adams, in her effort to find out about the disappearance of her father, gets turned into too many expository circles for the purposes info-dumping. Wilson does a lot of telling of the background, stuff which, in other novels I remember, he was decent enough to get into a story of discovery or else just remain decently unanswered. It might be because this version has a substantial backstory that it needs to submit to the reader, in order to get to the questions of What's Really Happening down there with the deep magic.

He gave himself a lot of good stuff to work with. There's a sudden injection of the world into a universe a couple billion years older, and populated a bit more completely by a thin, slowly expanding skein of self-replicating hardware. There's the big puzzle of its baffling, disconnected attention to human society. There's humanity's brash attempts to understand it, a sort of self-exiled biochemical Manhattan project with human subjects. There's the boy Isaac (said subject) and Sulean Moi, an unwelcome observer in the compound getting on as outcasts' outcasts. There's competing ideas of human anthropological development in different circumstances. The polyglot colonial landscape that set the detective and chase scenes is well-conceived too, but, while not especially horrible, that particular plot was only just enough to keep the pages turning until the last quarter of the book, where the characters finally come to face the strangeness. I could have done with more Sulean and Isaac, more evidence of crazed obsession among the true believers. More internal conflict needed here please, and less 'splainin.

And while I liked the large questions that Wilson plays with, a little bit more philosophical meandering on the ideas wouldn't have upset me too much either. I mean, he's basically, and later explicitly, offered a plausible--on a science-fiction level--conception of what a powerful and indifferent god might actually look like. And it's just a damn cool idea: a universe that's full of designed machines that (very slowly, and with the aid of some fourth-dimensional physics without which they couldn't cover much volume (these don't even get a handwave, which is just as well)) reproduce and expand and communicate with each other among the cold vast reaches of space. Is it evolutionary and insensate, the characters ask, or is it thinking out there, mimicking meat heads on an impossible scale? Does it live and die too, is it finite? The manifestations of the big celestial mind, the behavior of its "cells", are pretty cool too, machine-like, life-like, weird, and pretty innovative when seen from the ground. It's use for civilization, we learn, is to grow itself. Biological societies at a certain state of development will eventually launch hardware into the void, and when they encounter evidence of that network, will want to swap collected information on that scale too. Maybe, like our jelly life, the cosmic mind is out to create pockets of information in eternal defiance of the second law.

It's likely that many of Wilson's novels (well, that I've read) could get retconned back into this same universe. The themes presented here are definitely his usual schtick, which I've always liked. The couple requisite moments of sentimentality are not forgotten, finding compassion in that weird juxtaposition of the cosmic against the human.

Sunday, March 06, 2011

In between the bright lights and the far unlit unknown...

You know, there's a pretty good incentive to get that current turgid blob of a post off of the top of the queue and try to write something which, at least for me, passes as entertaining. Well, better luck next post. That I'm feeling somewhat less than entertaining is obviously part of the problem, and my mood these last few weeks has been anchored by the disheartening realization of the mutually exclusive financial realities of cultivating my own damn garden and giving my children future opportunities to do the same. If only...blah blah blah, it's not like I haven't been over it before. And anyway, gardening has its own share of commitment and frustration. Even in an ideal world of unfettered self-actualization, there it's hard to figure out what your passions and skills are, and in this bizarre world where livings have to be made, then good luck on that passion, if you have one, keeping you fed. At thirteen years old? Did you know what you wanted to do at thirteen?

I'm not unaware that I'm talking about problems of relative privilege. What I'm really doing is pissing and moaning about the governing social paradigm (handy concept, that) which for all of its papered-over inequity, evil, structural inequality, and destructiveness has in this country at least managed to foster a middle class full of crackers like me for almost 80 years now. Lots of different treadmills, many of them decently upholstered, and even those of us without connections have some options about which one we hop on, and the sooner we decide the sooner we can put in a down payment.

Okay, at thirteen I did have a vague idea, if not a passion. I'm thinking that I must have seemed like a promising kid, and god knows that my parents tried to keep doors open and encourage things. When I was little, I alternatively wanted to be an astronomer or a chef (I was joking to a friend last month that I split the difference and went into chemistry). Mom cooked a lot at home, so that makes sense, but I don't where the science bug came from. And all these years later, I am acutely aware that there's something that keeps me apart from passionate scientists too, and that I'm a mediocre performer, and I have reservations about role of the field and its future, but I don't know what the hell else I'd do (although doing honest work with a science hobby seems like it could be more rewarding than the current arrangement). That general orientation helped my parents do what they could to get things started in my life. Strange to think of it that way, but I was pretty lucky for that.

My daughter will be entering high school next year. She's already presented with a choice between technical and academic programs, and these too are mutually exclusive. The tech ed seems like a good program, but it definitely takes her off the academic path. If she takes the culinary arts training, there aren't, at a minimum, any advanced placement courses available (I think AP is a scam, but as a synonym for "more challenging classes", this is annoying), and by junior or senior year, it's special alternative tech courses, a recent innovation, thanks to Massachusetts' graduation requirements. The kids, the instructor told me, tend to go on to culinary college, which seems to me like a strange metric. While I'm happy that cooking is taken as a serious vocation in this country now, it's disappointing to be reminded how far we're down the Player Piano timeline. I mean, that's what sets trades apart, right? Learning by doing, and a tradition of apprenticeship? Do you need that cooking doctorate before you take the $4/hr dishwasher job to get started in the actual industry? (I bet the chemistry requirements would be pretty cool though.) On the other hand, assuming the normal vocational path is still available, then I'd be happy to support that route too (with the fortune I save an added bonus). Adding to it all, there is my opinion that general high-level education is good for humans, and the way we tend to squander it when we're young, well, that can be too. (Good times.) But if she's got a real passion there, then she's ahead of the game.

And speaking of paradigms, I wish I could shake the sense that this is all about picking teams for the next generation's class structure. There are lots of ways to live even within the system, and since my town offers so few examples of them, we (by which I mean my wife) have doing a lot of research for enrichment programs for young people. We've just enrolled Junior for what is basically an educational summer camp, which, at least as far as I can gather from the brochure and the orientation seminar, is totally awesome, with not only classes and workshops, but also optional cruises and outings and all kinds of genuinely fun activities—stuff I wish I had some excuse to do as an adult. It's not exactly the Bohemian Grove, but there's a strong networking component here, and there's much they're encouraged to do together in a variety of overlapping groups: let's forge bonds among the kids labeled up-and-comers, build up those intra-class intangibles. A stronger experience is expected with those that supply more cash, and I expect my little girl might find a small cultural divide between her and the residential students.

The administrator of the program gave us parents a short presentation last week, talking how much more satisfying (and easier) this job is for him than actually teaching middle school. Well, sure, when you run a word-of-mouth sort of program among the helicopter set, and when you keep the riffraff out with a stiff $2500 minimum requirement for enrollment, no credit cards, thanks, then it probably makes it all a little easier. The imagination and enthusiasm is impressive so far, don't get me wrong, but as for the kids without these opportunities or motivations, then they're left to the same devices as before.

And for all this, the course themes, all these new opportunities, they're are all targeting the petit-bourgeoisie. They have cooking, woodwork, art and music, production, sports, along with some business- and law-themed classes, and a smattering of medical industry sorts of enrichment. Some themed chemistry too, I note approvingly. Not a lot of financial analysis or "leadership" training. And it's fine, I guess, from one point of view, as these are all things that people do in the part of working America that doesn't have it too easy or too hard. And it's a fuller set of ideas than we've been able to showcase so far. Welcome to the middle class, kiddo. I'll do my part and start getting used to the crippling payments that go along with your indoctrination.

The future looks vast from far away, and has a habit of shrinking as you meet it. You find the wide open road gets narrower as you walk , and its direction depends on many more people than you. It's great to be young, to be starting out on the journey. I hope my daughter can find more paths than I've been able to show, wish I were better at pointing them out.

Wednesday, February 23, 2011

On The Structure of Scientific Revolutions. Part 2: Kuhn's Epistemology

Here's the second part of my post. These are the points that are more closely connected to the various discussions I had that motivated me to read Thomas Kuhn in the first place. I do want to reiterate that I think that his idea of paradigm- and revolution-based science history is usefully descriptive, and I mostly like it very much. I do take some exceptions here and there, however, and have some disagreements with respect to its universality. Mostly, I'm interested in challenging it against this epistemological paradigm that I've gone and developed in spite of myself.

A TEXTBOOK MISTAKE
Kuhn generally equates the current mature scientific paradigm to the stultifying stuff taught in textbooks. I have a few texts written before 1962, but those tend to be either highly specialized (not yet obviated I guess by new ways of looking at things) or else artifacts that I picked up and keep around as souvenirs instead of sources of information. Maybe things were a little different forty years ago. I mean, yes, textbooks serve to indoctrinate people into the current state of knowledge, but no, I don't think that these texts define very well what science is. To some smaller points of his, advanced book-writing isn't really frowned upon, and I also disagree somewhat that science separates itself from the larger community quite so much. There's a reputation of intellectual superiority that I think scientists vainly like to keep, but on the other hand, premier publications such as Nature and Science, really aim for general understanding of highly complicated fields. Or (I just added) go read the Feynman lectures (these clock in a couple years after Kuhn's essay). My introductory college textbooks often talked about past and current controversies, including the paradigms that stuck and the ones that didn't. The story of a gestalt switcheroo that turns a bug into a feature is an enduring favorite. The sort of triumphant narrative of toppling a progression of barriers that made Kuhn bristle? I don't know if I got that one quite as much, and when I did, it was more concerning the early discoveries. (We'll go back to classical waves, in other words, but not to a continuum of angels.) In my observation, the idea of thriving within a heady open-ended scientific crisis period is closer to the idealized self-congratulating story of many "top tier" scientists (as a colleague once liked to say) today. Even here in the dregs of applied science, "innovation" is the name of the game.

I think anyone working in a research field understands that scientific paradigms are articulated almost like a correspondence, a slow-motion argument consisting of innumerable published articles, conferences, and less-formal meetings. Underlying this communication is the normal science that Kuhn describes, but I don't think the subject matter is chosen solely to gratify a bunch of expected hypotheses. The popular sessions at a conference are the ones chasing after the sexy new field and lighting up the current controversies. Scientists, at least certain kinds of scientists, are just plain hungry for anomalies to fight about. They go looking for a crisis. And even for the bigger paradigm busters, there's plenty of room for brilliant kookery (e.g.) out there on the fringes.

Kuhn made a lot of hay about the seminal insights that John Dalton, a meteorologist, brought to the early days of chemical theory. The mode of thinking that he brought from a different discipline gave him tools to look at chemistry problems in a new way. Again, I don't know what it was like in the early 1960s, and maybe it's Kuhn that helped to begin this newer intellectual paradigm, but much like sexy research, digging around for nuggets in other fields is accepted, common, and encouraged these days. "Interdisciplinary" has become a buzzword too.

WHERE DOES A PARADIGM END AND NORMAL SCIENCE BEGIN?
Kuhn often implies that not all paradigm changes are the same, that there's a gradation in revolutionary goodness. Roentgen did more than just articulate his paradigm when he discovered x-rays, but on the other hand, he was no Copernicus. We can scale down and down too. Every scientific experiment (or thought experiment) has a challenge and a reconciliation built into it. It's meant to test the paradigm and explain the results. Most researchers will be presented with anomalous measurements even in the course of normal science—if everything goes as expected on the first try, then you really are doing common engineering—which they might ignore, fail to notice, or suitably explain within the existing paradigm. Paradigm shift, Kuhn explains, is a consequence of this kind of normal puzzle-solving difficulty, a question of only how important and persistent the anomalies seem to the community. Kuhn also notes that one person's anomaly is another's puzzle problem, depending on what viewpoint they subscribe to. There is no bright line.

Furthermore, while "paradigm" describes the full body of communication, everyone carries around an individualized understanding of it. If other fields (perhaps even the impure ones) are allowed to come in and interact, it can be a source of competing ideas. As we introduce the idea of competing paradigms, subdivided fields, when we don't let the unpopular ones quite fade away, then we might observe that all of these ideas, dead and alive, can always be compiled into a paradigm-of-paradigms that we can never approach from the outside. I don't have much to add to that, except to note that it does give an unpleasant point from which to voice disagreement, and also from which to advocate, when a paradigm can have a broad or a narrow meaning as the discussion demands.

DOES AN UNDERSTANDING OF PARADIGMS INVALIDATE THE SCIENTIFIC METHOD?
Prior to reading, as well as throughout the text, I imagined the description of scientific paradigms as a meta-construction built around the normal operation of science. Kuhn calls out normal science as the process of hypothesizing and delivering (until the point where this process fails to deliver) expected results. The anomalies he discusses, the ones that are seen as significant enough (and timely enough, and seen by the right eyes) to demand a new way to look at things, he stresses do come out of the normal operation of science. I don't think he means his views to invalidate this established investigatory process (even if they might require us think of it differently).

I tend to think of the scientific method mostly as a flowchart, starting with observation, study or review, followed by hypothesis, tests for agreement, and conclusions based on results of the test, adding and constantly revising the body of work. I've said that I don't think of this dogmatically, and see it mostly as a general guideline. Kuhn mentions that scientists tend to proceed day-to-day without thinking very much about the rules they're following, and this is true in my experience. I agree that research is goals-biased, and certainly test methods, standards of proof, and so forth are informed by (or are) the paradigm. The scientific method might count as a rough approximation of the quotidian work ("hey, let's see if this idea works"), and even if it's an imperfect decision-making hierarchy, it gets reinforced at the higher level in the conventions of scientific reporting (the customary sections of a paper—Introduction, Experiment, Results and Discussion, Conclusions—restate it outright) and also at the level of scientific funding decisions (write a proposal, and get money to see if it works). The scientific method is a beloved part of our current science philosophy paradigm, but much more than that, it is also part of a fundamental literary one. It maps the process of investigation on to a classic story: what is our subject like, what happened to him, and how did he change.

We like to construct narratives around science, just like everything else. If that can be seen as a template for the scientific method, then can the paradigm approach be mapped that way too? Is the articulation of normal science equivalent to a background study? Is investigating the anomaly the test of normal science? Do the conclusions and revisions amount to the delivery of a revolutionary new paradigm (or the reconciliation with an old one)? Well sure, if we are willing to speak broadly enough.

WHAT DISTINGUISHES SCIENCE?
There remains a need to evaluate theory with respect to observations, and when Kuhn discusses the acceptance of new theories, he addresses this in terms of scientific validation. He denies a Popperian sort of straight-up falsification (rightly I think), and also more probabilistic sorts of validation (that is, accepting things more strongly when they agree better; extraordinary claims require extraordinary evidence). We might take on a new paradigm that's popular, or elegant, or simpler, or seems to promise richer articulation, or fits with the other ones, or maybe it's all just arbitrary. Kuhn speculates that what makes it science isn't necessarily the acceptance criteria, but maybe the fact that it's imagined as intellectual progress. I don't agree with that. I think what makes it science instead of something else is that it's evidence-based.

I remember a quote from my freshman physics book, paraphrasing, that in reality, electrons are neither particles nor waves, they're electrons. Kuhn eventually gets to a similar point and cites it as the resolution of a scientific revolution. I don't want to give him this one. I think that the probabilistic way of looking at things, more than Kuhn's, suggests that nature is an independent thing, and that more than one view can be held simultaneously, within some range of validity. Of course, that could be just me thinking like an engineer, bringing in a less-than-pure-science viewpoint of my own, which perhaps has more of a most-workable-understanding-given-the-data sort of culture. I prefer to couch my understanding of science as a series of known assumptions and constraining rules (maybe the same thing as a paradigm as Kuhn means it), under which some theory is known to be useful, sometimes only good-enough useful, and sometimes only preferred because it's consistent with other theory. I don't feel anyone has to take one rigid outlook to the table.

I concede that science revolutions may not always go to the best theory (it's too early to tell when they're busy being all radical), and certainly doesn't result in the best possible one, but to say that it's a competition between existing paradigms doesn't, to me, refute very well a probabilistic validation approach. At a minimum, there's a requirement of descriptiveness that contributes to the appeal of a new paradigm.

BEWARE OF FALSE SYLLOGISMS!
The old understanding of the scientific method is also useful for categorizing ideas. It's good to keep in mind that a "hypothesis" is a proposal, while a "theory" is well-understood within its definition and constraints. Mostly, this serves as a helpful tool for dealing with poorly-informed blowhards.

One thing that a probabilistic validation is good for (and which I think a paradigm model deals with less effectively) is to keep down the poorly supported competing theories. It's a continuation of the point, but it deserves a special heading. It's true that all iconoclasts don't fit in within the popular paradigm. On the other hand, just because you are out there taking a chisel to everyone's favorite statues doesn't mean you're a revolutionary. Maybe you're just an asshole. It's good to have some rules of thumb here. You'd better have a damn good argument if you want to show me your perpetuum mobile.

WHY PICK ON SCIENCE?
I am most comfortable spotting paradigms outside of science, as well I might be. Politics and economics seem, to me, to be filthy with the things, and far more than with scientific study, they are unburdened by the rigors of empiricism. Why do people come to suddenly believe in Communism, in consumerism, in American party politics, in popular revolution, in abolition? These are more gestalt-style shifts, nudged on, I often like to think, by events as well as the evolution of scientific paradigms, but colored more heavily by the whimsical human imagination. The failure of old networks to address perceived social crises, suddenly perceived broadly enough, precipitates revolutions of a more political (and generally violent) sort. Kuhn touches on this at the end (it may have been his starting point), but if you've ever witnessed a debate between an American liberal and conservative, then you've seen very clearly a failure to accept the other's set of assumptions and evaluations, not to mention a rather questionable concept of progress, in addition to a craptastic analysis of data (usually worse for the person with a threatened advantage). Living in a political climate that I loathe is difficult, especially when the tools I have for analyzing it are also the ones it provides. I give the social dissenters some major props, including, and maybe especially, those who can spot the system and find a way to conscientiously object to it. I think that much of the alternative social paradigms come from literature and art (and science may owe more to these than is usually acknowledged—I liked Kuhn's point that in the Renaissance, there was little distinction between science and art). I love to see when scientific principles are applied in a more honest manner than number-crunching your way to a foregone economic conclusion from dubious assumptions, and it's governed a lot of my reading in the past few years. My minor observation is that a more evidence-based approach would do wonders for the world.

Boring! But it's out of my head.