Saturday, March 29, 2008

The Holy House of Mouse

Religious training for me was an odd thing. We weren't a church family, but my parents pushed us kids toward some measure of Christian exposure, just on the off chance it might take. We went to Sunday school, to Jesus camp, got conscripted into inept musical performances at services, which my Mom and Dad would attend only under pressure to support the kids. (To this day, they'll neither confirm nor deny their agnosticism, which I suppose is how you do it.) The church was like someone else's house, that I visited frequently. The faith belonged in the building, but I somehow fit in there too, all its nooks I knew, all the places a kid could tear around and play or act serious when people were looking.

So there I'd find myself for a Sunday a month, accompanying the yearly murder of Bach, or otherwise taking my part in some divine pageant or other, trucking the blue vinyl robes off their shelves, and hanging the silly gold diamond from my neck so that it would fall down my back in the most dignified way. The choir robes, at least those for us irregulars, had all the grace and durability of a graduation gown, consciously designed for appearance and not wear. It conforms with my enduring impression of church: insufficiently vacuumed back rooms, sterile kitchens, rooms full of battered folding chairs, boxes of filthy waiting-room toys, ragged stages heading auditoria of linoleum tiles, institutional smells. It's not as though these places lacked love, but it is impossible to apply the same sort of attention to an official space as the rickety box you spend most of your days living in. And it's not merely a flaw of the easygoing Protestants. The Catholics in town have it even worse, married to larger notions of frugal grandeur: "stained glass" of peeling applique, threadbare pews and fraying carpets, grand entrances with atrocious drafts. On Sunday evening, whether following the Latin or otherwise, all of the sacred vestments must necessarily get stuffed into some closet or other, maybe next to the choir robes, forgotten between cigarettes, coffee, and whatever business the cloth undertakes until they're used again at the next service. These objects can't take the sort of beating, the sort of love of everyday use. We don't caring for them like we do our dishes or our jeans, it's more like the occasional attention we pay to our halloween costumes, ignored for 50 weeks a year.

These humble dioceses, they'll attract their aesthetes I suppose, and certainly their locals, but in the business of marketing salvation, you need to have the look of success. Once upon a time, maybe it was enough to wave damnation around and send the flagellants through the town square every once in a while, selling, you know, that pie in the sky. Life used to be cheap enough (and no doubt it'll eventually be cheap enough again) to pull off the soft sell. For the rest of the Thomases, there's always the might of Rome to pound in the nails. What draws the discriminating into the the house of Peter? The thought that there's some mere spirit floating through the collective unconscious isn't going to do it, you can find that blowsy poetic crap any time people are left alone long enough to dream. Something must convince our unseemly pragmatic minds away from the low and local versions of wonder, and into the official corporate mysteries. The doctrinaire are pulled to the trappings of wealth, no faith worth its divine endorsement can subsist on Wonder and Welch's. If you've ever wondered what draws the penitents and pilgrims to Rome, it ain't the purity, it's the idea there is, in fact, no salvation on a budget. I've never been to St. Peter's, but I assume it's spotless, opulent. The Pope either scrubs the grease off the chalices himself, or else wipes his bedazzled sleeve over them and declares the blemishes holy. Whichever, and whoever, and who gives a fuck, so long as it's made of gold.

When my children were young enough to be easily impressed, we toured our share of low-budget amusements. Mother Goose, bereft of copyright and bound to her stations of the cross--here imprisoned the Shoe, there in a treetop Cradle, there with One Shoe On--immortalized, at least for a while, in rotting plastic, gathering mold seven months of the year, and dusted off by lackluster teenagers for the adulation of the cheap summer rubes. Icons of glory, um, lapsed. As children grow, many develop higher expectations of their faith. There is the expensive honor due the institution, sacred obeiscances and glorious tithes available to only to those of appropriately molded imagination. You can buy a hundred cartoon knockoffs, but in its deepest mysteries, Disneyworld markets the whole-family Hajj with the full force of its copious shareholder accounts, cramming an unfunny cartoon mouse onto cruises, into plastic fairy castles and styrofoam mountains, onto roller coasters, reducing him to ears (a bifurcated and bulbous excuse for a cross), and replicating those dispensible icons in profusion and moving them through innumerable gift shops. Make no mistake, there is no ground-in grime on Goofy's nose, and at the first hint of wear, his rubber suit is rapidly recreated in the smoke-belching Imagineering foundries. I do my best to bring my children up decently heathen, but the Spirit calls to them, it calls.

Seth Stevenson made a good pass at the experience, noting the corporate horrors, the more seductive for the enormous endowment glimmering from every carefully arranged crack. The reactions of the comments section is telling: how dare he question the teachings of Mickey? The truth of any living God must be in the myriad interactions of His smallest elements, eye-motes and tumbling sparrows and all that, but the Mouse's epiphany is decidedly top-down. Yeah, there's quality in there sometimes, but quality is only a gateway. Even the most precious Disney Moments are cribbed from better written material, and any actual genius is thoroughly digested to pap in the corporate tract before it is finally shit out as canon. The immortal Sheherazade gets a shade of Robin Williams, does a stint with the Disney Princess, dancing on lunchboxes with the rest of the shamelessly co-opted fables, makes a brief stop at Burger King before finally nestling snug in the landfill, eternally staring at wishing stars along with a generation's worth of other plastic dreck.

I guess what really pisses me off about the Disney experience is that vaunted attention to detail. They lay claim to the wonder of the mind in carefully packaged detail, much like the church would corner our spiritual journeys and the less coherent political and commercial powers are well served by a country that sucks up the idea of the American dream, shoehorning us all into the suburbs, onto highways, married with kids, struggling with shifting minutae of planned moral behavior, crippled under eternal and massive debt. There's nothing wrong with these narratives in themselves, but it chafes my spirit to watch my life shuffling along according to the mapped outlines. Real imagination takes us beyond the obvious story, at least in our minds, and maybe even pulls us out of the patterns of fervent consumption. To angle a monopoly on the very process of dreaming, sucking dollars off of it as it tumbles down every step of the value chain--that's inhuman. It takes a mouse.

Wednesday, March 19, 2008

Review of Shadows of the Mind by Roger Penrose

Improbably, "Books for Buds" continues. This one is for TenaciousK, and evidently not a moment too soon. Evidence for TK can still be found here, and although he ventured into the Outland some months ago, you can still catch him around once in a while. The rationale for Shadows of the Mind is that it's about consciousness and the brain, which is something that TK writes about frequently and, evidently, is involved in professionally. Sounds good as far as it goes, but the author of Shadows of the Mind is a mathematician, and approaches the subject from the edges of math philosophy and quantum theory, whereas our subject would be better described as a behaviorist, looking at the various identified or speculated physical and chemical processes in the brain, and assessing how they affect people's actions and thoughts. Does this book really suit TenaciousK? Well, let's put it this way: Roger Penrose opens up with a fair-minded 3000-word discussion about gendered pronoun conventions--I think it works out just fine.

#

I don't mean to imply that the author's verbosity (or TenaciousK's for that matter) is a handicap. For his abundance of words, Penrose reads is quite a pleasant read. (Careful observers may note that it took me the better part of a month to plow through this anyway. I blame free time.) He writes with an enthusiasm that can be charming in professors, an animated spasticity that translates, with a a lot of exclamatory asides and italicized profundity, to the written equivalent of popping out of his chair with and gesticulating excitedly when the ideas frequently take him. You get the feeling that here's a guy who'd miss meals in the throes of inspiration, and who'd whip about the front of a classroom like a sprite--four chalkboards filled with wild illegible scribbles--as he teaches students. It's infectious: I'd absolutely love to see one of this guy's lectures.

I do wonder who his purported students of Shadows of the Mind are supposed to be. I imagine them to be people like me, with an adequate technical education, but outside of the field. He writes generically to lay people, but without the various slapdash theoretical concepts I've gathered over the years (my quantum has always been pragmatic, and still inadequate for my job), I'm not sure I'd follow very well what the hell he was talking about at all. He does a good job of keeping the language easy to follow, but in the process, he necessarily obscures mathematical details. It's cutting right to the hanging philosophical questions, but glossing over how they arise, and while his practical examples are cute, they're a bit confusing. He doesn't leave the reader in a good situation: deferring the details of his ideas, as he must, it's not easy to challenge them, and well, they're the proverbial extraordinary claims. A lazy dolt like me is stuck going after his logic.

Here's what he's basically done: he has taken the weird mysteries of cognitive science (where does consciousness come from?), and looked for the answers in the difficult fringes of mathematical philosophy (a sound algorithmic system can not observe certain mathematical facts about itself), quantum theory (at what point does quantum superposition give rise to classical observation, and for that matter, how the hell does gravity fit in?), and cellular biology (is the cytoskeleton involved in information processing?). When he finally gets to them, his propositions are pretty wild, but Penrose is too smart and too honest not to recognize counterarguments, and the problem is that they sound more convincing (to me), and no less weird, than what he's advocating. Why must we insist, for example, that human thought is a sound algorithm? That it is algorithmic (that is, it follows rules to change its internal state and output, which in this case would be physical rules), certainly appears to be the physical case, and considering the mathematics, a quote from Alan Turing claiming that some level of fallibility is (perhaps) a necessary condition for intelligence resonated better than a chapter's worth of disclaiming that very thing. He needed to ride on a sophisticated appeal to incredulity that such a machine as ours cannot contradict itself internally. The other issue is that for his pondering on quantum mechanics, I have difficulty accepting that brain physics are special in any way that device physics aren't. Yes, there are quantum state reductions that must occur for information to propagate in neurons, but this is also true for electrons in transistors (or you know, in almost every physics we exploit for anything), and, for that matter, more obviously the case.

I don't want to sell his speculations short--they're fascinating hypotheses--and it would be thrilling to have them proven right. But Penrose is also interesting as hell, and expertly grounded, as he points out the disconnects in the theories of AI and quantum mechanics. These discussions make up (perhaps tellingly) the bulk of the book, and I'd probably recommend it for the background more than for his particular conclusions.

Tuesday, March 18, 2008

It was a slow day


It was a slow day

…and the sun might have been shining somewhere in another hemisphere. Icy New England winds whipped grit around under the sidewalk lights, which, if I turned around, I'd be able to see through the big windows. I thought about it--turning around that is--but the chilly outdoors didn't hold a lot of allure either.

"Tedious? Could you say it again?"

"Tedious."

"T-E-D-E--" Ooh. Evidently there are no take-backs in official, certified spelling bees. She sighed and continued, eyes drifting floorward and shoulders slackening. "…I-O-U-S."

The panel had been as hopeful as I had been (probably for different reasons), although they tried to keep any feelings under their black suit jackets. I could see it in the slight tilt forward, and hear it in a brief indrawn breath on the microphone. From the back row of the audience, I could only see the backs of the panel members, dark business-wear to a woman, heads varying from black to gray and cropped in severe educator fashions.

"That is" [pause] "incorrect."

I guess they can't point out how close they got either, but everyone knew.

The speaker's hair was dark, cut in a horizontal line just above her shoulders, and it rippled a little bit every time she moved. When she looked down at her sheet, a cheek or jaw sometimes flashed, and I found myself wondering, without much interest, what she might look like, if she had bangs, maybe if she had wrinkles. Evidently she took this thing seriously--moreso than anyone else--so maybe she was there in some official lexicographic capacity. But that wouldn't describe the disappointed pause. Did it come from a freshly minted naiveté, still young enough to hope? Maybe it came from a bureaucratic frisson of seeing someone (almost) succeed within the rules. Her voice had neither the piping lightness of a young woman nor the breathy warble of an older one. "Is that person a teacher?" I stage-whispered to my younger daughter, who looked me in the eye for a second before scooting even further out of reach and deeper into her own fantasies, chattering softly. I myself daydreamed about a soft-eyed spinster handing me a form at the DMV.

"Your word is: 'Soldier.'"

Here's a kid with a lot of tells. The odds were bad just by the way he was standing, kind of shambling in place, not suggestive a lengthy attention span by any means. But in his defense, it's 6:30, he's still at school, and at three times his age, I was also fidgeting. He raced into a reply, without confidence or concern.

"S-O-L-J-U-R"

"That is incorrect."

He grinned his way to the losers' bench. The children who had had their turn were a badly organized row of disconsolate wrecks and, like the last contestant, overanimated goobers. The only thing that kept them from bolting was the glare of the principal. The ones who hadn't yet had their turn remained orderly and attentive.

"M-E-D-I-C-I-N-E"

Score, and about damn time. This is shaping up to be the shortest spelling bee in history, which is good, but there has to be some dignity salvagable from this sham. I mean, there were two hundred people and change stuck in there. Out of the first thirty or so contestants, "medicine" was the third correct answer. What are the odds they could spell 'decimate?' (Actually, decimate wasn't right--what's the word for one in ten left alive? For that matter, does 'medicine' come from the famous Florentine family? Nah, probably not. What's going on up there?)

"Um, could you use define that, please?"

Ha, someone had seen this done. Maybe there would be a competition after all.

"A musical term describing the perfunctory application of a musical note, usually with upbeat rhythmic phrasing."

Somehow, I don't think that helped.

"Um…S-T-A-C-C--"

Yes, yes…

"Um…E-D-D-O"

Jesus.

"That is incorrect."

Staccato signals of constant information… The bastards put those three words together on purpose, didn't they? A loose affiliation of millionaires and billionaires, and… "Daddy, stop singing!" (The nerve of that kid. Well, what could I say?) "Don't cry baby, don't cry."

The little one has been working for months to perfect a withering stare, and I got one of her best. I glanced around at the crowd, and there was no evidence that anyone had made a similar connection. Just what the hell is wrong with these people? I hate this town.

"Your word is: 'Distraction.'"

The colons were all audible.

"D-I-S-C-H-R…"

One game you can play while feigning interest at this sort of thing is to watch the kids, and try to surreptitiously guess which parents are theirs, based on how they dress and how they sit, what they seem to think of all this. The fat kid's parents are garrulous and large, the sorts of people that take up twice their allotted space, figuratively and physically. A few cute kids have ugly parents (and vice versa), but mostly it looks like the same crowd regarding itself through slow glass. The poor girl who flailed on "staccato," her parents appeared confident and caring, and not particularly concerned about this ridiculous event, a smile at the earnest attempt, a sympathetic eye-roll at the missed ending. I liked them on sight, but not so much that I wouldn't avoid a chat if it got me out of there thirty seconds sooner.

"Your word is: 'Chagrin.'"

The only parents that I already know were already sitting at the other side of the room when I walked in, which was just as well. Their daughter didn't come close to getting her word right, tacking on a string of nonsense letters at the end to fill up the space. My own little girl whiffed too, but made a passable guess at a word she'd never seen. She so much like me it's scary sometimes, but in fifth grade, I was always reading. This one, you have to twist her arm.

In that year, I came in a tightly contested second place in my own school spelling bee, ultimately missing on "freckle," which I spelled, like a dolt, with an "el." I remembered to share the story with Junior on the ride home, and performed my best fatherly scowl when I mentioned reading habits. I felt like a tool, and probably looked like one too.

"Your word is: 'Plebian.'"

"Can you use that in a sentence, please?"

The only thing that was keeping the curtain from falling was that a child, for the victory, also had to correctly spell a bonus word. This was a tough order, under the circumstances. Only one girl made it through the first three rounds, but she missed her game-winner. As a consequence, all eight survivors from the previous round had to be called back up for a replay. I felt bad: some other little kid ended up on the stage after that spelling for the victory, a little boy whose best effort that day would only get him shamed in the state competition and waste his whole family's day. Well, there was a trophy. The principal was looking uncomfortable on the side of the stage, and if I wasn't imagining it, the spinster's movements were getting just a little bit more stiff as she called off the final word. Could this be it?

"Condescension."

"Okay, C-O-N-D-E-S-" [come on…] -"C-E-N-S-C-I-O-N"

But you know, these people are all right. What's more annoying than a spelling Nazi anyway? There can't be a more useless measure of intelligence than spelling prowess: it's no substitute for actually using the vocabulary, for effective language. The most valid reason to aspire to good spelling is to insulate yourself from looking stupid, but really, at the point you might actually consider using some non-Latinate, multisyllabic string of gibberish, you shouldn't anyway, and even when you insist, no one will get upset if you look it up first. Competitive spelling is the celebration of rote over critical thinking, of memorization over imagination. Maybe it works as a sport? Maybe the inspiring moments come from mining the improbable depths of the brain's resources, making the most tenuous inner connections and coming up right. My early vision of overeducated and underfulfilled parents slavering at the sideline as their diminutive homeschooled twerp morosely prattled out "phlegmatic" or "magniloquent" or whatever in front of a crowd of hundreds at some drab heartland community college theater turned into something more pleasant, the thrill of victory, or better, since it's not the sport that usually gets positive attention, a subplot of secretly held hands and unexpected kisses stolen from other little helicoptered kids, alone backstage for a few minutes, cornered, for once, with someone else's sympathetic fears and passions.

I also can't avoid noticing that I take some pride in being a fairly good, if utilitarian, speller myself, and all of these life prescriptions I'd been spouting looked, just a little uncomfortably, more like projections than anything. It's something that might really have sunk in if I gave it time, but the kids were pouring off the stage by then, and the little one was tugging at my sleeve and the older one needed a hug. The crowd of mostly strangers pushed through the door together tightly for a moment and then spread into the cold evening. The way we look to us all, oh yeah, oh yeah.

Tuesday, March 11, 2008

Random Roundup II

1. Uh Oh, This Means--
I blather on a lot about overpopulation, unsustainable development, crop monoculture, and limited oil resources. A whole lot of end-of-the-world this and crapping-up-the-linens that really, blah friggin' blah blah blah.

This, on the other hand, is fucking serious!

2. In Geeky, Condescending Memoriam
98% of the internet is aware that we lost Gary Gygax, the father of Dungeons and Dragons last week. I'm sorry to see anyone go that made nerds so happy, and I remember a D&D "monster manual" sifting through the kids at Sunday School (of all places) when I was eleven or so, and thinking the pictures were pretty cool.

Still, I've got my credibility to maintain here:
Suck on that, fanfic losers!
Just sayin.

3. The view from the monkey tree
Should prostitution be illegal? Answers range from the patronizing to the defiantly liberal to the um, anti-patriarchy-patronizing (matronizing?). Much like my opinions on casino gambling, I don't regard the oldest profession highly, thinking, at the very least, that it can't be good for the character, but the black market (for prostitutes) and the state monopoly (for gambling) probably don't make things better. (And for what it's worth, I more or less agree with Roy Edroso and John Cole on Eliot Spitzer, as well as with others buried in places more difficult to link: even if he went after deserving targets as the AG, his means were unsettling. [Edit: I'm actually going to have to think about this some more, not that anyone's reading, or otherwise cares. But I still assert:] What an asshole.)

In any case, it's a difficult issue, and I'm going to be pondering it all day as I go about my regular employment. Because you know, they can pay me to put out, but as hard as I fake it, they can't make me really love them.

Sunday, March 02, 2008

Matters of Thought, Part 2

"To put it simply, then, all we have to do is construct a digital device, a computer capable of producing an informational model of absolutely anything in existence. Properly programmed, it will provide us with an exact simulation of the Highest Possible Level of Development, which we can then question, and obtain the Ultimate Answers!" --Stanislaw Lem, The Cyberiad

Witness the lowly field effect transistor as it's used for logic: a source struggling to push electrical current over a parched channel, waiting for the gate awaken and conjure electrons from deep in the silicon, that they may then flow to the drain, and gratify the devices voltaic urges. A transistor has two inputs, (a gate and a source), and one output (the drain), which will produce a signal that depends on the state of the first two terminals. In computers, these inputs and outputs are wired to one another so as to create basic logic structures, whose function is to convert a pair of binary signals (1s or 0s, high or low voltage) into a single output signal according to one of sixteen possible operations in boolean arithmetic. It seems ham-handed to use half a dozen of these little machines to run an elementary logical step (you actually need a fair amount of redundancy to guarantee that transistor logic structures are perfectly predictable) but thus are the rules of a Turing machine written into a real device. The practical and theoretical limits of transistor operation are an important concern as computer applications keep screaming for more, faster. The sheer number of devices in a processor implies a heating issue--enough current must be provided to power the next transistor down the line--and as devices get small and tightly packed, leakage currents and crosstalk becomes a problem, and dissipation of signal at interconnects becomes non-trivial. Even as engineers keep managing to out-think the lower size limits of fabrication, there may be a bottom on the performance. The price of fabrication increases as transistor sizes shrink too, and Moore's law may ultimately fail based on what outlandish fab people are actually willing to develop and pay for.

For all I know, engineers will get single-molecule transistors into computers in my lifetime, but to keep the prophecy (self-) fulfilled, reearchers are also looking for alternate computing schemes that can get around using transistors altogether, maybe something more naturally small. One group at Notre Dame has proposed equivalent logic structures using quantum dots.1 These are solid crystals that are small enough (several millionths of a millimeter) that the electrons that normally spread across the whole crystal start to behave more like the electrons in atoms, confined to quantum states as defined by a "particle in a box," where the box is the crystal itself. A quantum dot could accommodate an extra electron, and it would tend to be localized on the crystal in a predictable energy state, but it would also allowed to move to an empty equivalent state another dot by by tunneling if it was both energetically favorable and close enough. When a cell of four equally spaced dots had two electrons to distribute between them, the charges would arrange so that they stayed as far apart as possible, that is, at either pair of corners (dots shaded black in Figure 1). Another cell could be placed nearby so that it was close enough to feel that charge repulsion, and arrange its own two electrons according to what its neighbor is doing, but just far enough away that tunneling would be prohibited.

With two distinguishable cell configurations, and a means to arrange them in 2-D space, then it's possible to begin building logic structures. Two simple ones are shown in Figure 1: a wire which can transfer the information imposed by its first cell all the way to the last cell in the chain, and an inverter, which will transfer the opposite information to its destination. In their paper, Porod et al. demonstrate how from there, standard logic gates can be constructed from these sorts of devices, and suggest several ways to go about building them in real materials (quantum dots are real enough), all possible in principle.

[IMG]http://i219.photobucket.com/albums/cc74/Keifus/Slide1.jpg[/IMG]

These quantum dot devices resemble cellular automata (CA). Strictly speaking, CA are computer models, simulations in which the state of individual squares on a grid change based on the states of their nearest neighbors according to a simple set of rules. The quantum dot array is an obvious analog (a CA processor), where the rules happen to be governed by physics (like charges repel, and tunneling is highly distance dependent). CA are like mathemeticians' drug philosophies, producing an acid haze of states that are deterministic but sometimes hard to predict, posessing, it sometimes seems, lives of their own. The Game of Life is a cellular automoton that was famous enough in the eighties to sneak into a science ficiton novel or two. Go ahead and click the link to the game: it's pretty fun to play with for a few minutes.

More complicated cellular automata than Life allow more states, and allow different rules. Stephen Wolfram (the guy who developed Mathematica) published a doorstop a couple of years ago proclaiming CA computing as A New Kind of Science. (This was controversial, evidently because however amazing it may be, it was light on citing others' work, and thin on peer review.) Even CA with two states and one dimension are weird, it turns out, and several of these appear able to generate random numbers from non-random input. One thing that's interesting about studying cellular automata is that the emphasis is empirical. Even for stuff that's run with computer software, the approach is slanted toward understanding the "behavior" that the simple rules evolve, more than knowing the behavior and using it to predict outcome. It's not the same animal as, say, differential equations, where even when the solution is broken up into cells for the purposes of computing (talking your standard numerical integration here), the rules are expected to correspond to something. CA may end up behaving like physical things, but the rules aren't necessarily little chunks of continuum mechanics.

Porod's group envisioned predictable logic structures with their CA processors, but they didn't have to. The could have laid out a large grid of cells, more like the Game of Life, and, for a given set of "input" values imposed on one bunch of squares, proposed to monitor the "output" for another group after the system would achieved a static point or limit cycle. It's turning one binary string of numbers into another one in other words, even if the means from getting from here to there is not always obvious. This sort of computation could be used for pattern recognition: for example, an image could be projected onto the surface, and a characteristic number generated that corresponded to the image, matched, ultimately, with the appropriate learning routine. It could be used for cryptography if any of those steps proved irreversible, as some of Wolfram's CA appear to be.

[IMG]http://i219.photobucket.com/albums/cc74/Keifus/Slide2.jpg[/IMG]

What's more, there's no reason for it to be made up of little binary dot structures, and the nodes don't really even have to have finite states either, but can instead go analog. Researchers have put together or proposed this sort of thing using regular circuit elements as the nodes, liquid crystals, magnetic quantum dots, or acoustic pulses.2 [Disclosure: I worked with one of these guys, and a lot of these points came up in his conversations a few years ago.] Almost any physics will do, provided the nodes are isolated, and there is some basic means of exchanging selected information with the nearest neighbors. For a given input, the system may rearrange itself to find the "solution" in response. Computing with neurons is like this too, the frequency states of quasi-periodic electrical oscillations5 (or whatever, depending on the model) are updated based on disturbances from other nodes, or from the outside world. The DNA computation that I talked about before also strongly resembles processing with cellular automata, where the selection rules for neighbor interactions are highly specific. In a sense, this brand of computation is letting the real world act more like itself, and avoiding the imposition of binary (or other mathematical) logic on it may lead to better models (or at least faster approximate ones). One may notice that the universe itself is a great big tangle of interactions, gradually approaching some limiting cycle (or not): is the universe the ultimate model of itself? Is it doing computations? Yeah sure, why not.

Of course, just because you're in the business of turning numbers into other numbers doesn't mean that you're doing anything useful, and given a system whose innards aren't necessarily worth knowing in mathematical detail, it remains to look at your inputs, and judge the fitness of the outputs. This can be automated as well, and these sorts of processes often use an evolutionary model, in which the best matches of one generation of inputs is selected, and then "mutated" to provide another generation to test against whatever real-world process, activity, or data you're trying to approximate. Simple rules (that is, non-evolutionary) for reacting to external data can be engineered as well, and if you've seen those walking robots, they're very cool. Of course, it's a big step up from here to take machine learning to include models that are self-aware. I don't think this says what consciousness is (for all the mess, it's still looking like a set of instructions under the hood, and it's as hard as ever to say how the gods or ghosts finally come out of the machine) but maybe it's getting a better handle on the fuzzy, approximate way that gray matter--or any matter at all--may actually processes information.

"[The computer] set forth the general theory of a posteriori deities, deities which had to be added to the Universe later by advanced civilizations, since, as everyone knows, Matter always comes first and no one, consequently, could have possibly thought in the very beginning."

[1] Porod et al., Int. J. Electronics, 86, 549, 1999. (Score! There's a copy online.)
[2] Harding et al., arXiv:cond-mat/0611462v1 [cond-mat.other], 2006
[3] Rietman and Hillis, arXiv:cs/0611136v1 [cs.RO] (I only read half of this one. Already past deadline, you know.)