Prosthetics, athletics, and the human future

The cover article in the latest issue of ESPN Magazine is on the new generation of prosthetics and the difference they’re starting to make in the world of sports; not only are they becoming sophisticated enough to allow athletes who have had limbs amputated to compete on a level playing field with those who haven’t, some folks are beginning to be concerned that they might provide a competitive advantage. In a classic knee-jerk overreaction, sports governing bodies have begun to respond, not by developing intelligent guidelines for the use of prostheses, but by banning them. Clearly, this isn’t fair.

The bottom line is this: Sports do not need knee-jerk segregation, they need rational and fair regulation. Every organized sport begins the same way, with the creation of rules. We then establish technological limits, as with horsepower in auto racing, stick curvature in hockey, bike weight in cycling. As sports progress, those rules are sometimes altered. The USGA, for instance, responded to advances in club technology by legalizing metal heads in the early ’80s. In Chariots of Fire, the hero comes under heavy scrutiny for using his era’s version of steroids: a coach, at a time when the sport frowned upon outside assistance. So if we can adjust rules of sports to the time, why not for prosthetics? Create a panel of scientists and athletes, able-bodied and disabled, and ask them to determine what’s fair. One example: We know the maximum energy return of the human ankle, so that measurement could be the limit for the spring of a prosthetic ankle. That type of consideration is much fairer than simply locking out an entire group of athletes.

If prosthetic technology can be used to enable people to compete on an even footing (so to speak), then it should be allowed for that purpose; obviously, the rules need to be carefully tuned to be as fair as possible, but the relative difficulty of that task should not be an excuse for not attempting it.There is, however, a deeper concern here.

If anyone can predict what sports will look like in 2050, it’s [Hugh] Herr, who lost his legs 26 years ago in a climbing accident. Herr wears robotic limbs with motorized ankles and insists he doesn’t want his human legs back because soon they’ll be archaic. “People have always thought the human body is the ideal,” he says. “It’s not.” . . .Bioethicist Andy Miah predicts that one day, “it will be an imperative, and the responsible course of action, to reinforce one’s body through prosthesis when competing at an elite level.” In other words, all pros will have engineered body parts. History will view the steroids witch hunt as a silly attempt to keep athletes from using technology to help regenerate after a season of pain. “In many ways, we’re facing the advent of the bionic man,” says MLS commissioner Don Garber. “It’s something our industry has to start thinking about.”

This is worrisome talk. The desire for a superhuman/post-human existence has done a fair bit of damage over the years, and as science starts to make “improving” ourselves a near-future possibility, we need to be very, very careful with that. We simply are not wise enough or knowledgeable enough to make playing God with our bodies a good idea; and I say that not only as a Christian but as a longtime reader of science fiction. The downside of trying to re-engineer the human body is just too great; and honestly, I don’t think the upside is worth it. If we “improve” everyone, what have we really gained?; and if we only “improve” some, haven’t we only taken the inequalities among people that already exist and made them worse? Do we really need more reasons for some people to think they’re better than others? These are the things we need to think about very carefully before we start declaring our bodies obsolete.

America’s Stone-Age Navy

OK, so maybe that’s an exaggeration—but when it comes to computer technology, it’s frighteningly close to the truth.

Consider the Arleigh Burke-class Aegis guided-missile destroyer. It is one of the most sophisticated and capable fighting ships the world has ever seen. With its advanced SPY-1 radar, 96 vertical-launch tubes armed with a variety of long-range weapons, an advanced sonar system and antisubmarine warfare capabilities, it has everything a naval warrior could want. Consider, now, the Blackberry that has become ubiquitous in our culture. The two-way communication bandwidth of a single Blackberry is three times greater than the bandwidth of the entire Arleigh Burke destroyer. Looked at another way, the Navy’s most modern in-service multi-mission warship has only five percent of the bandwidth we have in our home Internet connection. And the bandwidth it does have must be shared among the crew and combat systems . . . The recruiting posters promise, “Accelerate your life!” but the best we can do is “decelerate” access to information. The Economist summarized the challenge: “If Napoleon’s armies marched on their stomachs, American ones march on bandwidth.” During the past ten years we have seen an explosive growth in commercial bandwidth, and each year the Navy’s connectivity falls further and further behind. By 2014, our homes will have 250 times more bandwidth than a [guided-missile destroyer], and 100 times more than the next-generation aircraft carrier. We have to reverse this trend. And if we want the Navy to become a more interactive, collaborative, and effective fighting force, we have to leverage the innate collaborative nature of our Millennium Sailors.

I imagine that we’ve survived this handicap (which isn’t just the Navy; this is a problem for each of the services) to this point because we haven’t been up against opponents with the ability to exploit it. With China rising, we will—and probably sooner than we think. This needs to be fixed.HT: Max Boot

Outsourcing memory

Have you ever thought about how little we remember for ourselves anymore? Scholars talk about how cultures move from being oral cultures, in which the stories are passed down by word of mouth and held in the collective memory of the tribe, to written cultures, in which they are preserved in books, and now to what they’re calling “secondary orality,” as we move away from the written word; but it isn’t a move back toward a primary reliance on human memory. Instead, we’re simply replacing written media with visual/aural ones—the reliance on technology continues, as we outsource our memories to books, pictures, video, computers, PDAs, and the like. Indeed, a PDA is basically a handheld prosthetic memory; if you have one, and you remember to use it and keep it with you, you don’t have to remember what you need to do, where and when you need to do it, who you’re going to do it with, or what their phone number is—just press the right button, or buttons, and the box remembers it all for you and tells you what you need to remember when you need to remember it.

The advantage to storing so much of our memory outside ourselves, I think, is that less of our brain is needed for that task, which means there’s more of it that we can use for other purposes, like inventing new things. I don’t know if anyone’s ever looked into this, but that might explain the accelerating pace of technological progress. After all, each new invention that frees up a little more of our brainpower from the work of memory gives us that much more brainpower to come up with new ideas and new ways of doing things—and gives us ways to record and store those increasingly more complex ideas, allowing us to interact with them more easily and quickly; and as these inventions enable us, more quickly, efficiently, and completely, to share those with others, that multiplies the effect. So in that sense, maybe the fact that we don’t remember as much ourselves, that we rely on other means to do it for us, is one cause of all the material benefits science and technology have given us.

There are downsides, too, though. Not only can all those things break, or get lost, or simply not be where we need them when we need them, there’s also the fact that our memories tend to be less vivid and immediate, more distant from us—less real, we might even say. Rather than being part of our present reality, they come to us as shadows of another time. To be sure, this would be the fate of most of our memories regardless, and there will always be things we would rather let slide into oblivion—but what about the key moments in our lives, the ones that make us who we are? Consider that to a large part, memory is identity. The more distant our memories become from us—a problem worsened by the speed and busyness of life in the Western world, which leaves us little time to stop and reflect, and remember—the more distant we become from ourselves, and thus from others, and from God.

(Update: I first wrote this, for other purposes, back in the summer of 2006, so I may actually have gotten to this idea first, as I [probably naively] thought I had; but in the interim, David Brooks has gotten here too, from a quite different angle—neither fact of which is surprising.)