My daughter is turning five soon. She wanted to buy five of those shiny foil balloons that can cost anywhere from three to ten dollars.
Me: Those balloons are really expensive. If we buy five of them, that would cost twenty-five dollars. For that much money you could get two presents. Or five books.
My daughter: But I want five balloons because I’m turning five.
Me: We could return two of your presents and get you five balloons.
My daughter: But I want all of those presents.
Me: Then you only get one special balloon.
My daughter: If you give me five balloons, and you don’t return any of my presents, I will stop the arguing.
She has already mastered the filibuster.
Earlier this year she proved she could be House Majority Leader for the Republicans.
It would have taken me 2,000 words to do what XKCD does in 21 (not counting the mouseover).
(Which is, of course, the title of the concluding double episode of Season 3 of the greatest TV show of all time.)
Today I graduated from the Yale Law School. It has been said about many schools, but about no other school is it more true that getting in is hard, graduating is easy-peasy-parcheesi. I’m not entirely sure what to make of the entire experience.
So instead of reflecting on Yale, I’m posting a speech that I drafted fourteen years ago when I got my Ph.D. in history and that I just found on my hard drive. I believe that I wrote it because I applied to speak at that graduation; in any case, I know that I didn’t speak at graduation, so it must have been rejected.
(It amuses me that I still write the same way I did back then.)
“What is the use of history?” The French historian Marc Bloch put this question on the first page of a book entitled The Historian’s Craft. “The question,” he continues further on, “far transcends the minor scruples of a professional conscience. Indeed, our entire Western civilization is concerned in it.”
The attentive reader, however, will note that Bloch does not answer the question. After circling around it for a few pages, he writes, “our primary objective is to explain how and why a historian practices his trade. It will then be the business of the reader to decide whether this trade is worth practicing.”
Today, in a period of declining enrollments and dwindling institutional support, it is incumbent upon us as historians to say just what our discipline is good for. But while Bloch set out to convince an intellectual audience of history’s legitimacy as a branch of knowledge, today it is a matter of defending history’s value to the university, the state of California, or society as a whole—a value that is increasingly measured in monetary terms. How does history serve the economy of California? How does history train students to be productive members of society? What is the return on an investment in history?
It is no secret that universities everywhere are becoming increasingly attentive to the bottom line. Because of the resulting shakeup, their various schools and departments are coming to rest along a spectrum that ranges from engineering, applied physics, and business, at one end, to literature, classics, and history, at the other. The former are prized as the source of both innovation and skilled labor for high-tech economic growth, and are lavishly funded by the corporations that benefit from them. The latter increasingly appear an atavistic remnant of yesterday’s university, or an obligatory nod toward a notion of the liberal arts to which few people any longer subscribe. It is up to us to either accept or resist this devaluation of history.
When I came to Berkeley almost seven years ago, I assumed that history was useful, and that I fully deserved the money the state of California would contribute to my education. In my first two years, I learned how to argue that point—and to argue it convincingly, I like to think. Now … I’m not quite so certain.
Which, I think, is a good thing. We should not accept with complacency our own arguments for our importance. At this year’s convention of the American Historical Association, I attended a panel on downsizing in the profession. What I was struck by was the virtual consensus that history is valuable in and of itself, that downsizing is bad not only for historians but for society as a whole, and that it is simply a matter of pointing this out to the public at large, which, upon recognizing this, will presumably give us lots of money.
When did a profession supposedly devoted to critical thinking become so uncritical of itself? We have all absorbed the truisms about how History with a capital H is essential to a healthy society, but how much history do we really need? How many historians?
It’s time to face those questions squarely. Let’s not do what Marc Bloch did, and simply prove to ourselves the intellectual merit of our own research methods. Let’s face the problem he raised before setting it aside: “it is undeniable that a science will always seem to us somehow incomplete if it cannot, sooner or later, in one way or another, aid us to live better.” But at the same time, let’s not give in to the world-weary, overeducated cynicism that says history isn’t good for anything except providing employment for people whose principal activity is publishing articles and books that only they can read. Let’s find out if history really is good for something, besides esoteric academic debates.
As historians, it is up to us to answer this question. But it cannot be answered with clever sophistry or impassioned debate; only actions will suffice. If history is supposed to be essential to the moral conscience of our nation, then we have to stand up for what is right, and not bury our heads in books and journals. If it is to instill in future generations an appreciation of our shared human heritage, then we must teach history—from elementary to graduate school—with enthusiasm and conviction, not simply to pay the rent. If our research really does address questions vital to our understanding of the world, we should make it compelling for any reader, not just the academic specialist. If anyone is to learn lessons from history, it is up to us to draw them for all to see.
And if we can’t live up to these demands, let’s admit that history is merely a form of entertainment, in which case, Bloch said, “all minds capable of better employment must be dissuaded from the practice of history.”
I address this challenge to all of today’s graduates, not just to those of my newly doctored colleagues who will become what are known as “professional historians.” We came to this ceremony by many paths and will leave it for many futures, but we are all historians.
As for many of us, today is my last day in academics. I will probably never write another history paper nor teach another history class. Yet I will remain a historian, because studying history has made me, in part, who I am today. I have learned a great deal in the past seven years, both in and out of class. I need no longer, as one of my friends proposed to do in his orals, respond to every question by citing the Reform Bill of 1867. I have learned that the past invariably shapes the present, and that we cannot understand why something is the way it is without understanding where it came from. I know that there are many answers to every question, and that the motivations and interpretations of human behavior and experience are endless. And I know there is perhaps no more daunting task than to truly understand why people do the things they do, or even to understand a single human being.
These, to me, are the lessons of history, and we should all be proud to have learned them. Some of us will go on to teach them to future students. But for the rest of us, being a historian does not stop as we leave this theater today. It only becomes more difficult.
Within the walls of academia, what matters is being right—getting the right answer, the brightest new idea, or the most compelling interpretation. But too many people think that personal brilliance and the pursuit of knowledge provide a kind of terrestrial sanctification. The most important thing I learned here is that being right isn’t always what counts. It’s more important—and more difficult—to live your life well, to treat the people around you with unwavering fairness, respect, and generosity. And if history is to prove useful, it should help us to meet that challenge.
We who have studied history should know not to overestimate our own intellectual pursuits. We, too, like the people we study, are human beings condemned to imperfection in an imperfect world. We should distinguish ourselves by our perspective, our judgment, and our realization that great changes are made little by little, one person at a time. We have not been trained to be inventors or statisticians or keepers of sacred texts, but to understand the adventures and misadventures of human beings. In whatever your walk of life, I encourage you to use your training, to draw upon your knowledge of history and your capacity for understanding and say, “I am a historian, and this is where I stand.” And perhaps, together, we can prove that history, and historians, do matter.
In one of my classes yesterday, we were discussing the general topic of bounded will-power: the conflict between the affective self that wants to eat ice cream and the deliberative self that wants to exercise so it will be healthier in the future. This topic brings up an interesting normative question.
The conventional understanding is that the deliberative self is right and the affective self is wrong. For example, if you ask someone if she wants to save more money than she currently is saving, most people will say yes. That’s the deliberative self talking, thinking about the need to have income in retirement. But in practice, even after they say that, people don’t increase their saving, because the deliberative self isn’t strong enough. So, the policy wonks say, we should create devices to strengthen the deliberative self to increase its chances of prevailing against the affective self.
But how do we know that the deliberative self is right and the affective self is wrong? The deliberative self may be more risk-averse, but does that make it right? And what does “right” mean, anyway? Maybe if we led our lives entirely according to the affective self we would be happier than if we led them according to the deliberative self. We would eat more ice cream now, be poorer later, and figure it out then.
There’s a day-long conference on the Dodd-Frank Act at my school tomorrow. I really should go: I might learn something, I would meet people, it would be good for my career, etc. And I was planning to go. But yesterday I decided that I didn’t want to sit in a room all day and listen to economists and lawyers talk about the financial crisis. Sure, it might be healthy, but it didn’t seem all that enjoyable. So I’m skipping it. That is, my deliberative self did a calculation and decided I would be better off letting the affective self win this one. Put another way, I decided that my deliberative self uses too low a discount rate, which is the opposite of the conventional wisdom: most people think the affective self uses too high a discount rate.
In other words, we’ve reached the point where deliberative types like me are using happiness research to try to figure out how to become happier by shutting down the deliberative self.
Yesterday I and a few friends spent half an hour with Monty, our small, lovable, pettable eleven-year-old therapy dog, which the Times felt compelled to report on even though it couldn’t come up with any kind of interesting angle.* Afterward we had ice cream. So what if Harvard has ten times as many buildings as we do?
*The curious part of that article is when it claims that we get an “Introduction to Legal Reasoning” at Yale Law School. I can say with confidence that there is no such thing, at least not in a form that would warrant capital letters.
Actually, there are many things wrong with the NCAA basketball tournament(s — everything I say here applies equally to the men’s and women’s tournaments). A year ago I criticized the arbitrariness of having a selection committee, arguing instead for a European soccer-style system where the number of slots for each conference is determined a year in advance based on a quantitative formula and then each conference is free to decide how its slots will be filled.
The problem for today is broader, and applies to the Bowl Championship Series as well. Where I come from, the point of sports contests is to win. If North Carolina State beats Houston in the final game, they are the champions, even if we know that Houston would win nine games out of ten. The same goes for Miami beating Nebraska in the Orange Bowl, the Giants beating the Patriots in the Super Bowl, or Liverpool beating Milan in the Champions League final. And that’s also true for every other game along the way. The point is to win, not to have the best team.
Nate Silver breaks this down for basketball teams by using different statistical measures for teams’ talent (how good they should be) and merit (how many games they won and against whom). He does this to show how a team’s actual draw compares to the draw it deserved to get based on its performance during the season. The unfairness that results is a combination of a number of factors, such as the fact that some teams get to play close to home.
(I wrote a 365-word post for the3six5.com, a collective diary. Actually, I wrote two. This is the other one.)
Always stare straight at the camera. If your eyes are moving around, you look like you have something to hide.
I was on TV today for a five-minute interview on one of the business channels. The first time I was on TV, no one told me to stare at the camera, and I looked terrible. Now I know the drill. I was talking about derivatives regulation, right after a story about an options trader who made $8 million betting on Weight Watchers.
The topic was actually interesting, at least to finance nerds. If companies have to put up cash to cover potential losses on their derivatives transactions, will they have less money to build factories and hire workers? That was the point of a recent report arguing that derivatives regulation will increase unemployment. But there’s a flaw in that argument. If companies don’t have to put up cash, they are still on the hook for their derivatives, so that will reduce their access to credit. In a perfect market, you get the same result either way.
But that’s hard to fit into a sound bite, which makes the whole exercise seem slightly absurd. Did I really help anyone understand the issue? Or was I just a prop for the news show? And was it worth driving an hour each way?
Usually people like me justify things like this by pointing to option value: it’s always good to be nice to the media and to keep your name in circulation because it could be useful later. But when does it end? I just accepted a great job as a professor at the best law school within an hour of my house. When do you stop buying options and decide you’re happy with things the way they are?
But my daughter sure was excited. As I was taking her to school, she said, “Tell Mommy when you will be on TV so she knows when to turn on the TV and watch you!” And I got to wear the pink tie she gave me for Christmas (along with a bottle of lotion and a cloth to wipe my iPad.) So it wasn’t a complete loss.
I am always looking for ways to make my life both more productive and less stressful at the same time, and since I spend most of my working day at the computer, much of that effort has to do with how I use the computer (and iPad and phone, these days). But the issue isn’t whether or not you use the keyboard and mouse; the issue is what you use them for. I think there’s nothing wrong with using your mouse to scroll through an article you are reading — that’s how you learn things, and I find reading relaxing, even at a computer. The challenge is suppressing all the other ways your computer can bother you, or all the impulses to do something else, of which checking your email is probably number one.
I’ve tried a lot of tricks over the years. The problem is finding something effective that you can actually stick to. My current number one trick, which has worked well for several weeks now, is that I only check email between 10 and 6, Monday through Saturday, and when I’m not actively checking and responding to email, I close the window. It doesn’t work perfectly — I still check too often during the day — but it works pretty well. Basically, if it’s before 10 or after 6, I just pretend the email doesn’t exist.
Tomorrow is the first day of Yale Law School’s spring semester, and hence my last first day of school (although actually I probably won’t go to any classes until Tuesday). While most of me will be glad to be done with school — it’s expensive, for one, and it takes up time that I could use doing other things — I’m also sad about it.
Law school has been nice, an opportunity to do things like read books (I even like reading cases) and learn about new things and meet nice new people (who are mainly ten to twenty years younger than I am), without the responsibility and pressure of a real job. But more than that, school has been very, very good to me. I recognized as early as high school that what I was best at was going to school (and I vague recall my friend Jed, on the first day of senior year, saying “It’s the last first day of school!”), and except perhaps for the years researching and writing my dissertation, that has proven true over the years.
In the long run, what I’ve learned is that being good at school is not that important in the real world. In the business world, for example, academic and intellectual skills are far less important than the ability to pick up a phone, call someone you hardly know who doesn’t owe you anything, and get him to do something for you — and that’s something they don’t teach in any school. In the academic world, even, the skills you need to take classes are far less important than the ability to identify promising research areas and convince other people (particularly funders) that they are promising areas of research. And of course, in life as a whole, being able to get along with other people and enjoy your time with your family and friends is more important than just about anything. But that’s made law school even more enjoyable in some ways, because it’s this little cocoon where I can forget how complicated life can be outside the classroom.
Most of my classmates can’t wait to be done and off to their exciting new jobs (mainly as clerks to federal judges or associates at big, fancy law firms). But I can wait a few more months.
When I was younger, I wanted to go on vacations to places that were historically, culturally, or naturally interesting — you know, places like Paris, Berlin, Yellowstone, etc. Now that I’m over forty and have a family, though, I just want to relax. And especially now that I live in New England, in the winter I just want to go someplace warm.
To that end, we just spend an idyllic week in Miami Beach at the Loews Hotel. I was last at the Loews in September 2000 for one of Ariba’s major user conferences, when we were still the hottest thing on the Internet. I think we had something like 3,500 people at that conference (not all in the Loews, of course). I don’t recall anything about South Beach from that trip, and I suspect I spent all of my time inside or walking to and from the convention center.
This time, though, we sat by the pool in the sun, or lay on the beach in the sun, and ordered overpriced but passable hotel food from the roving waiters. My daughter floated in the pool, or looked for shells, or built sand castles, or pretended to be a mouse in the back of the cabana, or listened to us reading Magic Tree House and Bean and Ivy stories, or did all the other things four-year-old girls do. I had a caipirinha on the beach, like I did in Rio de Janeiro on the last day of a business trip three years ago. I went the whole week without caffeine and four days without email. (I needed email to reschedule an exam when my flight back was canceled due to a huge snowstorm in Connecticut.) I give it three stars.
Apparently, I am a “software developer/legal philosopher“!
(For the record, I did mainly marketing, sales, and consulting, and I dabbled in product management, but I did nothing that could be called development. “Legal philosopher,” though, I’ll take that.)
After taking hundreds of business trips, I have a pretty clear idea of what matters to me in a hotel. Most people would probably say cleanliness is the most important thing, but I disagree. Number one is a comfortable bed. Number two is the heating/cooling system: it has to be able to keep the room at the right temperature, without making noises that will wake me up. If I can have those two things, I can sleep well, which is pretty much all that matters. (Number three, if there is a number three, is a hot, reasonably high-pressure shower. I can live without just about anything else.)
This past week we went to New York to visit my father (and my sister’s family, who were also visiting), and we decided to stay in a hotel. We booked a room in the Hampton Inn (in Elmsford), which is generally my favorite chain (cheap, everything free, newly remodeled, predictable). But we had a terrible time sleeping, because the heater didn’t have a constant fan setting, meaning that it kicked on periodically, and it also blew out extremely hot, dry air when it was on. So we checked out after the first night and switched to . . . the Ritz-Carlton (in White Plains).
One of the momentous events in the way I live and work happened this past spring when I joined the Appleverse,* buying an iPad and a MacBook Air (13.3″, with an SSD) within a month. At the time, I was lukewarm about Apple’s app-based approach to computing — not because I didn’t think it would work, but because I didn’t think it was the best thing for the world.
Seven months on, I love my Mac (and am trying to convince my wife to buy one, too), but I still have mixed feelings about the iPad. I bought it so I could watch Gossip Girl on the Washington Metro, and for that use case it’s just about perfect. But overall it feels like a massive exercise in tradeoffs.