Saturday, February 15, 2014

The brittleness of elaborate thinking

From Norman Levy’s very interesting autobiography, The Final Prize: My life in the anti-apartheid struggle (2d ed. 2012), one striking detail:

Ruth First, in her account of her confinement under the 90-day law [which authorized solitary confinement, without trial, for renewable periods of 90 days, and was a vehicle for torture of opponents of apartheid], noted that activists who sometimes seemed weak and woolly in their thinking held out the longest and did not break under solitary detention. In contrast, others (she had herself in mind) succumbed with insufficient resistance. (Pages 314-15, footnote omitted)

I don’t join First (killed long ago by an apartheid letter-bomb) in her self-criticism; torture, even torture without physical violence, is designed to overpower and usually does. But the contrast she draws is intriguing: those who had fully and precisely elaborated ideological perspectives turned out to be more brittle than those whose thinking was fuzzier. Why would that be?

Surely one likely answer is that ideological precision is always a delusion. We know that all the ideologies of the past were imperfect in one or many ways – it’s easy to see that with the benefit of hindsight. What’s a little more difficult is to realize that that must be true of our own ideologies, right now, as well.


Meanwhile the elaborate effort involved in creating a complete intellectual structure, fending off attacks on it, rationalizing its inconsistencies, all bespeaks a person distanced from the simple emotional forces that fuel our lives. And that distance, that self-alienation, is not only a tool for the torturer – for those unfortunate enough to encounter one – but also a shadow cast across all of a person’s life.

Saturday, February 8, 2014

Requiring experiential learning in law schools -- the debate continues

The debate over whether ABA accreditation standards (specifically, proposed Standard 303(a)(3)) should require law students to spend about one-sixth of their school time -- 15 credits, or approximately one semester -- in experiential education continues. (I linked to my debate about this issue with Brian Leiter here.) Here (minus a typo, and slightly reformatted) is the comment I submitted on this issue to the Council of the ABA's Section of Legal Education and Admissions to the Bar -- one of many comments the Council received. The latest word: Albany Law Professor Mary Lynch reports that the "Standards Review Committee," which had earlier recommended a lower requirement of 6 credits, yesterday decided to take no position on the 15-credit proposal, which now goes to the full Council for consideration. 

January 31, 2014

Dear Members of the Council of the Section of Legal Education and Admissions to the Bar:

I am writing to comment on the proposal by the Clinical Legal Education Association, circulated for comment by the Council, to require all law students to take 15 credits of experiential courses.[1] It is certainly possible to debate some of the details of this proposal, but my purpose here is to defend its central proposition: that all law students should receive significant training in the practice of law before they graduate. This is one of those propositions that almost seems not to need defense: who would imagine a professional school that did not give its graduates training in how to practice their future profession? 

To endorse the value of significant training in the practice of law is not at all to deny the value of other things law students do, notably in studying rules of law, cases and statutes, and the ways that lawyers employ these fundamental tools of their trade. The traditional classroom is one way of teaching students this skill, and an important one. But one might think that this method should normally be able to accomplish its central goals in 5 semesters, or about 75 credits’ worth, of classes, and that if students have pursued their classroom studies for that long and have not yet learned how these critical elements of legal reasoning, then it might well be time for some other approach – for example, an experiential one, in which the way students master law’s intellectual moves is by making them on behalf of real (or simulated) clients.

Nor is the endorsement of the value of significant training in the practice of law a rejection of the utility of students’ pursuing advanced study in particular course areas they view as directly related to their future practice. Five semesters of classroom study is a lot of time – time, probably, for the usual required first year courses, for a battery of upper-level courses that many or most law schools require or encourage, and time for some further specialization as well.

But even if we accept the fundamental proposition that law students should get significant training in the practice of law before they graduate, it might be argued, nonetheless, that regular law school classes are an engagement with practice. There is some force to this point. Socratic classrooms are much more engaged, I believe, than lectures. Langdell, as I understand it, aimed to teach students the skill of legal reasoning - and I certainly agree that's a practice skill. 

There are two major problems, however, with the "regular coursework as engagement with practice" argument. First, the traditional study of legal reasoning is an engagement only with a fraction of the skills a lawyer needs. It includes no interviewing, no counseling, no trial skills (except a measure of advocacy training gained by some students from the give-and-take of class discussion), no negotiation - and actually not much training in legal research nor, in many courses, more than a final exam's worth of training in the many challenges of legal writing. Second, the sad truth is that the charm of this method wears off. Two years of Socratic dialogue does not make the third year of it more profoundly rewarding. Instead, it evidently leaves our students often deeply disengaged, as Mitu Gulati, Richard Sander & Robert Sockloskie have shown in “The Happy Charade: An Empirical Examination of the Third Year of Law School” (2001/02) available at http://www.seaphe.org/richardsander/pdf/Sander-Gulati-HappyCharade-final.pdf.

I think that clinical legal educators have demonstrated over the past forty years that the other skills of legal practice are also, like legal reasoning and legal doctrine, susceptible of scholarly study and capable of being tau ght. Experiential courses, including clinics, externships and simulation classes, aim to do just that and so to add crucial depth to the practice preparation that law schools provide. Clinics now are offered in a wide range of substantive areas, including not only litigation but also mediation and public policy advocacy, and not only poverty law and criminal law but also corporate and transactional work as well. Even skills that may not be within the range of clinics (merger & acquisition techniques, for example, for which few clients would welcome student representation) can be addressed in simulation courses, and we have good reason to believe, from two different NALP studies (the “2010 Survey of Law School Experiential Learning Opportunities and Benefits” and the “2011 Survey of Law School Experiential Learning Opportunities and Benefits – Responses from Government and Nonprofit Lawyers”), that simulation courses are valuable even though law graduates rank clinics and externships even more positively.

One might accept all this, and respond that law schools actually already do give students the experiential learning they need, by means other than course work. It might even be contended, as one blog commenter suggested, that two summer jobs are “not dissimilar” to medical school clinical rotations. I certainly don’t deny that summer jobs can be very valuable. In fact, data from the After the J.D. study, as examined by Rebecca Sandefur and Jeff Selbin in The Clinic Effect, 16 Clinical L. Rev. 57, 85 (2009), suggest that new lawyers viewed summer work as the single most useful experience of their law school years in preparing them for practice, with school-year legal employment coming second. It is not surprising that students find legal work a valuable training ground for future legal work – wouldn’t it be odd if they didn’t? – but isolated summer jobs, as valuable as they are, are not school. Nor do they compare to medical school clinical rotation, which is not only part of the medical school educational program but a very extensive and intensive part. (A program like Northeastern’s, in which four different work experiences are deliberately made an integral part of the students’ overall law school experience, might well be another matter – but that goes to the question of how to define eligible practice training experiences, not to the basic need for significant practice training before graduation.)

One might accept this point too, yet respond that the choice of what courses to take should belong to the students themselves. Certainly choice is important. But we teach in law schools, and it’s built in from the start in our schools that most of the courses students take must be law courses – rather than, say, political science or humanities. Even within the domain of law courses, many or most schools restrict student choice significantly – notably by prescribing most or all of the first-year curriculum and sometimes parts of the upper-year curriculum as well. I think the question is not whether we will limit students’ choices, but how much and for what reasons. And on that score, I wonder whether part of the sense that 15 credits of experiential learning is too much might arise from a perception of “skills” as a single subject, rather than as a very wide range of different competencies that get used in different ways in different settings. We routinely allocate 60 credits or more to teaching “doctrine” and the skill of legal reasoning; it does not seem too much to allocate 15 (or some comparable amount) to other skills of practice. Within those practice credits, I’d certainly hope that law schools will offer their students a substantial range of courses from which to choose. 

One last point on the question of “how much?” Bob Kuehn, in his essay “Pricing Clinical Legal Education,” available at http://papers.ssrn.com/sol3/papers.cfm?abstract_id=2318042, has compiled figures on the “practice-based and clinical education” requirements in a range of other professions: architecture, dentistry, medicine, nursing, pharmacy, social work, and veterinary. Each of these requires at least one quarter of students’ training to be in clinical settings; some require one third and, last but not least, medicine requires one half (and that doesn’t include the years of supervised post-graduate clinical training that follow receiving the M.D.) Kuehn also presents cogent evidence that law school budgets can handle the cost of clinical education – not that this education is costfree (it obviously isn’t), but that the costs can be handled. The CLEA proposal is that approximately one-sixth of law students’ training be in experiential courses. It seems to me that the burden is on those who disagree with this proposal to explain why law students, unlike their peers in other professions, do not need this level of experiential preparation for the work they will soon be doing.

I appreciate the opportunity to submit these comments, and hope the Council will find them helpful.

Sincerely,



Stephen J. Ellmann
Professor of Law 
Director of the Office of Clinical 
       and Experiential Learning
New York Law School




[1] These comments are based on arguments I laid out in an online dialogue with Professor Brian Leiter (and others) on his blog.

Sunday, February 2, 2014

Human community via the Super Bowl

Last night for the first time I saw Super Bowl volunteers in New York. Some were on duty, some making their way home for the night, all wearing yellow windbreakers announcing their role. Perhaps these folks are the NFL’s version of unpaid interns, people who are trying to network their way to a job and meanwhile being exploited by the powerful – and, amazingly, “nonprofit” and tax-exempt – NFL. But my guess is that many are not looking for a job but instead are happy to serve as volunteers for the pleasure of being part of the experience (and the souvenir windbreakers).

They are fans of the Super Bowl itself – and such devoted fans that they give their own time in service on the periphery of an event that they will surely not be able to afford a ticket to attend. I admit it’s hard for me to see this event as the one that people should give their hearts to (I like watching the Super Bowl, but it’s a game and one that a lot of people profit from rather flagrantly). As my wife asked, isn't there a charity these volunteers could be sustaining? Still, tastes differ. We are creatures who often want to be part of the cause and the crowd and the excitement, and it may be that there simply is no bigger event to join in the US right now than the Super Bowl. Slightly more people voted in the 2012 Presidential election (about 129,000,000) than watched the 2012 Super Bowl (about 118,000,000 at the end of the game), but this year isn’t a Presidential election year.  

Perhaps we should accept that human community is a good thing wherever it comes into being, as long as it is not the communalism of mob violence, and so we should welcome the Super Bowl community simply because it is a community. Of course, this is a community built around a game that’s a very violent one, not deliberately lethal in the manner of the gladiatorial contests with which the Romans entertained themselves, but with plenty of immediate risk and visible and invisible longterm harm. I’m not so ready to say that community formed around the sight of others enduring and inflicting danger and pain is always good, but it’s impossible to deny that it touches something deep in many people. Hockey, rugby, boxing, auto racing and other sports make that pretty clear.

At any rate, even if Americans increasingly “bowl alone,” as Harvard Professor Robert Putnam has argued, we can certainly declare that they do not Super Bowl alone.

Saturday, February 1, 2014

Are plants thinking? And feeling?

Before I forget, I have to say a word about “The Intelligent Plant.” That’s the name of an article by Michael Pollan in The New Yorker late last fall. It’s a fascinating article, which makes one central point – plants are either thinking or doing something unnervingly like thinking. Not everyone agrees with this point: those who do created an organization called the "Society of Plant Neurobiology" in 2005, but the word "neurobiology" applied to plants was so controversial that the group later renamed itself the "Society for Plant Signaling and Behavior." What seems beyond doubt is that plants have an elaborate sensory apparatus, and it’s not limited to impersonal data like soil composition: “Roots can tell whether nearby roots are self or other and, if other, kin or stranger.” With kin, in one research instance, the plants “restrained their usual competitive behaviors and shared resources.” Most vividly, Pollan describes time-lapse video of a bean plant looking very much as if it is deliberately heading for a particular bar up which it will climb; when it gets there, “the plant appears to relax; its clenched leaves begin to flutter mildly.” 

Not to belabor the obvious, but it is troubling to think that one is eating what had been a thinking and feeling creature. I cope with this uncomfortable thought while eating meat, but now it seems possible the same guilt attaches to eating plants. Even vegetarians may be killers. 

Perhaps we shouldn’t worry, because apparently none of the scientists studying plant cognition assert that plants have emotions, but I’m not at all sure why we should be confident they don’t have emotions, if we ultimately believe they somehow make choices based on data. Emotions help humans make choices; wouldn't they likely be helpful to plants for the same reason? And if cognition can somehow take place without a brain (or at least without a brain-equivalent that's yet been identified), why not emotions too? 

If the no-emotions argument is unsatisfying, can we get off the ethical hook because, as one scientist told Pollan, “Plants evolved to be eaten—it is part of their evolutionary strategy”? I don’t think so; Pollan quotes another scientist saying plants may feel pain, and it seems to me that an individual plant may be sorry to give up this life, no matter how much its doing so contributes to its evolutionary strategy. Human beings don’t welcome death, after all, even though that’s certainly part of our biological nature.


The world we are fortunate to inhabit is full of life, life of all sorts, life everywhere. But this profusion of creation seems to rely on mutual slaughter to continue.