Saturday, January 28, 2012

The uneasiness of a Neolithic monument

Why is it important that there is a neolithic monument, built 11,500 years ago at Gobukli Tepe in Turkey? As Elif Batuman explains ("The Sanctuary: The world's oldest temple and the dawn of civilization," The New Yorker, Dec. 19 & 26, 2012, at 72), the answer is that the people who built it were apparently hunter-gatherers. That means that hunter-gatherers could accumulate enough surplus food and assemble enough labor power to be able to afford the project, which presumably took years of mass effort to create. (76)

If that's right, then agriculture wasn't necessary to accumulation and social organization. In fact, studies of Neolithic and later skeletons suggest that hunter-gatherers were healthier and better fed than their agriculturalist successors. The people who built this monument are taller than the modern Turks who are excavating it. (80-81)

So why did people turn to agriculture? Batuman suggests that the answer is ideology: "The findings at Gobekli Tepe suggest ... that it was actually the need to build a sacred site that first obliged hunter-gatherers to organize themselves as a workforce, to spend long periods of time in one place, to secure a stable food supply, and eventually to invent agriculture." (73-74) If ideology (say, the exaltation of kings) calls for centralized, hierarchical life, agriculture supplies the economic framework. And this drive pushes people into the new social organization even at the cost of their own health. And, as an archaeologist spending his professional life excavating this site told Batuman, "Ninety percent had to work, and ten per cent lived by wealth. The elite wanted to keep their advantage, and they had the power to do it." (83)

There are other possibilities, surely. In the long run, worldwide, the turn to agriculture has supported a vast increase in the number of people who can live and (mostly, I hope) without the peril of starvation. To enable more of us to exist and prosper is, presumably, a net gain for the human race. Perhaps Neolithic people thought their own lives and those of their children would improve if they settled down to farm. If so, their hopes were misplaced for some thousands of years, and that misjudgment could have been an instance of false consciousness, another form of ideological oppression. But it could also have been just a mistake -- perhaps a cognitive error of some sort -- we certainly can't be sure!

Still, if we focus on Batuman's account, what is striking is that it is about as anti-Marxist as you can get. Control of the means of production doesn't shape the form of society. Rather, beliefs generate a form of society, and disposition of the means of production, to obey their commands. That reversal of the Marxian formula may say a lot about what drives our history, and our lives -- ideas first, perhaps, and objective conditions only second. That may be hopeful.

But it's fascinating that the author intersperses reporting on Neolithic life and archaeological investigation with comments on the discrimination against women in the area of Turkey where all this takes place. At least for me, unfamiliar with Turkish names, it only gradually becomes clear that the author is a woman.

But at that point her account of modern Turkey sheds a very bleak light on her archaeological story. Ideology and power drove the ancient societies of this region, and in a direction that made many of their lives worse. Ideology and power still drive modern society, and often still in directions that make our lives worse. Possibly the turn to agriculture was particularly bad for women, condemning them to rigid male rule and "more frequent, more debilitating pregnancies." (82) Again, we don't know. In the long run, I think we are all slowly becoming better off. But it's possible to wonder, in particular, just how much 11,500 years of history has done for at least one half of the human race: women.

Saturday, January 21, 2012

The importance, and limits, of cross-cultural understanding


Anne Fadiman's gripping book, The Spirit Catches You and You Fall Down: A Hmong Child, Her American Doctors, and the Collision of Two Cultures (1997), is a powerful demonstration of the value of cross-cultural sensitivity. It's clear from her book that dedicated, expert American doctors treating a very young Hmong girl for epilepsy and loving, attentive Hmong parents raising her in Merced, California could have engaged with each other far better than they did. More translators would have helped. More interest by the health providers in the parents' perceptions and understandings would have too. Less aura of authority on the part of the providers might also have elicited more trust and candor.

Had the health institutions and professionals managed all this -- in the midst of the many pressures of budget, workload and emotion that were part of their daily life -- communication would have been better. The parents would have better understood what they were being asked to do and why. The health providers, for their part, would have realized more quickly that the parents were profoundly loving and attentive, rather than essentially evasive and noncompliant (though they were these things too).


But even then the distance between the parents and the doctors would have been daunting. The Hmong parents believed that their daughter’s epilepsy was caused by her soul leaving her body and getting lost, evidently with the aid of a soul-stealing spirit, a
dab  -- hence the Hmong term for epilepsy, quag dab peg, or “the spirit catches you and you fall down.” (20) Hmong people find many familiar Western medical interventions -- spinal taps, emphatically, but also lesser intrusions such as drawing blood -- a threat to the body and soul.  (33, 61) “The only form of [Western] medical treatment that was gratefully accepted by at least some of the Hmong in the Thai camps [to which they had fled as they escaped from persecution in Laos] was antibiotic therapy, either oral or by injection.” (34)

Fadiman speaks only briefly (76-78) about the Merced doctor who had the most Hmong patients. The other doctors in town didn't think this one was very skilled. In the few words of his that Fadiman quotes, he does not come across as deeply reflective. But it does appear that what his Hmong patients like is that he doesn't try to push them to accept treatments they don't like – to be precise, he “doesn’t cut.” (76) And he has a reason for this stance, quite a good one -- that “It’s their body.” (77)


One way to understand this statement is that this doctor is truly patient-centered, and that may be so. The principles of nondomination captured in the idea of patient-centeredness – the medical analogue to lawyers’ client-centeredness -- are powerful ones.


But I wonder if this physician was reflecting an insight more sociological than ethical. Fadiman never offers any formula for what the health providers should have done for the little girl who is the center of the story -- that's not her point in the book. But she does express the hope that someday the voices of the American doctors and Hmong family members will merge into one conversation. (ix) Someday, yes. But perhaps we should accept that that may take generations. In the meantime, what the doctor popular with the Hmong offered might be called "peaceful coexistence" -- an acceptance that some gaps can't yet be bridged, some goals can't be accomplished, and that we should try to work together with mutual respect but relatively limited ambitions while the passage of time slowly brings us closer together.

Saturday, January 7, 2012

Can students prepare for law practice in the three years of law school? And what if they can't?

Do lawyers need more than three years of law school? Or could they get all they need to know in a three-year program better assembled than those most law schools have now?

Let's approach this question in a practical way.

First, probably just about every law school has a prescribed first-year curriculum consisting of courses introducing students to basic areas of legal doctrine, such as contracts law, and to basic skills of lawyering (sometimes including interpersonal skills such as interviewing and counseling, always including legal research, legal writing, and "thinking like a lawyer"). It's possible to argue over which building blocks are the most fundamental, but I think few would deny the need for something like this starting year.

Second, in the following two years there are often more courses in important areas of law to be taken. Some may be required, others just more or less emphatically recommended. At New York Law School, where I teach, Constitutional Law and Professional Responsibility are required upper-year courses. Other courses that students might be well-advised to take include Evidence; Wills, Trusts & Future Interests; a basic Tax course; Corporations; and a class in Criminal Procedure. These seven courses add up to close to a full second year's worth of study. Students usually spread them across their second and third years, but let's imagine that they take these courses all together, in their second year. It's then just about complete. .



That leaves one year. Is there anyone who believes that a real introduction to the many complexities of practice deserves less than one full year? I doubt it. There are certainly people who believe this year, or most of it, should take place after graduation, but who would hire a lawyer with less than a year's experience if he or she could afford a better prepared attorney? (For that matter, who would hire a lawyer with one year's experience if he or she could afford someone with five years of practice, or ten?)


In fact, law schools today do not allocate a full year to practice training --whether in clinics, simulations, externships or other courses that focus on introducing students to the world of practice and inculcating professional strengths and values. Judge Jose Cabranes, at the Association of American Law Schools annual meeting yesterday, proposed that the third year be devoted to apprenticeship, and this would be a radical step. (I've proposed a version of this idea too, a "clinical year" on the lines of medical school clinical rotations.)


But if we imagine a law school consisting of two years of substantive law and introduction to lawyering skills, followed by one year of practice apprenticeship, we haven't described a program that provides all that a lawyer might need. In a curriculum like this, there's no room for training in economics, or in sociology, or in psychology; nor for jurisprudence or legal history or philosophy. I don't mean that these subjects could never come up. The first-year contracts course could introduce students to the economic analysis of law. A clinic could introduce students to psychological insights about clients. But these introductions will likely be just that. If, say, lawyers who want to deal sensitively with their clients need a serious understanding of how clients' behavior and wishes are shaped by their social and psychological makeup, the three years I've just described won't provide it. 


So what can we do? Well, here are some possibilities:


(1) We accept that we cannot teach anything except substantive law and applied skills; all broader, less strictly legal, forms of knowledge are less important or at least less fundamental. Realistically, if this means that students end their professional schooling without those other forms of knowledge, we must assume most will never acquire them. 

(2) We teach less of either substantive law or applied skills, thus making room for some of the other forms of knowledge in the three years of law school.


(3) We insist (as a colleague of mine suggested) that students study some of the related fields of knowledge before they come to law school; then law school can build on what the students have already learned.


(4) We offer, or require, at least one additional year of training, perhaps in the form of an LL.M.


There's a lot to be said about each of these. But I want to close this post by considering one problem shared by all four of them: they all seem to require law students to get at least the 7 years of post-secondary education they now receive, 4 in college and 3 in law school -- if not more. Yet the expense of those seven years, the debt that students incur to pay for them, and the troubling state of the law graduate job market have led to calls for programs that would enable lawyers to practice after as few as 5 years of post-secondary schooling. If we move in that direction, we lose teaching time. If nothing else changes, the result will be that we stop teaching something -- two years' worth of something, if we cut back to 5-year programs. What won't we teach? Is there any way we can teach what we now teach more quickly? Can we re-think what we need to teach, and find curricula that will set students on the right course better than our programs now do?


I'll keep talking about these questions in posts to come.

Tuesday, January 3, 2012

Happy New Year -- and another look at the prison conditions case of Brown v. Plata


Back in June of 2011 I wrote a post on a Supreme court case, Brown v. Plata, limiting the permissible population of California's prisons as a remedy not for unconstitutional overcrowding but for violation -- by reason of overcrowding -- of the right to decent health care while in state custody.  I said the decision was a good one, but not entirely.

Here, briefly, is the reason for the qualification: the oddity of limiting prison population without finding that the number of inmates was itself unconstitutional. (Justice Alito's dissent emphasizes the point that the case did not turn on a finding of overcrowding unconstitutional in itself. (Alito, J., dissenting, slip opinion at 1-2.)) This makes the right to adequate health care somehow more demanding than the right against overcrowding. That's not necessarily unreasonable -- the presence of abysmal health care may be more unmistakable and its consequences more sharp than would be true for overcrowding as such. But I don't think the courts have articulated such a priority listing of rights, and without one the special leverage for one compared to the other is puzzling.

Nor is it easy to produce a listing of human claims that convincingly measures them all against each other. Consider, for instance, the claims of different groups of school children. Do handicapped children have greater rights to public support for their education than poor children? Or members of racial minorities compared to their peers disadvantaged by economics or by handicaps?

What about weak students compared to average students? Weak students compared to strong students may be easier, since strong students may prosper even without special support. Or, perhaps more likely, strong and weak students will both get special support, leaving those in the middle least assisted. Do those in the middle have a claim comparable to any of these other groups?

There are, to be sure, some weak rights, such as the right not to be discriminated against on some ground society isn't much worried about (say, whether your factory makes margarine or butter). But rights that have real force give those who enjoy them a big, often probably a huge, edge in conflicts over the allocation of society's always scarce resources. That's a central reason people fight to have their claims established as rights.

It's a smart move, and I think the "rights explosion" of the past 50 years has broadly made our society more just -- and did so in this particular case. But in a society less than fully committed to social justice, even declaring a right hardly guarantees it will be fully honored. Meanwhile, the pockets of rights that we do recognize aren't evenly distributed, and so -- sometimes -- they probably create their own degrees of injustice as the weak compete against the weaker for the benefits society is prepared to distribute.

So I'm glad the prisoners won their case. But the priority of health care rights as against overcrowding rights reflects how uneven, if not inequitable, our distribution of rights and resources sometimes is.