1.21 Case Under Submission

Many of the earlier essays in this group of essays focused on various causes of ineffectiveness of the legal system. Most recently I dealt with the extreme ineffectiveness and waste of time that is involved with grand juries in the State of Alabama. In other essays, I focused on ineffectiveness that results from the economics of law practice. This column will continue the focus on ineffectiveness in the legal system. The judicial system itself has certain built in problems.

Trial court judges are overloaded with responsibilities and overworked. Caseloads are large. The effectiveness of a trial judge depends to some extent on the motivation of the particular judge. The ultimate responsibility for decision making rests with the trial court judge in most instances. There is little supervision of the day to day work of a trial court judge.

The appeal system, which will be the subject of a later essay in this series, certainly does not provide supervision. At best, if the trial court judge makes a mistake it can be reversed by an expensive appeal process after the lapse of several months. That does not constitute any real supervision. There is the judicial inquiry commission, if the judge’s conduct steps considerable outside the norm. But in the normal course of events the work of a trial court judge is unsupervised. Ultimate responsibility must rest somewhere, and for our legal system, it rests with the trial judge. “The buck stops” with the trial court judge, in large measure.  For the system to be effective, judges must be totally honest, bright, and have strong work ethic.

There is an ethical provision that requires a judge to report to a bureaucracy any cases or matters that have been under submission for more than six months! Why should any matter ever remain under submission for over six months at any time? Based on my experience as a circuit judge, I know that often a circuit judge deals with dockets involving thirty or forty cases on any given day. If ten matters are taken under submission each working day, that would add up to fifty under submission in a week. That would be two hundred matters in a month. In six months it could be a whopping twelve hundred matters. Of course, some of those matters would have been decided in the meantime. Nevertheless, simple mathematics tells us that, if there is an average delay of just two months in deciding cases, the judge will always be required to remember the details of several hundred different matters. Judges are human, and the human mind just doesn’t work that well.

Effectiveness requires a quicker turnaround. While we are talking about the psychology and mental aspects of retaining mentally sufficient information to make good decisions, we should remember that the largest amount of memory loss occurs on the first night’s sleep after exposure to pending matters. Recall does not improve with the passage of time for a judge any more than it does with anyone else. If notes become cold, the facts are difficult to recover, and who can keep up with several hundred sets of notes?

The hope of lawyers and litigants that the judge is spending a great deal of time meditating on a decision in their case is often without any real basis in fact.  Congestion of the docket brings about delay and delay does not make the decision making process any more effective or efficient. First impressions are often the best impressions and if a judge is reasonably convinced after hearing arguments, then there should be little delay in making the decision. A quick turnaround on the decision making process will improve the effectiveness of decision making in most instances. Needless to say, there are, from time to time, complicated legal issues that require study and careful analysis and the judge needs to identify those cases and spend the required amount of time engaged in the study. However, the matter of routinely taking cases under submission and not making a decision within the most optimal period of time is very detrimental to the effectiveness of the legal system. While some systems of accountability may be helpful, the only real solution will always remain in the integrity and work ethic of the judges themselves.

1.22 Individualism In the Legal Profession

Individualism in the legal profession, like other walks of life, thrives on legends and myths. The profession cherishes certain images. One of these images is the image of the rugged individual.

Rugged Individualism is an important part of the American tradition. Pioneers were rugged individuals. The legal profession, an adversarial profession, cherishes this image, but it is not an accurate image of the profession in modern times.  In the days of pioneers, it probably was a reasonably accurate image of the role of lawyers. Up until the Civil War the primary method of legal education was apprenticeship. Would-be lawyers “read” the law in the office of practitioners. There were no large firms. Law schools were not an important part of the picture.  Individuals were the practitioners. Practitioners such as Daniel Webster, Abraham Lincoln, and Henry Clay certainly fit the definition of rugged individuals. They were powerful orators. They developed strong reputations.  Their trials attracted audiences.

The Civil War, and its aftermath, brought many changes to the legal profession. Regardless of many other cultural causes, the Civil War, in a large sense, was about whether there would be an industrial revolution in the United States. It pitted the Northern industrial economy against the Southern agrarian economy. The central commodity for both was cotton.  Slavery was an adjunct feature of the agrarian economy and became a popular cause. There were strongly held anti-Slavery sentiments, but they did not precipitate the war.  But as the saying goes, “All is fair in love and War.”  It was inevitable that the industrial political forces would take advantage of the strong anti-slavery emotions.  It is a bit ironic that at the same time the union forces were “dealing with the slavery issue,” the were also removing native Americans from their own lands—taking their land by military force.  The underlying roots of the Civil War were strongly embedded in the desire for industrial progress. The question was whether politics would be controlled by an industrial economy or an agrarian economy.

Cotton was the principle commodity at issue.  The industrial revolution in the United States began with the cotton mills of Lowell Massachusetts. Of course, railroads and manufacturing were also important elements of the industrial revolution. The War decided the issue in favor of industrialism.  Shortly after the Civil War ended, in 1871, Harvard Law School launched the case method of study for law schools. Under the leadership of Dean Christopher Columbus Langdell, the case method put a double twist on the idea of precedent in the law. Not only was the previous case a source of law and basis for a decision, the study of earlier cases became the primary method for the study of law. This seemed to show that courts “make” law.

Civil War veteran, Oliver Wendell Holmes, Jr. was one of the Harvard Law School professors. He was also a member of the “Harvard Club,” along with the great American psychologist/philosopher, William James.  Holmes would become an important legal philosopher and Chief Justice of the Supreme Court. No doubt seeking to find a foundation for law base on the strong sentiment for empiricism that was consistent with the pragmatic thoughts of the Harvard Club, he made statements like “Law is what a court does.” This furthered the idea that courts actually make law rather than finding it in the beliefs and practices of the culture. (It is one thing for courts to “make” law in struggling to properly decide a case; it is quite another for courts to use the decisions in a case as an excuse to preempt the work reserved to the legislative branch.) But this was the milieu in which law school came to be the accepted means of legal education.

Corporations controlled industry. Large law firms like the Cravath Firm in New York quickly became the advisors to the developing corporate America. Ownership of property quickly transitioned from individual ownership to corporate ownership with individuals being stockholders who did not manage the wealth. As law firms developed, the “A” students were hired by the leading law firms, who worked as teams of specialists, and became the advisors of business. Rugged individuals were not in charge.

Even in the changed environment, the mythology of individualism in the practice of law continued. After the turn of the 20th Century, there was the famous Scopes Monkey Trial in Tennessee pitting Clarence Darrow against William Jennings Bryan. The famous names helped to perpetuate the myth of the importance of individualism in the practice of law. However, the main business of advising the railroads and burgeoning corporate America rested with the proliferating large law firms. The actual practice of law as it affected the development of America was not in the hands of individual solo lawyers.

Nevertheless, the imagery is still a highly romantic notion that affects the self-perception of the members of the legal profession. The State of Alabama, during the first half of the Twentieth Century was gradually emerging from the agrarian economy and moving toward the industrial economy. That provided the perfect setting for To Kill a Mocking Bird .  In the pleasant glow of that romantic notion of lawyering, the Alabama legal profession in recent years formed the “Atticus Finch” Society. While the bedrock virtues of total honesty, and loyalty to the client, the legal system and the truth remains absolutely necessary for the successful operation of any legal system, reality today does not sustain the Atticus Finch role model in the actual operation of the legal system any more than doctors making house calls provides an adequate image for today’s medical care.

The reality of law today is dominated by and exists in a complex corporate environment. The image of individualism is no longer a valid working model for attorneys. We need to find new images that retain all of the necessary virtues.

1.23 Natural Law and the United States Constitution

The concept of natural law was the prevailing philosophy of law throughout the formative period of modern nation states. The idea was that law is something that occurs naturally. It exists in nature, and is there to be discovered.  Nature, of course, includes human nature and the nature of human society. In general the created order of nature, some solutions to problems are better than others. Under natural law theory, the task of courts and legislative bodies is to find that law and declare it.

At about the time the United States came into existence, a philosophy of law called legal positivism asserted itself. The theory of legal positivism is that law originates purely in human intellectual activity. Humans simply invent law. When the United States Constitution was adopted; the idea was that the Congress (and state legislatures) composed of people who are chosen by democratic processes, would create laws. The Courts would decide cases by applying the law created by the legislative branches of government. The executive branch would carry out enforcement of the law and provide government. All of that was embedded in the United States Constitution with its treasured concept of a separation of powers. Theoretically the branches of government would hold each other in check, preserving liberty to the people.

Early in the history of the United States, the question arose as to which branch-legislative, judicial or executive-would have the power to ultimately declare the meaning of the United States Constitution. In the landmark case of Marbury v. Madison the United States Supreme Court decided that it had the authority to examine federal legislation (obviously produced by Congress) to determine whether that legislation is consistent with the Constitution. This decision was made against the background understanding that the legislative branch makes laws, and courts decide cases.  From that time forward the Supreme Court asserted the right to declare legislation that is inconsistent with the Constitution unconstitutional and unenforceable.

That issue came back to the forefront during the Great Depression when the conservative Supreme Court of the United States declared New Deal legislation unconstitutional. Roosevelt even threatened to stack the Supreme Court with new members in order to allow his legislation to be declared constitutional. Although he did not do that, during his 13 years in office he appointed a much more liberal Supreme Court that made decisions much more consistent with his thinking.

During the second half of the 20th century, a new era arose in constitutional law under the direction of that liberal Supreme Court. Up to that point, the Supreme Court had limited itself to overturning federal legislation and if it were inconsistent with the constitution. However, during the second half of the 20th century the court began to declare positive law based on its interpretation of the Constitution. For instance in the case of Miranda v. Arizona, the Supreme Court spelled out exactly what a police officer had to say to a suspect in a criminal case before interrogating that suspect. The now famous Miranda Warnings were not statutes enacted by Congress, but were requirements declared by the Supreme Court of the United States. Note the huge difference between declaring legislation unconstitutional on the one hand, and declaring positive law based on what the Supreme Court thinks that the constitution means on the other. Many other instances of declaration of positive law occurred.

The Supreme Court of the United States appeared to actually select cases for the express purpose of declaring positive law, a function which most of us thought the Constitution reserved to Congress. That assigns to the Constitution an oracle-like quality. What does the Constitution have to say about abortions or gay marriages? I suspect that the writers of the Constitution would be shocked to find that anybody believes that the Constitution had anything to say about either of those topics or other topics for which the Supreme Court has used the Constitution as a theoretical basis for the declaration of positive law.

One of the problems for the natural law theory was the question as to who declares the natural law. At one point in time, the church assumed that role. When the legal positivist movement began, courts were severely criticized for declaring positive law under the guise of “finding” natural law. By any reasonable standard it would certainly appear that the oracle-like approach to the United States Constitution by the federal courts is subject to that same criticism today. With natural law the courts could look to the entire wisdom of the culture. The legislative branch could change the law to reflect the will of the people. But when the United States Supreme Court makes declarations of positive law in the name of the Constitution, their decisions cannot be reversed by the legislative branch. This is a far cry from Marbury v. Madison and review of legislation to determine if it is constitutional. When Congress was creating law, it made sense for courts to review. But who can review the positive law declared by the court? What happened to the checks and balances insofar as the court system is concerned?  And how is that a just power, derived from the consent of the governed?

1.26 Faith About What Is Right

At the time I wrote this essay as an oped for the Alabama Gazette, I was teaching a course about faith in my church.  I make the rather obvious point that faith is what we really believe.  I’m afraid that sometimes we don’t really believe what we say we believe.  We always act consistently with our faith.  Jesus said, “Ye shall know them by their fruit.”  But if our actions always reflect what we believe, then why don’t we always do what we know is the right thing to do?

Plato thought if a person knows what is right, he or she will act consistently with that knowledge.  Plato has a following.  A Twentieth Century expert in moral formation, Lawrence Kohlberg, built his cognitive developmental theoryof moral development on that premise.  At first blush the Plato/Kohlberg position appears to be consistent with my contention that we always act consistently with our faith.  But St. Paul said it well when he said “The good which I would, I do not, but the evil which I would not, that I do.”  Flip Wilson may have explained it all when he said “The devil made me do it.” 

Learning correct principles of moral behavior is very important.  If we don’t know what is right, there is a strong probability that we won’t do what is right.  We need to know what is right.  However, Plato and Kohlberg were wide of the mark with their theory that if we know what is right we will do what is right.  Merely “knowing” is not enough.  Knowing is not complete faith.  Faith involves emotions, as well as cognitive learning.  Faith is what we believe with our whole heart.  The “knowing” has to connect to the emotions to produce right behavior.  If moral training that connects the emotions to knowledge of right doesn’t happen in early childhood, learning that occurs later in life may not cure the problem.  Psychologists Sigmund Freud and Eric Erikson nail it in their psychosocial theory of moral development.  They recognize the role that emotion plays in moral development.  Parents and peer group install the conscience.  We don’t always act consistently with correct moral principles that we “learn,” but we are quite likely to act consistently with those principles if parents and peer group embed them in our emotions. 

Even when we knowingly do things that we shouldn’t do, an understanding of faith provides the explanation.  We always act consistently with what we believe at the moment, even if the belief is simply that we can get away with things that we should not do.  In the fleeting but permanent slice of time that we call “now,” when all action takes place, what we do always reflects what we believe, even if shame and remorse overtake us immediately afterwards. 

A properly installed conscience is more than shame.  It is more than fear of detection.  Whatever it is within us that watches our thoughts and connects to our feelings governs behavior.  It pushes us to achieve the ideal self or ego-ideal posited by Sigmund Freud.  The ego ideal is the good person that we want to be. 

A well-developed sense of duty is and always has been essential for human progress.  It usually brings about proper behavior.  Moral behavior makes the group more efficient.  Moral requirements serve the needs of the group.  Human survival and social evolution depend on individual conformity to the requirements of the group.  A properly functioning moral system promotes human progress.  The belief that we might “get away with it” does not always (or even most often), override moral convictions.  Sound moral development usually produces morally correct behavior without necessity for outside compulsion.  Knowingly doing what is wrong demonstrates inadequate moral formation.  The “internal observer” (Freud’s superego?) watches over our thoughts and feelings looks at the knowledge of what is right and wrong, the possible embarrassment of detection, damage to our ideal image of our self that will result from immoral behavior, and makes a judgment about what to do.

If there is a violation of moral expectations, the group will exert moral force to try to make the individual take responsibility for any damage resulting from inappropriate conduct, The group’s greatest ally is a properly installed conscience. 

[You may check the names mentioned and the terms italicized in this article on the internet for more information.]

1.29 Opinion Polls

It seems that every time I answer the phone these days it is a new opinion poll. The pollsters have even gotten into cell phones. I probably receive at least twelve or fifteen such calls each week. The calls are intrusive and excessive. The benefits arising from the overburden of surveys is highly questionable.  This essay was written for the Alabama Gazette in early 2016, and I receive fewer calls from pollsters now.  However, the problems with polls and polling persists.

The problems facing government these days are highly complex, to say the least. Most of us have our hands full simply dealing with the problems that confront us in daily life. So what is the wisdom of politicians seeking opinions from people who, prior to hearing the question, have not given any critical thought to the issue that is raised. Compiling all of the ignorance in the United States is not likely to create wisdom. The solutions to complex problems requires critical thinking.

In many instances the solutions to the problems confronting the United States will require new and imaginative creative thinking. Unfortunately, there is a well-recognized tendency amongst us human beings to reject or attempt to destroy anything that we do not understand. Thus creative solutions to problems are not likely to fare well in public opinion polls. Opinion polls are likely to appeal to the very worst in human nature, invoking prejudice and knee-jerk reactions.

I suspect that a large number of people, among them some of the most intelligent people, find the opinion polls objectionable. These intelligent people probably hang up the phone without responding. If this is the case, then certainly this practice of intelligent people calls into question the statistical validity of the huge number of public opinion polls conducted by telephone. Many of the polls are anything but impartial.  The prejudices of pollsters are likely to be reinforced by the opinionated people who are anxious to respond to the pollster’s questions, while those on the opposite side tend to hang up.

The impression is that many of the polls are actually instigated by political candidates. If the results of the poll are not beneficial to the candidate, what is the likelihood that the results will be published? What is the likelihood that a political candidate paying for a poll during the political season simply to inform himself or herself as to how the public feels, in order to make informed political decisions? But if the results turn out to be favorable what is the likelihood that the candidate will publish the results? What is the likelihood that the publication of such polls will adversely affect the wisdom of public opinion?

All of this is not intended to suggest that the well-considered opinion of the public is unimportant. In another essay I have described the crucial role played by consensus reality that is actually built on public opinion.  The purpose of this essay is to simply suggest that the polls as conducted are not likely to elicit wise public opinion. In order to be valuable, public opinion needs to be well informed public opinion. It does not need to be opinion expressed after hard day’s work during which no thought whatsoever was given to the issues raised by the pollster.

This discussion suggests important issues. First it is clear that opinion polls, as presently conducted, are not a desirable way to deal with the determination of public opinion. One of the underlying problems is that most members of the general public do not have enough information at their disposal to formulate decisions on many of the most important public issues. Or, stated differently, most members of the general public are not inclined to avail themselves of the information that is available on important public issues. Sports events are more entertaining than political debates. Political debates are interesting only when they project the prejudices and engage in the demagogic rhetoric that the public enjoys. Unfortunately, the solutions to our problems are not usually found in the debates that we find entertaining.

This article is, itself, an appeal to public opinion. It is a suggestion that the public should voice its opposition to unsolicited public opinion polls. But at the same time it is a suggestion that the public needs to begin acquiring information necessary to reach informed opinions on public issues. The public needs to find ways and means to express those opinions in a meaningful fashion.  And those do not include arming prejudiced news media with and inside scoop on what the public believes.

The main area in which the public needs to form sound judgment is in the selection of the politicians who will ultimately make the decisions. It is neither possible nor desirable for the general public to participate directly in the decision-making process as to the underlying issues. That is why we have a representative democracy.  It is much more important to select leaders who will make wise decisions, and whose judgment can be trusted, than to nail those leaders down on specific issues.

1.32 Ignorance: The Mother of Evil

First, let’s focus on ignorance; then we can develop its relationship to evil. Ignorance certainly includes lack of knowledge, but ignorance is not merely absence of knowledge. Ignorance has a positive existence in human consciousness. It consists of incorrect beliefs more often than the mere absence of correct beliefs. The incorrect beliefs are strongly and passionately held and adamantly defended. Ignorance doesn’t just stand on a street corner and rant and rave; it talks on I-Phones, and moves in emails.  It gains access to media, public offices, courtrooms, classrooms and even churches.  It enters into and affects behavior just like any other strongly held belief.

Like other strongly held beliefs, ignorance is often shared by the groups of which an individual is a part. It is what the group believes. It is handed from generation to generation in the bosom of the family.  The fact that the individual is surrounded by others who maintain the incorrect belief makes it extremely difficult to eradicate. It actively resists truth and correction.  The faulty dictum that one opinion is just as good as another supports is a cornerstone of ignorance.  Some opinions obviously are better than others.  Ignorance and incorrect opinion forms the basis for inappropriate behavior.

Against this background it is not at all difficult to see a connecting link between ignorance and evil. The fact that the Bible teaches that we have all sinned shows that none of us are immune from ignorance. This discussion is about the concepts of ignorance and evil.  It is not a pretext for launching into a tirade about some specific activity that may be going on in the world right now, although there is plenty of ignorance and evil that could be discussed.  No particular issue precipitated this essay!

Just a word or two about good and evil.  Anything that promotes the welfare and survival of humanity is good.  Anything that is harmful to humanity and decreases the likelihood of human survival is evil.  Now let’s get theological. The Old Testament concept of sin, or evil, is breaking the law. The law was (and still is) a gift of God. Jesus made it clear that he did not come to destroy the law but to fulfill the law. The abundant life that he advocated does not happen simply because one does not break the law. Breaking the law is not a good thing to do, but following the law does not necessarily make one good. The New Testament concept of evil is “missing the mark.”  It is failure to live life in a way that brings about good.  When one actively pursues an abundant life, and does things that make the world a better place to live, good arises.

The New Testament concepts of good and evil differ dramatically from beliefs that arise out of the Zoroastrian Persian religious milieu and its mythology. In that mythology, the hero Marduk destroys the force of evil, which is epitomized by a giant snake, Tiamat.  The material universe arose from the remains of the evil beast. Thus, in that way of thinking, good arises from the destruction of evil. Those fundamental differences in cultural beliefs may explain a lot about the things that are happening in the world today. But they give rise to pretty serious questions about what to do about evil.

Christians believe that the Crucifixion of Jesus was an evil thing. But Jesus said “Father forgive them, for they know not what they do.” His request was predicated on the fact that the evil actions arose from ignorance.  The Bible says that the wages of Sin are death. So theoretically evil ultimately destroys itself, and that will put it out of business. Evil ultimately doesn’t work out. That seems to be consistent with Darwin’s theory of survival of the fittest. But all of that theology and philosophy doesn’t provide great peace of mind for most of us as we watch a world that seems to be churning with evil. It does not eliminate the strong temptation to aggressively fight evil at every opportunity, and adamantly believe in our cause.

The Christian viewpoint is eminently correct. We cannot save world merely by fighting evil. Of course, when someone like Hitler comes along, we have to resist. But even that does not create good.  A lot of good things were destroyed in World War II.  But in the final analysis, to promote good, we have to show the world a better way.  When we look at the eons involved in the evolution of human consciousness, it becomes clear that we can deal with ignorance and evil only by finding and offering a better way.

1.33 A Story About Schools

Public education was and is the great American dream. Nevertheless, since the 1950s, we have seen a broad-based movement toward the privatization of education. It is against this background that I tell my story. Stephen Covey who wrote The 7 Habits of Highly Effective People suggested that one of those habits is “keeping the main thing the main thing.” For a public education system, excellent, effective education is the main thing.

I was born in rural Macon County and attended Shorter High School, a public school for grades one through twelve. There were fewer than 100 students in all 12 grades. There were 9 members of my 1960 graduating class. I used to say that there were 5 basketball players and four cheerleaders, but that may not be politically correct!

Mrs. Steele Bibb was our principal and was a Huntingdon College graduate. Four members of my graduating class, including Betty Menefee, who would later become my wife, signed up to attend Huntingdon. When I almost backed out because of fear of the tuition (they were charging and almost $1000 a year!), I was recruited by Coach Neal Posey. I suspect that he knew that I could not basketball that well, but also knew my ACT score. Mrs. Bibb knew I liked basketball, and probably “recruited” Neal Posey.  Shorter High School provided excellent education. An amazing percentage of graduates attended college. But soon after my graduation, the case of Lee v. Macon, that desegregated the Alabama Public schools, made Macon County the battleground between the politics of Gov. George Wallace and power of Judge Frank Johnson, neither of whom was an educator.

I had a good academic record at Huntingdon and served as president of the Student Government Association five years after John Ed Matheson, and five years before Jeff sessions. I was easily accepted into the University of Alabama Law School.  I was one of the first Huntingdon graduates to attend law school, although there have been many since then. The foundation that the Shorter school provided passed every test. 

I returned to Montgomery to practice law.  After a couple of years living in Montgomery, I moved back to Macon County in 1970, in the opposite direction from the “white flight” that was generally occurring.  I continued practicing law in Montgomery. The public school system in Macon County was no longer the same.  Our children, Philip and Mike, attended the Montgomery Academy.  In 1982, I was elected Circuit Judge in Alabama’s Fifth Judicial Circuit, which includes Macon County, where I served for 18 years.

My brother Wade had graduated from Shorter High School in 1956. He attended Troy State Teachers College and got a degree in education. He was hired to teach at the newly formed Montgomery Academy, a private school with emphasis on college preparation. Wade taught mathematics. He continued pursuing his education and got a Master’s degree in education at the University of Alabama.

After teaching at the Montgomery Academy for a number of years and serving as an interim headmaster, he was appointed Headmaster at the Montgomery Academy. He was influential in hiring Mrs. Duke and Mrs. Jolly who had taught both of us at Shorter High School. They taught at the Montgomery Academy for several years. 

My son, Philip, graduated from the Montgomery Academy in 1985, in a class of 40 and was included among eight National Merit finalists in that class. A very bright black student from Macon County was also in that class and several others were enrolled at the Academy.  In 1986 we moved to Tallassee and Mike completed his education in the public school.  Philip now practices intellectual property law in Chicago, and Mike practices law with me in Tallassee.  This is my story.

Without question, much social progress was made by the changes in public education that have occurred since 1960. But something very important was lost. Our governments did not keep the main thing the main thing. Everyone would have been better off if they had.

Today is a new day. We must find ways to rehabilitate the dream of excellence in public education. Privatized education can never reach the talented intellects that are mired in poverty and other disadvantages.  Excellent, effective, free public education is still the great equalizer. But the education system (like the army?) must offer the opportunity to everyone to be all that they can be.  “No child held back” is just as important as “no child left behind.”  We can only renew the dream of universal excellent public education by making the main thing the main thing.