We are searching data for your request:
Upon completion, a link will appear to access the found materials.
William Abraham was born in Cwmavon, Glamorgan in 1842. Educated at the Cwmavon National School, he became a collier at the age of ten.
In 1873 he became a Miners Agent and in the 1885 General Election became the Lib-Lab MP for the Rhondda. Abraham was a marvellous public speaker and the miners gave him the nickname Mabon (the Bard). He also had a remarkable singing voice and would often entertain the miners at conferences and demonstrations.
Abraham remained active in the trade union movement and by 1907 was President of the South Wales Miners' Federation and Treasurer of the Miners Federation of Great Britain.
Abraham won Rhondda in seven successive parliamentary elections and remained an MP until he retired in 1920.
William Abraham died on 14th May, 1922.
If any friction arose and pandemonium threatened - so easy to rouse, so difficult to quell - 'Mabon' never tried to restore order in any usual way. He promptly struck up a Welsh hymn, or that magical melody, "Land of my Father". Hardly had he reached the second line, when, with uplifted arms, as though drawing the whole multitude into the circle of his influence, he had the vast audience dropping into their respective "parts", and accompanying him like a trained choir. It was wonderful, almost magical and the effect was thrilling. When the hymn or song was finished he raised a hand, and instantly perfect silence fell. The storm had passed.
United Methodists at the End of the Mainline
The United Methodist Church stands at a critical moment. Founded in 1968 at a time of ecumenical enthusiasm and euphoria, it now harbors within it forces that threaten to destroy it as a single body. Those forces did not arise overnight indeed they stretch back into the parent bodies that merged to form United Methodism. Three groups, the liberals, radicals, and conservatives, are finding their uneasy compromise difficult to maintain.
It has long been agreed that United Methodism is a coalition of diverse conviction and opinion, having been formed under the banner of theological pluralism. Church leaders took the view in the 1970s that the core identity of United Methodism, if there was one at all, was located in commitment to the Methodist Quadrilateral (Scripture, tradition, reason, and experience), and that this not only permitted but in fact sanctioned and fostered doctrinal pluralism.
Doctrinal pluralism, despite its intellectual incoherence, will work so long as something akin to Liberal Protestantism is held by the leadership of the church and so long as those who are not Liberal Protestants acquiesce. In fact pluralism is part of the intellectual structure of Liberal Protestantism. If you believe that Christian doctrine is essentially an attempt to capture dimensions of human experience that defy precise expression in language because of personal and cultural limitations, then the truth about God, the human condition, salvation, and the like can never be adequately posited once and for all on the contrary, the church must express ever and anew its experience of the divine as mediated through Jesus Christ. The church becomes a kind of eternal seminar whose standard texts keep changing and whose conversation never ends. In these circumstances pluralism is an inescapable feature of the church&rsquos life. Pluralism effectively prevents the emergence of Christian doctrinal confession, that is, agreed Christian conviction and truth and it creates the psychological and social conditions for constant self-criticism and review.
The incoherence of this position is not difficult to discern, despite its initial plausibility. On its own terms it cannot tolerate, for example, those who believe that there really is a definitive revelation of the divine, that the church really can discern and express the truth about God through the working of reason and the Holy Spirit, and that such truth is necessary for effective mission and service. Hence pluralism is by nature exclusionary. Thus it is no surprise that pluralists readily desert their pluralism in their vehement opposition to certain kinds of classical and conservative theology.
Pluralism is at once absolutist and relativist. It is absolutely committed to the negative doctrine that there is no divine revelation that delivers genuine knowledge of God it is absolutely committed to a radically apophatic conception of Christian theology, so that no human language or concept, no product of reason at all, can adequately express the mystery of the divine and it is absolutely committed to using theology to articulate Christian doctrine given the needs and idiom of the day. But it is relativist in its vision of what constitutes the material content of Christian doctrine at any point in history. Doctrine for the pluralists is the expression of Christian teaching as worked out by some appropriate theology and expressed in terms adequate to the culture of the day. To them, Christian tradition constitutes a series of landmark expressions of the faith which are worth exploring, but which must change to incorporate new insights and new truth. On this analysis tradition is seen to be a relatively benign, if not strictly binding, phenomenon.
More recently, however, a very different attitude to the church&rsquos tradition has emerged. There is now abroad in theology a form of Radical Protestantism which constitutes a whole new vision of Christian faith and existence. Its proponents claim that the tradition is dominated by patriarchy and exclusion, the product of oppressive forces linked to geographical location, social class, race, and gender. It is not to be tolerated, but stamped out and destroyed. Nobody, at least in public, would be prepared to state the matter that bluntly, but that is the truth of the matter.
Like the liberals, the radicals are both absolutists and relativists, but about different matters. They absolutize a commitment to liberation, emancipation, and empowerment. Equally absolute is the privileged position of designated victims of oppression. In some radical circles we can detect that a working doctrine of divine revelation has crept back into their discourse, where certain experiences of oppression and liberation are taken as epiphanies or as visible signs of the reign of God, and anything that questions the truth embedded in these experiences must be suppressed. On the other hand, radicals insist, we should not suppress the diverse convictions, ideologies, theories, and discourses of the new included groups. They become the real focus of pluralism as we try to foster different voices, experiences, readings, and proposals within the carefully circumscribed boundaries.
Within intellectual circles in United Methodism these developments have caused some consternation. Many of the great Liberal Protestant teachers of the tradition in the last generation have become disillusioned by the loss of their cherished conceptions of critical inquiry, courtesy, and academic standards. They are undergoing a mixed sense of despair, betrayal, and alienation. Their ideas of objective scholarship have been overtaken by forms of engaged or committed scholarship which they see as a mixture of radical subjectivism and political manipulation. A fertile few have managed to find a way to take on board some of the new theories without jettisoning the deep structure of their position, but the general sense is one of weariness and deep loss.
Recently, divisions that had only surfaced in academic discussions have begun to move out into the wider church. Significant numbers of women clergy now see opposition to their intellectual positions as ineradicably linked to right-wing Christianity or as inextricably tied to a backlash on the part of white male members in the church. This is entirely in keeping with the underlying convictions about knowledge and power that animate much of the new trend in theology.
These developments are a genuinely new arrival within the borders of United Methodism. This is not, of course, the first time that there has been a changing of the academic guard but this time we have something more, an intentional political edge that does not permit it to be contained within the standard liberal language of tolerance and civility. &ldquoEngaged scholarship&rdquo brings into the heart of the discussion considerations related to emotion, commitment, personal identity, subjective reception, and radical enactment in the public arena. There is in fact a missionary dimension that drives its adherents to transform the church and the world. In this respect the new orthodoxy is very much like earlier forms of orthodoxy that sought to serve the church from within a very particular confessional stance. There is also a concomitant concern to link knowledge and action and to relate action to vital spirituality.
Many fine pastors, theologians, and administrators, people who have given a generation of service to the church and who are committed to a small core of Christological conviction surrounded by a very flexible outer ring of conviction, still imagine that things are much the same as they were when they were in seminary. Such leaders have been able to survive intellectually by folding the reigning diversity and pluralism into their conviction that Jesus really is the Son of God and the teacher and savior of the world. Their motto could be summed up: &ldquoStick closely to Christ and leave the rest to God and human history.&rdquo This is an inadequate body of doctrine for the long haul of history, but it has served a whole generation remarkably well. Although they are aware that the intellectual landmarks are changing, they find it difficult to believe that the basic commitment to civility, relevant evidence, and respect for the tradition of the church across the ages might be overtaken by a very different vision of the church. Yet it is only a matter of time before the changes identified above will force themselves upon these leaders.
To round out this contemporary portrait of the United Methodist Church, something needs to be said about conservative or classical Methodists. It is this group, often identified in secularist fashion as the right wing of the denomination, that is accused of splitting the church.
This charge is puzzling in the extreme, for the practice of even the hard-line conservatives has been anything but schismatic. Rather than pull out, they have opted over many years to stay in and work for renewal. Indeed, most conservatives within United Methodism are instinctively oriented to renewal rather than schism. Those committed to schism have already left and gone elsewhere. The conservative wing of the church is itself a fragile coalition, including those who lean in a catholic direction, those who are card-carrying charismatics, those inclined in an Anabaptist direction, and those who are really pragmatists at heart but for the moment lean to conservatism out of convenience and traditional piety. Those who believe that there is some kind of conspiracy afoot to pull out and form a new church overlook these differences among conservatives, and underestimate the difficulty of bringing them all together. The coalition holds together informally for the most part because of the perceived threat to the integrity and continuity of the Methodist tradition. Take away that threat and the inner divisions within the conservative wing of the church will quickly become visible.
Three additional considerations are pivotal for understanding the current mood among conservatives. First, they have been reasonably effective at the local level in some cases their success in growing local churches has been spectacular. This has kept them busy and enabled them to ignore those features of the larger church that disturb them. Secondly, they have become more organized politically within the church as a whole. Though still at the margins, they now have to be reckoned with seriously. Thirdly, a network of highly educated conservative academics has begun something of a renaissance of classical Wesleyanism. The development of such a network opens the way for a deeper renewal, looking to issues of principle that would otherwise be ignored and to articulating a more forceful diagnosis of the situation in the church.
Schismatic activity would involve conservatives abandoning their own principles. There are few more telling pieces on the evils of schism and its consequences than that provided by the founder of Methodism, John Wesley. (The irony of Wesley&rsquos own position will not, however, be lost on the perceptive reader, for Wesley made this attack on parties within the church all the while he was organizing one of the most effective renewal movements that Anglicanism had seen.)
Consider the following comments:
As . . . separation is evil in itself, being a breach of brotherly love, so it brings forth evil fruit it is naturally productive of the most mischievous consequences. It opens a door to all unkind tempers, both in ourselves and others. It leads directly to a whole train of evil surmisings, to severe and uncharitable judging of each other. It gives occasion to offense, to anger, and to resentment, perhaps in ourselves as well as in our brethren, which if not presently stopped, may issue in bitterness, malice, and settled hatred, creating a present hell wherever they are found, as a prelude to hell eternal.
Wesley provides a graphic catalogue of woes that follow from division and schism. Evil tempers lead to evil actions, which in turn lead some Christians to abandon the faith and put their eternal salvation at risk. Offense is given to the Holy Spirit, holiness is quenched, and evangelism suffers, for outsiders see no point in becoming Christian. Ultimately both the power and the very form of religion are destroyed. Even a cursory reading of Wesley is an antidote to any thought of schism in the church.
Despite these features of conservative Methodism, others still fear it as a source of division in the church, and perhaps understandably so. A new brand of conservative is emerging who is arguing that United Methodism really does have a substantial doctrine to which the tradition has been and should be committed. Non-conservative United Methodists instinctively fear that such a perspective will divide the church because it involves the marking of boundaries between those who are in and those who are out. In short, critics are relying on the old slogan that doctrine divides while experience unites. The insistence that United Methodism is a confessional church, a central claim of most conservatives, threatens the commitment to pluralism, diversity, and inclusiveness of the last generation of United Methodists. Here we have reached the nub of the charge, for abandoning pluralism and accepting diversity only within agreed boundaries does indeed represent a significant departure from the unstable orthodoxy that has been in vogue for so long.
Yet even this move on the part of conservatives need not lead to schism. On the contrary, those pressing this reorientation have done exactly what those committed to pluralism did a generation ago. They have worked out a careful account of the United Methodist tradition that rivals the prevailing one. They have proposed a deep conversation on the doctrinal identity of United Methodism, and they have insisted that any debate that emerges be conducted in a serious and civilized fashion. Moreover, they readily acknowledge that proposed legislative and other changes, if needed, should be carried out within the corridors and courts of the church in a rational and fair manner. Liberal Protestants should grasp the value of such an approach immediately. It is an open question whether they will actually do so, or whether they will join with Radical Protestants in dismissing this whole exercise as a cover for ideology and a quest for power.
In light of all these considerations, it is quite remarkable that United Methodism has been able to hang together for so long. While other factors are clearly involved, we have been fortunate to have had a cadre of Liberal Protestants who have been able to lead (albeit in a way that has exasperated both conservatives and radicals), and to have had a strong commitment on the part of conservatives to stay on board and work for renewal. However, as I have noted, this is now in the process of disintegrating, and it is the liberal commitment to pluralism that is giving way. Pluralism, much as it continues to be prized among liberals, is a self-destructive notion rejected by both radicals and conservatives. It is an inherently unstable arrangement that cannot survive either the force of logic or the march of events.
We are facing, then, the breakdown of a working consensus, and it is not difficult to imagine what it would take to complete the break. A headstrong figure, the theological and ecclesiastical equivalent of a Ross Perot, might emerge and insist that the whole church follow his way or die. A significant group of bishops could manage to develop an agenda deeply at odds with prevailing circumstances. Some large bodies, or jurisdictions, might become so alienated from the leadership of the church and so upset about funding policies in key areas that they decide to withhold all contributions to the Connection, the governing body of United Methodism.
Suppose there emerged from left or right an issue of moral commitment over which the diverse movements in the church could agree that church-wide action must be taken but could not agree on what action to take. Suppose, further, that this issue was logically related to matters of principle at a deeper level, so that one could not commit oneself on this issue without also making significant commitments about the internal logic and character of the tradition as a whole. Suppose, still further, that those demanding action intended to use not just argument and rhetoric but activist demonstration to secure their ends. Suppose, finally, that they were to form a community of local churches and other entities within United Methodism that both expressed their moral convictions and worked assiduously for the practical adoption of their agenda. If such a scenario were to develop, then there can be no doubting that the community would be ripe for outright schism.
It does not take a rocket scientist to work out what the relevant scenario actually is. Like all mainline Protestant denominations, United Methodism finds itself challenged on its traditional position on sexual morality by the emergence of the conscientious conviction that gay and lesbian relationships are a legitimate expression of God&rsquos good and diverse creation. Revisionists are sufficiently agitated by the righteousness of their cause that they deem it essential to make use of both rational and nonrational means to win over the church as whole. More than a decade ago they took the important step of institutionalizing their position across the denomination.
There is a deep and unintended irony in this development. The theology driving the conscience of change is one that is deeply committed to inclusivism. In this theology gay and lesbian Christians have the same status earlier attributed to slaves and currently attributed to women, the status of those excluded from the traditional church. The clear aim is to include this new minority within the church, but the effect is to drive out those opposed to legitimizing homosexuality. Because they see themselves as agents of reconciliation and unity, the revisionists have difficulty seeing that their position is in effect exclusionary.
Awareness of this paradox may do little to alter the way things will turn out. Perceptive revisionists can see this, and they face a difficult dilemma. One prominent pastor personally committed to the position of the revisionists stated in a pastoral letter to his congregation that were the revisionists successful, those opposed to the legitimization of homosexuality would be forced to make a painful decision: they could either remain within a church that would stand for an agenda they found incompatible with obedience to Christ, or they could leave the church. &ldquoOn an issue on which the whole body of believers finds so many unresolvable questions, I find it unacceptable to force a large number of our members to face this dilemma.&rdquo
This is a refreshing acknowledgment of the matter. Equally refreshing in its honesty is the following comment of a senior pastor of a Reconciling (i.e., revisionist) congregation.
Now it is our turn to get honest. Although the creeds of our denomination pay lip service to the idea that Scripture is &ldquoauthoritative&rdquo and &ldquosufficient for faith and practice,&rdquo many of us have moved far beyond that notion in our theological thinking. We are only deceiving ourselvesand lying to our evangelical brothers and sisterswhen we deny the shift we have made.
We have moved beyond Luther&rsquos sola Scriptura for the same reason the Catholic Church moved beyond the canonized Scriptures after the fourth century. We recognize that understandings of situations change. &ldquoNew occasions teach new duties.&rdquo We have moved far beyond the idea that the Bible is exclusively normative and literally authoritative for our faith. To my thinking, that is good! What is bad is that we have tried to con ourselves and others by saying &ldquowe haven&rsquot changed our position.&rdquo
Furthermore, few of us retain belief in Christ as the sole way of salvation. We trust that God can work under many other names and in many other forms to save people. Our views have changed over the years.
Such an admission makes clear that more is at stake on this issue than a new moral judgment of homosexuality. What is at stake are issues of principlethe role of revelation and Scripture in the formation of consciencethat affect matters of doctrine ranging from the place of the Methodist Quadrilateral in the formation of United Methodist identity to the place of Christ in salvation.
The dilemma for the conservatives, forced upon them by the attack against traditional teachings, is simple: they perceive their position to be essential to Christianity, so they cannot see it abandoned and retain loyalty to what is left.
Not surprisingly, we can look to the founder of Methodism for guidance. John Wesley recognized that not all internal disputes within the church could be traced back to bad faith or lack of love. Some were matters of conscience. Speaking of his relationship to his beloved Church of England, he wrote:
I am now, and have been from my youth, a member and a minister of the Church of England. And I have no desire nor design to separate from it till my soul separates from my body. Yet if I was not permitted to remain therein without omitting what God requires me to do, it would then become meet, and right, and my bounden duty to separate from it without delay. To be more particular, I know God has committed to me a dispensation of the gospel. Yea, and my own salvation depends upon preaching it: &ldquoWoe is me if I preach not the gospel.&rdquo If then I could not remain in the church without omitting this, without desisting from the gospel, I should be under a necessity of separating from it, or losing my own soul. In like manner, if I could not continue to unite with any smaller society, church, or body of Christians, without committing sin, without lying and hypocrisy, without preaching to other doctrines which I did not myself believe, I should be under an absolute necessity of separating from that society. And in all these cases the sin of separation, with all the evils consequent upon it, would not lie upon me, but upon those who constrained me to make that separation by requiring of me such terms of communion as I could not in conscience comply with.
This is a sobering admonition. Given that it appears within the canonical heritage of United Methodism, it is worth asking whether what it portends can be forestalled. How might division be avoided? We can think of several possibilities, all of them unlikely.
Perhaps there will be decisive new evidence or a fresh interpretation of the available doctrinal and empirical data that will lead one side to convert the other, thereby salvaging unity. This is a very unlikely possibility, for it is implausible to think that radically new evidence will emerge, or that a significantly new reordering of current data will be advanced. The standard lines are well known and unlikely to change.
Perhaps someone with the stature and wisdom of Solomon will emerge and find a way to develop a framework in which both sides could accept each other within an agreed consensus. This is an unlikely scenario for at least two reasons. First, the church as a whole has experimented at length with this very option in its commitment to doctrinal pluralism. As I have repeatedly argued, this is an incoherent and unstable arrangement that is now falling apart. Second, the tradition is too big and too full of parties, caucuses, movements, and organizations to permit such a person emerging on a national scale. The same logic applies to the possibility of concerted effort on the part of the Council of Bishopsthe bishops themselves are deeply divided on the relevant issues and have now expressed that division in public.
Perhaps the revisionists will come to acknowledge the consequences of their position and withdraw either to form a new church or to join a church that advocates their position. This too is unlikely.
The revisionists do not present a monolithic front. In fact one of the most interesting features of the revisionist position is that it can harbor both liberals and radicals, a feat of significant proportions given the tension between these two groups. The revisionist position spans the field from those who might entertain second thoughts about their position all the way to those who are absolutely convinced that revision is demanded by the gospel, stems from the guidance of the Holy Spirit, and represents appropriate prophetic action in the current generation. Some of the latter also take the view that all opposition to their cause is prompted by bigotry, intolerance of minorities, and ignorance. Many of them believe that their cause is as correct as that of opposition to slavery and of the opening of ordination to women. Given these sorts of convictions, it is most unlikely that the revisionists will discontinue pursuing their aims within the church.
What then is likely to happen? Initially, much will depend on the speed of developments in the deliberations and actions of three major constituencies within the church: the liberal institutionalists, the racial and ethnic minorities, and the conservatives.
The institutionalists are concerned less with the rightness or wrongness of homosexuality and related issues than with the future of the denomination. Their natural reaction to the church&rsquos dilemma is a mixture of anger, distress, irritation, and fear. They would dearly love not to address the issues at all, to muddle through as best they can, and to stay clear of all talk of division and schism. Their heads may well be with the conservatives, but their hearts are with the revisionistshence they find themselves inwardly torn. They especially fear any discussion that goes to the principles of the tradition, preferring to live as best as they can with whatever compromise is worked out. The time for decision for this group will come when they must enact the practices of the revisionists in their local churches. At that point their heads must win out over their hearts if a schism is to be avoided.
The minority groupsAfrican Americans, Hispanic Americans, and Asian Americanswill also be crucial for future developments. In this case there will be even greater reluctance to side with the conservatives in the church. In the past, these groups have perceived conservatives to be suspect on racism, while on the other hand they worked with liberals in the fight for civil rights, and several of their theological heroes are crucial forerunners if not advocates of radicalism. Their natural alliance would seem to be with the revisionists. Yet much of the theological and liturgical content of the African American, Hispanic, and Asian American traditions is in fact deeply conservative and orthodox. It is, therefore, very possible that the leaders of these traditions could break with their earlier alliances and move in a significantly different direction.
Finally, there are the conservatives. Some of them will undoubtedly take an aggressive line, resorting to legislative action, mass mailings, letter-writing campaigns, verbal agitation, and the like. This is all the more likely in light of the recent narrow acquittal by a church tribunal of a pastor on the charges that he violated church law by performing a wedding ceremony for two lesbian members of his Omaha, Nebraska, congregation.
Other conservatives, those who would gladly identify themselves as moderates, traditionalists, or centrists, may well be glad that there are more radical conservatives around to raise the issues, but they are extremely nervous about any kind of drastic action. Tempted perhaps to take the line adopted by institutionalists, they will bide their time hoping that the crash never comes.
In the short term we need some way to hold off precipitous actions on the homosexual issue that will lead to the division of the church. But it is clear that homosexuality is but one of a number of potentially church-dividing issues. In the long term we need to stimulate conversation toward the emergence of a new theological consensus that might command the allegiance of a majority in the church at large.
However this important conversation continues, and it surely will continue, it must be informed by the very real possibility that the Liberal Protestant project exemplified by United Methodism was flawed from the start. Perhaps the very idea of theological pluralism was bound to self-destruct in time. These are the ominous questions now engaged. The truth and the church we love deserve from parties on all sides of these questions clear thinking, honest speaking, mutual respectand much prayer and fasting.
William J. Abraham is the Albert Cook Outler Professor of Wesley Studies at the Perkins School of Theology, Southern Methodist University. He is the author of Waking From Doctrinal Amnesia: The Healing of Doctrine in the United Methodist Church (1995) and Canon and Criterion in Christian Theology from the Fathers to Feminism, just out from Clarendon/Oxford University Press.
William E Abraham, author of "The Mind of Africa"
Born in 1934, William Emmanuel Abraham is a Ghanaian philosopher, and author of The Mind of Africa (first published in 1962). A new edition of Th.e Mind of Africa was published by Sub Saharan publishers in 2015, and this can be purchased from African Books Collective online bookstore.
William attended school at Adisadel College in Cape Coast, Ghana, and went on to study philosophy at the University of Ghana Legon, and then at Oxford University. At Oxford, he became the first African fellow of All Souls, and his interest in African politics quickly developed into a Pan Africanist perspective. The Mind of Africa, written whilst at All Souls, was a fruit of that enlarged perspective.
Return to Ghana
During a visit to Ghana in 1962, the then President of Ghana, Kwame Nkrumah, persuaded William to move back to Ghana to teach at the University of Ghana, Legon. William subsequently became pro Vice Chancellor of the University, and chair of the three person vice presidential committee overseeing Ghana's affairs at times when President Nkrumah was abroad. In 1965 William was elected Member of Parliament for Cape Coast. During this period he also chaired the Abraham Commission into Trade Malpractices in Ghana (1965).
After the coup against Nkrumah
In February 1966, Kwame Nkrumah was overthrown in a police/military coup, and many of those close to him including William were arrested. William was imprisoned in Ussher Fort, Accra for 9 months, after which he was released and returned to duties as a Professor at the University of Ghana, before accepting an invitation to be visiting professor at the University of Indiana. This was followed by a similar role at Malacaster College. William finally moved the University of California Santa Cruz to continue his teaching and research, where he stayed until his retirement. He continues as professor emeritus.
William is married to Marya Abraham, and lives in St. Paul Minnesota. He has 9 children.
SELECTED PUBLICATIONS BY WILLIAM EMMANUEL ABRAHAM
2017 What Did Jesus Do? Some Theological Reflections, WestBowPress, May 2017. ISBN 1512785628
1987 African philosophy: Its proto-history and future history in Volume V of The Chronicles of Philosophy, D. Reidel
1987 The Strategy of Plato's philosophy of language in Logos and Pragma, a Festschrift for Professor Gabriel Nuchelmans, Aristarium Series, Vol 3, Nijmegen
1985 Sources of African identity: philosophical foundations, in Africa and the Problem of its Identity, ed. Alwin Diemer, Frankfurt am Main, Bern, and New York
1980 Monads and the Empirical World in Leibniz in Theoria cum Praxi, Wiesbaden
1978 The Origin of Myth and Philosophy" Man and World, Vol. XI, No. 1/2, pp. 165-85.
1975 Africa rediviva, book chapter in Readings in African Political Thought, G-C Mutiso and S.W. Rohio, eds., Part VII, Ch. 19.
1975 Leibniz's Philosophy of Logic and Language, Man and World, Vol. 8, No. 3, August, pp. 347-358.1975 Predication, Studia Leibnitiana, Band VII, Hannover, pp. 1-20.
1974 Disentangling the Cogito, Mind, LXXXIII, England, pp. 75-94. Link to paper
1972 The Incompatibility of Individuals in NOUS VJ, I, pp 1-13. Link to paper
1972 The nature of Zeno's Argument against plurality in DK 29BI in Phronesis XVII, I, pp 44-52 Link to paper
1969 Complete concepts and Leibnitz's distinction between neccessary and contingent propositions , Studia Leibnitiana 1 (4):263 - 279. Link to paper
1964 The life and times of William Amo, Transactions of the Historical Society of Ghana. Link to paper
1962 Book chapter Creators of Literature in Prospect. Alfred Hutchinson & Co., Ltd., London
Added 2019-08-30 21:31:02 -0700 by Private User
About Chief William Abraham Hicks
William Abraham Hicks (1769 - 1837?, age 68) became Principal Chief of the Cherokee Nation in 1827 after being elected to succeed his older brother, Charles R. Hicks, the longtime Second Principal Chief who died on 20 January 1827, just two weeks after assuming office as Principal Chief. William served until October, 1828.
In 1832, he became a figurehead for the Cherokee Nation faction advocating a treaty for emigration west of the Mississippi River. In December 1833, members of the Treaty Party elected William Hicks as their Principal Chief (with John McIntosh as his assistant), though Major Ridge and son John Ridge were widely recognized as the true leaders of this faction. He died at Oothcaloga Creek, Georgia before the Removal at age 68.
Charles and William's parents are believed to be a Scottish trader named Nathan Hicks and Nan-Ye-Hi, a half-blood Cherokee woman, who was herself a child of a Swiss man named Jacob Conrad and a Cherokee wife. William married Sarah Bathia Foreman and had 14 children.
CHIEF WILLIAM ABRAHAM6 HICKS, SR, CHIEF (NA-YE-HI5 CONRAD, JENNIE4 ANI'-WA'YA, OCONOSTOTA3, MOYTOY2, A-MA-DO-YA1) was born Abt. 1769 in CNE [GA], and died Bef. November 1837.
He married (1) LYDIA QUA-LA-YU-GA HALFBREED Abt. 1792 in Spring Place, GA, daughter of BIG HALFBREED and QUA-LA-YU-GA CRITTENDEN. She was born Abt. 1776 in CNE [GA], and died 1849.
He married (2) SALLIE FOREMAN 1804 in Tennessee, daughter of JOHN FOREMAN and SUSIE TI-TA-S-GI-S-GI. She was born Abt. 1788 in CNE [TN], and died September 01, 1839 in Fairfield, CNW.
Notes for CHIEF WILLIAM ABRAHAM HICKS, SR, CHIEF:
OCCUPATION: Principal Chief, 1826 - 10/13/1828. Notes of Starr, Letter bks A-F, v1, p119, note C641.
List of students UBM at Spring Place, CN East, 1804-1834. Jerry Clark 8&9 Cher Fam Resch Fall 1992 and Spring 1993, page 10.
In the Cherokee emigration Rolls 1817-1835.
- 1833 Wm Hicks Sr. Age over 50 residing in Oothcaloga GA (b bef1783)
- 1833 Wm Hicks Jr. age under 25 from Oothcaloga GA (b aft1808)
- Wm Hicks Jr. Arrived May 8 1834.
Table 5, p407-418, The Brainerd Journal lists three students that entered the mission on 12/07/1818, Edward, Jesse and a Polly Hicks. (who is Polly Hicks?)
A Gift from Mary Lincoln
After Abraham Lincoln’s death, Mary went into mourning and remained in widow’s clothes until her own death in 1882. She gave some of her White House finery to family members. Her cousin, Elizabeth Todd Grimsley, received this purple velvet ensemble. In 1916 Grimsley’s son, John, sold the ensemble to Mrs. Julian James for the Smithsonian’s new First Ladies Collection.
John Grimsley attributed this dress to a “seamstress of exceptional ability” who “made nearly all of Mrs. Lincoln’s gowns.” Although he mistook her name as “Ann,” he most likely was referring to Elizabeth Keckly.
Little Known Black History Fact: William H. Johnson
William H. Johnson, an African-American man, was the personal valet of President Abraham Lincoln. Johnson was employed by the president well before he went to the White House. He was there when Lincoln received the Republican Nomination for president.
William Johnson accompanied the president to the famous Gettysburg Address in November 1863.
When Lincoln became president, he was pressured to fire Johnson because he wasn&rsquot the traditional &ldquopaper bag&rdquo skin color of the other employees. Johnson was indeed fired, but Lincoln referred him for a high profile job with the U.S. Treasury Department. Johnson also continued to do some odd jobs for the president, including fittings, valet and barber services, despite White House protocol.
The close friendship between Lincoln and Johnson was under question for years the president co-signed a loan for Johnson and buried him when he died. It may have been out of friendship or out of guilt. William H. Johnson died in January 1864 after nursing President Lincoln back to health when he showed symptoms of smallpox during the trip to deliver the Gettysburg Address.
When Johnson passed away, it was said that President Lincoln had buried his former servant in Arlington Cemetery on a plot with a tombstone that read &ldquoWilliam H. Johnson, Citizen.&rdquo
President Lincoln never refuted the fact that he and William H. Johnson were friends, not even to the public.
The character of William H. Johnson is loosely portrayed by actor Anthony Mackie in the newly released film, &ldquoAbraham Lincoln: Vampire Hunter&rdquo in theaters now.
William Lewis Dayton (1807-1864)
- Regent of the Smithsonian Institution from 1861 to 1864
- 1819 (12) Attended the Brick Academy under Dr. Brownlee
- 1825 (18) – Graduated College of New Jersey (Princeton Univ.)
- 1830 – Passed the Bar
- Moved to Freehold, New Jersey
- 1837 (30) – Entered politics – voted NJ State Senator (upper house)
- Justice of the New Jersey Supreme Court, 1838-1841
- 1842 (35) – United States Senator, 1842-1851 appointed by Gov. Pennington to Samuel Southard’s seat after his death (another Basking Ridge native).
- 1856 (49) – Vice Presidential candidate for the Republican Party, 1856
- Attorney-General of New Jersey, 1857-1861
- 1861 (54) – Minister to France, 1861-1864
- 1864 (57) – Died in Paris
- Both of Williams’ parents (Joel (Plot 624) and Nancy) are buried in the Basking Ridge Presbyterian churchyard cemetery. His brother Jonathan, Amos, and sister Bailey are also there.
- William and his wife Margaret Vanderveer (the Somerville line) are buried in Riverview Cemetery in Trenton.
- The Dayton’s had 6 children:
- Ferdinand Vanderveer (Buried Riverview Cemetery, Trenton)
- Anna Lewis (Buried Riverview Cemetery, Trenton)
- William Lewis Jr. (Buried Riverview Cemetery, Trenton)
- Edward Lewis (Buried Riverview Cemetery, Trenton)
- Robert (Buried Riverview Cemetery, Trenton)
- Margaret Vanderveer (Buried Riverview Cemetery, Trenton)
About the Writer
Brooks Betz is the official historian for Bernards Township. He is also the founder and trustee for the Mr. Local History Project, a non-profit dedicated to preserving and promoting local history with a social twist in the Somerset Hills of Northern Somerset County, New Jersey.
Who’s Biggest? The 100 Most Significant Figures in History
A data-driven ranking. Plus, have former TIME People of the Year been predictive?
Who’s bigger: Washington or Lincoln? Hitler or Napoleon? Charles Dickens or Jane Austen? That depends on how you look at it.
When we set out to rank the significance of historical figures, we decided to not approach the project the way historians might, through a principled assessment of their individual achievements. Instead, we evaluated each person by aggregating millions of traces of opinions into a computational data-centric analysis. We ranked historical figures just as Google ranks web pages, by integrating a diverse set of measurements about their reputation into a single consensus value.
Significance is related to fame but measures something different. Forgotten U.S. President Chester A. Arthur (who we rank as the 499 th most significant person in history) is more historically significant than young pop singer Justin Bieber (currently ranked 8633), even though he may have a less devoted following and lower contemporary name recognition. Historically significant figures leave statistical evidence of their presence behind, if one knows where to look for it, and we used several data sources to fuel our ranking algorithms, including Wikipedia, scanned books and Google n-grams.
To fairly compare contemporary figures like Britney Spears against the ancient Greek philosopher Aristotle, we adjusted for the fact that today’s stars will fade from living memory over the next several generations. Intuitively it is clear that Britney Spears’ mindshare will decline substantially over the next 100 years, as people who grew up hearing her are replaced by new generations. But Aristotle’s reputation will be much more stable because this transition occurred long ago. The reputation he has now is presumably destined to endure. By analyzing traces left in millions of scanned books, we can measure just how fast this decay occurs, and correct for it.
We don’t expect you will agree with everyone chosen for the top 100, or exactly where they are placed. But we trust you will agree that most selections are reasonable: a quarter of them are philosophers or major religious figures, plus eight scientists/inventors, thirteen giants in literature and music, and three of the greatest artists of all time. We have validated our results by comparing them against several standards: published rankings by historians, public polls, even in predicting the prices of autographs, paintings, and baseball cards. Since we analyzed the English Wikipedia, we admittedly measured the interests and judgments of primarily the Western, English-speaking community. Our algorithms also don’t include many women at the very top: Queen Elizabeth I (1533-1603) [at number 13] is the top ranked woman in history according to our analysis. This is at least partially due to women being underrepresented in Wikipedia.
Each year since 1927, TIME Magazine has selected an official Person of the Year, recognizing an individual who “has done the most to influence the events of the year.” Our rankings provide a way to see how well these selections have stood up over time. Adolf Hitler  proves to be the most significant Person of the Year ever. Albert Einstein  was the most significant modern individual never selected for the annual honor, though TIME did name him Person of the Century in 1999. Elvis Presley  is the highest ranked figure that has been completely dissed: no author or artist has ever so been honored.
The least significant Person of the Year proves to be Harlow Curtice , the president of General Motors for five years during the 1950s who increased capital spending in a time of recession, which helped spur a recovery of the American economy. Other obscure selections include Hugh Samuel “Iron Pants” Johnson , who Franklin Roosevelt appointed to head the depression-era National Recovery Administration, and fired less than a year later. John Sirica  was the District Court Judge who ordered President Nixon to turn over tape recordings in the Watergate Scandal. David Ho  is credited with developing the combination therapy that provided the first effective treatment for AIDS. His contributions to human health arguably deserve a better significance rank than our algorithms gave him here.
William Abraham - History
The story, as Parson Weems tells it, is that in 1754 a strapping young militia officer named George Washington argued with a smaller man, one William Payne, who made up for the disparity in size by knocking Washington down with a stick. It was the kind of affront that, among a certain class of Virginia gentlemen, almost invariably called for a duel. That must have been what Payne was expecting when Washington summoned him to a tavern the following day. Instead, he found the colonel at a table with a decanter of wine and two glasses. Washington apologized for the quarrel, and the two men shook hands.
Whether or not this actually happened—and some biographers believe that it did—is almost beside the point. Weems’ intention was to reveal Washington as he imagined him: a figure of profound self-assurance capable of keeping an overheated argument from turning into something far worse. At a time in America when the code of the duel was becoming a law unto itself, such restraint was not always apparent. Alexander Hamilton was the most celebrated casualty of the dueling ethic, having lost his life in an 1804 feud with Aaron Burr on the fields ofWeehawken, New Jersey, but there were many more who paid the ultimate price— congressmen, newspaper editors, a signer of the Declaration of Independence (the otherwise obscure Button Gwinnett, famous largely for being named Button Gwinnett), two U.S. senators (Armistead T. Mason of Virginia and David C. Broderick of California) and, in 1820, the rising naval star Stephen Decatur. To his lasting embarrassment, Abraham Lincoln barely escaped being drawn into a duel early in his political career, and President Andrew Jackson carried in his body a bullet from one duel and some shot from a gunfight that followed another. Not that private dueling was a peculiarly American vice. The tradition had taken hold in Europe several centuries earlier, and though it was frequently forbidden by law, social mores dictated otherwise. During the reign of George III (1760-1820), there were 172 known duels in England (and very likely many more kept secret), resulting in 69 recorded fatalities. At one time or another, Edmund Burke, William Pitt the younger and Richard Brinsley Sheridan all took the field, and Samuel Johnson defended the practice, which he found as logical as war between nations: “Aman may shoot the man who invades his character,” he once told biographer James Boswell, “as he may shoot him who attempts to break into his house.” As late as 1829 the Duke of Wellington, then England’s prime minister, felt compelled to challenge the Earl of Winchelsea, who had accused him of softness toward Catholics.
In France, dueling had an even stronger hold, but by the 19th century, duels there were seldom fatal, since most involved swordplay, and drawing blood usually sufficed to give honor its due. (Perhaps as a way of relieving ennui, the French weren’t averse to pushing the envelope in matters of form. In 1808, two Frenchmen fought in balloons over Paris one was shot down and killed with his second. Thirty-five years later, two others tried to settle their differences by skulling each other with billiard balls.)
In the United States, dueling’s heyday began at around the time of the Revolution and lasted the better part of a century. The custom’s true home was the antebellum South. Duels, after all, were fought in defense of what the law would not defend—a gentleman’s sense of personal honor—and nowhere were gentlemen more exquisitely sensitive on that point than in the future Confederacy. As self-styled aristocrats, and frequently slaveholders, they enjoyed what one Southern writer describes as a “habit of command” and an expectation of deference. To the touchiest among them, virtually any annoyance could be construed as grounds for a meeting at gunpoint, and though laws against dueling were passed in several Southern states, the statutes were ineffective. Arrests were infrequent judges and juries were loath to convict.
In New England, on the other hand, dueling was viewed as a cultural throwback, and no stigma was attached to rejecting it. Despite the furious sectional acrimony that preceded the Civil War, Southern congressmen tended to duel each other, not their Northern antagonists, who could not be relied upon to rise to a challenge. Consequently, when South Carolina congressman Preston Brooks was offended by Massachusetts senator Charles Sumner’s verbal assault on the congressman’s uncle, he resorted to caning Sumner insensible on the floor of the Senate. His constituents understood. Though Brooks was reviled in the North, he was lionized in much of the South, where he was presented with a ceremonial cane inscribed “Hit Him Again.” (Brooks said he had used a cane rather than a horsewhip because he was afraid Sumner might wrestle the whip away from him, in which case Brooks would have had to kill him. He didn’t say how.)
Curiously, many who took part in the duel professed to disdain it. Sam Houston opposed it, but as a Tennessee congressman, shot Gen. William White in the groin. Henry Clay opposed it, but put a bullet through Virginia senator John Randolph’s coat (Randolph being in it at the time) after the senator impugned his integrity as secretary of state and called him some colorful names. Hamilton opposed dueling, but met Aaron Burr on the same ground in New Jersey where Hamilton’s eldest son, Philip, had died in a duel not long before. (Maintaining philosophical consistency, Hamilton intended to hold his fire, a common breach of strict dueling etiquette that, sadly, Burr didn’t emulate.) Lincoln, too, objected to the practice, but got as far as a dueling ground in Missouri before third parties intervened to keep the Great Emancipator from emancipating a future Civil War general.
So why did such rational men choose combat over apology or simple forbearance? Perhaps because they saw no alternative. Hamilton, at least, was explicit. “The ability to be in future useful,” he wrote, “ . . . in those crises of our public affairs which seem likely to happen . . . imposed on me (as I thought) a peculiar necessity not to decline the call.” And Lincoln, though dismayed to be called to account for pricking the vanity of a political rival, couldn’t bring himself to extend his regrets. Pride obviously had something to do with this, but pride compounded by the imperatives of a dueling society. For a man who wanted a political future, walking away from a challenge may not have seemed a plausible option.
The Lincoln affair, in fact, affords a case study in how these matters were resolved—or were not. The trouble began when Lincoln, then a Whig representative in the Illinois legislature, wrote a series of satirical letters under the pseudonym Rebecca, in which he made scathing fun of State Auditor James Shields, a Democrat. The letters were published in a newspaper, and when Shields sent him a note demanding a retraction, Lincoln objected to both the note’s belligerent tone and its assumption that he had written more of them than he had. (In fact, Mary Todd, not yet Lincoln’s wife, is believed to have written one of the letters with a friend.) Then, when Shields asked for a retraction of the letters he knew Lincoln had written, Lincoln refused to do so unless Shields withdrew his original note. It was a lawyerly response, typical of the verbal fencing that often preceded a duel, with each side seeking the moral high ground. Naturally, it led to a stalemate. By the time Lincoln agreed to a carefully qualified apology provided that first note was withdrawn— in effect asking Shields to apologize for demanding an apology—Shields wasn’t buying. When Lincoln, as the challenged party, wrote out his terms for the duel, hopes for an accommodation seemed ended.
The terms themselves were highly unusual. Shields was a military man Lincoln was not. Lincoln had the choice of weapons, and instead of pistols chose clumsy cavalry broadswords, which both men were to wield while standing on a narrow plank with limited room for retreat. The advantage would obviously be Lincoln’s he was the taller man, with memorably long arms. “To tell you the truth,” he told a friend later, “I did not want to kill Shields, and felt sure that I could disarm him . . . and, furthermore, I didn’t want the damned fellow to kill me, which I rather think he would have done if we had selected pistols.”
Fortunately, perhaps for both men, and almost certainly for one of them, each had friends who were determined to keep them from killing each other. Before Shields arrived at the dueling spot, their seconds, according to Lincoln biographer Douglas L. Wilson, proposed that the dispute be submitted to a group of fair-minded gentlemen—an arbitration panel of sorts. Though that idea didn’t fly, Shields’ seconds soon agreed not to stick at the sticking point. They withdrew their man’s first note on their own, clearing the way for a settlement. Shields went on to become a United States senator and a brigadier general in the Union Army Lincoln went on to be Lincoln. Years later, when the matter was brought up to the president, he was adamant. “I do not deny it,” he told an Army officer who had referred to the incident, “but if you desire my friendship, you will never mention it again.”
If Lincoln was less than nostalgic about his moment on the field of honor, others saw dueling as a salutary alternative to simply gunning a man down in the street, a popular but déclassé undertaking that might mark a man as uncouth. Like so many public rituals of the day, dueling was, in concept at least, an attempt to bring order to a dangerously loose-knit society. The Englishman Andrew Steinmetz, writing about dueling in 1868, called America “the country where life is cheaper than anywhere else.” Advocates of the duel would have said that life would have been even cheaper without it. Of course, the attitudes dueling was meant to control weren’t always controllable. When Gen. Nathanael Greene, a Rhode Islander living in Georgia after the Revolution, was challenged by Capt. James Gunn of Savannah regarding his censure of Gunn during the war, Greene declined to accept. But feeling the honor of the Army might be at stake, he submitted the matter to GeorgeWashington. Washington, who had no use for dueling, replied that Greene would have been foolish to take up the challenge, since an officer couldn’t perform as an officer if he had to worry constantly about offending subordinates. Indifferent to such logic, Gunn threatened to attack Greene on sight. Greene mooted the threat by dying peacefully the following year.
Even more than Captain Gunn, Andrew Jackson was an excitable sort with a famously loose rein on his temper. Asurvivor— barely—of several duels, he nearly got himself killed following a meeting in which he was merely a second, and in which one of the participants, Jesse Benton, had the misfortune to be shot in the buttocks. Benton was furious, and so was his brother, future U.S. senator Thomas Hart Benton, who denounced Jackson for his handling of the affair. Not one to take denunciation placidly, Jackson threatened to horsewhip Thomas and went to a Nashville hotel to do it. When Thomas reached for what Jackson supposed was his pistol, Jackson drew his, whereupon the irate Jesse burst through a door and shot Jackson in the shoulder. Falling, Jackson fired at Thomas and missed. Thomas returned the favor, and Jesse moved to finish off Jackson. At this point, several other men rushed into the room, Jesse was pinned to the floor and stabbed (though saved from a fatal skewering by a coat button), a friend of Jackson’s fired at Thomas, and Thomas, in hasty retreat, fell backward down a flight of stairs. Thus ended the Battle of the City Hotel.
It was just this sort of thing that the code of the duel was meant to prevent, and sometimes it may have actually done so. But frequently it merely served as a scrim giving cover to murderers. One of the South’s most notorious duelists was a hard-drinking homicidal miscreant named Alexander Keith McClung. Anephew of Chief Justice John Marshall—though likely not his favorite nephew, after engaging in a duel with a cousin—McClung behaved like a character out of Gothic fiction, dressing from time to time in a flowing cape, giving overripe oratory and morbid poetry, and terrifying many of his fellow Mississippians with his penchant for intimidation and violence.
A crack shot with a pistol, he preferred provoking a challenge to giving one, in order to have his choice of weapons. Legend has it that after shooting Vicksburg’s John Menifee to death in a duel, the Black Knight of the South, as Mc- Clung was known, killed six other Menifees who rose in turn to defend the family honor. All of this reportedly generated a certain romantic excitement among women of his acquaintance. Wrote one: “I loved him madly while with him, but feared him when away from him for he was a man of fitful, uncertain moods and given to periods of the deepest melancholy. At such times he would mount his horse, Rob Roy, wild and untamable as himself, and dash to the cemetery, where he would throw himself down on a convenient grave and stare like a madman into the sky. . . . ” (The woman refused his proposal of marriage he didn’t seem the domestic type.) Expelled from the Navy as a young man, after threatening the lives of various shipmates, McClung later served, incredibly, as a U.S. marshal and fought with distinction in the Mexican War. In 1855, he brought his drama to an end, shooting himself in a Jackson hotel. He left behind a final poem, “Invocation to Death.”
Though the dueling code was, at best, a fanciful alternative to true law and order, there were those who believed it indispensable, not only as a brake on shoot-on-sight justice but as a way of enforcing good manners. New Englanders may have prided themselves on treating an insult as only an insult, but to the South’s dueling gentry, such indifference betrayed a lack of good breeding. John Lyde Wilson, a former governor of South Carolina who was the foremost codifier of dueling rules in America, thought it downright unnatural. Ahigh-minded gentleman who believed the primary role of a second was to keep duels from happening, as he had done on many occasions, he also believed that dueling would persist “as long as a manly independence and a lofty personal pride, in all that dignifies and ennobles the human character, shall continue to exist.”
Hoping to give the exercise the dignity he felt sure it deserved, he composed eight brief chapters of rules governing everything from the need to keep one’s composure in the face of an insult (“If the insult be in public . . . never resent it there”) to ranking various offenses in order of precedence (“When blows are given in the first instance and returned, and the person first striking be badly beaten or otherwise, the party first struck is to make the demand [for a duel or apology], for blows do not satisfy a blow”) to the rights of a man being challenged (“You may refuse to receive a note from a minor. . . , [a man] that has been publicly disgraced without resenting it. . . , a man in his dotage [or] a lunatic”).
Formal dueling, by and large, was an indulgence of the South’s upper classes, who saw themselves as above the law— or at least some of the laws—that governed their social inferiors. It would have been unrealistic to expect them to be bound by the letter of Wilson’s rules or anyone else’s, and of course they were not. If the rules specified smoothbore pistols, which could be mercifully inaccurate at the prescribed distance of 30 to 60 feet, duelists might choose rifles or shotguns or bowie knives, or confront each other, suicidally, nearly muzzle to muzzle. If Wilson was emphatic that the contest should end at first blood (“no second is excusable who permits a wounded friend to fight”), contestants might keep on fighting, often to the point where regret was no longer an option. And if seconds were obliged to be peacemakers, they sometimes behaved more like promoters.
But if bending the rules made dueling even bloodier than it had to be, strict adherence could be risky too. Some would-be duelists discovered that even the code’s formal preliminaries might set in motion an irreversible chain of events. When, in 1838, Col. James Watson Webb, a thuggish Whig newspaper editor, felt himself abused in Congress by Representative Jonathan Cilley, a Maine Democrat, he dispatched Representative William Graves of Kentucky to deliver his demand for an apology. When Cilley declined to accept Webb’s note, Graves, following what one Whig diarist described as “the ridiculous code of honor which governs these gentlemen,” felt obliged to challenge Cilley himself. Subsequently, the two congressmen, who bore each other not the slightest ill will, adjourned to a field in Maryland to blast away at each other with rifles at a distance of 80 to 100 yards. After each exchange of shots, negotiations were conducted with a view to calling the whole thing off, but no acceptable common ground could be found, though the issues still at stake seemed appallingly trivial. Graves’ third shot struck Cilley and killed him.
Though President Van Buren attended Cilley’s funeral, the Supreme Court refused to be present as a body, as a protest against dueling, and Graves and his second, Representative Henry Wise of Virginia, were censured by the House of Representatives. On the whole, though, outrage seemed to play out along party lines, with Whigs less dismayed by the carnage than Democrats. Congressman Wise, who had insisted the shooting continue, over the protests of Cilley’s second, was particularly defiant. “Let Puritans shudder as they may,” he cried to his Congressional colleagues. “I belong to the class of Cavaliers, not to the Roundheads.”
Ultimately, the problem with dueling was the obvious one. Whatever rationale its advocates offered for it, and however they tried to refine it, it still remained a capricious waste of too many lives. This was especially true in the Navy, where boredom, drink and a mix of spirited young men in close quarters on shipboard produced a host of petty irritations ending in gunfire. Between 1798 and the Civil War, the Navy lost two-thirds as many officers to dueling as it did to more than 60 years of combat at sea. Many of those killed and maimed were teenage midshipmen and barely older junior officers, casualties of their own reckless judgment and, on at least one occasion, the by-the-book priggishness of some of their shipmates.
In 1800, Lt. Stephen Decatur, who was to die in a celebrated duel 20 years later, laughingly called his friend Lieutenant Somers a fool. When several of his fellow officers shunned Somers for not being suitably resentful, Somers explained that Decatur had been joking. No matter. If Somers didn’t challenge, he would be branded a coward and his life made unbearable. Still refusing to fight his friend Decatur, Somers instead challenged each of the officers, to be fought one after another. Not until he had wounded one of them, and been so seriously wounded himself that he had to fire his last shot from a sitting position, would those challenged acknowledge his courage.
The utter pointlessness of such encounters became, in time, an insult to public opinion, which by the Civil War had become increasingly impatient with affairs of honor that ended in killing. Even in dueling’s heyday, reluctant warriors were known to express reservations about their involvement by shooting into the air or, after receiving fire, not returning it. Occasionally they chose their weapons—howitzers, sledgehammers, forkfuls of pig dung—for their very absurdity, as a way of making a duel seem ridiculous. Others, demonstrating a “manly independence” that John Lyde Wilson might have admired, felt secure enough in their own reputations to turn down a fight. It may not have been difficult, in 1816, for New Englander Daniel Webster to refuse John Randolph’s challenge, or for a figure as unassailable as Stonewall Jackson, then teaching at the Virginia Military Institute, to order court-martialed a cadet who challenged him over a supposed insult during a lecture. But it must have been a different matter for native Virginian Winfield Scott, a future commanding general of the Army, to turn down a challenge from Andrew Jackson after the War of 1812. (Jackson could call him whatever he chose, said Scott, but he should wait until the next war to find out if Scott were truly a coward.) And it had to be riskier still for Louisville editor George Prentice to rebuke a challenger by declaring, “I do not have the least desire to kill you. . . . and I am not conscious of having done anything to entitle you to kill me. I do not want your blood upon my hands, and I do not want my own on anybody’s. . . . I am not so cowardly as to stand in dread of any imputation on my courage.”
If he did not stand in such dread, others did, since the consequences of being publicly posted as a coward could ruin a man. Yet even in dueling’s heartland south of the Mason- Dixon line, the duel had always had its opponents. Anti-dueling societies, though ineffectual, existed throughout the South at one time, and Thomas Jefferson once tried in vain to introduce in Virginia legislation as strict—though surely not so imaginative—as that in colonial Massachusetts, where the survivor of a fatal duel was to be executed, have a stake driven through his body, and be buried without a coffin.
But time was on the side of the critics. By the end of the Civil War, the code of honor had lost much of its force, possibly because the country had seen enough bloodshed to last several lifetimes. Dueling was, after all, an expression of caste—the ruling gentry deigned to fight only its social nearequals— and the caste whose conceits it had spoken to had been fatally injured by the disastrous war it had chosen. Violence thrived murder was alive and well. But for those who survived to lead the New South, dying for chivalry’s sake no longer appealed. Even among old dueling warriors, the ritual came to seem like something antique. Looking back on life’s foolishness, one South Carolina general, seriously wounded in a duel in his youth, was asked to recall the occasion. “Well I never did clearly understand what it was about,” he replied, “but you know it was a time when all gentlemen fought.”
- ROSS DRAKE is a former editor at People magazine who now writes from Connecticut. This is his first article for SMITHSONIAN.
The Fullmers were one of the early settlers of Spring Glen, arriving on March 10, 1889. The head of the family, Edwin Fullmer, served as the second bishop of the Spring Glen Ward. He was born on March 30, 1860 at Provo, Utah. When he was a young boy the family moved to Hobble Creek, just east of Springville. It was there that he married Ada Maria Mendenhall on January 11, 1884. He had met his wife while working at logging. He had been heading down Spanish Fork Canyon to find work at a logging camp. He had had previous logging experience working around Coalville, getting timbers for the construction of the D & RGW Railway. Then he and his brother had worked at the copper belt mine at Marysvale, where they were harrassed because of their religion. At that time he headed for Spanish Fork Canyon and met his future bride.
The newlyweds moved to Tucker, now a ghost town, and had three children which were delivered at the home of Ada's mother in Spanish Fork. Edwin continued to work for the railroad but was unhappy with the necessity of spending so much time away from home and with the frequent accidents that occurred on the railway. Hearing of their concerns, Ada's uncle, James Davis Gay, invited them to come to Spring Glen and sold them some of his property.
The Fullmers arrived in Spring Glen on March 10, 1889 and remained there twelve years. During that time they had six more children. They took up farming on the west side of the river near the homestead of Ada's uncle, James Gay. The town of Spring Glen was located on the east side of the river, and crossing at flood time was always a challenge. However, they were regular in church attendance and in November 1889 Edwin was set apart as first counselor to Bishop Heber J. Stowell at the organization of the Spring Glen Ward. On May 8, 1893 he was ordained bishop.
On their land west of the river the family probably cultivated grain and raised livestock. On other land east of town there were fruit trees, shrubs, bees and berries. Edwin and six other members of the family contracted malaria, which they believed was caused by the damp rising from the trees and the river. To avoid further infection, they moved to a spot on the eastern side of town on a hill by the Spring Glen canal, now Sacamanos. There they built a log cabin which is still standing today. (CR-18-495) This cabin was added on to on two occasions. A shed-roofed portion to the east was used by Edwin Fullmer as his office.
The family left Spring Glen in 1901 and moved to several different places. First they went to Castle Gate where Edwin worked in the power house. A year later they went to Scofield where he worked in the mine with his brother Alonzo. Most of the family was still ill with malaria and the Fullmers' next child was stillborn. For awhile they returned to Spanish Fork, Utah and then moved to Raymond, Alberta, Canada in the fall of 1903. Their last two children were born in Canada, and the younger members of the family were raised there. In 1924, after his family was grown, he and Ada moved to Legrande, Oregon where he died on Fabruary 28, 1940. Ada also died there ten years later.
In spite of their relatively short tenure in Spring Glen (twelve years) the Fullmers are well-remembered as among the earliest settlers and leading citizens. The preservation of at least one of their cabins is a tangible reminder of their contribution.
Watch the video: Η μεγάλη και συγκλονιστική ιστορία του Αβραάμ!! (July 2022).