Have we lost our Morals?

The Moral Basis of a Free Society

by Steve Forbes The Hoover Institution

Saturday, November 1, 1997

No nation has ever enjoyed the status that America does today. Our strength comes not just from the might of our economy or the brilliant capabilities of the men and women in our armed forces. It comes also from the example we set for the rest of the world of how a free people can adapt to and advance in changing times and circumstances.

While others look to us, however, Americans themselves are seeking answers to some painful and bitter questions.

Can a free society survive the collapse of the two-parent family, where one-third of children are born into homes without fathers? Can a free society long endure a culture in which newborn babies have been thrown into trash dumpsters and young people have doubled their rate of heroin use in a single year?

As the 20th century comes to an end, the world is learning from America that the economic and political freedoms that come from capitalism and democracy are the most powerful and productive way to organize society.

At the same time, we in America are discovering that capitalism and democracy alone are not enough to sustain a healthy, vibrant society. We are learning the hard way that a self-governing nation must consist of self-governing individuals.

A breakdown in the moral fabric of society has dire consequences. An explosion of violence, crime, drug abuse, sexual promiscuity, and out-of-wedlock births undermines the blessings of liberty and prosperity.

The stakes, therefore, are enormous. If America makes the economic, political, and moral changes necessary to move forward in the years ahead, then the rest of the world has a chance of getting it right. But if America drifts off course, then the rest of the world will be in trouble as well.

Americans have always defined true freedom as an environment in which one may resist evil and do what is right, noble, and good without fear of reprisal. It is the presence of justice tempered with mercy.

It is a rule of law based on fundamental moral truths that are easily understood and fairly and effectively administered. It offers individuals and families equal opportunity to better their lives morally, spiritually, intellectually, and economically.

Freedom, in other words, is neither a commodity for dictators to distribute and deny at will nor a moral, spiritual, or political vacuum in which anything goes.

Freedom is a priceless treasure that the state is supposed to safeguard. Why? Because human beings have an intrinsic right to be free, a right that comes not from the state but from God. To the Founding Fathers, this was a “self-evident” truth. It is the essence of the American experiment in self-government.

The Founders, even those most suspicious of organized religion, believed that man’s place in the universe was no accident–that man himself and the world in which he lived were created and sustained by a just and loving God.

“It is impossible to account for the creation of the universe without the agency of a Supreme Being,” wrote George Washington, “and it is impossible to govern the universe without the aid of a Supreme Being.”

 James Madison put it this way: “The belief in a God All Powerful, wise and good, is so essential to the moral order of the World and to the happiness of man, that arguments which enforce it cannot be drawn from too many sources.”

To navigate the oceans without consulting fixed stars, Americans knew, is to risk being turned around by waves and wind, circling aimlessly with dwindling stores of food and water.

To believe in the randomness of man’s appearance on the earth, the Founders likewise intuitively understood, would be to deny the existence of fixed moral truths, established outside of man’s own personal whims and predilections.

In such a world, no one could judge with authority what is right or wrong because everyone would be entitled to his own personal system of values. Hence there could be no equality before the law because the law would consist of whatever people in power declared it to be.

That would elevate jungle law–what Darwin would later term “survival of the fittest”–over the rule of natural law. And that, in turn, would legitimize both the centralized European regimes of the Founders’ time, and the anarchic Beiruts of our day, where the powerful rule over the weak, use force to obtain wealth, and use wealth to reinforce their power.

Instead, the Founding Fathers staked the future of the country on the principle that human beings are created by God, and therefore have certain intrinsic, absolute, nonnegotiable rights.

“All men are created equal,” reads the Declaration of Independence, and are “endowed by their Creator with certain unalienable rights . . . among these are life, liberty and the pursuit of happiness.” Government’s role in society, then, is to “secure” these rights, not create or dispense them. This is the moral basis of a free society.

The order of these rights–first life, then freedom, and then the equal opportunity to pursue one’s own happiness–was written with great care and precision, not haphazardly.

The Founders understood the need to balance man’s right to be free with man’s responsibility to be honest, just, and fair.

For example, if it makes you happy to shoot and kill someone while you rob a bank–well, the law says you’re out of luck. A person’s right to live supersedes your “freedom” to steal and murder. This may seem obvious, but it is profound.

It is also the linchpin of Western civilization. Switch the order of these fundamental human rights–putting happiness before liberty, or liberty before life–and you end up with moral chaos and social anarchy. Deny the God-given nature of these rights and you open the door to tyranny.

“Can the liberties of a nation be sure when we remove their only firm basis, a conviction in the minds of the people, that these liberties are the gift of God?” asked Thomas Jefferson.

Or, as John Adams put it, “We have no government armed with power capable of contending with human passions unbridled by morality and religion. Avarice, ambition, revenge or gallantry would break the strongest cords of our Constitution as a whale goes through a net. Our Constitution was made only for a moral and religious people. It is wholly inadequate to the government of any other.”

In America today, however, not everyone regards these basic moral truths as “self-evident.” Modern liberalism, which rejects absolute moral standards, has abandoned the proper ordering of man’s fundamental rights.

As a result, modern liberalism has undermined a long-held American principle: that the law should protect the weakest among us, not just the strong, the healthy, and the rich.

There is no need here to catalog in detail the results since the 1960s of liberalism’s passions. The effort to legitimize all moral claims, to give personal freedom an utterly free hand has given us the following: horrific increases in violent crime, out-of-wedlock births, family breakups, and substance abuse; dramatic declines in educational and cultural standards; a proliferation of increasingly bizarre lawsuits; a blizzard of regulations that defy common sense and assault our rights to property and due process; a growing corruption of the tax code; and a judiciary that often acts like an imperial aristocracy hurtling decrees down upon the rest of us.

Modern liberalism has adopted a view of liberty that is at the same time too broad and too narrow. Liberalism wrongly insists, for example, on a parent’s freedom to choose an abortion while simultaneously denying parents’ freedom to choose the schools their children may attend.

Ideas have consequences. Liberalism’s moral confusion over the sanctity of human life and the vital importance of the traditional family has reshaped American law and society.

Certainly, crime is not new. But Americans have rarely been so confused about right and wrong, about what is acceptable and what is to be forcefully condemned.

So, we must be clear: A free society cannot survive the collapse of the two-parent family or the absence of fathers, love, and discipline in the lives of so many children. A free society cannot survive an unchecked explosion in violent crime.

Nor can a free society survive a generation of crack babies and teenagers whose minds and bodies have been destroyed by illegal drugs.

Today, movies, television, music, and the Internet bombard young people with cultural messages of sexual revolution and self-absorbed materialism that tempt them away from good moral character rather than appealing to the better angels of their natures. Affluence does not protect children from temptation; sometimes it makes temptation more accessible.

The good news is that this is not the first time we have faced such dark times and turned things around. America has seen several periods of renewal and reform, most notably the Second Great Awakening and the Teddy Roosevelt Era.

Both periods marked a return to America’s founding ideals; both offer guidance as to how we might strengthen our moral commitments while preserving freedom.

Following the Revolutionary War, America experienced a period of moral decline. The chaos of battle, the pain of death and separation, the anxiety of wartime inflation, the excitement of subsequent political change, and the all-consuming nature of building a new nation drained people’s time and energy.

Fewer and fewer people attended church. Spiritual devotion waned and social problems proliferated. From the late 1770s until the late 1820s, per-capita consumption of alcohol in America rose dramatically, to about four to five times per person what it is today.

Everybody took a swig from the jug–teachers, preachers, children. They called it “hard cider,” but it was nothing like the cider we buy at the grocery store today. In those days, it seemed everyone was in a haze by noontime. The social consequences were predictable.

“Illegitimate births were rampant” during the early 1800s, wrote Tom Phillips in his book Revival Signs. “Alcohol, the drug of the day, was destroying families and wrecking futures.

Thomas Paine was proclaiming that Christianity was dead–and certainly the body of faith appeared to be in a coma. Yet even as church rolls were shrinking and greed, sensuality and family breakdown were becoming more widespread, America was about to experience a great spiritual revival.”

Slowly at first, then building over the next several decades, one wave of spiritual renewal and religious rededication after another swept the country in what historians now call America’s “Second Great Awakening.”

In one community after another, people began to wake up from their moral and spiritual slumber as though saying, “If we’re going to have a self-governing nation, it must be occupied by self-governing citizens.”

The first public-health movement in America was launched not by the government but by citizen-activists such as Lyman Beecher, the founder of the American Bible Society and a pastor who went on to form the American Society for the Promotion of Temperance in 1826.

This enterprise became known as the Temperance Movement–and it worked. Within one generation, alcoholic consumption in America fell by two-thirds.

Soon pastors and community leaders were opening elementary and secondary schools (this was before “public” education), founding colleges and universities, setting up orphanages and homes for abandoned children, creating shelters for the poor, building hospitals, and exhorting people to stop drinking and spend more time with their families.

The Reverend Thomas Gallaudet opened his school for the deaf. William McGuffey wrote his famous “Eclectic Readers,” of which 120 million copies were printed. The first Young Men’s Christian Association (YMCA) opened in Boston, followed shortly by the first Young Women’s Christian Association.

It was during this rebuilding of the moral foundations of a free society that French historian Alexis de Tocqueville came to America in 1831.

“Upon my arrival in the United States, the religious aspect of the country was the first thing that struck my attention, and the longer I stayed there, the more I perceived the great political consequences resulting from this new state of things,” he wrote. “In France I had almost always seen the spirit of religion and the spirit of freedom marching in opposite directions. But in America I found they were intimately united and that they reigned in common over the same country.”

Eventually the religious and moral renewal of the Second Great Awakening gave birth to the abolitionist movement, one of the nation’s greatest struggles to reassert a moral order based on man’s fundamental rights.

This gets to one of the great strengths of the American democracy. It is not that we do not make mistakes as a people and as a nation. We are, after all, only human. But when we do stumble, we have a record of rediscovering our first principles and resuming the journey toward faith and moral renewal.

In the early years of the 20th century, Americans were filled with optimism. The nation’s rapid industrialization and urbanization created enormous new social, economic, and political problems, but these were confronted by bold, imaginative national leaders and the energetic efforts of people voluntarily working together to promote shared objectives.

The period speaks to us today. The 1890s had been a troubled time. The rise of large corporations and massive industrial monopolies seemed to mock the idea of individual entrepreneurship.

The rise of big cities with corrupt political machines supplanted the tradition of democratic town meetings. People feared that massive immigration, which was several times greater in proportion to our population than what we are experiencing today, would degrade the American character and culture.

How, they asked, could we assimilate so many people from so many different races, nationalities, and religions? These years were also plagued by drug addiction–primarily to opium. Sound familiar?

American churches and synagogues responded to the challenge of the new industrial era by combining a message of spiritual renewal with practical, personal care for those in need.

Dwight L. Moody, a former shoe salesman, became the most influential American evangelist of the 19th century. He launched a Sunday School movement in Chicago to provide moral instruction for more than 1,500 poor, urban street children.

He opened a Bible college to challenge other young people to follow his example of helping destitute and demoralized people turn their lives around. And, in an age without radio or television, he communicated his message of spiritual and moral renewal to millions of people before his death in 1899.

The spiritual and practical needs of America’s burgeoning city populations were also addressed by social reformers such as William and Catherine Booth, who founded the Salvation Army in the United States in 1880.

Women took a particular interest in the needs of those who found themselves financially and morally bankrupt.

By 1913, more than 500 urban rescue missions were operating in the United States and Canada, many of them organized and run by women of faith. Catholic nuns and Jewish and other fraternal societies also labored to help the needy everywhere from little mining towns to urban slums.

At the same time, President Theodore Roosevelt was ushering in an era of political and economic reform. He declared in his Inaugural Address, “Much has been given us, and much will rightfully be expected from us. Our forefathers faced certain perils which we have outgrown. We now face other perils, the very existence of which it was impossible that they should foresee. Modern life is both complex and intense, and the tremendous changes wrought by the extraordinary industrial development of the last half-century are felt in every fiber of our social and political being.”

From 1901 to 1909, Roosevelt sought to expand individual opportunity and strengthen individual control over personal, business, and political affairs, as well as to increase America’s economic and military influence in the world.

He busted up family controlled and anti-competitive trusts and corporate monopolies, attacked government and political corruption in both major parties, supported the right of workers to organize, expanded U.S. trade with other nations, and built up our armed forces, particularly the navy.

He advocated the direct election of U.S. senators, the right of women to vote, the creation of open presidential primaries, and the introduction of citizen initiatives, referenda, and recalls–all of which soon became realities.

Roosevelt reinforced his battle for political and economic reform by publicly, vigorously, and consistently reasserting the notion that there must be a moral foundation to a free society.

It was he, after all, who coined the term “bully pulpit.” While governor of New York, Roosevelt once declared, “It is absolutely impossible for a Republic long to endure if it becomes either corrupt or cowardly,” and he never lost sight of that essential truth.

He rightly believed that private, local, character-forming institutions must be left free to strengthen the moral fiber of the nation. The role of religious faith in society must be affirmed, not undermined.

He did not believe that government should establish a state religion. But he did not shrink from the right or responsibility of a public official to encourage individuals to attend to their moral and spiritual character.

Eight years after leaving the White House, Roosevelt was still offering Americans his “top 10” list of reasons for going to church. “In this actual world a churchless community where men have abandoned and scoffed at or ignored their religious needs is a community on the rapid downgrade,” he wrote in 1917 in Ladies’ Home Journal.

“It is perfectly true that occasional individuals or families may have nothing to do with church or with religious practices and observances and yet maintain the highest standard of spirituality and of ethical obligation. But this does not affect the case in the world as it now is, any more than that exceptional men and women under exceptional conditions have disregarded the marriage tie without moral harm to themselves interferes with the larger fact that such disregard if at all common means the complete moral disintegration of the body politic.”

So, we now ask ourselves big questions such as, “how did America–the most pro-individual, anti-statist nation ever invented–come to permit its government to assume the size and scope it has today?”

The answer is war–the great shaper of this century. Throughout history, warfare fostered government centralization. You cannot face a major external threat unless you have a strong government to marshal the resources necessary to meet that threat. For most of the last 100 years, America has faced a major external threat of one sort or another–first World War I, then World War II, and finally the Cold War.

These conflicts have been cited to justify government expansion in every direction.

How did we justify federal aid to education? The initial rationale was national security. Federal aid for research and development and the space program? National security. Even the interstate highway program begun in the 1950s was partially justified on national security grounds. It seemed natural to some that if government could mobilize resources to fight external enemies, it could solve an array of domestic problems as well. Hence the “War on Poverty.”

It has taken us 50 years to learn, very painfully, the limitations of Big Government. Now that the Cold War is over, we no longer need such a massive, centralized federal government. We now have the opportunity to downsize Washington and shift money, power, and control back to individuals, families, and local communities.

Just as Teddy Roosevelt started the new century by attacking government corruption at its source and busting up anti-competitive monopolies, it is time to start the next century by shrinking Big Government.

So what should be the Government’s role in all of this?

Presidents, senators, and other government officials are not archbishops. They do not have primary responsibility for the life of the spirit.

Yet our early presidents and other leading Founders knew well how crucial religion is to the cause of liberty.

The great historian of liberty, Lord Acton, wrote that the history of liberty is in fact, “coincident” with the history of Christianity. In the words of Jefferson, “God who gave us life gave us liberty.” To save liberty, our Founders never failed to stress the role of faith.

At a particularly difficult impasse at the constitutional convention in Philadelphia, Ben Franklin proposed a pause for solemn prayer to Providence, just as in The Federalist Papers, Madison, Hamilton, and Jay three times noted the interventions of divine Providence in the cause of establishing freedom on this continent.

In short, our past national leaders have sensed a duty to express this nation’s need of divine guidance and its gratitude for the Creator’s manifold acts of assistance. In this country, we do not have an established church. But the foundations of our liberty are dug deep in the voluntary and heartfelt faith of millions.

So, what do you think folks? Are we the people capable of launching another Great Awakening?

Education: Back to Basics

Dr. John Keeney once told me, “We have a system that has worked for 2000 years, Why would we change it?” He was correct.

Just look at our school system today. It is a mess. All because we tried to “fix” it.

https://educationalroots.weebly.com/

Plato (427-347 BCE)

The quest to define what we know and how we know it however begins with Greek philosophers.

Plato, born in Athens, Greece, was the student of Socrates (469-399 BCE) and later the teacher of Aristotle. He wrote in the middle of the fourth century B.C.E. in ancient Greece. 

Plato was influenced by his teacher and mentor Socrates in the same type of system of mentoring used today by experienced teachers who provide guidance to beginning teachers.

Socrates believed it is a teacher’s job to propose questions that draw ideas out of a student’s mind that are already there and make them think deeply about their beliefs.

Plato later founded the Academy in 387 B.C. E. where he also taught and became the founder of Western idealism. Plato believed the primary role of the teacher was to bring about an intellectual conversion experience in the learner, create a quiet educational environment that promotes contemplation and reflection and then ask leading questions that provoke critical thinking and self-examination in the students.

Now Plato’s thoughts were not without controversy. Plato believed in the state controlling a citizen’s education from birth to death. He believed children should be taken out of the home during early formation to be raised in state-operated nursery schools from birth to age six. (Much like the current administration is proposing with government day care).

Children aged 6-18 would then attend school with a curriculum of music, literature, mathematics, and gymnastics.

Plato’s method of education was as follows:

 Elementary – All boys and girls would be educated together. They would study mathematics, literature, poetry, and music until they were eighteen   
years of age.

Military Training – The next two years of the youth’s life would be devoted to physical education alone. Thereafter, the best youths would be selected for the higher education given to future guardians of the state.

Higher Education – Between the ages of 20-35, the future guardian would receive a higher education to prepare him for ruling the state. His studies would include mathematics, music, and literature. At the age of 30 he would have enough maturity to begin his study of philosophy.

At 35, his formal education would cease, and he would enter upon a minor administrative position, prior to undertaking more important governing positions. 

Plato died in 347 BCE, leaving the Academy he founded to his sister’s son. The Academy remained a model for institutions of higher learning until it was closed in 529 BCE by the Emperor Justinian.

Next came Aristotle (384-322 BCE) Another ancient education innovator, who embraced the Greek version of a liberal arts curriculum and emphasized natural sciences, biology, botany, physiology, and zoology.

He studied with Plato for 20 years at the Academy and eventually joined him and Socrates in Western education history.

Aristotle was able to take Plato’s philosophical and educational ideas as a jumping off point changing them throughout his life to become his own personal philosophy.

Whereas Plato believed truth was found within the mind, Aristotle looked to the world outside the mind to find evidence of what was true. 

Born in 384 BCE in Greece, Aristotle served as a tutor to Alexander the Great for seven years and eventually established a school in Athens known as the Lyceum.

Aristotle believed the purpose of school was to develop and exercise students’ potential for reasoning, form ethical character, and provide a skill and knowledge base.

He thought the purpose of schooling was to develop dispositions and habits that exercise reason and forming a human’s philosophy.

Schools were to prepare future citizens with more functional knowledge needed to conduct their political, social, and economic affairs.

His lifelong fascination with science and medicine is reflected in his philosophy of education with one of his biggest philosophies being his definition of humans as rational animals.

As a founder of Western science, he pioneered categorization of objects and was the founder of natural realism.

He can be seen as a forerunner of the modern university professor who believes research and teaching are inseparable. His writings span a wide range of disciplines including logic, metaphysics, philosophy of mind, ethics, political theory, aesthetics, and rhetoric.

They even delved into such primarily non-philosophical fields as empirical biology where he excelled at detailed plant and animal observation and taxonomy.

In all these areas, Aristotle’s theories provided illumination, met with resistance, sparked debate, and generally stimulated the sustained interest of his followers.

Like Plato, Aristotle recognized the importance of early childhood as a formative period of human development.

He divided schooling into three stages: primary, secondary, and higher education.

Ages 7-14 would attend primary and could consist of gymnastics, writing, reading, music, and drawing.

Ages 14-21 would attend secondary and would continue their primary studies while implementing literature, poetry, drama, choral music, and dancing.

The last four years would be spent in military drill, tactics, and strategy.

Higher studies would begin at age 21 and continue as long as the student was willing and able.

Higher education was for males only as Aristotle believed women were not capable of such complex studies.

It is believed Aristotle wrote 150 philosophical essays with the 30 that survive touching on an enormous range of philosophical problems, from biology and physics to morals to aesthetics (art) to politics.

However, many are thought to simply be “lecture notes” instead of complete essays and a few may not even be Aristotle’s but of members of his school.  

One of the major discoveries that were made during the Crusades was that of Aristotle’s texts which had not been found up until that point.

With the discovery of these texts, the rise of Islam, and the spread of the Arab Empire, they became familiar to Muslim scholars who translated them into Arabic.

They then spread throughout the Islamic world including Spain. In the 12th century, scholars came from England, Paris, and Italy to seek them out and translate them into Latin.

At that point, Aristotle’s texts had now spread into the intellectual centers of the West. 

Aristotle’s work was rediscovered in the later Middle Ages and was adopted by medieval scholars becoming known as Ille Philosophus (The Philosopher), or “the master of them that know,” by his followers.

This is where we get the term Philosophes which refers to the writers and thinkers during the Age of Enlightenment beginning in the 17th century.

One of these philosophes was John Locke 1632-1704

John Locke was born August 29, 1632, in Wrington, Somerset, England. Regarded as one of the most influential Enlightenment thinkers, he was known as the Father of Classical Liberalism. (free market, Civil Liberties under rule of law, and limited government)

He was an economist, political operative, physician, Oxford scholar, and medical researcher as well as one of the great philosophers of the late 17th and early 18th centuries.

Locke created the philosophy that there was no legitimate government under the Divine Right of Kings theory, which emphasized that God chose some people to rule on earth in His will.

Therefore, the monarch’s actions were the will of God and to criticize the ruler meant you were challenging God. However, Locke did not believe in this theory and wrote his own to challenge it.

Locke’s writings also greatly influenced the founding fathers of the United States when writing the Constitution.

They implemented his idea that the power to govern was obtained from the permission of the people. He believed the purpose of government was to protect the natural rights of its citizens.

He stated that natural rights were life, liberty, and property, and that all people automatically earned these simply by being born.

When a government did not protect those rights, the citizen had the right to overthrow the government. These ideas were incorporated into the Declaration of Independence by Thomas Jefferson.

Once they took root in North America, the philosophy was adopted in other places as justification for revolution.

Locke believed that children are born with their mind a blank sheet of paper, a clean slate, a tabula rasa.

He also maintained that children are potentially free and rational beings, and that the realization of these human qualities tends to be disillusioned through imposition of the sort of prejudice that perpetuates oppression and fallacy.

Locke believed it was the upbringing and education that hindered the development of children’s humanity. Locke noted two consequences of the doctrine of the tabula rasa: equality and vulnerability.

Locke believed the purpose of education was to produce an individual with a sound mind in a sound body so as to better serve his country.

Locke thought that the content of education ought to depend upon one’s station in life.

The common man only required moral, social, and vocational knowledge. He could do quite well with the Bible and a highly developed vocational skill that would serve to support him in life and offer social service to others.

 However, the education of gentlemen ought to be of the very highest quality. The gentleman must serve his country in a position of leadership.

For gentlemen, Locke believed that the he must have a thorough knowledge of his own language.

The schools of the Puritans in England broke with tradition completely. They sought to educate one for the society in which he would live. The schools were called, therefore, schools of social realism.

Locke held that the content of the curriculum must serve some practical end. He recommended the introduction of contemporary foreign languages, history, geography, economics, math and science. 

Locke proposed the following for the education of the gentleman: 

 a. Moral Training. All Christians must learn to live virtuously.

b. Good Breeding. The gentleman must develop the poise, control    and outward behavior of excellent manners. Education must aim, therefore, at developing correct social skills. 
c. Wisdom. The gentleman ought to be able to apply intellectual and moral knowledge in governing his practical affairs. 
d. Useful Knowledge. The gentleman must receive education which will lead to a successful life in the practical affairs of the society, as well as that which leads to the satisfaction derived from scholarship and good books.

Before his death, Locke saw four more editions of An Essay Concerning Human Understanding. He died at Oates in Essex on October 28, 1704, 72 years before Thomas Jefferson used his words in our Declaration of Independence.

Next on our list of historic educators is Jean-Jacques Rousseau (1712-1778)

Jean Jacques Rousseau was an 18th century philosopher who later became known as a revolutionary philosopher on education and a forerunner of idealism.

One of the most influential thinkers during the Enlightenment in 18th century Europe, his ideas concerning education and the role society plays in a child’s development/education was published in his famous work Emile, which caused some sparks to help light the French Revolution and eventually brought about his own exile from Paris.

Born at Geneva on June 28th, 1712, Rousseau’s ultimate belief was that people are born basically good, but are corrupted by society.

He also thought that individuals only learn these “bad habits” by living in the city, which is why he preferred a rural setting for children to learn.

He wanted to abandon society’s hypocrisy and pretentiousness. He believed that natural education promotes and encourages qualities such as happiness, spontaneity, and the inquisitiveness associated with childhood.

He wanted children to be shielded from societal pressures and influences so that the natural tendencies of each child could emerge and grow without any unwarranted corruption.

Rousseau’s greatest work was Émile, published in 1762. More a tract upon education with the appearance of a story than it is a novel, the book describes the ideal education which prepares Emile and Sophie for their eventual marriage.

Book One deals with the infancy of the child. The underlying thesis of all Rousseau’s writings stresses the natural goodness of man. It is society that corrupts and makes a man evil. Rousseau states that the tutor can only stand by at this period of the child’s development, ensuring that the child does not acquire any bad habits.


According to Rousseau’s Émile Book Two, the purpose of education consisted of the tutor preparing the child for no particular social institution, but to preserve the child from the baleful influence of society.

Emile is educated away from city or town; living in the country close to nature should allow him to develop into a benevolent, good adult. The child learns by using his senses in direct experience. He lives in Spartan simplicity.

Book Three describes the intellectual education of Emile. Again, this education is based upon Emile’s own nature. When he is ready to learn and is interested in language, geography, history and science, he will possess the inner direction necessary to learn.

This learning would grow out of the child’s activities. He will learn languages naturally through the normal conversational activity. Rousseau assumes that Emile’s motivation leads to the purposive self-discipline necessary to acquire knowledge.

Finally, Emile is taught the trade of carpentry in order to prepare him for an occupation in life.


Book Four describes the social education and the religious education of Emile. The education of Sophie is considered and the book concludes with the marriage of Emile and Sophie.

Emile is permitted to mingle with people in society at the age of 16. He is guided toward the desirable attitudes that lead to self-respect. Emile’s earlier education protects him from the corrupting influence of society.

The revelation and dogma of organized religion are unnecessary for man. The fundamental tenets of any religion affirm the existence of God and the immortality of the soul and these are known through the heart only.


His views of monarchy and governmental institutions outraged the powers that be and his ideas on natural religion, unorthodox to both Catholics and Protestants, forced him to flee to Prussia (Germany) under the protection of Frederick the Great. There Rousseau studied botany and eventually accepted an invitation to settle in England where he wrote most of his autobiographical Confessions.

Rousseau’s philosophical ideas on the education of children were very risqué and created quite the uproar.

Rousseau’s powerful influence on the European Romantic movement and after was due to his vision of a regenerated human nature.

His philosophy revealed a striking combination of idealistic and realistic elements which constantly seemed to open the possibility of a better world.

This optimistic outlook was transmitted through a particularly eloquent and persuasive style, giving the impression of intense sincerity.

Rousseau’s great challenge was to convince the humblest of men that they should never feel ashamed to call themselves human beings.

Rousseau ultimately left the city of Paris to live in the country. He had become completely disillusioned with the norms of society in the city, especially that of the monarchy, and had even become disgusted with his own friends.

He felt that children more fully learned right and wrong by experiencing the consequences of their actions firsthand rather than being physically punished as most people in his day did.

Rousseau firmly believed that nature is kind and man essentially is not; humans have the potential to be kind, but oftentimes choose not to.

Now, having introduced you to some of the greatest historical figures in the development of modern education, I have saved the best for last.

18th Century Advice: Thomas Jefferson on Education Reform

Elena Segarra / April 14, 2013

The original “Man of the People,” Thomas Jefferson, was born on April 13 in 1743.

Jefferson is best known for drafting the Declaration of Independence, but he also wrote prolifically and prophetically on education. “If a nation expects to be ignorant and free, in a state of civilization, it expects what never was and never will be,” he wrote in a letter to a friend.

Jefferson understood that freedom depends on self-government: the cultivation of self-reliance, courage, responsibility, and moderation.

Education contributes to both the knowledge and virtues that form a self-governing citizen.

By proposing a bill in Virginia that would have established free schools every five to six square miles, Jefferson sought to teach “all children of the state reading, writing, and common arithmetic.” With these skills, a child would become a citizen able to “calculate for himself,” “express and preserve his ideas, his contracts and accounts,” and “improve, by reading, his morals and faculties.”

Jefferson viewed this basic education as instrumental to securing “life, liberty, and pursuit of happiness”  (Locke) for Americans because it helps an individual “understand his duties” and “know his rights.”

Once taught reading and history, people can follow the news and judge the best way to vote. If the government infringes on their liberties, educated citizens can express themselves adequately to fight against it.

By providing equal access to primary schools, Jefferson hoped to teach children “to work out their own greatest happiness, by showing them that it does not depend on the condition of life in which chance has placed them, but is always the result of a good conscience, good health, occupation, and freedom in all just pursuits.”

While Jefferson supported the idea of public education, he would not have placed schools under government supervision. Instead, he argued for the placement of “each school at once under the care of those most interested in its conduct.” He would put parents in charge.

But if it is believed that these elementary schools will be better managed by…[any] general authority of the government, than by the parents within each ward, it is a belief against all experience.… No, my friend, the way to have good and safe government, is not to trust it all to one, but to divide it among the many, distributing to every one exactly the functions he is competent to.

Taxpayers would provide the resources for public education; the community would arrange the schooling. Although we today face a very different set of challenges than Jefferson, his reasoning remains relevant: Those most concerned with the school’s performance, i.e., parents, will best manage education.

We spend more than enough on our struggling education system. Empowering parents with control over dollars, instead of increasing the amount spent on schools, will improve educational outcomes.

During his lifetime, Thomas Jefferson had little success with his efforts to reform the American education system. Yet the principles he promoted hold true today: Our freedom depends on delivering a quality education to future generations.

So, there you have it folks. Perhaps if we should make this show required listening for the Federal Department of Education, DESE, and our School Administrators. Maybe, they could learn something from our past.

Iran Rocket Launch

DUBAI, Dec 30 (Reuters) – Iran used a satellite launch rocket to send three research devices into space on last Thursday, a defense ministry spokesman said, as indirect U.S.-Iran talks take place in Austria to try to salvage a 2015 nuclear deal.

He did not clarify whether the devices had reached orbit, but suggested the launch was a test ahead of coming attempts to put satellites into orbit.

Iran, which has one of the biggest missile programs in the Middle East, has suffered several failed satellite launches in the past few years due to technical issues.

Washington has said it is concerned by Iran’s development of space launch vehicles and a German diplomat said Berlin had called on Iran to stop sending satellite launch rockets into space, adding that they violated a U.N. Security Council resolution.

The blast-offs have raised concerns in Washington about whether the technology used to launch satellites could advance Iran’s ballistic missile development. The United States says that such satellite launches defy a United Nations Security Council resolution calling on Iran to steer clear of any activity related to ballistic missiles capable of delivering nuclear weapons.

Iranian defense ministry spokesman Ahmad Hosseini said the Simorgh (Phoenix) satellite carrier rocket had launched the three research devices at an altitude of 290 miles and at a speed of 4.5 miles per second.

“The intended research objectives of this launch were achieved,” Hosseini told state television. “This was done as a preliminary launch … God willing, we will have an operational launch soon.”

State TV showed footage of what it said was the firing of the launch vehicle from the Imam Khomeini Space Center in northern Iran at dawn.

“By developing our capacity to launch satellites, in the near future satellites with a wide range of applications… will be placed into orbit,” Hosseini said.

“The payloads launched today were subsystems of satellites that were tested in vacuum conditions and high altitude as well as high acceleration and speed and the data was gathered,” he added.

State media quoted Information and Communications Technology Minister Isa Zarepour as saying: “I hope lessons learned from this research launch will pave the way for operational access to satellite system launch technology.”

Now understand, Saudi Arabia is an arch enemy of Iran.

Iran is Shiite Muslim (believes leader of the faith must be direct descendant of Mohammed.

Saudi Arabia is Sunni Muslim. Says leader of the faith does not have to be a blood relative of Mohammed.

So, firing this rocket into space has definitely gotten the attention of the Saudis.

DUBAI, Dec 30 (Reuters)

Saudi King Salman said on Wednesday that Saudi Arabia was concerned about Iran’s lack of cooperation with the international community on its nuclear and ballistic missile programs.

King Salman bin Abdulaziz said in an address to the kingdom’s advisory Shura Council that he hoped Iran would change its “negative” behavior in the region and choose dialogue and cooperation.

“We follow with concern the Iranian government’s policy which is destabilizing regional security and stability, including building and backing sectarian armed militias and propagating its military power in other countries,” the 85-old ruler said in a speech published by state news agency SPA.

“(We follow with concern) its lack of cooperation with the international community regarding its nuclear program and its development of ballistic missiles,” he added.

Saudi Arabia, a major Western ally in the Gulf, has been locked in a bitter rivalry with Iran across the Middle East where both sides have backed opposing factions in several conflicts including in Yemen, Syria and Lebanon.

Saudi Arabia and other Gulf countries expelled Lebanese envoys in October in a diplomatic spat that has added to Lebanon’s economic crisis. Saudi officials said the crisis with Beirut has its origins in a Lebanese political setup that reinforces the dominance of the Iran-backed Hezbollah armed group.

In a step to ease tensions, Saudi and Iranian officials met in a series of direct talks earlier this year but they have yet to yield a breakthrough.

So, you can see that Iran is triggering a destabilization of the Middle East with these rocket launches.

With Iran in mind, Israel just signed deal with the US for heavy choppers and refueling planes

By JUDAH ARI GROSS 

The Times of Israel

31 December 2021, 11:06 am  

Israel on Thursday signed a long-awaited agreement to purchase a dozen heavy transport helicopters and two additional refueling planes from the United States, in a weapons deal worth over $3 billion, the Defense Ministry said.

These aircraft, along with a number of additional F-35 fighter jets that Israel plans to purchase from the US, are specifically meant to counter threats posed by Iran, including its nuclear program.

According to the ministry, in addition to the fighter jets, transport helicopters and refueling planes, this includes “advanced aerial munitions, air defense systems, new naval and land-based platforms, and cyber and digital systems.”

The 12 Sikorsky King Stallion heavy transport helicopters will replace Israel’s aging fleet of CH-53 Sea Stallion helicopters, which have been in use for over half a century and have seen a number of maintenance issues in recent years. The first CH-53K helicopters are scheduled to arrive in Israel in 2026, according to the ministry.

Under the agreement, Israel has the option to purchase six more helicopters in the future as well.

In addition, the delegation of the ministry’s purchasing department signed a deal to buy two more Boeing KC-46 refueling planes, which would be needed in order to conduct strikes against targets in Iran, some  (1,200 miles) from Israel and far outside the regular flight range of Israeli jets.

Israel has already agreed to purchase two of these refueling planes, which are scheduled to arrive in 2025.

The helicopter deal will cost Israel roughly $2 billion and the refuelers will cost another $1.1 billion, with the money coming from the $3.8 billion that Israel receives from Washington as part of the 10-year memorandum of understanding between the two countries, the ministry said.

“The purchase of these platforms is part of a wider effort, which is being led by the Defense Ministry with the IDF over the past year and a half to strengthen the capabilities and force build-up of the IDF against current and future threats, mostly those posed by the ‘third-ring,’” the Defense Ministry said.

In Israeli military parlance, the first ring refers to threats directly on the country’s borders, the second ring refers to slightly farther-flung enemies, like those in Iraq or Yemen, and the third ring refers to those yet further away — in practice, it is almost exclusively used in connection with Iran.

The Biden Team Knows Its Iran Policy Is Failing

By Anthony Ruggiero, a senior fellow at the Foundation for Defense of Democracies.

DECEMBER 31, 2021, 6:00 AM

The Biden administration now admits a nuclear deal with Iran may not happen despite its continued outreach to Tehran. Of course, Biden wants to pin the blame on former U.S. President Donald Trump, whose withdrawal from the original nuclear deal supposedly provided Iran with the pretext to advance its nuclear weapons capabilities.

But the uncomfortable truth is Iran’s most aggressive moves came after U.S. President Joe Biden was elected.

What’s driving Tehran forward is not Trump’s maximum pressure campaign but Biden’s decision to ease that pressure. Simply put: Iran, like China and Russia, is doing what it can get away with.

In early December, the administration acknowledged it is discussing alternatives “if the path to diplomacy towards a mutual return to compliance [with the 2015 nuclear deal] isn’t viable in the near term.”

A U.S. State Department spokesperson made that comment while Israeli Defense Minister Benny Gantz was visiting Washington to propose joint military exercises to prepare for potential strikes on Iran’s nuclear facilities. The need for such consultations indicates a deal is slipping out of reach.

Earlier this month, an unnamed senior U.S. official also warned that “in the first quarter of [2022],” Tehran could “configure things and rapidly get one bomb’s worth of [highly enriched uranium].”

In other words, Iran has taken advantage of lengthy negotiations in Vienna to move toward nuclear breakout, which is when a state achieves nuclear weapons capability.

Washington’s European allies also know the talks are headed for failure. British Foreign Minister Liz Truss said this is Iran’s “last chance” for a deal.

Earlier this month, when an interviewer told U.S. Secretary of State Antony Blinken “the path for diplomacy seems to be failing,” Blinken pivoted to blaming Trump.

He said Trump’s “decision to pull out of the [original] agreement was a disastrous mistake because what’s happened since is that Iran has used that as an excuse, despite the maximum pressure applied against Iran, to also renege on its commitments under the agreement and to relentlessly rebuild the nuclear program that the agreement had put in a box.”

The problem with that argument is Tehran’s most egregious nuclear advances occurred after Biden was elected, not after Trump pulled out of the deal in 2018.

Biden has incentivized Tehran’s march toward the bomb by refusing to impose any consequences on the clerical regime for its provocations. There were five key instances when Biden stuck to his “engagement only” strategy despite Tehran’s nuclear advances.

First, Iran began producing uranium metal, a crucial element in nuclear weapons, in February.

Second, Tehran has obstructed the International Atomic Energy Agency’s (IAEA) investigation into Iran’s undeclared nuclear activities at several suspect nuclear sites.

Third, Iran reduced its cooperation with the IAEA at declared nuclear sites as of February. Since then, the agency cannot review data from surveillance equipment and other techniques used to monitor Iran’s nuclear program’s status.

Fourth, Tehran has increased production of advanced centrifuge parts since August but has not allowed the agency to inventory or verify the location of this equipment.

Fifth, the Biden administration has allowed each IAEA Board of Governors meeting this year to conclude without a censure resolution against Iran.

Tehran’s amassing of knowledge about the development of nuclear weapons will irreparably harm the global nonproliferation regime and lead to a more dangerous world.

If Biden hopes to stop it, he will have to recognize that his decisions as president have brought the United States to this point.

Anthony Ruggiero is a senior fellow at the Foundation for Defense of Democracies and a former senior director for counterproliferation and biodefense on the U.S. National Security Council during the Trump administration. Twitter: @NatSecAnthony

Eugenics

Italy recently raised the question of whether they should treat people over 70 years old who had tested positive for the Corona Virus.

This brought to mind an issue that was at the scientific forefront at the beginning of the 20th century.

It was called Eugenics.

Eugenics is the practice of improving the human species by selectively mating people with specific desirable hereditary traits. It aims to reduce human suffering by “breeding out” disease, disabilities and so-called undesirable characteristics from the human population.

Early supporters of eugenics believed people inherited mental illness, criminal tendencies and even poverty, and that these conditions could be bred out of the gene pool.

How many times did you hear your grandparents say, “They’re not our kind of people.”

John Harvey Kellogg, of Kellogg cereal fame, organized the Race Betterment Foundation in 1911 and established a “pedigree registry.”

The foundation hosted national conferences on eugenics in 1914, 1915 and 1928.

John Harvey Kellogg was a doctor who ran a famous turn-of-the-century sanitarium (not the crazy kind, the health kind) where he advocated to his patients, among other things, yogurt enemas, strict vegetarianism and a bland diet, because he believed that spicy, protein-rich foods increased the sex drive and bland foods reduced it.

This is, in fact, where flaked cereals came from; he and his brother developed a number of super-bland foods for consumption at the sanitarium, including, by accident in 1896, Corn Flakes. That’s right — Corn Flakes were originally designed to be an anti-aphrodisiac. A patient of Kellogg’s, C.W. Post, developed his own rival line of flaked cereals — and there you have the origins of most American cereals.

As the concept of eugenics took hold, prominent citizens, scientists and socialists championed the cause and established the Eugenics Record Office.

Eugenics would have been so much bizarre parlor talk had it not been for extensive financing by corporate philanthropies, specifically the Carnegie Institution, the Rockefeller Foundation and the Harriman railroad fortune.

They were all in partnership with some of America’s most respected scientists hailing from such prestigious universities as Stanford, Yale, Harvard, and Princeton.

Stanford president David Starr Jordan originated the notion of “race and blood” in his 1902 racial epistle “Blood of a Nation,” in which the university scholar declared that human qualities and conditions such as talent and poverty were passed through the blood.

In 1904, the Carnegie Institution established a laboratory complex at Cold Spring Harbor on Long Island that stockpiled millions of index cards on ordinary Americans, as researchers carefully plotted the removal of families, bloodlines and whole peoples.

From Cold Spring Harbor, eugenics advocates agitated in the legislatures of America, as well as the nation’s social service agencies and associations.

The Harriman railroad fortune paid local charities, such as the New York Bureau of Industries and Immigration, to seek out Jewish, Italian and other immigrants in New York and other crowded cities and subject them to deportation, trumped up confinement or forced sterilization.

The Rockefeller Foundation helped found the German eugenics program and even funded the program that Josef Mengele worked on before he went to Auschwitz.

Much of the spiritual guidance and political agitation for the American eugenics movement came from California’s quasi-autonomous eugenic societies, such as the Pasadena-based Human Betterment Foundation and the California branch of the American Eugenics Society, which coordinated much of their activity with the Eugenics Research Society in Long Island.

These organizations–which functioned as part of a closely-knit network–published racist eugenic newsletters and pseudoscientific journals, such as Eugenical News and Eugenics, and propagandized for the Nazis.

Eugenics was born as a scientific curiosity in the Victorian age. In 1863, Sir Francis Galton, a cousin of Charles Darwin, theorized that if talented people only married other talented people, the result would be measurably better offspring.

At the turn of the 20th century, Galton’s ideas were imported into the United States.  American eugenic advocates believed with religious fervor that the same concepts determining the color and size of peas, corn and cattle also governed the social and intellectual character of man.

Elitists, utopians and so-called “progressives” fused their smoldering race fears and class bias with their desire to make a better world. They reinvented Galton’s eugenics into a repressive and racist ideology.

The intent: populate the earth with vastly more of their own socio-economic and biological kind–and less or none of everyone else.

The superior species the eugenics movement sought was populated not merely by tall, strong, talented people. Eugenicists craved blond, blue-eyed Nordic types.

This group alone, they believed, was fit to inherit the earth. In the process, the movement intended to subtract emancipated Negroes, immigrant Asian laborers, Indians, Hispanics, East Europeans, Jews, dark-haired hill folk, poor people, the infirm and really anyone classified outside the gentrified genetic lines drawn up by American raceologists.

How? By identifying so-called “defective” family trees and subjecting them to lifelong segregation and sterilization programs to kill their bloodlines.

The grand plan was to literally wipe away the reproductive capability of those deemed weak and inferior–the so-called “unfit.” The eugenicists hoped to neutralize the viability of 10 percent of the population at a sweep, until none were left except themselves.

Eighteen solutions were explored in a Carnegie-supported 1911 “Preliminary Report of the Committee of the Eugenic Section of the American Breeder’s Association to Study and to Report on the Best Practical Means for Cutting Off the Defective Germ-Plasm in the Human Population.” Point eight was euthanasia.

The most commonly suggested method of eugenicide in America was a “lethal chamber” or public locally operated gas chambers.

Eugenic breeders believed American society was not ready to implement an organized lethal solution. But many mental institutions and doctors practiced improvised medical lethality and passive euthanasia on their own.

One institution in Lincoln, Illinois fed its incoming patients milk from tubercular cows believing a eugenically strong individual would be immune. Thirty to forty percent annual death rates resulted at Lincoln. Some doctors practiced passive eugenicide one newborn infant at a time. Others doctors at mental institutions engaged in lethal neglect.

Nonetheless, with eugenicide marginalized, the main solution for eugenicists was the rapid expansion of forced segregation and sterilization, as well as more marriage restrictions.

California led the nation, performing nearly all sterilization procedures with little or no due process. In its first twenty-five years of eugenic legislation, California sterilized 9,782 individuals, mostly women.

 Many were classified as “bad girls,” diagnosed as “passionate,” “oversexed” or “sexually wayward.”

In 1933 alone, at least 1,278 coercive sterilizations were performed, 700 of which were on women.

Even the United States Supreme Court endorsed aspects of eugenics. In its infamous 1927 decision, Supreme Court Justice Oliver Wendell Holmes wrote, “It is better for all the world, if instead of waiting to execute degenerate offspring for crime, or to let them starve for their imbecility, society can prevent those who are manifestly unfit from continuing their kind…. Three generations of imbeciles are enough.”

This decision opened the floodgates for thousands to be coercively sterilized or otherwise persecuted as subhuman. Years later, the Nazis at the Nuremberg trials quoted Holmes’s words in their own defense.

Only after eugenics became entrenched in the United States was the campaign transplanted into Germany, in no small measure through the efforts of California eugenicists, who published booklets idealizing sterilization and circulated them to German officials and scientists.

Hitler studied American eugenics laws. He tried to legitimize his anti-Semitism by medicalizing it, and wrapping it in the more palatable pseudoscientific facade of eugenics.

Hitler was able to recruit more followers among reasonable Germans by claiming that science was on his side. While Hitler’s race hatred sprung from his own mind, the intellectual outlines of the eugenics Hitler adopted in 1924 were made in America.

During the ’20s, Carnegie Institution eugenic scientists cultivated deep personal and professional relationships with Germany’s fascist eugenicists.

 In Mein Kampf, published in 1924, Hitler quoted American eugenic ideology and openly displayed a thorough knowledge of American eugenics. “There is today one state,” wrote Hitler, “in which at least weak beginnings toward a better conception [of immigration] are noticeable. Of course, it is not our model German Republic, but the United States.”

Hitler proudly told his comrades just how closely he followed the progress of the American eugenics movement. “I have studied with great interest,” he told a fellow Nazi, “the laws of several American states concerning prevention of reproduction by people whose progeny would, in all probability, be of no value or be injurious to the racial stock.”

Hitler even wrote a fan letter to American eugenic leader Madison Grant calling his race-based eugenics book, The Passing of the Great Race his “bible.”

Hitler’s struggle for a superior race would be a mad crusade for a Master Race. Now, the American term “Nordic” was freely exchanged with “Germanic” or “Aryan.”

Race science, racial purity and racial dominance became the driving force behind Hitler’s Nazism.

Nazi eugenics would ultimately dictate who would be persecuted in a Reich-dominated Europe, how people would live, and how they would die.

Nazi doctors would become the unseen generals in Hitler’s war against the Jews and other Europeans deemed inferior. Doctors would create the science, devise the eugenic formulas, and even hand-select the victims for sterilization, euthanasia and mass extermination.

During the Reich’s early years, eugenicists across America welcomed Hitler’s plans as the logical fulfillment of their own decades of research and effort.

California eugenicists republished Nazi propaganda for American consumption. They also arranged for Nazi scientific exhibits, such as an August 1934 display at the L.A. County Museum, for the annual meeting of the American Public Health Association.

In 1934, as Germany’s sterilizations were accelerating beyond 5,000 per month, the California eugenics leader C. M. Goethe upon returning from Germany  bragged to a key colleague, “You will be interested to know, that your work has played a powerful part in shaping the opinions of the group of intellectuals who are behind Hitler in this epoch-making program.

 Everywhere I sensed that their opinions have been tremendously stimulated by American thought.…I want you, my dear friend, to carry this thought with you for the rest of your life, that you have really jolted into action a great government of 60 million people.”

More than just providing the scientific roadmap, America funded Germany’s eugenic institutions.

By 1926, Rockefeller had donated some $410,000 — almost $4 million in 21st-Century money — to hundreds of German researchers.

In May 1926, Rockefeller awarded $250,000 to the German Psychiatric Institute of the Kaiser Wilhelm Institute, later to become the Kaiser Wilhelm Institute for Psychiatry.

Among the leading psychiatrists at the German Psychiatric Institute was Ernst Rüdin, who became director and eventually an architect of Hitler’s systematic medical repression.

Another in the Kaiser Wilhelm Institute’s eugenic complex of institutions was the Institute for Brain Research. Since 1915, it had operated out of a single room. Everything changed when Rockefeller money arrived in 1929.

A grant of $317,000 allowed the Institute to construct a major building and take center stage in German race biology. The Institute received additional grants from the Rockefeller Foundation during the next several years.

Leading the Institute, once again, was Hitler’s medical henchman Ernst Rüdin. Rüdin’s organization became a prime director and recipient of the murderous experimentation and research conducted on Jews, Gypsies and others.

Beginning in 1940, thousands of Germans taken from old age homes, mental institutions and other custodial facilities were systematically gassed. Between 50,000 and 100,000 were eventually killed.

Leon Whitney, executive secretary of the American Eugenics Society declared of Nazism, “While we were pussy-footing around…the Germans were calling a spade a spade.”

A special recipient of Rockefeller funding was the Kaiser Wilhelm Institute for Anthropology, Human Heredity and Eugenics in Berlin.

At the time of Rockefeller’s endowment, Otmar von Verschuer, a hero in American eugenics circles, functioned as a head of the Institute.

In 1935, Verschuer left the Institute to form a rival eugenics facility in Frankfurt that was much heralded in the American eugenic press.

Verschuer had a long-time assistant. His name was Josef Mengele. On May 30, 1943, Mengele arrived at Auschwitz.

Verschuer notified the German Research Society, “My assistant, Dr. Josef Mengele (M.D., Ph.D.) joined me in this branch of research. He is presently employed as Hauptsturmführer [captain] and camp physician in the Auschwitz concentration camp.

Anthropological testing of the most diverse racial groups in this concentration camp is being carried out with permission of the SS Reichsführer [Himmler].”

Mengele began searching the boxcar arrivals for twins. When he found them, he performed beastly experiments, scrupulously wrote up the reports and sent the paperwork back to Verschuer’s institute for evaluation. Often, cadavers, eyes and other body parts were also dispatched to Berlin’s eugenic institutes.

Rockefeller executives never knew of Mengele. With few exceptions, the foundation had ceased all eugenic studies in Nazi-occupied Europe before the war erupted in 1939. But by that time the die had been cast.

The talented men Rockefeller and Carnegie financed, the institutions they helped found, and the science it helped create took on a scientific momentum of their own.

After the war, eugenics was declared a crime against humanity–an act of genocide. Germans were tried and they cited the California statutes in their defense. To no avail. They were found guilty.

However, Mengele’s boss Verschuer escaped prosecution. Verschuer re-established his connections with California eugenicists who had gone underground and renamed their crusade “human genetics.”

Soon, Verschuer once again became a respected scientist in Germany and around the world. In 1949, he became a corresponding member of the newly formed American Society of Human Genetics, organized by American eugenicists and geneticists.

Human genetics’ genocidal roots in eugenics were ignored by a victorious generation that refused to link itself to the crimes of Nazism and by succeeding generations that never knew the truth of the years leading up to war.

Human genetics became an enlightened endeavor in the late twentieth century. Hard-working, devoted scientists finally cracked the human code through the Human Genome Project.

Now, every individual can be biologically identified and classified by trait and ancestry. Yet even now, some leading voices in the genetic world are calling for a cleansing of the unwanted among us, and even a master human species.

There is understandable wariness about more ordinary forms of abuse.

For example, in denying insurance or employment based on genetic tests.

 On October 14, America’s first genetic anti-discrimination legislation passed the Senate by unanimous vote. However, because genetics research is global, no single nation’s law can stop the threats.

So there you have it folks. The ideas of Hitler forming a master race did not originate in Germany. They originated here in the elite scientific community of the United States. My wife and thousands of other people have benefitted from new cures and treatments brought about by genetic engineering.

However, have we opened pandora’s box?

If so, what can be done to make sure these discoveries are not abused by those supporting an evil agenda?

What would you fight for?

What would America Fight for?

The Economist

December 11, 2021

Eighty years ago Japan bombed Pearl Harbor. It was a grave error, bringing the world’s mightiest country into the war and dooming the Japanese empire to oblivion.

A clear-sighted Japanese admiral supposedly lamented: “I fear all we have done is to awaken a sleeping giant and fill him with a terrible resolve.”

Today Japan is peaceable, rich and innovative. It was the Japanese who rebuilt their country, but their task was made easier by the superpower that defeated them.

Not only was America sponsor to a liberal, capitalist democracy in Japan; it also created a world order in which Japan was free to trade and grow.

This order was not perfect and did not apply everywhere. But it was better than anything that had come before.

Unlike previous great powers, America did not use its military dominance to win commercial advantage at the expense of its smaller allies.

On the contrary, it allowed itself to be bound, most of the time, by common rules. And that rules-based system allowed much of the world to avoid war and grow prosperous.

Unfortunately, America is tiring of its role as guarantor of the liberal order.

The giant has not exactly fallen asleep again, but its resolve is faltering, and its enemies are testing it.

Vladimir Putin is massing troops on the border with Ukraine and could soon invade.

China is buzzing Taiwan’s airspace with fighter jets, using mock-ups of American aircraft-carriers for target practice, and trying out hypersonic weapons.

Iran has taken such a boastful stance at nuclear talks that many observers expect the talks to collapse. Thus, two autocratic powers threaten to seize land currently under democratic control, and a third threatens to violate the Non-Proliferation Treaty by building a nuclear bomb.

The question now becomes, how far would America go to prevent such reckless acts?

Joe Biden can sound forceful, at times. On December 7th he warned Mr Putin of severe consequences if Russia were to launch another attack on Ukraine.

He has maintained sanctions on Iran. And in October he said that America had a “commitment” to defend Taiwan, though aides insisted policy has not changed. (America has long refused to say whether it would send forces to repel a Chinese invasion, so as not to encourage any Taiwanese action that might provoke one.)

China was left wondering whether Mr. Biden misspoke or was craftily hinting at a more robust stance.

On December 7th America’s House of Representatives passed a big boost to the defense budget.

And yet, America has become reluctant to use hard power across much of the world. A coalition of hawks and doves in Washington is calling for “restraint”.

The doves say that by attempting to police the world, America inevitably gets sucked into needless conflicts abroad that it cannot win.

The hawks say that America must not be distracted from the only task that counts: standing up to China.

Either of these two visions would entail a partial, destabilizing American retreat, leaving the world more dangerous and uncertain.

Mr. Biden’s debacle in withdrawing from Afghanistan led some to doubt America’s willingness to defend its friends or deter its foes, and many to worry about the competence of its planning.

The president’s loose words about America’s nuclear umbrella have undermined faith among allies that it still protects them.

And though Mr Biden does not insult allies as Donald Trump did, he often fails to consult them, eroding the bonds of trust that have long multiplied American power.

Just as important as the instincts of any one president is the mood of the country that elects them.

America is no longer the confident powerhouse of the 1990s. Its relative power has waned, even if it remains unmatched.

After Iraq and Afghanistan, voters have grown weary of foreign adventures.

Partisan politics, which once stopped at the water’s edge, paralyzes most aspects of foreign policy. Over 90 ambassadorial posts remain vacant, blocked by Congress.

The relentless drama of politics, including such things as disputed elections and mask-wearing, makes America seem too divided at home to show sustained purpose abroad.

Japan and Australia have signaled that they would help defend Taiwan. Britain has joined America in sharing nuclear-submarine propulsion technology with Australia. A new German government is hinting at a tougher line against Russia.

More adaptation to a world with less America will be required. Democracies, especially in Europe, should spend more on defense.

Those, such as Taiwan and Ukraine, at risk of being attacked should make themselves stronger, for example by beefing up their capacity for warfare. The better prepared they are, the less likely their foes are to attack them.

Fans of the rules-based order should share more intelligence with each other. They should bury old quarrels, such as the futile spats between Japan and South Korea over history.

They should forge deeper and broader alliances, formally or informally.

India, out of self-interest, should draw closer to alliance with Australia, Japan and America. 

NATO cannot admit Ukraine, since the rules say an attack on one is an attack on all, and Russia has already occupied Ukrainian territory.

 But NATO members can offer Ukraine more arms, cash and training to help it defend itself.

If the current world order breaks down, America’s allies will suffer terribly.

Once it is gone, Americans themselves may be surprised to discover how much they benefited from it. Yet all is not lost.

A determined and united effort by democracies could preserve at least some of the rules-based system and prevent the world from sliding back towards the dismal historical norm, in which the strong prey unchecked on the weak. Few tasks are more important, or harder. 

So, let’s ask the question again. This time looking at our history.

What would America Fight for?

The Cause of the American Revolution

Kelly, Martin. “The Root Causes of the American Revolution.” ThoughtCo, Feb. 16, 2021, thoughtco.com/causes-of-the-american-revolution-104860.

No single event caused the revolution. It was, instead, a series of events that led to the war. Essentially, it began as a disagreement over the way Great Britain governed the colonies and the way the colonies thought they should be treated.

Americans felt they deserved all the rights of Englishmen. The British, on the other hand, thought that the colonies were created to be used in ways that best suited the Crown and Parliament.

This conflict is embodied in one of the rallying cries of the ​American Revolution: “No Taxation Without Representation.”

In order to understand what led to the rebellion, it’s important to look at the mindset of the founding fathers. It should also be noted that this mindset was not that of the majority of colonists.

There were no pollsters during the American revolution, but it’s safe to say its popularity rose and fell over the course of the war.

Historian Robert M. Calhoon estimated that only about 40–45% of the free population supported the revolution, while about 15–20% of the free white males remained loyal.

1/3 supported the King, 1/3 supported George Washington, and 1/3 didn’t care one way or the other.

The 18th century is known historically as the age of Enlightenment. It was a period when thinkers, philosophers, statesman, and artists began to question the politics of government, the role of the church, and other fundamental and ethical questions of society as a whole.

The period was also known as the Age of Reason, and many colonists followed this new way of thinking.

A number of the revolutionary leaders had studied major writings of the Enlightenment, including those of Thomas Hobbes, John Locke, Jean-Jacques Rousseau, and the Baron de Montesquieu.

From these thinkers, the founders gleaned such new political concepts as the social contract, limited government, the consent of the governed, and the separation of powers.

Locke’s writings, in particular, struck a chord. His books helped to raise questions about the rights of the governed and the overreach of the British government. They spurred the “republican” ideology that stood up in opposition to those viewed as tyrants.

Men such as Benjamin Franklin and John Adams were also influenced by the teachings of the Puritans and Presbyterians.

These teachings included such new radical ideas as the principle that all men are created equal and the belief that a king has no divine rights.

Together, these innovative ways of thinking led many in this era to consider it their duty to rebel against laws they viewed as unjust.

The geography of the colonies also contributed to the revolution. Their distance from Great Britain naturally created a sense of independence that was hard to overcome.

Those willing to colonize the new world generally had a strong independent streak with a profound desire for new opportunities and more freedom.

The Proclamation of 1763 played its own role. After the French and Indian War, King George III issued the royal decree that prevented further colonization west of the Appalachian Mountains.

The intent was to normalize relations with the Indians, many of whom fought with the French.

A number of settlers had purchased land in the now forbidden area or had received land grants. The crown’s proclamation was largely ignored as settlers moved anyway and the “Proclamation Line” eventually moved after much lobbying.

Despite this concession, the affair left another stain on the relationship between the colonies and Britain.

The existence of colonial legislatures meant that the colonies were in many ways independent of the crown. The legislatures were allowed to levy taxes, muster troops, and pass laws. Over time, these powers became rights in the eyes of many colonists.

The British government had different ideas and attempted to curtail the powers of these newly elected bodies.

There were numerous measures designed to ensure the colonial legislatures did not achieve autonomy, although many had nothing to do with the larger British Empire. In the minds of colonists, they were a matter of local concern.

From these small, rebellious legislative bodies that represented the colonists, the future leaders of the United States were born.

Even though the British believed in mercantilism, Prime Minister Robert Walpole espoused a view of “salutary neglect.” This system was in place from 1607 through 1763, during which the British were lax on enforcement of external trade relations. Walpole believed this enhanced freedom would stimulate commerce.

The French and Indian War led to considerable economic trouble for the British government. Its cost was significant, and the British were determined to make up for the lack of funds. They levied new taxes on the colonists and increased trade regulations. These actions were not well received by the colonists.

New taxes were enforced, including the Sugar Act and the Currency Act, both in 1764. The Sugar Act increased already considerable taxes on molasses and restricted certain export goods to Britain alone.

The Currency Act prohibited the printing of money in the colonies, making businesses rely more on the crippled British economy. 

Feeling underrepresented, overtaxed, and unable to engage in free trade, the colonists rallied to the slogan, “No Taxation Without Representation.” This discontent became very apparent in 1773 with the events that later became known as the Boston Tea Party.

The British government’s presence became increasingly more visible in the years leading to the revolution. British officials and soldiers were given more control over the colonists, and this led to widespread corruption.

Among the most glaring of these issues were the “Writs of Assistance.” These were general search warrants that gave British soldiers the right to search and seize any property they deemed to be smuggled or illegal goods.

Designed to assist the British in enforcing trade laws, these documents allowed British soldiers to enter, search, and seize warehouses, private homes, and ships whenever necessary. However, many abused this power.

In 1761, Boston lawyer James Otis fought for the constitutional rights of the colonists in this matter but lost. The defeat only inflamed the level of defiance and ultimately led to the Fourth Amendment in the U.S. Constitution.

The Third Amendment was also inspired by the overreach of the British government. Forcing colonists to house British soldiers in their homes infuriated the population. It was inconvenient and costly to the colonists, and many also found it a traumatic experience after events like the Boston Massacre in 1770.

Trade and commerce were overly controlled, the British Army made its presence known, and the local colonial government was limited by a power far across the Atlantic Ocean.

If these affronts to the colonists’ dignity were not enough to ignite the fires of rebellion, American colonists also had to endure a corrupt justice system.

Political protests became a regular occurrence as these realities set in. In 1769, Alexander McDougall was imprisoned for libel when his work “To the Betrayed Inhabitants of the City and Colony of New York” was published.

His imprisonment and the Boston Massacre were just two infamous examples of the measures the British took to crack down on protesters. 

After six British soldiers were acquitted and two dishonorably discharged for the Boston Massacre—ironically enough, they were defended by John Adams—the British government changed the rules.

From then on, officers accused of any offense in the colonies would be sent to England for trial. This meant that fewer witnesses would be on hand to give their accounts of events and it led to even fewer convictions.

To make matters even worse, jury trials were replaced with verdicts and punishments handed down directly by colonial judges. Over time, the colonial authorities lost power over this as well because the judges were known to be chosen, paid, and supervised by the British government.

The right to a fair trial by a jury of their peers was no longer possible for many colonists.

All of these grievances that colonists had with the British government led to the events of the American Revolution.

And many of these grievances directly affected what the founding fathers wrote into the U.S. Constitution. These constitutional rights and principles reflect the hopes of the framers that the new American government would not subject their citizens to the same loss of freedoms that the colonists had experienced under Britain’s rule.

Now I know there are other conflicts that triggered a response from the American People such as the War of 1812, The Spanish American War, and the Civil War. But in an effort to better focus on conflicts that involved foreign adversaries, let’s jump ahead to WWI.

Following World War I, the United States hoped to avoid further entanglement with European politics that had drawn us into war.

A strong isolationist sentiment developed that questioned the wisdom of our entry into WWI, The Great War, as it was then known.

Americans really didn’t care that much about Germany’s expansionist policies in Europe during that war.

However, the rise of military government in Germany, Italy and Japan and their invasions of neighboring countries prior to WWII became a major concern for United States leaders including President Franklin Delano Roosevelt.

In Europe, Adolf Hitler led the rise of the Nazi Party, which claimed that Germany was treated unfairly in the peace treaty that ended WWI. He also sought to unite all German-speaking peoples, a policy that put him at odds with several neighbors like Austria, Poland and Czechoslovakia.

Great Britain and France tried to negotiate an end to German expansion, but the Soviet Union on Germany’s eastern front signed a non-aggression treaty with Hitler that opened the door to Germany’s invasion of Poland in 1939.

France and England came to the aid of the Poles and declared war on Germany. Hitler’s armies quickly overran Poland and then France, leaving Britain alone against German armies and air force.

President Roosevelt wanted to come to the aid of our British allies, but public sentiment still was not yet ready to send American soldiers to fight in another European war.

Meanwhile, Germany and Italy became partners with Japan that had designs on domination of Eastern Asia.

Japan lacked natural resources like oil and rubber and created plans to attack neighboring countries that could supply them. They invaded Korea and Manchuria and then China.

They also looked southward to the European colonies of Dutch East Asia and British Malaysia. They knew that the United States and Great Britain would fight to stop them.

To weaken U.S. naval forces in the Pacific, Japan bombed the naval base at Pearl Harbor in Hawaii on December 7, 1941.

At that point we had no choice but to join the fight. Bear in mind, our entry came 2 years after the war had started.

So, America declared war on Japan, and on December 11, Germany and Italy lived up to their agreement with Japan and declared war on the United States.

Let’s move to the next conflict. The Korean war.

Again we ask the question, what would America fight for?

America wanted not just to contain communism – they also wanted to prevent the domino effect. Truman was worried that if Korea fell, the next country to fall would be Japan, which was very important for American trade. This was probably the most important reason for America’s involvement in the war.

Article on line by VICTOR DAVIS HANSON

of the Hoover Institution for Prager University.

 Mention the Korean War today and most people will look at you with a blank stare. At the time it was fought, just five years after World War II ended, everyone recognized it as a world shaping conflict, a stark confrontation between the forces of democracy and communism.

It began on June 25, 1950 when Soviet-backed communist North Korea crossed the 38th parallel and invaded its U.S.-backed anti-communist South Korean neighbor.

Within weeks the communists had nearly absorbed the entire country. The United States at first was confused over whether it should—or even could—respond.

America had slashed its military budget after the end of World War II and was short both men and equipment. It still had not awakened fully to the expansionist threat of Soviet Russia.

The Soviets—buoyed by their own recent development of an atomic bomb and Mao Zedong’s communist victory in China—sensed America’s lack of resolve and encouraged the North’s aggression.

Yet within weeks President Harry Truman rushed troops to save the shrinking Allied perimeter at Pusan on the southern tip of the Korean Peninsula.

And by late September 1950, General Douglas MacArthur had successfully completed the Inchon landings and launched counter-attacks. He quickly reclaimed the entire south and sent American-led United Nations forces far into North Korea to reunite the entire peninsula—only to be surprised when hundreds of thousands of Chinese Red Army troops crossed the Yalu River at the Chinese border and sent the outnumbered Americans reeling back into South Korea.

Thanks to the genius of General Matthew Ridgeway, who arrived to assume supreme command in South Korea in December 1950, over the next 100 days U.S. led UN forces pushed the communists back across the 38th Parallel.

The fighting was fierce. Seoul, the capital city of South Korea, exchanged hands between communist and U.S. led forces five times before it was finally secured.

During the years 1952 and 1953, the war grew static, neither side able to deliver a knockout blow. Eventually the conflict ended with a tense armistice in July 1953.

For over the next 60 years, a cold war persisted between the Stalinist North and what, by the 1980s, had evolved into the democratic, economic powerhouse of South Korea.

Over 35,000 Americans died in the Korean War. The war marked the first major armed conflict of the Nuclear Age, and one in which the United States had not clearly defeated the enemy and thus not dictated terms of surrender.

Was fighting the Korean War and restoring the South—without uniting the entire peninsula—worth the huge cost in blood and treasure

The natural dividend of saving the South was the evolution of today’s democratic and prosperous South Korea that has given its 50 million citizens undreamed of freedom and affluence—and has blessed the world with topflight products from the likes of Hyundai, Kia, LG and Samsung. South Korea is a model global citizen and a strong ally of the U.S.—and stands in sharp contrast to the communist regime in the North that has starved and murdered millions of its own people and caused untold mischief in the world community.

Had it not been for U.S. intervention and support to the South, the current monstrous regime in Pyongyang would now rule all of Korea, ensuring its nuclear-armed dictatorship even greater power and resources.

The American effort to save South Korea also sent a message to both communist China and the Soviet Union that the free world, under U.S. leadership, would no longer tolerate communist military takeovers of free nations.

The resulting deterrence policy helped to keep the communist world from attempting similar surprise attacks on Japan, Taiwan, and Western Europe.

Finally, the Korean War awakened the United States to the dangers of disarmament and isolationism and led to the bipartisan foreign policy of containment of global communism that in 1989 finally led to the collapse of the Soviet Union, and with it victory in the Cold War.

The Korean War was an incomplete American victory in its failure to liberate North Korea and unite the peninsula, but a victory nonetheless.

And not just from a military perspective, but from a moral one as well. The reason 35,000 Americans died in Korea was to keep at least half the Korean people free.

Korea did not have a single material resource that would have benefited America. Yet, we decided to join the fight.

How about one more folks?

What did the US fight for in Vietnam?

At the heart of the conflict was the desire of North Vietnam, which had defeated the French colonial administration of Vietnam in 1954, to unify the entire country under a single communist regime modeled after those of the Soviet Union and China.

The South Vietnamese government, on the other hand, fought to preserve a Vietnam more closely aligned with the West.

U.S. military advisers, present in small numbers throughout the 1950s, were introduced on a large scale beginning in 1961, and active combat units were introduced in 1965.

By 1969 more than 500,000 U.S. military personnel were stationed in Vietnam. Meanwhile, the Soviet Union and China poured weapons, supplies, and advisers into the North, which in turn provided support, political direction, and regular combat troops for the campaign in the South.

The costs and casualties of the growing war proved too much for the United States to bear, and U.S. combat units were withdrawn by 1973. In 1975 South Vietnam fell to a full-scale invasion by the North.

A public opinion poll of more than 1,000 people in each of 64 countries in late 2014 by WIN / Gallup International found that only 44 percent of adult Americans were willing to tell pollsters they’d fight for their country.

The percentage is even less for some U.S. allies, such as Canada (30%), France (29%), the United Kingdom (27%), Italy (30%), Germany (18%) and Japan (11%).

In contrast, 71 percent of Chinese and 59 percent of Russians say they’d fight for their countries.

So, there you have it folks. I’ve given you a few examples of what we as Americans have fought for in our past.

The big question now is. Are we willing to fight for Taiwan? Are we willing to fight for Ukraine?

If not, have we reached a point in our history that we should now concentrate on our own country, turn isolationist, and let the rest of the world fend for themselves?

If so, what are the consequences?

Dollar Diplomacy

Dollar Diplomacy

What is it?

Dollar diplomacy is a form of diplomacy that involves investing in foreign nations to stabilize them. The term is often used specifically to represent efforts conducted in self-interest by the United States; dollar diplomacy in this sense of the word is diplomacy that will benefit the interests of the United States.

This approach to diplomacy has been practiced for a very long time by a number of nations, not just the United States.

It is this practice that China is now using on the United States and other countries world wide.

This term became popularized during the term of President Taft, who notably used dollar diplomacy to “send dollars instead of guns” to areas in which the United States had an interest.

The government did things like acquiring debt held by poor nations and investing in infrastructure in countries that could not afford it. Just like China is currently doing in Africa, South America, and the Caribbean.

In exchange for this, the United States government expected certain concessions from the countries it was assisting.

Sometimes the United States used dollar diplomacy so that it could play a role in shaping regulatory policy in a way that would be beneficial for American companies.

This included pressuring companies to enact lax laws to protect workers, limiting taxation of foreign companies, and other activities.

Dollar diplomacy was also used to secure political power, as seen when the United States reserved the right to choose appointees to key political positions, and sometimes to outright appoint them.

Beneficiaries of dollar diplomacy were in a difficult position. These nations needed the financial assistance and benefited from the expertise, equipment, and funds American companies brought into their borders. Nations didn’t like being ordered around by the United States, however, and felt internal pressure as a result of the control exerted by the United States.

Some citizens of these countries protested, sometimes violently, and the history of dollar diplomacy in areas like Latin America and Southeast Asia played a role in military conflicts sparked by resentful citizens.

The United States argued that while the policy certainly had the effect of opening up foreign markets and creating a favorable business climate for American companies, which benefited the United States, it was also beneficial for recipients.

Dollar diplomacy created jobs, infrastructure, and security for some nations, and in fact the United States continues to invest in foreign allies for the purpose of helping them recover and stabilize after military conflicts, economic downturns, and political turmoil.

The focus today is less on self-interest, however, and more on helping allies and friends of the United States achieve political, economic, and social independence to create positive long term allies.

So how about a little history of US dollar diplomacy.

Let’s start with Hawaii.

The Kingdom of Hawaii was independent from 1810 until 1893 when the monarchy was overthrown by resident American businessmen.

It was then an independent republic from 1894 until 1898, under Sanford B. Dole, when it was annexed by the United States as a territory, becoming a state in 1959.

In 1887, King Kalākaua was forced to sign the 1887 Constitution of the Kingdom of Hawaii, which stripped the king of much of his authority.

There was a property qualification for voting, which disenfranchised most Hawaiians and immigrant laborers, and favored the wealthier white community. Resident whites were allowed to vote, but resident Hawaiians were excluded.

Because the 1887 Constitution was signed under threat of violence, it is known as the “Bayonet Constitution”. King Kalākaua, reduced to a figurehead, reigned until his death in 1891. His sister, Queen Liliʻuokalani, succeeded him on the throne. She was the last monarch of Hawaii.

In 1893, Queen Liliʻuokalani announced plans for a new constitution. The United States then sent in a company of US marines to maintain the peace thus assuring American businessmen would remain in power.

In January 1893, it was Queen Liliʻuokalani who was overthrown and replaced by American lawyer Sanford B. Dole who became President of the Republic in 1894.

After William McKinley won the presidential election in 1896, Hawaii’s annexation to the U.S. was again discussed. The previous president, Grover Cleveland, was a friend of Queen Liliʻuokalani. However, McKinley was open to persuasion by U.S. expansionists and by annexationists from Hawaii.

In 1900, Hawaii was granted self-governance and retained ʻIolani Palace as the territorial capitol building.

Despite several attempts to become a state, Hawaii remained a territory for sixty years.

Plantation owners and key capitalists, who maintained control through financial institutions, or “factors”, known as the “Big Five“, found territorial status convenient, enabling them to continue importing cheap foreign labor; such immigration was prohibited in various states.

In the 1950s the power of the plantation owners was finally broken by descendants of immigrant laborers. Because they were born in a U.S. territory, they were legal U.S. citizens.

The Hawaii Republican Party, strongly supported by plantation owners, was voted out of office. The Democratic Party of Hawaii dominated politics for 40 years. Eager to gain full voting rights, Hawaii’s residents actively campaigned for statehood.

In March 1959, Congress passed the Hawaii Admission Act and U.S. President Dwight D. Eisenhower signed it into law.

Next up, The Spanish American War.

On April 25, 1898 the United States declared war on Spain following the sinking of the Battleship Maine in Havana harbor on February 15, 1898.

The war ended with the signing of the Treaty of Paris on December 10, 1898. As a result, Spain lost its control over the remains of its overseas empire — Cuba, Puerto Rico, the Philippines Islands, Guam, and other islands.

Cuba was the first to initiate its own struggle for independence. During the years from 1868-1878, Cubans fought for autonomy from Spain. That war concluded with a treaty that was never enforced. In the 1890’s Cubans began to agitate once again for their freedom from Spain.

U.S. interest in purchasing Cuba had begun long before 1898. American sugar interests bought up large tracts of land in Cuba.

The U.S. had more than $50 million invested in Cuba and annual trade, mostly in sugar, was worth twice that much. Support for war had been growing in the United States, despite President Grover Cleveland‘s proclamation of neutrality on June 12, 1895.

But sentiment to enter the conflict grew in the United States when General Valeriano Weyler began implementing a policy of Reconcentration that moved the population into central locations guarded by Spanish troops and placed the entire country under martial law in February 1896.

By December 7, President Cleveland reversed himself declaring that the United States might intervene should Spain fail to end the crisis in Cuba.

President William McKinley, inaugurated on March 4, 1897, was even more anxious to become involved, particularly after the New York Journal published a copy of a letter from Spanish Foreign Minister Enrique Dupuy de Lôme criticizing the American President on February 9, 1898.

Events moved swiftly after the explosion aboard the U.S.S. Maine on February 15.

On March 9, Congress passed a law allocating fifty million dollars to build up military strength. On March 28, the U.S. Naval Court of Inquiry found that a mine blew up the Maine. On April 21 President McKinley ordeed a blockade of Cuba and four days later the U.S. declared war.

Following its declaration of war against Spain issued on April 25, 1898, the United States added the Teller Amendment asserting that it would not attempt to exercise control over Cuba.

Two days later Commodore George Dewey sailed from Hong Kong with Emilio Aguinaldo on board.

Fighting began in the Phillipines Islands at the Battle of Manila Bay on May 1 where Commodore George Dewey reportedly exclaimed, “You may fire when ready, Gridley,” and the Spanish fleet under Rear Admiral Patricio Montojo was destroyed.

However, Dewey did not have enough manpower to capture Manila so Aguinaldo’s guerrillas maintained their operations until 15,000 U.S. troops arrived at the end of July.

On the way, the cruiser Charleston stopped at Guam and accepted its surrender from its Spanish governor who was unaware his nation was at war.

Although a peace protocol was signed by the two belligerents on August 12, Commodore Dewey and Maj. Gen. Wesley Merritt, leader of the army troops, assaulted Manila the very next day, unaware that peace had been declared.

In late April, Andrew Summers Rowan made contact with Cuban General Calixto García who supplied him with maps, intelligence, and a core of rebel officers to coordinate U.S. efforts on the island.

The U.S. North Atlantic Squadron left Key West for Cuba on April 22 following the frightening news that the Spanish home fleet commanded by Admiral Pascual Cervera had left Cadiz and entered Santiago, having slipped by U.S. ships commanded by William T. Sampson and Winfield Scott Schley. They arrived in Cuba in late May.

War actually began for the U.S. in Cuba in June when the Marines captured Guantánamo Bay and 17,000 troops landed at Siboney and Daiquirí, east of Santiago de Cuba, the second largest city on the island.

At that time Spanish troops stationed on the island included 150,000 regulars and 40,000 irregulars and volunteers while rebels inside Cuba numbered as many as 50,000.

U.S. army strength at the time totaled 26,000, requiring the passage of the Mobilization Act of April 22 that allowed for an army of at first 125,000 volunteers (later increased to 200,000) and a regular army of 65,000.

On June 22, U.S. troops landed at Daiquiri where they were joined by Calixto García and about 5,000 revolutionaries.

U.S. troops attacked the San Juan heights on July 1, 1898. Dismounted troopers, including the African-American Ninth and Tenth cavalries and the Rough Riders commanded by Lt. Col. Theodore Roosevelt went up against Kettle Hill while the forces led by Brigadier General Jacob Kent charged up San Juan Hill and pushed Spanish troops further inland while inflicting 1,700 casualties.

While U.S. commanders were deciding on a further course of action, Admiral Cervera left port only to be defeated by Admiral Schley.

On July 16, the Spaniards agreed to the unconditional surrender of the 23,500 troops around the city.

A few days later, Major General Nelson Miles sailed from Guantánamo to Puerto Rico. His forces landed near Ponce and marched to San Juan with virtually no opposition.

Representatives of Spain and the United States signed a peace treaty in Paris on December 10, 1898, which established the independence of Cuba, ceded Puerto Rico and Guam to the United States, and allowed the victorious power to purchase the Philippines Islands from Spain for $20 million.

The war had cost the United States $250 million and 3,000 lives, of whom 90% had perished from infectious diseases.

The Platt Amendment, an amendment to a U.S. army appropriations bill, established the terms under which the United States would end its military occupation of Cuba (which had begun in 1898 during the Spanish-American War) and “leave the government and control of the island of Cuba to its people.”

While the amendment was named after Senator Orville Platt of Connecticut, it was drafted largely by Secretary of War Elihu Root.

The Platt Amendment laid down eight conditions to which the Cuban Government had to agree before the withdrawal of U.S. forces and the transfer of sovereignty would begin.

The Platt Amendment’s conditions prohibited the Cuban Government from entering into any international treaty that would compromise Cuban independence or allow foreign powers to use the island for military purposes.

The United States also reserved the right to intervene in Cuban affairs in order to defend Cuban independence and to maintain “a government adequate for the protection of life, property, and individual liberty.”

Other conditions of the Amendment demanded that the Cuban Government implement plans to improve sanitary conditions on the island, relinquish claims on the Isle of Pines, and agree to sell or lease territory for coaling and naval stations to the United States. (This clause ultimately led to the perpetual lease by the United States of Guantánamo Bay.)

Finally, the amendment required the Cuban Government to conclude a treaty with the United States that would make the Platt amendment legally binding, and the United States pressured the Cubans to incorporate the terms of the Platt Amendment in the Cuban constitution.

The rationale behind the Platt Amendment was straightforward. The United States Government had intervened in Cuba in order to safeguard its significant commercial interests on the island in the wake of Spain’s inability to preserve law and order.

As U.S. military occupation of the island was to end, the United States needed some method of maintaining a permanent presence and order.

However, anti-annexationists in Congress had incorporated the Teller Amendment in the 1898 war resolution authorizing President William McKinley to take action against Spain in the Spanish-American War.

This Teller Amendment committed the U.S. Government to granting Cuba its independence following the removal of Spanish forces.

By directly incorporating the requirements of the Platt Amendment into the Cuban constitution, the McKinley Administration was able to shape Cuban affairs without violating the Teller Amendment.

General Leonard Wood, commander of the U.S. occupation forces and military governor of Cuba, presented the terms of the Platt Amendment to the delegates of the Cuban Constitutional Convention in late 1900.

Although the Cuban delegates realized that the amendment significantly limited Cuban sovereignty, and originally refused to include it within their constitution, the U.S. Government promised them a trade treaty that would guarantee Cuban sugar exports access to the U.S. market. (Dollar Diplomacy)

After several failed attempts by the Cubans to reject or modify the terms of the Platt amendment, the Cuban Constitutional Convention finally succumbed to American pressure and ratified it on June 12, 1901, by a vote of 16 to 11.

The Platt Amendment remained in force until 1934 when both sides agreed to cancel the treaties that enforced it.

Now a a spinoff of the Spanish American War, the US entered the Philippine–American War an armed conflict between the United States and Filipino revolutionaries.

The conflict arose from the struggle of the First Philippine Republic to secure independence from the United States following the Spanish–American War. The war was a continuation of the Philippine struggle for independence that began in 1896 with the Philippine Revolution.

Fighting erupted between United States and Filipino revolutionary forces on February 4, 1899, and quickly escalated into the 1899 Second Battle of Manila.

On June 2, 1899, the First Philippine Republic officially declared war against the United States. The war officially ended on July 4, 1902.

 However, some groups led by veterans of the revolution continued to battle the American forces. Among those leaders was General Macario Sacay, a veteran revolution member who assumed the presidency of the proclaimed “Tagalog Republic“, formed in 1902 after the capture of President Emilio Aguinaldo.

Other groups, continued hostilities in remote areas and islands until their final defeat a decade later on June 15, 1913.

The war and occupation by the U.S. would change the cultural landscape of the islands, as people dealt with an estimated 34,000 to 220,000 Filipino casualties (with more civilians dying from disease and hunger brought about by war), disestablishment of the Roman Catholic Church in the Philippines (as a “state Church” – as previously in Spain), and the introduction of the English language in the islands as the primary language of government, education, business, industrial and increasingly in future decades among families and educated individuals.

Under the 1902 “Philippine Organic Act“, passed by the United States Congress, Filipinos were initially given very limited self-government, including the right to vote for some elected officials such as an elected Philippine Assembly, but it was not until 14 years later with the 1916 Philippine Autonomy Act, (or “Jones Act”) passed by the United States Congress, now under Democratic 28th President, Woodrow Wilson, that the U.S. officially promised eventual independence, along with more Filipino control in the meantime over the Philippines.

The 1934 Philippine Independence Act created in the following year, 1935, the Commonwealth of the Philippines, a limited form of independence, and established a process ending in Philippine independence (originally scheduled for 1944, but interrupted and delayed by World War II.

Finally in 1946, following World War II and the Japanese Occupation of the Philippines, the United States granted independence through the Treaty of Manila concluded between the two governments and nations.

One final example of US Dollar Diplomacy and this is a dandy.

The Panama Canal

On June 19, 1902, the U.S. Senate voted in favor of building the canal through Panama. Within 6 months, Secretary of State John Hay signed a treaty with the Colombian Foreign Minister to build the new canal.

We offered Columbia $10 million and $250,000/year. The financial terms were unacceptable to Colombia’s congress, and it rejected the offer. Teddy Roosevelt called Columbia “The blackmailers of Bogata”.

President Roosevelt responded by dispatching U.S. warships to Panama City (on the Pacific) and Colón (on the Atlantic) in support of Panamanian independence.

Colombian troops were unable to negotiate the jungles of the Darien Strait and Panama declared independence on November 3, 1903.

The newly declared Republic of Panama immediately named Philippe Bunau-Varilla (a French engineer who had been involved in the earlier French canal attempt) as Envoy Extraordinary and Minister Plenipotentiary.

 In his new role, Bunau-Varilla negotiated the Hay-Bunau-Varilla Treaty of 1903, which provided the United States with a 10-mile wide strip of land for the canal, a one-time $10 million payment to Panama, and an annual annuity of $25,000.

The United States also agreed to guarantee the independence of Panama. Completed in 1914, the Panama Canal symbolized U.S. technological prowess and economic power.

Although U.S. control of the canal eventually became an irritant to U.S.-Panamanian relations, at the time it was heralded as a major foreign policy achievement.

President Theodore Roosevelt laid the foundation for heavy handed dollar diplomacy in 1904 with his Roosevelt Corollary to the Monroe Doctrine (under which United States Marines were frequently sent to Central America) maintaining that if any nation in the Western Hemisphere appeared politically and financially so unstable as to be vulnerable to European control, the United States had the right and obligation to intervene.

President Taft continued and expanded the policy, starting in Central America, where he justified it as a means of protecting the Panama Canal. In March 1909, he attempted unsuccessfully to establish control over Honduras by buying up its debt to British bankers.

Dollar Diplomacy wasn’t always peaceful. In Nicaragua, U.S. “intervention involved participating in the overthrow of one government and the military support”[ of another.

When a revolt broke out in Nicaragua in 1912, the Taft administration quickly sided with the insurgents (who had been instigated by U.S. mining interests) and sent U.S. troops into the country to seize the customs houses.

As soon as the U.S. consolidated control over the country, Secretary of State Philander C. Knox encouraged U.S. bankers to move into the country and offer substantial loans to the new regime, thus increasing U.S. financial leverage over the country.

 Within two years, however, the new pro-U.S. regime faced a revolt of its own; and, once again, the administration landed U.S. troops in Nicaragua, this time to protect the tottering, corrupt U.S. regime. U.S. troops remained there for over a decade.

So there you have it folks. As we watch the Chinese use dollar diplomacy today, It isn’t very hard to figure out where they got the idea.

Obviously they studied their history.

Who is buying America’s farmland?

Politico Website

AGRICULTURE

China is buying up American farms. Washington wants to crack down.

Bipartisan pressure is building to stop foreign nationals from purchasing American farm operations and receiving taxpayer subsidies.

The push to stop China’s influence in the U.S. economy has reached America’s farm country, as congressional lawmakers from both parties are looking at measures to crack down on foreign purchases of prime agricultural real estate.

House lawmakers recently advanced legislation to that effect, warning that China’s presence in the American food system poses a national security risk. And key Senate lawmakers have already shown interest in efforts to keep American farms in American hands.

The debate over farm ownership comes amid broader efforts by Congress and the Biden administration to curb the nation’s economic reliance on China, especially in key industries like food, semiconductors and minerals deemed crucial to the supply chain.

The call for tighter limits on who owns America’s farms has come from a wide range of political leaders, from former Vice President Mike Pence to Sen. Elizabeth Warren (D-Mass.), after gaining momentum in farm states.

“America cannot allow China to control our food supply,” Pence said recently during a speech at the conservative Heritage Foundation, urging President Joe Biden and Congress to “end all farm subsidies for land owned by foreign nationals.”

Chinese firms have expanded their presence in American agriculture over the last decade by snapping up farmland and purchasing major agribusinesses, like pork processing giant Smithfield Foods.

By the start of 2020, Chinese owners controlled about 192,000 agricultural acres in the U.S., worth $1.9 billion, including land used for farming, ranching and forestry, according to the Agriculture Department.

Still, that’s less than farmland owned by people from other nations like Canada and European countries, which account for millions of acres each. It’s also a small percentage of the nearly 900 million acres of total American farmland.

But it’s the trend of increasing purchases and the buyers’ potential connections to the Chinese government that have lawmakers spooked.

USDA reported in 2018 that China’s agricultural investments in other nations had grown more than tenfold since 2009.

The Communist Party has actively supported investments in foreign agriculture as part of its “One Belt One Road” economic development plans, aiming to control a greater piece of China’s food supply chain.

“The current trend in the U.S. is leading us toward the creation of a Chinese-owned agricultural land monopoly,” Rep. Dan Newhouse (R-Wash.) warned during a recent House Appropriations hearing.

The committee unexpectedly adopted Newhouse’s amendment to the Agriculture-FDA spending bill (H.R. 4356 (117)) that would block any new agricultural purchases by companies that are wholly or partly controlled by the Chinese government and bar Chinese-owned farms from tapping federal support programs.

That move followed a contentious debate over the potential consequences for Asian Americans if Congress adopted a provision aimed squarely at China.

In other words, if you are going to restrict China you should restrict other countries, like Canada,  as well.

The legislation is expected to reach the House floor before the end of July, as part of a broader appropriations package, although the Senate has not yet drafted its own version of the spending bill.

Scrutiny of foreign-owned agricultural operations receiving taxpayer subsidies has also been rising in recent years after meatpacking conglomerates like the U.S. subsidiary of Brazilian-owned JBS received millions of dollars under the Trump administration’s trade bailout starting in 2018.

Smithfield was also in line to receive money from the program, which was created to help U.S. farmers hurt by trade retaliation from China and other competitors.

But the company backed out of its contract with USDA after an outcry from lawmakers led by Sen. Chuck Grassley (R-Iowa).

The renewed focus on curbing foreign farm purchases comes as Biden and Agriculture Secretary Tom Vilsack roll out a series of actions to bolster the food supply chain, following major disruptions caused by the pandemic.

That effort includes greater scrutiny of large meat processing companies like JBS (used to be Swift & Co.) and Smithfield, as well as plans to tighten the requirements for meat to be labeled a “Product of the USA.”

While lawmakers remain focused on Chinese buyers, other nationals own much more agricultural property in the United States.

Foreign investors by the end of 2019 held an interest in more than 35 million acres — an area bigger than New York State.

The total has grown by an average 2.3 million acres per year since 2015, according to USDA data.

A few states, including top agricultural centers like Iowa and Minnesota, already have varying restrictions on foreign ownership of their farmland.

Those seeking more restrictions say USDA’s numbers actually understate the amount of foreign control over American ag operations.

The data is based on a 1978 law directing foreign nationals to report their U.S. agricultural holdings to USDA — a requirement that can be difficult for the department to enforce.

For example, foreign investors can set up limited liability companies in the U.S. and designate an American owner to circumvent the reporting requirements while still controlling the operation behind the scenes, said Joe Maxwell, president of the progressive advocacy group Family Farm Action.

“It’s a massive undertaking to verify who really owns [the land],” Maxwell said. “These foreign interests are pretty smart. They use different business structures to further conceal it.”

While some states have strict laws in place, others are more open to foreign investments. Texas has the largest amount of foreign-held agricultural land, at 4.4 million acres, followed by Maine and Alabama, according to USDA.

The money flowing into agricultural real estate from other countries also makes it difficult for new farmers in the U.S. to afford land as outside buyers bid up prices. Maxwell said that poses a big risk with an older generation of farmers set to exit the industry.

“When this land changes hands, they’re going to gobble it up,” he said of foreign buyers. “These investments artificially increase the value of that land, which then denies young and beginning farmers opportunities to farm.”

Foreign Purchases of U.S. Agricultural Land: Facts, Figures, and an Assessment of Real Threats

September 8, 2021

Center for Strategic and International Studies

Jamie Lutz is a research associate with the Global Food Security Program at the Center for Strategic and International Studies (CSIS) in Washington, D.C.

Caitlin Welsh is the director of the CSIS Global Food Security Program. 

Foreign ownership of U.S. agricultural land doubled from 2009 to 2019, according to U.S. Department of Agriculture (USDA) records, and policymakers have become increasingly concerned about foreign control of the U.S. food supply.

Q1: Where and why are foreign entities purchasing farmland in the United States?

A1: USDA records provide the best source of information on foreign-held agricultural land, although they are still incomplete and contain many errors

According to USDA data, foreign investors owned at least 35.2 million acres of U.S. agricultural land in 2019—2.7 percent of U.S. farmland, an area almost the size of Iowa.

While foreign land ownership has been reported in all 50 states and Puerto Rico, the holdings are concentrated in particular states.

The greatest share is in Texas, with over 4.4 million acres, followed by Maine (3.3 million acres) and Alabama (1.8 million acres). Over 40 percent of the additional 3.4 million acres acquired by foreign investors in 2019 was located in Texas, Oklahoma, and Colorado.

Canadian investors hold the largest share of this land, at 29 percent, with the Netherlands, Italy, Germany, and the United Kingdom collectively owning another 33 percent.

The remaining 38 percent is held by entities from almost a hundred other countries.

Although Congress has become increasingly concerned about Chinese land purchases, investors from China currently own only a small fraction of this land, at 191,652 acres (0.05 percent of the total).

In 2019, 49 percent of reported foreign-held acreage in the United States was forest land, while 25 percent was crop land, 24 percent was for pasture and other agricultural uses, and 2 percent was for non-agricultural uses (such as homesteads and roads).

The USDA reports that the changes in pasture and crop land holdings since 2009 were mostly due to foreign-owned wind companies signing or terminating long-term leases.

Q2: What threats do foreign acquisitions of U.S. farmland pose?

A2: On a large scale, these acquisitions do not represent a substantial enough portion of food production in the United States to threaten national food security.

The United States currently produces more than enough food per capita, even after adjusting for food waste. Food insecurity among U.S. families is primarily driven by poverty, not a lack of food.

U.S.-based companies also own over nine million acres in other countries.

Even so, large land purchases present various localized concerns in the places where they occur. For example, in water-scarce regions like the Southwest, outside use of freshwater resources can affect water availability for local farms and communities.

Arizona, for instance, has no rules on groundwater pumping as long as it is for a “beneficial use,” which includes agriculture even if the products are shipped elsewhere.

Near a 10,000-acre hay farm run by a Saudi subsidiary, local residents say their wells are going dry.

It is not only foreign companies who take advantage of this regulatory loophole, however—companies from other states and cities around the United States are also buying up land in Arizona to take advantage of the state’s loose water regulations, putting Arizona’s long-term water resources at risk.

Across the country, the 2013 purchase of Smithfield Foods by the Chinese firm Shuanghui, now called WH Group, received national attention and escalated concerns about Chinese intervention in U.S. food systems.

The deal meant that WH Group now owns the largest pork producer in the United States as well as over 146,000 acres of Missouri farmland.

Missouri had formerly banned all foreign ownership of agricultural land in the state, but one week before Shuanghui took over Smithfield, that rule changed to allow foreign entities to own up to 1 percent of the state’s farmland.

Critics claim that the rule change was instrumental in allowing the deal to go through as written, and some members of Congress warned of Chinese government involvement in the purchase. Smithfield is vertically integrated and owns all aspects of its supply chain, meaning that WH Group now controls a significant portion of U.S. pork production and revenue in addition to farmland.

While a large portion of Smithfield pork was already being exported to China prior to the acquisition, the Covid-19 crisis raised concerns about Chinese control of U.S. food supply chains.

When the pandemic hit, Smithfield increased pork exports to China even as the United States experienced widespread meat shortages due to supply chain disruptions and Smithfield closed some of its plants due to poor working conditions.

This series of events prompted Congress to look at how to prevent Chinese ownership in U.S. agriculture, even though other foreign entities, like Brazilian-owned JBS (Swift & Co.), control similarly large portions of U.S. food supply chains.

Q3: What regulations are currently in place?

A3: The only federal law governing these transactions is the Agricultural Foreign Investment Disclosure Act (AFIDA) of 1978.

The AFIDA requires foreign entities to report transactions of farmland to the USDA and imposes steep penalties for failing to report (up to 25 percent of the fair market value of the land), although they are rarely enforced—the last fine imposed under the act was in 2014.

The USDA claims that its goal is to monitor foreign ownership of land, not to exact penalties. Given the state of the AFIDA data, it seems it is doing neither. The data is entirely reliant on self-reporting, and the USDA does not check it for completeness and accuracy, so there are frequent typos, omissions, and outdated information.

As a result, the public does not have a complete picture of which foreign entities own how much U.S. farmland or what the land is being used for.

On the state level, regulations vary. Most states, like Texas and Maine, have no restrictions on foreign ownership of land, contributing to the large amount of farmland that is under foreign control in these states. 

Six states forbid any foreign landholdings, and some, like Missouri, put caps on how much land can be held by foreign entities.

Q4: What federal solutions are being discussed?

A4: Since the 2013 purchase of Smithfield Foods, multiple bills have been proposed to provide more oversight of foreign investments in U.S. agricultural companies, but until recently they each died in committee.

Even if these bills had passed, they would have strengthened oversight on purchases of U.S. companies, not agricultural land specifically, leaving most land acquisitions unregulated.

Furthermore, policymakers have signaled no efforts to improve state- or national-level information on foreign purchases of U.S. farmland, leaving the true picture obscured.

In the wake of Covid-19 supply chain disruptions and escalating tensions with China, U.S. lawmakers have specifically increased scrutiny of purchases by Chinese investors.

Citing national security concerns, the House Appropriations Committee included an amendment in the recent Department of Agriculture-Food and Drug Administration spending bill that prohibits the purchase of agricultural land located in the United States by Chinese-owned companies.

Representative Grace Meng expressed concern that the amendment could fuel already rising anti-Asian hate, and the USDA and advocacy groups have pointed out that the USDA does not have the authority to intervene in private land deals.

The USDA could enforce the portion of the bill that bans Chinese-owned companies from participating in federal benefits programs, but constitutionally, land purchasing falls under states’ rights, and legislators have made no motions to change that.

The House passed the agriculture appropriations bill on July 29, raising questions about how it will be enforced if it becomes law.

Q5: What is the real threat to U.S. food security?

A5: Land grabbing is more of an immediate threat to food security in other parts of the world than it is in the United States, but it could become a greater threat in the future if more farmland is sold and if foreign investors continue to buy available farmland.

The U.S. farmer population is aging, with an average age of 57.5 in 2017, up from 55 in 2012. The National Young Farmers Coalition (NYFC) anticipates that two-thirds of farmland will change hands over the next decade as farmers retire, meaning that more land could become available for foreign purchase.

It is important to note that foreign entities are not the only ones aggressively buying up U.S. farmland. Many large corporations, pension funds, and wealthy individuals are investing in agricultural land in the United States and abroad.

Advocacy groups like the National Family Farm Coalition argue that the larger threat to national security is corporate capture of U.S. land resources, whether those corporations are U.S.- or foreign-owned.

The NYFC also points to both urban and rural development as a threat to the future of U.S. farms, since converting farmland to other uses drives up prices and makes the land unaffordable for beginning farmers.

For long-term U.S. food security, perhaps the larger concern is why up-and-coming U.S. farmers are unable to buy the land they need.

According to the NYFC, young and aspiring farmers say access to land is their largest barrier to starting a successful farm business. With an aging U.S. farmer population and not enough new farmers able to enter the industry, more land will inevitably be converted to other uses or sold to foreign and domestic investors unless policies are put in place to support the next generation of farmers.

Focusing narrowly on land purchases by Chinese companies or other foreign entities will not address the full scope of this problem.

Policymakers should, instead, consider the many threats facing the future of the U.S. food system and ensure that current and aspiring farmers have the resources they need to secure long-term U.S. food production, starting with access to affordable farmland.

In addition to federal action, some states, like Arizona, could do more to protect their local resources and communities from exploitation by domestic and foreign entities.

For its part, China owned 191,000 acres worth $1.9 billion as of 2019. This might not sound like a lot, but Chinese ownership of American farmland has exploded dramatically over the last decade.

Indeed, there has been a tenfold expansion of Chinese ownership of farmland in the United States in less than a decade. 

Six states — Hawaii, Iowa, Minnesota, Mississippi, North Dakota and Oklahoma — currently ban foreign ownership of farmland

Massive Chinese investment in American farmland is troubling for one very obvious reason: It puts the food security of the nation in the hands of a hostile foreign power.

But there is also the social cost of allowing foreign buyers who have effectively unlimited resources to compete on the real estate market with smaller domestic buyers.

So, the real victims of all this are smaller landholders and our next generation of farmers.

So, what do you think? We have already seen the impact of depending on the Chinese for all sorts of manufactured good and technology, should we now let them invest in our food supply?

What about all of the other countries who are investing in our farms. Should they be banned as well?

China and its Worldwide Influence

What is the ‘One China’ policy?

Published 6 October in BBC news

What is the ‘one china’ policy?

It is the diplomatic acknowledgement of China’s position that there is only one Chinese government.

Under the policy, the US recognizes and has formal ties with China rather than the island of Taiwan, which China sees as a breakaway province to be reunified with the mainland one day.

The One China policy is a key cornerstone of Chinese-US relations. It is also a fundamental bedrock of Chinese policy-making and diplomacy.

However, Washington maintains a “robust unofficial” relationship with taiwan, including continued arms sales to the island so that it can defend itself.

Although Taiwan’s government claims it is an independent country officially called the “Republic of China”, any country that wants diplomatic relations with mainland China must break official ties with Taipei.

This has resulted in Taiwan’s diplomatic isolation from the international community.

How did it come about?

The policy can be traced back to 1949 and the end of the Chinese civil war. The defeated nationalists, also known as the Kuomintang, retreated to Taiwan and made it their seat of government while the victorious communists began ruling the mainland as the People’s Republic of China.

So, let’s stop right there for clarification.

Republic of China is Taiwan

Peoples Republic of China is mainland China

Both sides say they represent all of China.

Since 1949, China’s ruling communist party has threatened to use force if Taiwan ever formally declares independence, but it has also pursued a softer diplomatic track with the island in recent years.

Initially, many governments including the US recognized Taiwan as they shied away from communist China.

Bear in mind, China was our ally in WWII (chiang kai shek).

But the diplomatic winds shifted as China and the United States saw a mutual need to develop relations beginning in the 1970s, with the US and other countries cutting ties with Taiwan in favor of Beijing.

Many however still maintain informal relations with Taiwan through trade offices or cultural institutes, and the US remains Taiwan’s most important security ally.

After years of warming relations, the US established formal diplomatic ties with Beijing in 1979 under President Jimmy Carter.

As a result, the US had to sever ties with Taiwan and closed its Taiwan embassy in Taipei.

But that same year it also passed the Taiwan Relations Act, which guarantees support for the island.

This act states that the US must help Taiwan defend itself – which is why the US continues to sell arms to Taiwan.

The US has also said it insists on the peaceful resolution of differences between the two sides and encourages both sides to pursue “constructive dialogue”.

We maintain an unofficial presence in Taipei via the American Institute in Taiwan, a private corporation through which it carries out diplomatic activities.

Beijing has obviously benefited the most from the policy, which has cast Taiwan out into the diplomatic wilderness.

Taiwan is not recognized as an independent country by much of the world nor even the United Nations.

But even in its isolation, Taiwan has not entirely lost out.

Taiwan’s president Tsai Ing-Wen spoke to President Trump in early December 2016, breaking decades of US diplomatic protocol

It maintains good economic and cultural ties with neighbors.

It employs a small group of powerful lobbyists in Washington DC including former Senator Bob Dole.

As for the US, it continues to benefit from formal relations with China – its biggest foreign lender and a top trade partner – while quietly continuing to maintain strong ties with Taiwan.

The One China policy is a delicate balancing act that the US has maintained for decades.

Where it goes from here is anyone’s guess.

Bear in mind 34 percent of our total federal debt is owed to foreigners, including China (which owned nearly $1.3 trillion of the total debt, or about 8 percent), closely followed by Japan, which owned $1.1 trillion, or 7 percent.

Previously, Japan had been the top foreign owner of US debt, but China surpassed Japan in September 2008.

The Heritage Foundation

China’s Influence in the Western Hemisphere

April 19, 2005  by Peter Brookes

Senior Research Fellow at The Heritage Foundation

When the People’s Republic of China unleashed its unprecedented economic reforms almost 20 years ago, no one could have imagined the effect it would have on China–or the world.

Finally freed from the shackles of an inefficient Soviet-style command economy, China would experience a remarkable expansion in economic growth, including near double-digit growth for the last 10 years, according to PRC government statistics.

These economic reforms have transformed China into a rising power in world politics. In fact, some would argue that, today, China is no longer a “rising power”–but a “risen power.”

Chinese leaders believe that if its economic growth continues at this pace, China will overcome 150 years of “humiliation” at the hands of foreign powers, returning to its past glory as the “Middle Kingdom.”

In China’s view, eventually, this economic growth will allow it to be able to challenge the world’s most powerful nations, including the United States, for control of the international system.

China is well on its way to doing just that. Today, China, the world’s most populous nation, also has the world’s second largest economy and the world’s second largest defense budget, allowing China to play key, central roles in Asian geopolitics.

But China is also becoming an increasingly important player on the world stage. Although it has long been a permanent member of the U.N. Security Council and a nuclear weapons state, its expanding economic might is resulting in growing political influence beyond Asia as well.

It is hard to find a major international issue in which China is not playing a role: From weapons proliferation, to human rights, to energy security, to North Korea, Iran, Sudan, and the United Nations, China is present, and Beijing is increasingly confident of its high-profile role in world politics.

With increasingly well-developed power derived from economic growth, political stability, and a growing military capability, China sees its re-emergence as a global power, on its own terms, as a certainty.

If all goes according to Beijing’s plans, in the next few decades China will take its “rightful place” among the great powers in the international system–if not atop the international system.

A subset of China’s grand strategy is an “opportunistic” foreign policy aimed at its main competition for preeminence in the international system, the United States.

China is pursuing a foreign policy that aims to support China’s national interests while attempting to balance–or, perhaps, more accurately, unbalance–the predominance of the United States across the globe.

China is looking to “quietly” use its growing economic strength to build new political relationships abroad while exploiting dissatisfaction with the United States wherever possible.

Eventually, in Beijing’s estimation, once China has gathered as many allies and friends as possible and developed its economic and military strength to near that of other major powers, it will be able to challenge the United States directly if necessary.

Put simply: China is using its burgeoning economic power to gain political and economic influence internationally, at America’s expense wherever possible, in an effort to replace the U.S. as the world’s most powerful nation.

For example, China has indicated that it would not support taking Iran to the U.N. Security Council over its nuclear weapons program while signing a 25-year, $100 billion oil/gas deal with Iran. China’s decision obviously pleased Tehran.

Likewise, China also worked hard against a strong U.N. resolution on the genocide in Sudan, which would have placed economic sanctions on the Sudanese government, in an effort to protect its $3 billion oil investment there. Khartoum could not have been happier with China’s support.

The PRC has taken advantage of trans-Atlantic tensions arising from the Iraq war, too. China has seemingly convinced the European Union, led by France and Germany, to lift the EU’s 1989 Tiananmen Square arms embargo.

China wants forgiveness for the Tiananmen Square crackdown, and Europe hopes that ending the ban will result in large commercial deals–and, perhaps, arms deals–for European firms. The U.S. strongly opposes lifting the ban.

Bottom line: China is pursuing a “realist” foreign policy in order to advance its national interests.

The existence of dissatisfaction with Washington or American policies in global capitals only makes it easier. China’s grand strategy certainly applies to Latin America and the Caribbean, too.

The importance of Latin America and the Caribbean to China is multifold, but two issues predominate: Taiwan and access to raw materials, especially energy.

The PRC will not feel its rise to power is complete without returning Taiwan to the Mainland’s political control.

As I stated earlier, Taiwan and China have been separated since the 1949 civil war, and it is Beijing’s view that Taiwan is a “renegade province” that must be “reunified” with the PRC.

To the tremendous frustration of the PRC, the Chinese view of Taiwan’s sovereignty is increasingly not at the top of public opinion on Taiwan.

As a result, China is employing every instrument of its national power to effect unification with Taiwan, including an unwillingness to renounce the use of force to resolve Taiwan’s future.

One of China’s tactics is an effort to politically isolate Taiwan internationally by enticing countries that currently diplomatically recognize Taiwan to shift allegiances to the PRC.

Most of the countries that recognize Taiwan are in Latin America, Africa, and the Pacific Islands.

At present, six nations in Central America–Panama, Costa Rica, Nicaragua, El Salvador, Honduras and Guatemala–retain full diplomatic relations with Taiwan.

Beginning with Chile in 1970, all but one South American state–Paraguay–have moved to recognize Beijing.

In the Caribbean, the Dominican Republic, Haiti, St. Kitts and Nevis, and St. Vincent and the Grenadines have relations with Taiwan. Dominica switched allegiances to the PRC last year.

For Taiwan, the states of Central America and the Caribbean, and Paraguay, represent a relatively solid regional commitment to its status as a state separate from China.

These states represent nearly half of Taiwan’s diplomatic recognition around the world, now totaling 25 nations.

Taiwan pays dearly to retain this diplomatic recognition, and if these states were to switch recognition from Taipei to Beijing, the damage to Taiwan’s political confidence and its claims of legitimacy as a state would be seriously undermined in Taipei’s estimation.

China’s other interest, not surprisingly, is access to natural resources, especially energy.

China is scouring the planet for resources to feed its economy’s insatiable appetite for raw materials.

Since China’s government is not popularly elected, its claim to legitimacy has been its ability to improve the standard of living of the 1.3 billion Chinese people.

Stoking the economic furnaces also allows China to continue its unprecedented military buildup, supported primarily by Russian arms sales, and to provide overseas aid–often without conditions–to countries of interest in an effort to spread its influence.

China is broadly diversifying its energy sources.

It is trying to reduce its reliance on coal, which has made China the world’s second largest polluter.

In its effort to ensure consistent energy supplies, China is expected to divert its overseas investments outside the Middle East to Russia; Southeast Asia (e.g., Indonesia, Burma); Central Asia (e.g., Kazakhstan, Uzbekistan); Africa (e.g., Angola, Sudan); and Latin America (e.g., Colombia, Venezuela).

Petroleum leads the list of resources South American states have to offer China.

Venezuela is the world’s fifth largest producer of petroleum that produces 2.5 million barrels per day, providing the United States with 13-15 percent of its oil imports.

China has invested over $1 billion in petroleum projects in Venezuela and is positioning itself to invest nearly $350 million to extract oil from eastern Venezuelan oil fields, as well as an additional $60 million in natural gas wells. China is also seeking to purchase petroleum from Ecuador, Argentina, Colombia, and Mexico.

Latin America is an important source of a variety of minerals and food items as well.

Aluminum, copper, iron, and soybeans constitute a large part of China’s imports from Latin America.

For commercial purposes, China also obviously has a strong interest in the Panama Canal and access to good port facilities in the Caribbean.

During his visits to Brazil and Argentina in November 2004, Chinese President Hu Jintao announced plans to invest $100 billion in Latin America over the next decade, primarily for infrastructure and energy projects. These investments made by the Chinese government will undoubtedly bring political influence as well.

China is also on a military diplomacy offensive across the globe. China has formed military diplomatic ties with 146 countries and sent military attaches to 103 countries.

China uses these exchanges to gather information on the host country, as well as other countries, if possible, for military doctrine development as well as military intelligence purposes.

In 2004, more than 100 military exchange programs took place, involving Chinese military leaders visiting more than 60 countries and senior officers from about 50 countries visiting China.

Some exchange programs featured joint military exercises, security sessions involving military officers from multiple countries, combined seminars on defense and security, and field trips.

China has military and security interests in Latin America as well. China’s presence at Signals Intelligence (SIGINT) facilities in Cuba directed at the United States is long-standing and well known, but China is also establishing military ties in Latin America.

For example, in 2004, Defense Minister Cao Gangchuan paid a visit to Brazil.

In April 2004, Vice-Chairman of the Central Military Commission Xu Caihou visited Cuba and called on Cuban military units and training centers. Since the late 1990s, at least one high-level visit has taken place every year to Venezuela.

In addition, Chinese intelligence services are undoubtedly active in Latin America and the Caribbean, using Chinese front companies, students, visitors, and intelligence officers to steal and exploit technology and commercial secrets of interest to enhance their military prowess and economic competitiveness.

China has achieved unparalleled growth in its power, influence, and importance over the past 20 years. Its grand strategy is to become the preeminent power in the Pacific–and in the world–replacing the United States as the world’s most powerful nation.

Though that point is not here today, China is making progress on both counts. The PRC is seeking friends and allies to advance its agenda in Asia, Europe, Africa, the Middle East–and Latin America.

Like most other nations, China is committed to improving the performance of its economy and spreading its political influence. Its actions are worrisome in Latin America and the Caribbean because some national leaders, such as Venezuela’s Hugo Chávez, welcome the arrival of another world power to offer an alternative to the United States.

There are challenges to China’s advance in Latin America and the Caribbean, including geographic proximity, culture, and language. But if Washington wants to neutralize China’s growing influence in the Western Hemisphere, it needs to take action.

Peter Brookes is Senior Fellow for National Security Affairs and Director of the Asian Studies Center at The Heritage Foundation. These remarks were prepared for delivery at a hearing of the Subcommittee on the Western Hemisphere of the House Committee on International Relations.

So, there you have it folks. China is a big problem and it is only getting worse.

I think our former President could see the threat that China poses. I fear that our current leadership is taking a wait and see position on the issue.

Hopefully, as you watch the evening news, this information will help you to better understand why China is flying jets over Taiwan, The United Kingdom and the US cut a deal to builds submarines for Australia, and Russia feels emboldened enough to once again threaten Ukraine.

These are just a few examples of the impact China is now having on the world stage. There are many others. You just have to look at the big picture and follow the money.

Russia and Ukraine

Russia-Ukraine border: Why Moscow is stoking tensions

By Sarah Rainsford
BBC Moscow correspondent

When Russia wanted the US to sit up and take notice last April it sent tanks towards the Ukrainian border.

The show of force worked: President Joe Biden called Russia’s Vladimir Putin and in June the two men met in Geneva.

But whatever they agreed about Ukraine at their summit, something has since gone awry.

In recent weeks, Russian tanks have been moving west towards Ukraine once again, prompting fresh, even starker warnings from US intelligence circles that a cross-border offensive could be in the cards.

This build-up of Russian forces was spotted 185 miles from Ukraine.

Moscow insists that it is “anti-Russian” hysteria, and most analysts agree there’s no rationale for Russia openly entering – and massively escalating – the conflict in Ukraine, where it backs separatist forces but always denies a direct role.

Instead, they see the Kremlin sending a message that it’s ready to defend its “red lines” on Ukraine: above all, that it must not join Nato.

“I think for Putin it’s really important. He thinks the West has begun giving Ukraine’s elite hope about joining Nato,” political analyst Tatiana Stanovaya at R.Politik told the BBC.

“The training, the weapons and so on, are like a red rag to a bull for Putin and he thinks if he doesn’t act today, then tomorrow there will be Nato bases in Ukraine. He needs to put a stop to that.”

Ukraine’s desire to join the security bloc is nothing new, nor is Russia’s insistence on vetoing that ambition in what it sees as its own “back yard”.

But Moscow has been rattled recently by the Ukrainian military using Turkish drones against Russian-backed forces in eastern Ukraine; the flight near Crimea of two nuclear-capable US bombers was an extra irritant.

There’s also concern that the so-called Minsk agreements, a framework for ending Ukraine’s seven-year-old conflict that’s too contentious to actually implement, could be dumped for something more favorable to Ukraine.

In April, Russia found that demonstrative military deployment worked well so it’s repeating the trick.

“Our recent warnings have indeed been heard and the effect is noticeable: tensions have risen,” President Putin told Russian diplomats last week. He argued that tension needed to be increased to force the West to reckon with Russia, not ignore it.

“If the military movements [close to Ukraine] are explicit, then this is not about direct military action – it’s about a signal Putin wants to send,” Andrei Kortunov, head of a think-tank in Moscow, told the BBC.

The signal to Ukraine is not to try anything rash.

So how did we get to this point?

A little history.


APR 16, 2019 History.com

PATRICK J. KIGER

At the height of the 1932-33 Ukrainian famine under Joseph Stalin, starving people roamed the countryside, desperate for something, anything to eat.

The Ukrainian famine—known as the Holodomor, a combination of the Ukrainian words for “starvation” and “to inflict death”—by one estimate claimed the lives of 3.9 million people, about 13 percent of the population of Ukraine.

And, unlike other famines in history caused by blight or drought, this was caused when a dictator wanted both to replace Ukraine’s small farms with state-run collectives and punish independence-minded Ukrainians who posed a threat to his totalitarian authority.

“The Ukrainian famine was a clear case of a man-made famine,” explains Alex de Waal, executive director of the World Peace Foundation at Tufts University and author of the 2018 book, Mass Starvation: The History and Future of Famine.

He describes it as “a hybrid…of a famine caused by calamitous social-economic policies and one aimed at a particular population for repression or punishment.”

In those days, Ukraine—a Texas-sized nation along the Black Sea to the west of Russia—was a part of the Soviet Union, then ruled by Stalin.

In 1929, as part of his plan to rapidly create a totally communist economy, Stalin had imposed collectivization, which replaced individually owned and operated farms with big state-run collectives.

Ukraine’s small, mostly subsistence farmers resisted giving up their land and livelihoods.

In response, the Soviet regime derided the resisters as kulaks—well-to-do peasants, who in Soviet ideology were considered enemies of the state.

Soviet officials drove these peasants off their farms by force and Stalin’s secret police further made plans to deport 50,000 Ukrainian farm families to Siberia, historian Anne Applebaum writes in her 2017 book, Red Famine: Stalin’s War on Ukraine.

“Stalin appears to have been motivated by the goal of transforming the Ukrainian nation into his idea of a modern, proletarian, socialist nation, even if this entailed the physical destruction of broad sections of its population,” says Trevor Erlacher, an historian and author specializing in modern Ukraine and an academic advisor at the University of Pittsburgh’s Center for Russian, East European, & Eurasian Studies.

Collectivization in Ukraine didn’t go very well. By the fall of 1932—around the time that Stalin’s wife, Nadezhda Sergeevna Alliluyeva, who reportedly objected to his collectivization policy, committed suicide—it became apparent that Ukraine’s grain harvest was going to miss Soviet planners’ target by 60 percent.

There still might have been enough food for Ukrainian peasants to get by, but, as Applebaum writes, Stalin then ordered what little they had be confiscated as punishment for not meeting quotas.

“The famine of 1932-33 stemmed from later decisions made by the Stalinist government, after it became clear that the 1929 plan had not gone as well as hoped for, causing a food crisis and hunger,” explains Stephen Norris, a professor of Russian history at Miami University in Ohio.

Norris says a December 1932 document called, “On the Procurement of Grain in Ukraine, the North Caucasus, and the Western Oblast,” directed party cadres to extract more grain from regions that had not met their quotas. It further called for the arrest of collective farm chiefs who resisted and of party members who did not fulfill the new quotas. 

Meanwhile, Stalin, according to Applebaum, already had arrested tens of thousands of Ukrainian teachers and intellectuals and removed Ukrainian-language books from schools and libraries. She writes that the Soviet leader used the grain shortfall as an excuse for even more intense anti-Ukrainian repression.

As Norris notes, the 1932 decree “targeted Ukrainian ‘saboteurs,’ ordered local officials to stop using the Ukrainian language in their correspondence and cracked down on Ukrainian cultural policies that had been developed in the 1920s.”

When Stalin’s crop collectors went out into the countryside, according to a 1988 U.S. Congressional commission report, they used long wooden poles with metal points to poke the dirt floors of peasants’ homes and probe the ground around them, in case they’d buried stores of grain to avoid detection.

Peasants accused of being food hoarders typically were sent off to prison, though sometimes the collectors didn’t wait to inflict punishment.

Two boys who were caught hiding fish and frogs they’d caught, for example, were taken to the village soviet, where they were beaten, and then dragged into a field with their hands tied and mouths and noses gagged, where they were left to suffocate.

As the famine worsened, many tried to flee in search of places with more food. Some died by the roadside, while others were thwarted by the secret police and the regime’s system of internal passports.

Ukrainian peasants resorted to desperate methods in an effort to stay alive, according to the Congressional commission’s report. They killed and ate pets and consumed flowers, leaves, tree bark and roots. One woman who found some dried beans was so hungry that she ate them on the spot without cooking them, and reportedly died when they expanded in her stomach.

“The policies adopted by Stalin and his deputies in response to the famine after it had begun to grip the Ukrainian countryside constitute the most significant evidence that the famine was intentional,” Erlacher says. “Local citizens and officials pleaded for relief from the state. Waves of refugees fled the villages in search of food in the cities and beyond the borders of the Ukrainian Soviet Republic.” The regime’s response, he says, was to take measures that worsened their plight.

By the summer of 1933, some of the collective farms had only a third of their households left, and prisons and labor camps were jammed to capacity.

With hardly anyone left to raise crops, Stalin’s regime resettled Russian peasants from other parts of the Soviet Union in Ukraine to cope with the labor shortage.

Faced with the prospect of an even wider food catastrophe, Stalin’s regime in the fall of 1933 started easing off collections.

The Russian government that replaced the Soviet Union has acknowledged that famine took place in Ukraine, but denied it was genocide.

In April 2008, Russia’s lower house of Parliament passed a resolution stating that “There is no historical proof that the famine was organized along ethnic lines.”

Nevertheless, at least 16 countries have recognized the Holodomor, and most recently, the U.S. Senate, in a 2018 resolution, affirmed the findings of the 1988 commission that Stalin had committed genocide.

Ultimately, although Stalin’s policies resulted in the deaths of millions, it failed to crush Ukrainian aspirations for autonomy, and in the long run, they may actually have backfired.

In the case of Ukraine it generated so much hatred and resentment that it solidified Ukrainian nationalism.”

Ukraine gained independence after the collapse of the Soviet Union in 1991 and has since veered between seeking closer ties with Western Europe and rejoining its alliance with Russia, which sees its interests as threatened by a Western-leaning Ukraine.

Europe’s second largest country, Ukraine is a land of wide, fertile agricultural plains, with large pockets of heavy industry in the east.

While Ukraine and Russia share common historical origins, the western part of the country has closer ties with its European neighbors, particularly Poland, and nationalist, independence, sentiment is strongest there.

However, a minority of the population wants to rejoin Russia and uses Russian as its first language, particularly in the cities and the industrialized east.

An uprising against pro-Russian President Viktor Yanukovych in 2014 ushered in a new, Western-leaning government, but Russia used the opportunity to seize the Crimean peninsula and arm insurgent groups to occupy parts of the industrialized east of Ukraine.

So bottom line, Ukraine used to be a part of the Soviet Union, but when the USSR collapsed, Ukraine sought, and is still seeking its independence.

Sarah Rainsford was expelled as BBC Moscow correspondent at the end of August after being designated a security threat.

Russian military buildup puts Washington on edge

BY ELLEN MITCHELL – 11/25/21

The Hill website 

So this brings us back to the present situation. 

Washington is on edge as Russia’s military buildup threatens a confrontation, with fears escalating following reports that U.S. intelligence shows Russian forces preparing to push into Ukraine.

Over the Thanksgiving holiday, the Biden administration received reports that nearly 100,000 Russian troops are stationed at various locations on the country’s western border, with no sign of those numbers waning.

Tensions have grown so high that the U.S. Embassy in Ukraine on Wednesday warned of “unusual Russian military activity” near Ukraine’s eastern border and in the annexed peninsula of Crimea, telling U.S. citizens not to travel there.

The new warnings come as Ukraine this week began to publicly declare that Russia could invade as soon as January or early February, much like when it annexed the Crimean Peninsula in 2014 and backed an insurgency in eastern parts of the country. More than 14,000 people have since been killed in that conflict. 

A similar land grab, which would be the second in less than 10 years, has global implications and could trigger a massive military conflict as well as geopolitical strife between Russia and Western nations.

“Our concern is that Russia may make the serious mistake of attempting to rehash what it undertook back in 2014, when it amassed forces along the border, crossed into sovereign Ukrainian territory and did so claiming — falsely — that it was provoked,” Secretary of State Antony Blinken said earlier this month. 

But U.S. officials are determined not to be caught off-guard by such a military operation, with Blinken on Saturday indicating the administration was preparing for any aggressive Russian maneuver. 

Reports also emerged this week that the Biden administration is mulling its options to deter the Kremlin, including sending military advisers and new weapons to Kyiv.  

Such an aid package could include helicopters, mortars, air defense systems such as stinger missiles and new Javelin anti-tank and anti-armor missiles. 

U.S. officials have also reportedly talked with European allies about forming a new sanctions package that could go into effect should Russia invade.

State Department officials have not publicly mentioned any new weapons or sanctions package, but one official told The Hill on Tuesday that the administration has “demonstrated that the United States is willing to use a number of tools to address harmful Russian actions and we will not hesitate from making use of those and other tools in the future.” 

Also in an effort not to be caught flat-footed, administration officials have shared intelligence with allied countries. 

Pentagon officials have also kept in close contact with their counterparts, with Joint Chiefs of Staff Chairman Gen. Mark Milley speaking by phone with the Lt. Gen. Valeriy Zaluzhny, the commander in chief of Ukraine’s military, on Monday.

Milley also spoke via telephone on Tuesday with Russia’s top military officer, Gen. Valery Gerasimov.

In addition, the administration has sent U.S. Navy patrol boats to help theUkrainian navy counter Moscowin the Black Sea. 

But even with its threatening stance, one that numerous NATO nations have publicly noted, Russia continues to deny it has any intention to invade its neighbor like it did nearly eight years ago.

Russian spokesman Dmitry Peskov said Tuesday that its amassing of forces and equipment don’t “pose a threat to anyone and should not cause concern to anyone.”

He instead blamed a “targeted information campaign” from Western nations as the cause for “building up tension” and said should the U.S. send additional military assistance to Ukraine, it could lead “to a further aggravation of the situation on the border line.”

So, there you have it folks. Should we get involved in this mess?

Keep in mind that while all this is happening, China is threatening Taiwan in much the same way that Russia is threatening Ukraine.

Is this a result of the US showing weakness on the world stage or was this bound to happen anyway?

What about Russia’s position? Are they reacting in a similar way to what America did during the Cuban Missile Crisis?

If Ukraine joins NATO, the west could install missile systems there with just a 10 minute flight time to Moscow.

Food for thought.

The 16th Amendment/Taxes

Taxes have been around since the beginning of recorded history. The earliest known tax was implemented in the ancient city-state of Lagash in what is now Iraq about 6000 BC. In ancient Egypt residents paid their taxes with grain and livestock.

Taxes have always been controversial because of our tendency to believe that we are paying too much and that our leaders squander our tax dollars on things we don’t want or need.

Surviving hieroglyphic tablets from Egypt tell us that their citizens felt the same way, and the ancient Sumerians had a saying, “you can have a lord, you can have a king, but the man to fear is the tax collector!

The 16th Amendment: How the U.S. Federal Income Tax Became D.C.’s Favorite Political Weapon

https://ammo.com/

Written by

Jose Nino

The American Revolution was sparked in part by unjust taxation. After all, the colonists in Boston rebelled against Britain for imposing “taxation without representation,” and summarily tossed English tea into the harbor in protest in 1773.

Nowadays Americans collectively spend more than 6 billion hours each year filling out tax forms, keeping records, and learning new tax rules according to the Office of Management and Budget.

Complying with the U.S. tax code is estimated to cost the American economy hundreds of billions of dollars annually – time and money that could otherwise be used for more productive activities like entrepreneurship and investment, or just more family and leisure time.

Most of these six billion hours sacrificed by Americans to Washington each year goes to complying with a tax that didn’t even exist until 100 years ago – the federal income tax.

Worse still, this tax has become a political weapon.

It’s a tax that follows Americans wherever they go in the world, and it’s one that was originally sold to the American people by President Woodrow Wilson as a means of “soaking the rich” during the so-called Gilded Age.

Sound familiar?

How did a country that was founded on the concept of limited government come to embrace such a draconian policy? And what does it say about Washington that tax reform has become synonymous with class warfare and corporate lobbyists?

Could you imagine a time in the U.S. when roads were being paved, there was zero national debt, and the federal government was completely operational – all without income taxes?

This may sound like a Libertarian fantasy, but it’s actually an image of the America of yesteryear. Before the advent of the income tax, the U.S. government relied exclusively on tariffs and user fees to finance operations.

Unsurprisingly, operations were much smaller compared with today’s extravagant government programs like welfare, social security, and subsidies.

But even though spending was more conservative during the Republic’s early years, certain political events motivated the government to consider more direct ways of reaching into the pockets of its citizens.

One of these political events was the War of 1812. This war may have inspired Francis Scott Key to write “The Star-Spangled Banner” as he famously watched the rockets red glare over Fort McHenry, but it was also straining our fiscal resources and the war effort needed to be financed.

Enter the idea of a progressive income tax – based on the British Tax Act of 1798 (which should have been our first warning).

Fortunately, the War of 1812 came to a close in 1815, and the discussion of enacting an income tax was tabled for the next few decades.

Ever so stubborn, progressive individuals were hell-bent on enacting income taxes, and they eventually found a way to do this at a local and state level. In time, they would reignite a new movement for the adoption of the federal income tax.

With state governments increasingly building public infrastructure projects and introducing compulsory public education, the money for these programs had to come from somewhere.

For the income tax advocates whose hopes were dashed during the War of 1812, state income taxes served as a consolation prize. In turn, income tax supporters immediately got to work and started to chip away at state legislatures.

In the mid-19th century, the fruits of the income tax crowd’s labor began to pay off as several states got the ball rolling.

Slow but sure, income taxes started to make their way from one state legislature to the next. But once the Civil War arrived, income taxes got a tremendous push.

Ripped apart at the seams by the Civil War (1861-1865), the Union government was desperate for funds to finance its ambitious quest to restore order to the nation.

Like the War of 1812, proposals for income tax were on the menu. Unlike the preceding war period, however, the U.S. was able to successfully enact an income tax.

Abraham Lincoln signed the Revenue Act of 1861 as a means to finance the expensive war effort.

This was followed up with other measures like the Revenue Act of 1862 and Revenue Act of 1864, which created the nation’s first progressive income tax system and the precursor to the Internal Revenue Service (IRS).

What seemed like a monumental victory for income tax supporters who hoped for a long-lasting income tax system would vanish once the Civil War ended.

No longer needing a massive army to put down rebels and stitch the country back together, the U.S. government let Civil War era income taxes expire once Reconstruction was in full swing.

How did the U.S. government go from embracing massive government expansions during the Civil War to later reverting to its Constitutional roots of limited government during the next decade?

There is reason to believe that taxes in the 19th century tended to be temporary in nature given the American people’s ideological tendencies.

 Most people were still skeptical of government overreach, especially during the Civil War – a time where habeas corpus was suspended, and the first income tax was implemented.

Shell-shocked from a horrific experience that laid waste to countless urban centers and left hundreds of thousands of Americans dead, the American populace wanted a return to normalcy. And that meant scaling back government as much as possible.

Even Henry Ward Beecher, the brother of the famous author Harriet Beecher Stowe, was skeptical of the Radical Republicans’ zealous plans to grow government during the Reconstruction period.

Historian Tom Woods in The Politically Incorrect Guide to American History exposed Beecher’s thoughts on the matter:

“The federal government is unfit to exercise minor police and local government and will inevitably blunder when it attempts it…To oblige the central authority to govern half the territory of the Union by federal civil officers and by the army, is a policy not only uncongenial to our ideas and principles, but pre-eminently dangerous to the spirit of our government.”

Many Americans would agree with Stowe’s assessment.

But with the arrival of the Progressive Era, the rules of the political game began to change. Soon, ideas of expansive government, which were routinely scoffed at by intellectuals, politicians, and the American population at large throughout the first half of the 19th century, made a fierce comeback during the latter half of the 19th century.

Decades of legislative pressure and constant hand-wringing finally began to pay off for income tax supporters. The arrival of the Progressive Era was like Christmas for political figures in favor of an activist state.

This was a time when reformers actively pushed for an energetic government to solve all of society’s ills, most notably poverty and income inequality.

Although they were shut out from the federal government throughout the Gilded Age, Progressives focused their attention on local and state races.

Additionally, academia became more receptive to the technocratic message of Progressivism, as numerous academics like John Dewey gained prominence during this period and made progressive ideas popular in the Ivory Tower circles.

Many will scoff and think that Ivory Tower ideas have no impact in changing, that these ideas are simply too dense and inaccessible to the masses.

However, free market economists like Nobel laureate F.A. Hayek understood the indispensable role ideas play in politics. In his work, The Intellectuals and Socialism, Hayek argued that when certain ideas promoting activist government become prominent among the academia and general culture, they eventually consume the political class whole.

The idea of an income tax would have been laughed out the venue in previous decades. But in the 1890s, it was all the rage at universities throughout the U.S.

Soon, political winds started to blow in a more favorable direction.

For a brief moment, Progressives got their wish when the Congress introduced an income tax during the mid-1890s.

The Wilson-Gorman Tariff Act, which had an income tax provision attached to it, gained the ire of then President Grover Cleveland for its last-minute amendments.

Nevertheless, the Wilson-Gorman Act became law without Cleveland’s signature. The Supreme Court would later strike down the income tax.

 .provisions of the Wilson-Gorman Act in 1895’s Pollock v. Farmers’ Loan Trust Co. case.

Pollock v. Farmers’ Loan & Trust Company, 157 U.S. 429 (1895), affirmed on rehearing, 158 U.S. 601 (1895), was a landmark case of the Supreme Court of the United States. In a 5-to-4 decision, the Supreme Court struck down the income tax imposed by the Wilson–Gorman Tariff Act for being an unapportioned direct tax. The decision was superseded in 1913 by the Sixteenth Amendment to the United States Constitution, which allows Congress to levy income taxes without apportioning them among the states.

Congress had previously introduced an income tax during the American Civil War, but this tax had been repealed in 1872. In 1894, Congress passed the Wilson-Gorman Tariff Act, which lowered tariff rates and made up for some of the lost revenue by introducing taxes on income, corporate profitsgifts, and inheritances. Chief Justice Melville Fuller‘s majority opinion in Pollock held that a federal tax on income derived from property was unconstitutional when it was not apportioned among the states according to representation in the House of Representatives. Fuller also held that federal taxation of interest earned on certain state bonds violated the doctrine of intergovernmental tax immunity. In one dissent, Associate Justice Henry Billings Brown wrote that the majority opinion “involves nothing less than the surrender of the taxing power to the moneyed class.”

The Supreme Court’s rejection of the income tax was no trivial failure. It was the first step in starting the conversation on the need for an income tax. Progressives now smelled blood in the water and would come back with a vengeance in less than two decades.

Not letting the temporary setback of the Pollock v. Farmers’ Loan Trust Co. deter their activism, Progressives continued plowing ahead and making their ideas more palatable to the political class and the masses.

Progressivism reached its high point during the administration of Woodrow Wilson, when Congress passed the 16th Amendment in 1909 and sent it on to the states for ratification.

https://www.illuminateourworld.org

by Don Lam

By 1913 it had been approved over the objections of conservatives like Richard Byrd who made an eloquent plea to defeat the amendment in the Virginia House of Delegates. His anti-tax speech has been repeated by conservatives in various forms ever since.

“A hand from Washington will be stretched out and placed upon every man’s business; the eye of the Federal inspector will be in every man’s counting house . . . The law will of necessity have inquisitorial features, it will provide penalties, it will create complicated machinery. Under it men will be hailed into courts distant from their homes. Heavy fines imposed by distant and unfamiliar tribunals will constantly menace the tax payer. An army of Federal inspectors, spies and detectives will descend upon the state.” – Delegate Richard Byrd

Obviously he could see the future.

After ratification of the 16th Amendment, Congress enacted the Revenue Act of 1913, levying a 1% tax on incomes over $3,000, with a 6% surtax on incomes above $500,000.

A few years later in 1918, after the United States entered World War I, the top tax bracket was increased to 77% on income over $1,000,000.

So, supporters of the income tax sold it as a tax that would only target the filthy rich.

Again folks, does this sound familiar?

In 1917, the lowest tax bracket paid two percent, although the highest income earners saw their taxes skyrocket to 67 percent.

At the time, politicians reassured their constituents that those rates would not be permanent, and they would eventually be scaled back. Little did taxpayers know what the 1930s and 1940s had in store for them.

By soaking the rich and redistributing their wealth, politicians claimed to be champions of the common man, all while consolidating their power in D.C.

However, economic realities and political backlash have constrained politicians’ abilities to indefinitely raise taxes.

Power-hungry politicians needed a little bit of outside help to make their wildest fantasies become reality. That help usually comes in the form of a political crisis, which politicians exploited in its fullest.

The New Deal was the first era that witnessed income taxes rise at astronomical rates.

On the eve of the 1929 stock market crash, the highest income earners paid a marginal tax rate of 25 percent. But once the Great Depression was well underway in the mid-1930s, the top tax bracket was paying 63 percent, and the United States’ entrance into World War II catapulted these rates toward 94 percent.

Certain political practices, such as the abandonment of the use of war bonds – debt securities the government issued to finance war efforts – changed certain political realities for the political class.

The discontinued use of war bonds made using the income tax and deficit spending a necessity. This was the result of the populace starting to grow skeptical of military action abroad.

With war bonds out of the picture, the U.S. relied more on income taxation and central banking to finance military actions and domestic programs after World War II.

As a result, the income tax soon became a part of the average American’s life, whether they liked it or not.

One of the sneakiest aspects of the income tax is the practice of withholding. Instead of paying a lump sum on April 15th, most taxpayers have their income taxes deducted from their paycheck.

Their employer essentially becomes an unpaid tax collector that gradually extracts their income in relative silence.

Come Tax Day, many Americans receive money back after paying excess taxes all year, so they’re left feeling like they’ve been given the gift of free money.

Sounds too good to be true, right? In reality, the government is actually forcing taxpayers to loan it money to finance lavish programs, with zero interest.

Ironically enough, withholding wasn’t an original feature of the income tax. It wasn’t until World War II that the practice of tax withholding was standardized through the Current Tax Payment Act of 1943.

Withholding would later become a permanent feature of the current tax code, despite its original intentions of being a temporary wartime measure.

So, there you have it folks. A little history of how the government was able to sell the idea of income taxes.

Wars and a huge bureaucratic federal government that seems to think its role is to solve all of society’s ills, most notably poverty and income inequality are the two biggest reasons I can come up with. I am sure there are more.

What do you think? Are taxes a necessary evil or have we lost sight of what the government role in society truly is?