Afghanistan

Covered by: Fox News, Greg NormanMichael RuizPeter AitkenAdam ShawLucas TomlinsonLucas Manfredi and Edmund DeMarche

White House national security adviser Jake Sullivan on Monday acknowledged the security situation in Afghanistan “unfolded at unexpected speed,” while maintaining that President Biden stands by his decision to withdraw U.S. troops. 

Heavily armed Taliban fighters swept into Afghanistan’s capital of Kabul on Sunday after the government collapsed, and the Afghani president fled the country, signaling the end of the United States’ 20-year effort to rebuild the nation after the withdrawal of the U.S. military from the region. 

Sullivan, during an appearance on ABC News’ “Good Morning America” Monday, defended Biden’s decision to withdraw troops.

“The president did not think it was inevitable that the Taliban were going to take control of Afghanistan,” Sullivan said. “He thought the Afghan national security forces could step up and fight because we spent 20 years, tens of billions of dollars, training them, giving them the best equipment, giving them support of U.S. forces for 20 years.”

“When push came to shove, they decided not to step up and fight for their country,” Sullivan said, adding that the president was faced with the question of whether U.S. men and women should be “put in the middle of another country’s civil war when their own army won’t fight to defend them?” 

“And his answer to the question was ‘no,’ and that is why he stands by this decision,” Sullivan said. 

The Taliban is pushing to restore the Islamic Emirate of Afghanistan, the formal name of the country under the Taliban rule before the militants were ousted by U.S-led forces in the wake of the 9/11 attacks, which were orchestrated by Al Qaeda while it was being sheltered by the Taliban. 

“A decade ago, we got Usama bin Laden, we degraded Al Qaeda, we stopped terrorist attacks against the United States from Afghanistan for 20 years,” Sullivan said. “What the president was not prepared to do was enter a third decade of conflict flowing in thousands more troops, which was his only other choice to fight in the middle of a civil war that the Afghan army wouldn’t fight for itself.” 

He added: “He would not do that to America’s men and women or their families. And that is why he made the decision to withdraw U.S. forces from Afghanistan this year.” 

OK. So let’s get a few things straightened out right off the bat.

With the recent events in Afghanistan, the media has been using the names Taliban and Al Qaeda almost as if they are the same group.

In fact, it was Al-Qaeda that planned and carried out the 11 September attacks in the United States. The terrorist network had its roots in Afghanistan fighting against the Soviet occupation of the 1980s. But it is composed mostly of Arabs or Islamic militants from countries other than Afghanistan.


On the flip side, the Taliban is made up of ethnic Pashtun Afghans who grew up in refugee camps or religious boarding schools in Pakistan — called madrassas — during the Soviet occupation of their homeland.

So, think of it as Taliban local, Al Qaeda global.

Amnesty International UK

Afghanistan has a tumultuous recent past. In the last three decades, the country has been occupied by communist Soviet troops and US-led international forces, and in the years in between has been ruled by militant groups and the infamous oppressive Islamic Taliban.

Now folks, I have to tell you who the biggest losers in all this turmoil will be. The Afghan women.

That is what I find fascinating about Mr. Sullivan and the Biden Administrations position on all this.

How many times have we heard Biden, Pelosi, and people like AOC harp on the issue of women’s rights? Yet here they are putting millions of women in harms way.

Throughout the changing political landscape of Afghanistan in the last fifty years, women’s rights have been exploited by different groups for political gain, sometimes being improved but often being abused.

‘Afghan women were the ones who have lost the most from the war.’

Horia Mosadiq (pronounced ho ree a, like Gloria) was a young girl when Russia invaded Afghanistan in 1979.

Horia states, Think of women in Afghanistan now, and you’ll probably recall pictures in the media of women in full-body burqas, perhaps the famous National Geographic photograph of ‘the Afghan girl’, or prominent figures murdered for visibly defending women’s rights. But it hasn’t always been this way.

She goes on to say, ‘As a girl, I remember my mother wearing miniskirts and taking us to the cinema. My aunt went to the university in Kabul.’

Until the conflict of the 1970s, the 20th Century had seen relatively steady progression for women’s rights in the country.

Afghan women were first eligible to vote in 1919 – only a year after women in the UK were given voting rights, and a year before the women in the United States were allowed to vote.

In the 1950s gendered separation was abolished; in the 1960s a new constitution brought equality to many areas of life, including political participation.

But during coups and Soviet occupation in the 1970s, through civil conflict between Mujahideen groups and government forces in the ’80s and ’90s, and then under Taliban rule, women in Afghanistan had their rights increasingly rolled back.

The Taliban are now notorious for their human rights abuses. The group emerged in 1994 after years of conflict.

Many of their members were former Mujahideen fighters who had been trained in Pakistan during Afghanistan’s civil war in the ’80s and ’90s.

They came together with the aim of making Afghanistan an Islamic state. The Taliban ruled in Afghanistan from 1996 until 2001.

Under the Taliban, women and girls were discriminated against in many ways, for the ‘crime’ of being born a girl.

The Taliban enforced their version of Islamic Sharia law. Women and girls were:

  • Banned from going to school or studying
  • Banned from working
  • Banned from leaving the house without a male chaperone
  • Banned from showing their skin in public
  • Banned from accessing healthcare delivered by men (with women forbidden from working, healthcare was virtually inaccessible)
  • Banned from being involved in politics or speaking publicly.
  •  

There were many other ways their rights were denied to them. Women were essentially invisible in public life and imprisoned in their homes.

In Kabul, residents were ordered to cover their ground and first-floor windows so women inside could not be seen from the street.

If a woman left the house, it was in a full body veil (burqa), accompanied by a male relative: she had no independence.

If she disobeyed these discriminatory laws, punishments were harsh. A woman could be flogged for showing an inch or two of skin under her full-body burqa, beaten for attempting to study, stoned to death if she was found guilty of adultery.

Rape and violence against women and girls was commonplace.

Afghan women were brutalized in the law and in nearly every aspect of their daily life. In 1996, a woman in Kabul had the end of her thumb cut off for wearing nail polish.

According to a fifteen year-old girl in Kabul, “In 1995, they shot my father right in front of me. It was nine o’clock at night. They came to our house and told him they had orders to kill him because he allowed me to go to school.

The Mujahideen had already stopped me from going to school, but that was not enough. I cannot describe what they did to me after killing my father…”

The US led an international military campaign intervening in Afghanistan immediately following the attacks on September 11 2001.

World leaders, including those from the UK and USA, regularly cited the need to improve Afghan women’s rights as part of the justification for the intervention.

The Taliban were ousted from power by the end of 2001.

In the years following international intervention, many schools opened their doors to girls and women went back to work. There was progress towards equality: a new constitution in 2003 enshrined women’s rights in it, and in 2009 Afghanistan adopted the Elimination of Violence Against Women (EVAW) law.

But the Taliban and other highly conservative insurgent groups still controlled some parts of Afghanistan, and violence and discrimination against women and girls continued – all over Afghanistan.

In 2011 it was named ‘the most dangerous country’ to be a woman.



So now let’s review a little history about Afghanistan.

CIA World Factbook

Ahmad Shah DURRANI unified the Pashtun tribes and founded Afghanistan in 1747.

The country served as a buffer between the British and Russian Empires until it won independence from national British control in 1919

A brief experiment in democracy ended in a 1973 coup and a 1978 Communist counter-coup.

The Soviet Union invaded in 1979 to support the tottering Afghan Communist regime, touching off a long and destructive war.

The USSR withdrew in 1989 under relentless pressure by internationally supported anti-Communist mujahedin rebels.

(Yes folks, we supported the Taliban while they were fighting the Soviet Union)

A series of subsequent civil wars saw Kabul finally fall in 1996 to the Taliban, a hardline Pakistani-sponsored movement that emerged in 1994 to end the country’s civil war and anarchy.

Sept. 2001 After the 9/11 attacks, President George W. Bush gave the Taliban an ultimatum to hand over Osama bin Laden.

The Taliban refused and in October the U.S. led a campaign that drove the Taliban out of major Afghan cities by the end of the year.

The UN-sponsored Bonn Conference in 2001 established a process for political reconstruction that included the adoption of a new constitution, a presidential election in 2004, and National Assembly elections in 2005.

In 2002, Hamid Karzai became interim president of Afghanistan. The Taliban continued to wage guerrilla warfare near the border with Pakistan.

In December 2004, Hamid KARZAI became the first democratically elected president of Afghanistan and the National Assembly was inaugurated the following December.

In Feb. 2009 President Obama ordered 17,000 additional troops to Afghanistan.

In Aug. 2009 President Karzai won re-election in a vote marred by fraud.

In Dec. 2009 President Obama issued orders to send 30,000 troops in 2010, bringing the total American force to about 100,000.

Now a couple of other interesting things to think about.

Afghanistan is the world’s largest producer of opium.

The Taliban and other antigovernment groups participate in and profit from the opiate trade, which is a key source of revenue for the Taliban inside Afghanistan.

Widespread corruption and instability impede counterdrug efforts and most of the heroin consumed in Europe and asia is derived from Afghan opium. So, the Taliban has plenty of money to carry on their conflict.

Also, the Palestinian terrorist group Hamas is congratulating the Taliban for their recent takeover of Afghanistan. 

In a statement, the militants say they welcome “the defeat of the American occupation on all Afghan land” and praised the Taliban’s “courageous leadership on this victory, which was the culmination of its long struggle over the past 20 years,” according to the AP. 

Finally, in June, the Pentagon’s top leaders said an extremist group like Al Qaeda may be able to regenerate in Afghanistan and pose a threat to the U.S. homeland within two years of the American military’s withdrawal from the country.

Two decades after the U.S. invaded Afghanistan because the Taliban harbored Al Qaeda leaders, experts say the Taliban and Al Qaeda remain aligned, and other violent groups could also find safe haven under the new regime. 

Based on the evolving situation, officials now believe terror groups like Al Qaeda may be able to grow much faster than expected.

So there you have it folks. I, just like you, have a lot of questions?

Did we do the right thing by pulling out of Afghanistan?

Did we just make a bad situation much worse?

What about the plight of the Afghan women and men who supported us during the conflict? Do we just turn our backs on them?

Have we now established a safe haven for future terrorist groups with plans to attack the United states?

Happy 200th Birthday Missouri

Missouri gets its name from a tribe of Sioux Indians of the state called the Missouris. The word “Missouri” often has   been   construed   to   mean   “muddy   water”   but   the Smithsonian Institution Bureau of American Ethnology has stated it means “town of the large canoes” and authorities have said the Indian syllables from which the word comes mean “wooden canoe people” or “he of the big canoe.”

Missouri has been nicknamed several times, but the “Show Me State”  probably is  the one used most. The saying gained favor in the 1890s although its origin is unknown.  

Whatever its origin, much of the credit for popularizing the expression goes to Congressman Willard D. Vandiver of Cape Girardeau County.

During an 1899speech in Philadelphia, the noted orator used the phrase, “I’m   from   Missouri;  you’ve   got   to   show   me.”  

The expression   soon   caught   the   public   fancy, portraying Missourians as tough-minded demanders of proof.       

The first Europeans to visit Missouri were French explorers from Canada.   Father   Jacques   Marquette   and Louis Joliet, who descended the Mississippi from the north in 1673, supplied the first written accounts of exploration in Missouri.

In 1682, the area was claimed for France by Robert Cavalier La Salle.

As part of the Louisiana Purchase Territory, Missouri has belonged to three nations.

France ceded the area to Spain in 1762.

Although Spain held it for forty years, its influence was slight. The early culture of the region was determined mostly by the French.    

It was the French who were responsible for the first permanent settlement of St. Genevieve in the mid-1730s. St. Genevieve stood alone in the   huge   upper   Louisiana   Territory   until   the establishment of St. Louis as a fur trading post in 1764.        

In the mid-1760s, Pierre LaClede established his fur-trading post just below the joining of the Missouri and Mississippi Rivers.

LaClede and his stepson, Rene Auguste Chouteau, named the post after King Louis IX who had been made a saint.  

With such a favorable location, St. Louis soon became the most prosperous outpost in the western region.

By secret treaty in 1802, Spain returned  the LouisianaTerritory to the control of France. Napoleon Bonaparte, anxious to rid himself of the vast and troublesome frontier, sold it to the United States in 1803 for a total of $15million.      

About this time President Jefferson organized the Lewis and Clark   Expedition,  which   was   the   first   extensive exploration of the northwestern part of the new territory.

The explorers left the St. Louis/St. Charles area in 1804.   

Missouri was organized as a territory in 1812 and was admitted to the Union as the 24th state on August 10, 1821.

Missouri Governor Alexander McNair was at the Capitol (which still stands) in St. Charles, when he heard that the territory had become a state.

Missouri became the second state (after  Louisiana)  of the Louisiana Purchase  to be admitted to the Union.

 Becoming a state wasn’t easy. When the Missouri Territory first applied for statehood, a debate ensued over the government’s right to restrict slavery.

In 1819, the Democratic-Republican Party (an American political party founded by Thomas Jefferson and James Madison)  had a monopoly over American politics as the Federalist Party (the opposing party founded by Alexander Hamilton) ceased to exist following the War of 1812.

However, factions existed within the Democratic-Republican Party which proved to be a real problem during Missouri’s bid for statehood.

Representative James Tallmadge proposed as a condition of Missouri’s statehood that no further slaves could be imported into the state and all children born after Missouri’s admission to the Union shall be born free.

This condition, known as the Tallmadge amendment, set out a plan for gradual emancipation in Missouri. Many northerners supported this amendment.

Northerners mainly supported this amendment, not because of slavery, but because they wanted to limit the political influence of southerners.

On the floor of the House, Representative Thomas W. Cobb of Georgia looked Tallmadge dead in the eye and told him: “you have kindled a fire which all the waters in the ocean cannot put out, which seas of blood can only extinguish.”   So basically, the request for Missouri to become a state brought about the first rumblings of a civil war.

The House vote on the Tallmadge amendment was divided along sectional lines with northern representatives voting 80 to 14 in favor and southern representatives voting 64 to 2, against the amendment. The amendment narrowly passed the House.

However, in the Senate southerners maintained greater influence and were able to block the passage of the amendment. The Tallmadge amendment failed which led to a deadlock in Congress.

When Congress took their annual recess, the statehood bill lapsed, and we were denied statehood.

When the 16th Congress convened in December 1819 Congressmen reignited debates over Missouri statehood.

However, President James Monroe, Speaker of the House Henry Clay and key Senate members worked behind the scenes on a compromise to solve this crisis. 

The Senate linked the admission of Maine to the Union to Missouri’s admission, essentially holding Maine statehood hostage.

The Senate would only let Maine’s statehood bill go through if Congress admitted Missouri into the Union without the Tallmadge amendment.

However, most northern Congressmen held out until Senator Jesse Thomas of Illinois (who owned slaves) proposed that slavery be allowed in Missouri but prohibited in the remainder of the Louisiana Purchase north of the 36°30’ parallel, Missouri’s southern boundary.

Enough northern Congressmen came around in support of this Thomas amendment to pass the Missouri Compromise in March 1820.

Passed as a package, the Missouri Compromise included the Thomas Amendment and stipulated that Maine (a free state) and Missouri (a slave state) would be admitted into the Union at the same time.

This set a precedent that states would be admitted in pairs to maintain sectional balance in the Senate and the Electoral College.

So, after being denied admission in 1819, Missouri was now admitted as a state in 1821.

Although admitted as a slave state, Missouri nevertheless remained with the Union throughout the Civil War. However, in 1854, the Kansas-Nebraska Act repealed the Missouri Compromise by replacing the Thomas amendment with popular sovereignty, which led to Bleeding Kansas.

At the beginning of the Civil War, most Missourians wanted only to preserve the peace. However, the state governor, Claiborne Fox Jackson, was strongly pro-southern and attempted to align Missouri with the Confederacy.

At the beginning of the war, Union forces occupied the state and General John Freemont declared martial law from his headquarters in Jefferson City.

Needless to say, this did not sit well with the citizens of our state.

The Governor and most of the legislature were forced to flee to southern Missouri where they actually passed an ordinance of secession.

The most important and bloodiest battle fought in Missouri was the Battle of Wilson’s Creek near Springfield. Other important battles in Missouri were fought at Carthage, Lexington, Westport and Boonville – the first engagement within the state.

Following the Civil War, Missouri became known as the “Gateway to the West.” (also “The Outlaw State”)

Many settlers would start out here on their way to California, Oregon, and other areas out west.

This was one of the last places where wagon trains could stop for supplies before beginning their long trip. Both the Santa Fe Trail and the Oregon Trail began in Missouri.

As the state developed, we had a huge impact on our young nation.

In 1873, Susan Elizabeth Blow opened the first public kindergarten in the United States in St. Louis after having become interested in the kindergarten methods of philosopher Friedrich Froebel while traveling in Germany a few years earlier. Blow later established a training school for kindergarten teachers.

During World War I, Missouri provided 140,257 soldiers, one-third being volunteers.

Missouri contributed such notable leaders as Gen. John J. Pershing of Laclede, Missouri, commander of the American Expeditionary Forces in Europe, and Provost Marshall Enoch H. Crowder of Grundy County, Missouri, who drew up the Selective Service Act.

During World War II, Missouri contributed a total of over 450,000 men and women to the various armed forces. Eighty-nine top officers were from Missouri including Gen. Omar N. Bradley of Clark, Missouri, and Lt. Gen. James H. Doolittle of St. Louis.

The nation’s leader during the last year of the war was Lamar, Missouri-born Harry S Truman, first Missourian to become President of the United States. After assuming office upon the death of Franklin D. Roosevelt in 1945, President Truman was re-elected to a full four-year term.

His was the fateful decision to use the atom bomb and force the Japanese surrender signed on the deck of the battleship USS Missouri in Tokyo Bay.

Now it would be wrong to not include two other key events that shaped Missouri history.

December 16, 1811 brought the first of four large earthquakes that would go down in Missouri history as the most powerful to ever hit the eastern part of the United States. The epicenter of the quake was near what is now New Madrid, Missouri in the southern “boot heel” of the state.

The tremors from these quakes caused waterfalls on the Mississippi River and were felt as far as Quebec, Canada.

Another key event, the 1904 World’s Fair, also known as The 1904 Louisiana Purchase Exposition opened in St. Louis to showcase art, industry and science at the advent of the 20th century.

It also marked the 100th anniversary of the Louisiana Purchase. The fair grounds were constructed by over 10,000 workers in a 1,200 acre park. It told the story of American progress since the Louisiana Purchase and gave the opportunity for many foreign nations to display their national history and technology.

The fair was also host to the 1904 summer Olympic games. This was the first time the games were hosted in the United States.

Now we must also include a list of famous Missourians in our talk today.

Daniel Boone (adventurer) (Born 1734; died 1820) – Boone was a pioneer, scout, Indian fighter and, in later years, a Missourian. He came to Missouri from Kentucky in 1799 and served as a local judge. From his home at Defiance, which he built with his son, Nathan, he explored much of the state. He died in his Defiance home.

Omar N. Bradley (military leader) (Born 1893; died 1981) – General Bradley was born in Clark, Missouri. He commanded the largest American force ever united under one man’s leadership during World War II. Bradley became the first chairman of the Joint Chiefs of Staff (in charge of all military), after the war. He served as a five star general; he served in the military longer than any other soldier in U.S. history, 69 years.

Christopher (Kit) Carson (adventurer) (Born 1809; died 1868) – Born in Kentucky, Kit Carson moved to the Boonslick district of Missouri in 1811, an area he called home for nearly half his life. He led an adventurous life as a Santa Fe Trail teamster, trapper, scout, and Indian fighter. Carson served as a guide for Lt. John Charles Fremont’s western expeditions and helped in the California conquest in 1846 during the Mexican War.


State Historical Society of Missouri

William Clark (explorer) (Born 1770; died 1838) – As part of the famous duo, Lewis and Clark, Clark is best known for his part in the exciting expedition he and Meriwether Lewis led westward to the Pacific. Clark returned with information about the western region of the United States. In 1806, Clark began a long and successful Missouri career when he was appointed the principal U.S. Indian agent for tribes in the territory. From 1813 to 1820, he served as governor of Missouri. In 1822, he moved to St. Louis as U.S. Superintendent of Indian Affairs, a post he held until his death.

Samuel Clemens (author) (Born 1835; died 1910) – Growing up in Hannibal, Clemens watched riverboats on the Mississippi. From riverboat language he took a name- Mark Twain (two fathoms/12 feet)- that would become famous worldwide for his books involving characters like Huckleberry Finn and Tom Sawyer. One of America’s greatest writers, Mark Twain is remembered today at his boyhood home in Hannibal and at his nearby birthplace in Florida, Missouri.

Walt Disney (cartoonist) (Born 1901; died 1966) – Disney grew up in Marceline and Kansas City. Disney created the first animated cartoon with sound, “Steamboat Willie,” which introduced the world to Mickey Mouse. Disney’s first animated feature film was “Snow White and the Seven Dwarfs.”

Phoebe Apperson Hearst (volunteer, children’s activist) (Born 1842; died 1919) – Born in Franklin County and married in Steelville, she moved to San Francisco with her husband George Hearst, also a Missourian, who amassed a fortune in the mining fields of Nevada. Although Mrs. Hearst supported the arts, she is best remembered for her early support of kindergartens and as a co-founder of the National Congress of Mothers, know today as the PTA. She is the mother of publisher William Randolph Hearst.


Courtesy of California Institute of Technology

Edwin Powell Hubble (astronomer) (Born 1889; died 1953) – Hubble was born in Marshfield and became one of the world’s leading astronomers. In

James Cash (J.C.) Penney (businessman) (Born 1875; died 1971) – Penney founded the J.C. Penney Company. Penney was born in Caldwell County. Penny started as a dry goods clerk and bought stock in a store that he named the Golden Rule Store, from his employer. He bought more stores in 1904, which led to a nationwide chain of stores. In 1912 Penney named the stores J.C. Penney Stores.

John J. Pershing (military leader) (Born 1860; died 1948) – Pershing, a six star general, born near Laclede, is the only American to be named General of the Armies. His career included service in the Spanish American War and in the fight against Mexican bandit Pancho Villa. In World War I, he commanded the American Expeditionary Force in Europe.

Joseph Pulitzer (newspaperman) (Born 1847; died 1911) – Pulitzer made his way from his birthplace in Mako, Hungary to St. Louis in 1865, a city he called home for almost 20 years. In 1869 he was elected to the Missouri Legislature. In 1878 Pulitzer bought the newspaper the St. Louis Dispatch and merged it with the St. Louis Post and created St. Louis’s leading newspaper known today as the St. Louis Post-Dispatch. He covered shocking stories to sell newspapers this approach was nicknamed “yellow journalism.” He also bought the New York World and became know as a publisher around the world. The Pulitzer Prize Award is named after him. This award is an important award for journalists.

Eviction Moratorium

Tucker Carlson, Fox News.

On New Year’s Day of this year, Rochelle Walensky was just a college professor in Massachusetts. You’d almost certainly never heard of her. You definitely didn’t vote for her at any point, because Walensky had never run for office.

As of January first, Walensky’s political power was precisely the same as yours and everyone else’s in this supposedly self-governing republic: she had one vote out of a nation of 320 million people. And then, just a few weeks later, everything changed, for her, and for the rest of us.

Joe Biden appointed Walensky to run the Centers for Disease Control and Prevention in Atlanta. At the time, it didn’t seem like a huge deal at the time. The CDC is not a legislative body. It is a public health bureau. It was originally designed to fight malaria, and it did a good job.

The CDC gathers information about diseases and then releases guidance about those diseases to the country. The CDC does not make laws in this country. It’s not allowed to. Under the U.S. Constitution, making laws is the exclusive role of Congress.

You vote for your senators and congressmen and they decide what the rules are. That’s known as representative democracy. It’s been our system for nearly 250 years. But apparently, it is now over. Rochelle Walensky now makes the laws.

Walensky announced last month that she has decided to nationalize America’s rental properties, millions and millions of them from Maine to California. Tenants are no longer required to pay their rent.  

Property owners cannot evict them under any circumstances. Making someone pay to live on your property is now a federal crime. Try it, and you can wind up in prison, with hundreds of thousands of dollars in fines.

At the same time, you should know, property owners will still be required to pay the banks that hold their mortgages. There’s no moratorium on mortgages.

It’s hard to overstate what a momentous change this is. It means among other things that private property no longer exists in the United States. You thought you owned your home. Not anymore. Rochelle Walensky does. She’ll decide who can live there, under what circumstance and for how long. 

Is this a good idea? Of course not.  It’s totalitarian. But there’s an even more pressing question at the center of this story — a principle that defines what kind of country this is and what kind of country it will be going forward: Where did Rochelle Walensky get the power to do this, to suspend private property rights in America? The answer is, she simply asserted the power.

Walensky claimed she had the authority, and no one stopped her from exercising it. She signed an official-looking order declaring that her opinion is now the law, and so it is the law. But wait, you say. That doesn’t seem very American. Shouldn’t somebody vote on this? If we’re going to continue to pretend this is a democracy, and you hear that on television constantly, then shouldn’t our elected lawmakers make the laws?

Nope. And they’re not going to. Nancy Pelosi has refused to call a vote on the matter, and she runs the Congress, she decides. Most Republicans haven’t said a word. That means that an unelected college professor you hadn’t heard of six months ago is now in charge of your country.  

If you’re wondering how all of this can possibly be legal, rest assured that it’s not — it’s not even arguably legal. We know for a fact that it’s not. The Supreme Court just ruled on the question, specifically.

The court found that the CDC does not have the right to institute a nationwide eviction moratorium. Period. Only Congress can do that. Now, the court didn’t make us guess on their view on this, the court put that in writing, in the clearest possible language. There’s no debate about that. The Biden Administration just ignored what the court said.

How can they do that?  

Congresswoman Maxine Waters of Los Angeles understands exactly how they did it. One thing Maxine Waters knows well is how third-world regimes operate. When you want something, you simply take it. You’ve got the guns. Who’s going to stop you? Might makes right.

Nancy Pelosi knows that too. Pelosi knows that what Rochelle Walensky just did is illegal by definition. She also knows that openly ignoring a Supreme Court ruling will mean the end of our current system. That’s fine with Nancy Pelosi.  

“The CDC has the power to extend the eviction moratorium,” Pelosi said. She didn’t explain where that power comes from. She simply declared that it exists, as dictators do.  

Keep in mind that even Joe Biden, who knows very little, knows that what his administration has just done is against the law. He said it on camera yesterday: 

JOE BIDEN: I’ve sought out constitutional scholars. To determine what is the best possibility, that would come from executive action or the CDC judgment. What could they do that was most likely to pass muster? Constitutionally. The bulk of the constitutional scholarship says that it’s not likely to pass constitutional muster. Number one. But there are several key scholars who think that it may and it’s worth the effort. 

So the eviction moratorium has been in place for months, it has just recently been extended. So people have debated this, jurists have weighed in on it, and so we know, it won’t pass, “constitutional muster,” says Biden. In other words, it’s illegal.

Now folks, believe it or not, our founding fathers could see this coming 232 years ago.

Dave Roos is a freelance writer based in the United States and Mexico. A longtime contributor to HowStuffWorks, Dave has also been published in The New York Times, the Los Angeles Times and Newsweek.

When the 13 United States of America declared independence from the United Kingdom in 1776, the founders were attempting to break free from the tyranny of Britain’s top-down centralized government. Very similar to what we are seeing here in the US today.

But the first constitution the founders created, the Articles of Confederation, vested almost all power in individual state legislatures and practically nothing in the national government.

The Articles of Confederation set up a government run by Congress, with no President. Can you imagine?

The result—political chaos and crippling debt that almost destroyed the country before it got started.

So the founders met again in Philadelphia in 1787 and drafted a new Constitution grounded in a novel separation of state and national powers known as federalism.

While the word itself doesn’t appear anywhere in the Constitution, federalism became the guiding principle to safeguard Americans against King George III-style tyranny while providing a check against rogue states.

The Articles of Confederation were written and ratified while the Revolutionary War was still raging. The document is less of a unifying constitution than a loose pact between 13 sovereign states intending to enter into “a firm league of friendship.”

Absent from the Articles of Confederation were the Executive or Judicial branches, and the national congress had only the power to declare war and sign treaties, but no authority to directly levy taxes.

As a result, the newly independent United States was buried in debt by 1786 and unable to pay the long-overdue wages of Revolutionary soldiers.

The U.S. economy sunk into a deep depression and struggling citizens lost their farms and homes.

In Massachusetts, angry farmers joined Shays’ Rebellion to seize courthouses and block foreclosures, and a toothless congress was powerless to put it down.

George Washington, temporarily retired from government service, lamented to John Jay, “What a triumph for the advocates of despotism to find that we are incapable of governing ourselves, and that systems founded on the basis of equal liberty are merely ideal & fallacious!”

Alexander Hamilton called for a new Constitutional Convention in Philadelphia in 1787 where the Articles of Confederation were ultimately thrown out in favor of an entirely new form of government.

When the United States cut ties with Britain, the founders wanted nothing to do with the British form of government known as “unitary.”

Under a unitary regime, all power originates from a centralized national government (Parliament) and is delegated to local governments. That’s still the way the government operates in the UK.

Instead, the founders initially chose the opposite form of government, a confederation.

In a confederation, all power originates at the local level in the individual states and is only delegated to a weak central government at the states’ discretion.

When the founders met in Philadelphia, it was clear that a confederation wasn’t enough to hold the young nation together. States were fighting over borders and printing their own money. Massachusetts had to hire its own army to put down Shays’ Rebellion.

The solution was to find a middle way, a blueprint of government in which the powers were shared and balanced between the states and national interests.

That compromise, woven into the Constitution and the Bill of Rights, became known as federalism.

The Constitution and the Bill of Rights created two different kinds of separation of powers, both designed to act as critical checks and balances.

The first and best-known of the separation of powers is between the three branches of government: Executive, Legislative and the Judiciary.

If the president acts against the best interests of the country, he or she can be impeached by Congress.

If Congress passes an unjust law, the president can veto it.

And if any law or public institution infringes on the constitutional rights of the people, the Supreme Court can remedy it.

But the second type of separation of powers is equally important, the granting of separate powers to the federal and state governments.

This is the key to where we are today in our fight with an all too powerful federal government.

Under the Constitution, the state legislatures retain much of their sovereignty to pass laws as they see fit, but the federal government also has the power to intervene when it suits the national interest.

As we discussed last week, under the “supremacy clause” found in Article VI, federal laws and statutes supersede state law.

Federalism, or the separation of powers between the state and federal government, was entirely new when the founders included it in the Constitution.

While it functions as an important check, it’s also been a continual source of contention between the two levels of government.

In the final run-up to the Civil War, the Southern states seceded from the Union in part because of the federal government was unconstitutionally encroaching on their “domestic institutions” of slavery.

According to James Madison, a committed federalist, the Constitution maintains the sovereignty of states by enumerating very few express powers to the federal government, while “[t]hose which are to remain in the State governments are numerous and indefinite.”

Article I Section 8 contains a list of all of the “enumerated” powers that are exclusively delegated to the federal government. Those include the power to declare war, maintain armed forces, regulate commerce, coin money and establish a Post Office.

But that very same Section 8 also includes the so-called “Elastic Clause” that authorizes Congress to write and pass any laws that are “necessary and proper” to carry out its enumerated powers.

This is where the train jumps off the tracks.

These powers are known collectively as “implied powers” and have been used by Congress to create a national bank, to collect a federal income tax, to institute the draft, to pass gun control laws and to set a federal minimum wage, among others.

Other than that, the Constitution grants almost all other power and authority to the individual states, as Madison said.

While the Constitution doesn’t explicitly list the powers retained by the states, the founders included a catch-all in the 10th Amendment, ratified in 1791:

“The powers not delegated to the United States by the Constitution, nor prohibited by it to the States, are reserved to the States respectively, or to the people.”

Those so-called “reserved” powers include all authority and functions of local and state governments, policing, education, the regulation of trade within a state, the running of elections and many more.

So this brings us back to our original topic of the eviction moratorium. Our forefathers saw the threat of an all powerful central government and set up a system to control it.

When followed, it works.

The question now is, “What happens when the system is implemented, and the federal government chooses to ignore the results?

The south answered that question in 1861.

It is my sincere hope that we have not reached that point again, but I think you can now see how critical the eviction moratorium issue is in deciding the future of our country and where we go from here.

Are we beyond the point of no return?

Should we get the vaccine? What does history tell us?

Wellcome Collection is a free museum and library that aims to challenge how we all think and feel about health.

Article by, Owen Gower is Museum Manager at Dr Jenner’s House, the former home of vaccine pioneer Edward Jenner. He regularly speaks and participates in interviews about Edward Jenner, the history of smallpox and the development of vaccination.

The extraordinary medical legacy of Edward Jenner, the father of modern vaccination, the country doctor who pioneered vaccination.

The legend usually repeated is that Jenner, a family doctor from Gloucestershire, England, had observed that milkmaids working in the countryside around his hometown of Berkeley had remarkably clear complexions and were never afflicted by the scars of the feared disease smallpox.

When he asked about this, he was told that they had all contracted cowpox in the course of their work and it was this that protected them from smallpox.

Jenner decided to try and experiment, and when Sarah Nelmes consulted him about the blisters she had acquired after milking a cow named Blossom, the doctor acted quickly.

Using pus from Nelmes’s lesions, he deliberately infected James Phipps, the eight-year-old son of his gardener, first with cowpox and, later, with smallpox.

To everyone’s relief, James did not contract smallpox. Jenner’s theory had been correct, and vaccination was born.

Edward Jenner wanted vaccination to be free at the point of delivery … available to everyone, no matter who they were or where they were from.

However, what is often forgotten is the rigorous scientific method behind Jenner’s experiment.

For some years prior to this first vaccination in 1796 he had been gathering evidence supporting the theory that those who had once contracted cowpox were immune from smallpox.

But his evidence was predominantly based on hearsay and required scrutiny in the form of clinical trials.

This is how, on 14 May 1796, he came to take fluid from a cowpox blister on Nelmes’s hand and scratch it onto the skin of James Phipps, who had previously had neither cowpox nor smallpox.

As expected, Phipps contracted cowpox and, once his fever subsided, Jenner then attempted to inoculate him using live smallpox.

This technique, also known as variolation, involved deliberately infecting a patient with a mild dose of smallpox in the expectation that it would provide protection from a more severe infection.

It had long been practiced in China, India, the Ottoman Empire, parts of Africa, and had gained popularity in Western medicine after 1721, when it was championed by Lady Mary Wortley Montagu.

Lady Mary had arranged for her children to be variolated after witnessing the practice in Turkey, and soon persuaded Caroline, Princess of Wales, to have her own children inoculated.

Jenner once wrote, “On average I am at least six hours daily with my pen in my hand bending over writing paper till I am grown as crocked as a cow’s horn and tawny as whey butter.”

Through modern eyes we might be taken aback by the ethical implications of deliberately infecting a child with smallpox; however, at the time the technique was considered to be the “gold standard” for artificially inducing immunity.

Does this sound familiar folks?

Infection with the mild disease cowpox was perhaps more controversial, but Jenner’s theory looked to be correct when, despite exposure to the deadly virus, Phipps did not contract smallpox.

Some months later, Jenner attempted to inoculate Phipps with smallpox again, but to no effect, and then tried the same experiment on numerous others.

Jenner’s trials were controlled, repeatable and, crucially, widely disseminated through his 1798 publication ‘An Inquiry Into the Causes and Effects of the Variolæ Vaccinæ’.

Having shown that cowpox could protect against smallpox, Jenner devoted the rest of his life to telling the world about vaccination and how to perform it safely and effectively.

In the garden of his house in Berkeley, Gloucestershire, Jenner turned a rustic, thatched summerhouse into the world’s first free vaccination clinic.

There he ensured that this life-saving medical intervention was available on the basis of need, rather than ability to pay.

Many consider the Temple of Vaccinia, the grand name given by Jenner to this building, to be a symbol not just of hope in the fight against disease but of the principles and values of a later free public health service: the National Health Service.

John Baron, Edward Jenner’s biographer, wrote that “the discovery of vaccination, was ushered into the world with singular modesty and humility”. And so it was that Jenner, without fanfare or ceremony, made his research on vaccination against smallpox freely available to the world.

Jenner did not seek to profit from his work and discouraged others from doing the same.

If anything, Jenner’s own income and medical practice suffered from the long hours he invested corresponding with those who were interested in adopting vaccination.

At the heart of Jenner’s commitment to free access for all was his practice of opening his garden once a week so that the poor of the local area could be vaccinated.

Jenner’s tireless work to share news of vaccination was grounded in his own deep-seated compassion and desire to bring about a world free of smallpox.

The fact that Jenner rarely travelled, preferring home comforts to a life on the road performing mass vaccinations, does not contradict these values. Jenner was, first and foremost, a community doctor.

In 1804 his friend W J Joyce observed, “The Doctor very well understands the art of dealing with their prejudices and it gave me great pleasure to observe the gentle and effectual manner with which he endeavored to soothe their mind.”

Jenner knew his patients and understood that they might have concerns about this new practice. That they consented to receive vaccination illustrates their level of trust in him.

Jenner primarily vaccinated within his normal practice area and taught others how to do the same in their own communities.

This method of working perpetuated even to the final days of the World Health Organization Smallpox Eradication Program of 1966–80, when an international team of medics supported local healthcare workers to ensure vaccination was accepted in areas where people remained unprotected.

Edward Jenner wanted vaccination to be free at the point of delivery, carefully explained by trusted and trained local healthcare workers, and available to everyone, no matter who they were or where they were from.

He considered himself the “Vaccine Clerk to the World” and was not interested in geopolitical divides, for “the Sciences are never at War” and he knew that it would require an international effort to realize his dream of the global eradication of smallpox.

Jenner’s willingness to teach anyone to vaccinate contributed to the prompt uptake of this new practice throughout the world and was rewarded with international recognition and respect.

In 1807, with Britain and France locked in conflict, Jenner petitioned Napoleon for the release of two friends who were being held as prisoners of war.

Napoleon was told to dismiss the request until his wife Joséphine insisted he look again at who it was from. “What that man asks is not to be refused” came the now famous reply from Napoleon.

How about another vaccine story? I found another great article by Lily Rothman on the Time website. JULY 6, 2015

It deals with the vaccine developed to address the disease of rabies.

Rabies is among the most terrifying viruses to get. According to the Centers for Disease Control, “once clinical signs of rabies appear, the disease is nearly always fatal.”

Luckily for us—and our pets—Louis Pasteur developed a vaccine that can stop things from getting to that point.

The first time the vaccine was ever administered to a human being–on this day in 1885–was by Pasteur himself.

Knowing that the disease was otherwise fatal, both doctor and patient (or, rather, patient’s mother) were willing to risk whatever harm might come from the injection, which had only been tested on dogs.

As TIME recounted  all the way back in in 1939:

One hot July morning in 1885, feverish little Joseph Meister was dragged by his frantic mother through the streets of Paris in search of an unknown scientist who, according to rumors, could prevent rabies.

The nine-year-old Joseph had been bitten in 14 places by a huge, mad dog and in a desperate attempt to cheat death, his mother had fled from their hometown in Alsace to Paris.

Early in the afternoon Mrs. Meister met a young physician in a hospital. “You mean Pasteur,” he said. “I’ll take you there.”

Bacteriologist Louis Pasteur, who kept kennels of mad dogs in a crowded little laboratory and was hounded by medical criticism, had never tried his rabies vaccine on a human being before.

But moved by the tears of Mrs. Meister, he finally took the boy to the Hotel-Dieu, had him injected with material from the spinal cord of a rabbit that had died from rabies.

For three weeks Pasteur watched anxiously at the boy’s bedside. To his overwhelming joy, the boy recovered.

By that fall, when his nation’s Academy of Sciences acknowledged the success, “hundreds of persons who had been bitten by mad dogs rushed to his laboratory.”

As for the little boy, Joseph Meister? He ended up working as a janitor at the Pasteur Institute. There, TIME reported in 1939, that Meister entertained visitors with tales of his time as the pioneering doctor’s patient: “I shall see always Pasteur’s good face focused on me,” he told them. 

How about one more folks?

Christopher Klein, History.com

Let’s talk about Jonas Salk and the polio vaccine.

In the early 20th century, polio was one of the most feared diseases in America.

While most scientists believed that effective vaccines could only be developed with live viruses, Jonas Salk developed a “killed-virus” vaccine by growing samples of the virus and then deactivating them by adding formaldehyde so that they could no longer reproduce.

By injecting the benign strains into the bloodstream, the vaccine tricked the immune system into manufacturing protective antibodies without the need to introduce a weakened form of the virus into healthy patients.

Many researchers such as Polish-born virologist Albert Sabin, who was developing an oral “live-virus” polio vaccine, called Salk’s approach dangerous.

Sabin even belittled Salk as “a mere kitchen chemist.”

After successfully inoculating thousands of monkeys, Salk began the risky step of testing the vaccine on humans in 1952.

In addition to administering the vaccine to children at two Pittsburgh-area institutions, Salk injected himself, his wife and his three sons in his kitchen after boiling the needles and syringes on his stovetop.

Salk announced the success of the initial human tests to a national radio audience on March 26, 1953.

Now folks, do you see a central theme here?

Smallpox, Rabies, Polio; all deadly diseases with no cure at the time.

Jenner, Pasteur, and Salk all saw that something needed to be done.

In every case, they took tremendous risks. Had their patients died as a result of their experiments, they would have been charged with murder.

Think about that.

Jenner injected his gardener’s 8 year old son with an experimental vaccine. Likewise, Louis Pasteur risked his entire reputation and career to save the life of a nine year old boy infected with Rabies.

Finally, what greater risk can one take than to try your experiment on yourself and your own family as Jonas Salk did with his polio vaccine?

Now folks, I am not telling you to run out and get the vaccine if you haven’t already done so.

What I am trying to do, is what I do every week with my shows.

I am trying to educate. I am tired of the national news pushing fear and lies.

I want people to know as much information as possible when it comes to making a decision as important as one’s health.

Vaccinate, don’t vaccinate, that is entirely up to you.

Is there a risk? You bet. As you can see from today’s show, there is always a risk. All I ask is that you do your research before you make any decisions.

As I have always said, “I don’t have a problem with people that don’t know, but I have a huge problem with those who don’t want to know.

Today, as we face a new disease in the form of Covid-19, more than 140 leaders around the world have called for a people’s patent-free vaccine.

Discussions focus on equitable access, treatment focused on need, rather than ability to pay, and a method of distribution that is both rapid and fair.

And if all that sounds familiar, it should. These ideas are not new: they are Edward Jenner’s founding principles of vaccination all the way back in 1798.

Education in America

In a recent article in the Foundation for Economic Education, Dr. Robert Peterson points out….

“For two hundred years in American history, from the mid-1600s to the mid-1800s, public schools as we know them today were virtually non-existent, and the educational needs of America were met by the free market.”

Yes, you heard me correctly, the free market.

Think about this. Would you like to choose what is being taught to your children?

Would you like to send your kids to a school free of teaching Critical Race Theory of the 1619 Project?

Would you like to send your kids to a school that taught patriotism?

Would you like them to be taught the role of good citizenship?

Would you like to send them to a school where the teachers and the Administration could provide discipline for acts of misbehavior?

Would you like to send your kids to a school that allowed school prayer, celebrated Christmas, and required the pledge of allegiance be recited every morning at the start of class?

Would you like your kids to graduate capable of being able to perform critical thinking and express their thoughts clearly both orally and through the written word?

I won’t even get into the issue of being able to solve simple math problems.

Now I can tell you, our forefathers are rolling in their graves watching what is happening to the education of our children.

From the 1650’s to the 1850’s, America produced several generations of highly skilled and literate men and women who laid the foundation for a nation dedicated to the principles of freedom and self-government.

The private system of education in which our forefathers were educated included home, school, church, voluntary associations, philosophical societies, circulating libraries, apprenticeships, and private study.

 It was a system supported primarily by those who purchased the services of education, and by private benefactors.

All was done without a mandate by, or oversight of the federal government.

Dr. Lawrence A. Cremin, a distinguished scholar in the field of education, has said that during the colonial period the Bible was “the single most important cultural influence in the lives of Anglo-Americans.”

Thus, the cornerstone of early American education was the belief that “children are a heritage from the Lord.”

Parents believed that it was their responsibility to not only teach them how to make a living, but also how to live. How to live.

Folks, that is the missing link. You want to stop the craziness running rampant through our schools and our cities?

We must teach our children how to live.

As our forefathers searched their Bibles, they found that the function of government was to protect life and property.

Education was not a responsibility of the government.

Education began in the home and in the fields.

Education in early America began in the home at the mother’s knee, and often ended in the cornfield or barn by the father’s side.

The task of teaching reading usually fell to the mother, and since paper was in short supply, she would trace the letters of the alphabet in the ashes and dust by the fireplace.

 The child learned the alphabet and then how to sound out words. Then a book was placed in the child’s hands, usually the Bible.

As many passages were familiar to him, having heard them at church or at family devotions, he would soon master the skill of reading.

The Bible was supplemented by other good books such as Pilgrim’s Progress by John Bunyan, The New England Primer, and Isaac Watt’s Divine Songs.

From volumes like these, our founding fathers and their generation learned the values that laid the foundation for free enterprise.

Armed with love, common sense, and a nearby woodshed, colonial mothers often achieved more than our modern-day elementary schools with their federally-funded programs and education specialists.

These colonial mothers used simple, time-tested methods of instruction mixed with plain, old-fashioned hard work.

 Children were not ruined by educational experiments developed in the ivory towers of academia.

Home education was so common in America that most children knew how to read before they entered school.

“Children were often taught to read at home before they were subjected to the rigors of school. In middle-class families, where the mother would be expected to be literate, this was considered part of her duties.

Without ever spending a dime of tax money, or without ever consulting a host of bureaucrats, psychologists, and specialists, children in early America learned the basic academic skills of reading, writing, and ciphering necessary for getting along in society.

Even in Boston, the capital city of the colony in which the government had the greatest hand, children were taught to read at home.

A Boston bookseller’s stock in 1700 included no less than eleven dozen spellers and sixty-one dozen primers.

The books were bought by parents, and illiteracy was absent because parents taught their children how to read outside of a formal school setting.

Coupled with the vocational skills children learned from their parents, home education met the demands of the free market. For many, formal schooling was simply unnecessary.

The fine education they received at home and on the farm held them in good stead for the rest of their lives, and was supplemented with Bible reading and almanacs like Franklin’s Poor Richard’s.

Some of our forefathers desired more education than they could receive at home. Thus, grammar and secondary schools grew up all along the Atlantic seaboard, particularly near the centers of population, such as Boston and Philadelphia.

In New England, many of these schools were started by colonial governments, but were supported and controlled by the local townspeople.

In the Middle Colonies there was even less government intervention. In Pennsylvania, a compulsory education law was passed in 1683, but it was never strictly enforced.

 Nevertheless, many schools were set up simply as a response to consumer demand. Philadelphia, which by 1776 had become second only to London as the chief city in the British Empire, had a school for every need and interest.

Quakers, Philadelphia’s first inhabitants, laid the foundation for an educational system that still thrives in America. Because of their emphasis on learning, an illiterate Quaker child was a contradiction in terms.

Other religious groups set up schools in the Middle Colonies. The Scottish Presbyterians, the Moravians, the Lutherans, and Anglicans all had their own schools.

In addition to these church-related schools, private schoolmasters, entrepreneurs in their own right, established hundreds of schools.

Historical records, which are by no means complete, reveal that over one hundred and twenty-five private schoolmasters advertised their services in Philadelphia newspapers between 1740 and 1776.

 Instruction was offered in Latin, Greek, mathematics, surveying, navigation, accounting, bookkeeping, science, English, and contemporary foreign languages.

Incompetent and inefficient teachers were soon eliminated, since they were not subsidized by the State or protected by a union.

Teachers who satisfied their customers by providing good services prospered.

One schoolmaster, Andrew Porter, a mathematics teacher, had over one hundred students enrolled in 1776. The fees the students paid enabled him to provide for a family of seven.

Libraries

In addition to formal schooling in elementary and secondary schools, colleges, and universities, early America had many other institutions that made it possible for people to either get an education or supplement their previous training.

An individual who never attended school could receive an excellent education by using libraries, building and consulting his own library, and by joining a society for mutual improvement. In colonial America, all of these were possible.

Consumer demand brought into existence a large number of libraries. Unlike anything in the Old Country, where libraries were open only to scholars, churchmen, or government officials, these libraries were rarely supported by government funds.

The first non-private, non-church libraries in America were maintained by membership fees, called subscriptions or shares, and by gifts of books and money from private benefactors interested in education.

Soon libraries became the objects of private philanthropy, and it became possible for even the poorest citizens to borrow books. Sometimes the membership fee was completely waived for an individual if he showed intellectual promise and character.

The sermon was also an excellent educational experience for our colonial forefathers. Sunday morning was a time to hear the latest news and see old friends and neighbors. But it was also an opportunity for many to sit under a man of God who had spent many hours preparing for a two, three, or even four hour sermon.

Thus, without ever attending a college or seminary, a church-goer in colonial America could gain an intimate knowledge of Bible doctrine, church history, and classical literature.

The first Sunday Schools also developed in this period. Unlike their modern-day counterparts, colonial Sunday Schools not only taught Bible but also the rudiments of reading and writing. These Sunday Schools often catered to the poorest members of society.

Philosophical Societies

Another educational institution that developed in colonial America was the philosophical society. One of the most famous of these was Ben Franklin’s Junto (pronounced (Hun-toe) where men would gather to read and discuss papers they had written on all sorts of topics and issues.

 Another society was called The Literary Republic. This society opened in the bookbindery of George Rineholt in 1764 in Philadelphia. Here, artisans, tradesmen, and common laborers met to discuss logic, jurisprudence, religion, science, and moral philosophy (economics).

Traveling lecturers, rented halls and advertised their lectures in local papers.

By 1776, when America finally declared its independence, a tradition had been established and voluntarism in education was the rule.

Our founding fathers, who had been educated in this tradition, did not think in terms of government-controlled education.

 Accordingly, when the delegates gathered in Philadelphia to write a Constitution for the new nation, education was considered to be outside the jurisdiction of the civil government, particularly the national government.

Madison, in his notes on the Convention, recorded that there was some talk of giving the Federal legislature the power to establish a national university at the future capital. But the proposal was easily defeated because  the Founding Fathers supported the local institutions which had sprung up all over the country.”

 A principle had been established in America that was not to be deviated from until the mid-nineteenth century. Even as late as 1860, there were only 300 public schools, as compared to 6,000 private academies.

The results of colonial America’s free market system of education were impressive indeed.

Almost no tax money was spent on education, yet education was available to almost anyone who wanted it, including the poor.

No government subsidies were given, and inefficient institutions either improved or went out of business.

 Competition guaranteed that scarce educational resources would be allocated properly.

The educational institutions that prospered produced a generation of articulate Americans who could grapple with the complex problems of self-government.

The Federalist Papers, which are seldom read or understood today, even in our universities, were written for and read by the common man.

Literacy rates were as high or higher than they are today.

 A study conducted in 1800 by DuPont de Nemours revealed that only four in a thousand Americans were unable to read and write legibly

 In 1772, Jacob Duche, the Chaplain of Congress, wrote:

The poorest laborer upon the shore of Delaware thinks himself entitled to deliver his sentiments in matters of religion or politics with as much freedom as the gentleman or scholar. Such is the prevailing taste for books of every kind, that almost every man is a reader; and by pronouncing sentence, right or wrong, upon the various publications that come in his way, puts himself upon a level, in point of knowledge, with their several authors.

Ben Franklin, as well, testified to the efficiency of the colonial educational system.

 According to Franklin, the North American libraries alone “have improved the general conversation of Americans, made the common tradesmen and farmers as intelligent as most gentlemen from other countries, and perhaps have contributed in some degree to the stand so generally made throughout the colonies in defense of their privileges.”

The experience of colonial America clearly supports the idea that the market, if allowed to operate freely, could meet the educational needs of modern-day America.

I for one, would like to see education in America return to a free enterprise, market driven, competitive system .

Why? Abraham Lincoln said it best years ago, “The philosophy of the classroom will be the philosophy of the government in the next generation.”

Think about that folks.

Have we Lost Control?

OK Folks, I have two issues that have come to the forefront that have me riled up.

The first is the recent news that the country’s largest teachers union has moved to undermine the left-wing talking point that critical race theory is not taught to children — by voting promote it and arguing it is “reasonable and appropriate” to use CRT in social studies classes.

The National Education Association has approved a plan to “publicize” critical race theory and dedicate a “team of staffers” to assist union members looking to “fight back against anti-CRT rhetoric.”

New Business Item 39 also declares that the union opposes bans on critical race theory and the New York Times’ controversial 1619 Project – which roughly half the U.S. states have already implemented.

Additionally, the resolution calls for the union to “join with Black Lives Matter at School and the Zinn Education Project to call for a rally this year on Oct. 14 — George Floyd’s birthday — as a national day of action to teach lessons about structural racism and oppression.”

The third paragraph pledges to accomplish the following:

“Publicly (through existing media) convey its support for the accurate and honest teaching of social studies topics, including truthful and age-appropriate accountings of unpleasant aspects of American history, such as slavery, and the oppression and discrimination of Indigenous, Black, Brown, and other peoples of color, as well as the continued impact this history has on our current society. The Association will further convey that in teaching these topics, it is reasonable and appropriate for curriculum to be informed by academic frameworks for understanding and interpreting the impact of the past on current society, including critical race theory.”

However despite the approval, a note on the union’s website reads, “This item cannot be accomplished with current staff and resources under the proposed Modified 2021-2022 Strategic Plan and Budget. It would cost an additional $127,600.”

The move comes as districts around the country and liberal pundits have attempted to fend off anti-CRT parents by telling them the curriculum is too complex for K-12 students and is only taught to students graduate-level courses.

That claim is made despite evidence that critical race theory seminars are being offered to teachers and administrators and examples of CRT-themed topics being introduced in some classrooms.

Conservative lawmakers have already secured bans on CRT in roughly two-dozen states, with Iowa going as far as to declare it “discriminatory indoctrination.” And the topic on its own has prompted impassioned public comments at school board meetings around the country.

Critics say it’s a racist philosophy in and of itself that encourages stereotyping and labeling while highlighting divisiveness and anti-American rhetoric rather than unity and the virtues of the founding documents.

The NEA represents more than 2 million members – well over half of the 3.2 million public school teachers the U.S. Department of Education estimated were working in the country last year.

Michael Ruiz is a U.S. and World Reporter for Fox News.

Now for the second issue that really got to me last week.

Historynet.com

Claire Barrett
July 2, 2021

In a 285-120 vote, the House of Representatives voted last Tuesday to remove all Confederate statues from public display in the U.S. Capitol.

The bill, if passed in the Senate, directs the architect of the Capitol to identify and remove all statues and busts that depict members of the Confederacy from public display within 45 days of the resolution’s enactment. The statues would then be returned to the home states, with the option to replace them with another honoree, NPR reports.

In addition to statues that depict or glorify the Confederacy, the bill specifically mentions the removal of former Chief Justice of the United States Roger Taney, author of the landmark 1857 Dred Scott v. Sandford decision that declared no individual of African descent was, or could ever be a U.S. citizen. His statue is to be replaced with one of Thurgood Marshall, the first Black Supreme Court Justice.

“The halls of Congress are the very heart of our democracy,” said House Speaker Nancy Pelosi, D-Calif., ahead of the vote. “The statues that we display should embody our highest ideals as Americans, expressing who we are and who we aspire to be as a nation. Monuments to men, or people who advocated cruelty and barbarism to achieve such a plainly racist end are a grotesque affront to those ideals.”

The removal of the Capitol statues is complicated, however, as Congress has no authority over much of it, writes the New York Times.

A law that originated during the Civil War, declares that each state may send two statues of deceased citizens who were “illustrious for their historic renown or for distinguished civic or military services,” and whom the state considers “worthy of this national commemoration” to be featured in Statuary Hall or elsewhere in the Capitol.

Many of the statues sent by Southern states were erected in the early 20th century during a time in which the narrative of the “Lost Cause” was largely being propagated.

Some Republican lawmakers noted their support for the removal of statues but stated they would vote ‘no’ on the bill due to discontent over the legislative process.

In a statement, Alabama Republican Rep. Mo Brooks denounced “cancel culture and historical revisionism,” before adding, “I support federalism and a state’s right to decide for itself who it should honor. As such, I will proudly vote ‘No’ on H.R. 3005. Alabama, not New Yorkers, Californians, or anyone else, should decide who we wish to honor in Alabama’s contribution to the National Statuary Collection.”

Now folks, we have a big problem. The federal government has moved into areas of our lives in which they have absolutely no authority and as such, have divided our country in ways we haven’t seen since the Civil War.

Good grief folks, have we lost our collective minds?

The Federal government now contends it has the power to determine what we teach our kids and who we can honor from our past?

These are just two of the latest over-reaches by our federal government.

Open borders, increased crime in our cities, voting rights, all are coming under Federal control while we sit back and watch,

So, when did it all change?

I found a great article by Donald A. Loucks, Statesman news network, Statesman.com, posted back on Oct 7, 2016 

There were three things that happened in our government in 1913 that completely changed the way the Federal Government operated, not in a good way. By way of background, it should be noted that Karl Marx published the Communist Manifesto in 1848 and immediately acquired a devout following. It is astounding how much of its content has already been applied to the destruction of the Constitutional Republic that is the United States.

Marx theorized that a country could be toppled into Communism in certain stages, that last stage being Democracy. A quote from Plato comes to mind: “Dictatorship naturally arises out of democracy, and the most aggravated form of tyranny and slavery out of the most extreme liberty.”

So, what to do about a constitutional republic like the United States? We hear our country referred to as a “Democracy” all the time in the news media. When was the last time you heard it referred to as a Republic? That’s my point. We no longer know any better.

So, now let’s get back to 1913. Marx wanted central control of money, a means of confiscating wealth, and, of course, a change in our government making it more “Democratic.” He would have been pleased. He got them all.

First, the Federal Reserve Act established a means for a centralized bank to contract or expand the money supply. Inflating the currency reduces its value, thus taking that value from the citizens as a nearly invisible tax.

Second, the 16th Amendment authorized the implementation of a federal income tax which was sold to the people as only a 1% tax on the very rich.

That has morphed into a monster. Another quote, this one from Daniel Webster: “An unlimited power to tax involves, necessarily, the power to destroy.” What a great means to accomplish the Marxian task of destroying the middle class.

Third, the 17th Amendment changed the method of selecting U.S. Senators to popular election. Few now know that senators were originally appointed by their state’s legislatures.

Senators were essentially ambassadors of the States to the Federal Government. This method protected the government and the citizens from spur of the moment action, fads, and stirred-up urgency by special interest groups and news media (yes, they had newspapers back then.

The hue and cry at the time was that the Senate was “too stodgy” and obstructionist, thus impeding progress. No wonder this amendment was championed by the progressives of the day. Marx would have loved this Amendment. Remember, more democracy (mob rule) equals closer proximity to tyranny.

All the amendments so far were done the old-fashioned way delineated in the first part of Article V of the Constitution where two-thirds of both Houses of Congress proposed and passed a proposed amendment and was then ratified by three-quarters of the States.

Here is an interesting fact, The Constitutional Convention of 1787 was a “runaway.” The delegates did not follow their instructions. Fortunately, what they produced was a masterpiece. A slightly flawed masterpiece, but a masterpiece nonetheless.

Who were these men? Among them were surviving signatories of the Declaration of Independence who originally numbered fifty-six.

Twenty-nine had seminary degrees. Granted, a seminary degree is somewhat different than one now. But how many members of Congress have an education that encompasses even part of that field of study?

It is the character of those in politics today that is our greatest threat.

Think about what Marx said was necessary to achieve communistic tyranny, and think about those in government who may not even have any knowledge about what that might mean. That is what is most frightening.

Of the three changes since 1913, I contend that we lost control with the 16th Amendment. Once the feds had our money, they had the power to rule.

Money raised through income tax is used to pay for the programs, benefits, and services provided by the US government for the benefit of the people.

Essential services such as national defense, food safety inspections, and federal benefit programs including Social Security and Medicare could not exist without the money raised by the federal income tax.

While the federal income tax did not become permanent until 1913, taxes, in some form, have been a part of American history since our earliest days as a nation.

While taxes paid by American colonists to Great Britain were one of the main reasons for the Declaration of Independence and ultimately the Revolutionary War, America’s Founding Fathers knew that our young country would need taxes for essential items such as roads and especially defense.

Providing the framework for taxation, they included procedures for the enactment of tax law legislation in the Constitution. Under Article I, Section 7 of the Constitution, all bills dealing with revenue and taxation must originate in the House of Representatives. Otherwise, they follow the same legislative process as other bills.

Before final ratification of the Constitution in 1788, the federal government lacked the direct power to raise revenue.

Under the Articles of Confederation, money to pay the national debt was paid by the states in proportions to their wealth and at their discretion.

One of the goals of the Constitutional Convention was to ensure that the federal government had the power to levy taxes.

Even after the ratification of the Constitution, most federal government revenues were generated through tariffs — taxes on imported products — and excise taxes — taxes on the sale or use of specific products or transactions.

In 1913, with the costs of  World War I looming, ratification of the 16th Amendment permanently established the income tax. The 16th Amendment states:

“The Congress shall have power to lay and collect taxes on incomes, from whatever source derived, without apportionment among the several States, and without regard to any census or enumeration.”

The 16th Amendment gave Congress the power to tax the incomes of all individuals and the profits of all businesses.

The income tax enables the federal government to maintain the military, construct roads and bridges, enforce the laws and federal regulations, and carry out other duties and programs.

By 1918, government revenue generated from the income tax exceeded $1 billion for the first time and topped $5 billion by 1920.

The introduction of the mandatory withholding tax on employee wages in 1943 increased tax revenue to almost $45 billion by 1945.

In 2010, the IRS collected nearly $1.2 trillion through income tax on individuals and another $226 billion from corporations.

So there it is folks. We can all argue about states’ rights when it comes to critical race theory, tearing down statues, gay marriage, marijuana, gun control, and any number of other issues.

However, as long as the federal government holds control of our tax dollars, there is really nothing we can do.

We either comply with what Washington says, or we risk losing federal funding.

Bottom line, we are being blackmailed with our own money.

Does anyone out there have a solution?

Please call in and let us know this morning.

Is Technology making us dumber?

Article in Psychology Today by, John Elder Robison 

Robinson states, I grew up believing knowledge was something to be treasured. Not anymore. Any fool with a cell phone or a laptop can look up life’s answers at the drop of a hat, provided there’s cell phone service. So where does that leave the knowledgeable geeks of yesterday? I guess what was special has become ordinary, at least on first glance.

What happened? Did the pocket Internet make everyone smarter? Or does it just facilitate snappy comebacks, with a sixty-second web browser delay?

Robinson says, I used to think the Internet was a tide that lifted all boats, knowledge wise, but now I wonder if the opposite is true. I think the Internet and information technology in general makes us dumber, in some key ways.

He goes on to say, “When I was a kid, you had to actually memorize and know the capitals of states if you wanted to talk geography. And you never knew when that might happen.


So what, today’s young people say. The iphone will tell you more about geography in sixty seconds than you could possibly remember. That’s true, but by relying on the computer, we stop training out minds, and we stop filling our memory banks.

By doing so, I believe we diminish our ability to solve life’s problems unaided, and we become more and more dependent on machines.

When the machines give us answers, we seem superficially smarter, but we really are dumber, because we’re not building the networks in our brains to solve a whole host of problems.

Want another example of this? Think navigation. I went my whole life looking at maps and finding my way. I have a long, long history of reaching my destinations, whether on foot, by boat, or by car. I looked at a map, related it to the world around me, and found my way.

All too often, navigation today is handed off to a machine. Many motorists can’t make sense of a basic road map, or estimate the distance between two points on a printed page. They are lost if their machine loses touch with the satellites.

Most of the time, technology works as it should. People get to their destinations faster thanks to computers. But people who rely on machines have given up something vital yet intangible.

They’ve lost the ability to think through a navigation problem themselves. They have become slaves to machines out of intellectual laziness, and the laziness makes them less smart.

The brain wiring that solves navigation problems allows us to solve other problems too. Computers don’t have that flexibility, and neither do we when we abdicate our thinking to machines.

I think this point is lost on many young people today. After all, if they have not developed certain processing abilities in their minds, how can they know what they are missing?

I know, because I see what I lose when I rely on technology and it fails. I think of my frustration when my car gets lost, and I recall all those times when I solved my own problems and found my own way, uneventfully although a bit slower.

For many people, web browsing has replaced book reading. Recent studies suggest that their attention spans are reduced as a result. When we rely on a computer to look up facts, instead of our own memory, the price may not be obvious. But I believe it’s there, and it real.

The Independent, UK Genevieve Roberts , Thursday July 2015

A recent study suggests 90 per cent of us are suffering from digital amnesia. More than 70 per cent of people don’t know their children’s phone numbers by heart, and 49 per cent have not memorized their partner’s number.

While those of us who grew up in a landline-only world may also remember friends’ home numbers from that era, we are unlikely to know their current mobile number, as our phones do the job.

The Kaspersky Lab concludes we don’t commit data to memory because of the “Google Effect” – we’re safe in the knowledge that answers are just a click away, and are happy to treat the web like an extension to our own memory.

Dr Maria Wimber, lecturer at the University of Birmingham’s School of Psychology, worked with an internet security firm on their research. She believes the internet simply changes the way we handle and store information, so the Google Effect “makes us good at remembering where to find a given bit of information, but not necessarily what the information was.

It is likely to be true that we don’t attempt to store information in our own memory to the same degree that we used to, because we know that the internet knows everything.”

These findings echo Columbia University Professor Betsy Sparrow’s research on the Google Effect on memory, which concluded, “Our brains rely on the internet for memory in much the same way they rely on the memory of a friend, family member or co-worker.

We remember less through knowing information itself than by knowing where the information can be found.”

This even extends to photographs. A Fairfield University study in 2003 found that taking photos reduces our memories. Participants were asked to look around a museum, and those who took photos of each object remembered fewer objects and details about them than those who simply observed.

Dr Wimber says: “One could speculate that this extends to personal memories, as constantly looking at the world through the lens of our smartphone camera may result in us trusting our smartphones to store our memories for us. This way, we pay less attention to life itself and become worse at remembering events from our own lives.”

But is this making us more stupid? Anthropologist Dr Genevieve Bell, a vice-president at Intel and director of the company’s Corporate Sensing and Insights Group, believes not.

She says technology “helps us live smarter” as we’re able to access answers.

“Being able to create a well-formed question is an act of intelligence, as you quickly work out what information you want to extract and identify the app to help achieve this. To me, this suggests a level of engagement with the world that’s not about dumbness.”

In contrast, Nicholas Carr, author of The Shallows: How the Internet is Changing the Way We Think, Read and Remember and The Glass Cage: Where Automation is Taking Us, believes we should be alarmed.

“We’re missing the real danger, that human memory is not the same as the memory in a computer: it’s through remembering that we make connections with what we know, what we feel, and this gives rise to personal knowledge.

If we’re not forming rich connections in our own minds, we’re not creating knowledge. Science tells us memory consolidation involves attentiveness: it’s in this process that you form these connections.”

He believes the combination of the Google Effect and the constant distraction of smartphones, constantly delivering information, is concerning. A Microsoft study found the average human attention span fell from 12 seconds in 2000 to eight seconds today.

“There is a superficiality to a lot of our thinking,” Carr says. “Not just the cognitive side, but also the emotional side. That not only reduces richness in one’s own life and sense of self, but if we assume that rich, deep thinking is essential to society then it will have a detrimental effect on that over the long run.

Carr believes our brains are not like hard drives, or refrigerators that can get overstuffed so there’s no more room. In contrast, he says they expand: “It’s not as if remembering and thinking are separate processes. The more things you remember, the more material you have to work on, the more interesting your thoughts are likely to be,” he says.

The greatest leaders in history had one thing in common folks. They all read voraciously.

Abraham Lincoln, Teddy Roosevelt, Winston Churchill and Albert Einstein in the past, just to name a few.

Doesn’t apply today?

What if I told you that Jeff Bezos, Warren Buffet, Reese Witherspoon, Elon Musk, and yes, Lebron James all put down their phones, and take the time to read voraciously? It’s true

So, is reading as opposed to depending on technology the answer?

Here are a few benefits I found in an article I found written by  Jeff Somers on the Barnes and Noble web site.

1. Situational Awareness.
People who read heavily encounter a lot of unexpected situations, if only on the page, and are thus trained to carefully note details on the fly. This skill carries over into real life, where the same attention to detail and anticipation of small mysteries that serves us well while reading complex novels allows us to assess a situation quickly and stay aware of what’s going on around us (even when we’re reading while walking).

2. Forethought.
Books allow us to experience different lifestyles, cultures, historical periods, and points of view—many more than we’d be able to in real life. This in turn allows us to see patterns play out both across history and within the imagination. As the saying goes, history repeats itself. The more you read, the more we’re able to see those same patterns in your daily life, predict their outcomes, and adjust our behavior accordingly.

3. Empathy.
The ability to experience other peoples’ points of view means voracious readers are more empathetic. Because we’ve walked virtually in other people’s shoes, readers have been forced to imagine themselves in various situations—and we know we might be found wanting if something similar happened to us. This understanding of our own frailties makes us more likely to be kind to others.

4. The Ability to be Alone.
Solitude is a super power, especially in the crowded, digitally-linked modern age. People who can tolerate and make good use of alone time are people who have confidence and self-reliance. Reading is a way of training ourselves to value solitude instead of fearing it.

5. A Sense of History.
People who don’t know their history are doomed to repeat it. People who devour books of any kind are bound to pick up a better sense of history than most, with the end result being that they can see patterns, big and small, and have some idea of where those patterns will end. This isn’t just about the “big” patterns of history like fascism or economic cycles, but also the “small” patterns of interpersonal relationships, tolerance of other beliefs and lifestyles, and self-care.

6. The Sherlock Holmes Scan.
Reading teaches us to pay attention. All it takes is to be fooled once by the secret villain in a book, or the hero in disguise, and the reader vows to never be fooled again, and to catch every single detail in future stories. We inevitably bring this skill into our daily lives, and find ourselves assessing the people we meet with a head-to-toe scan straight out of a Sherlock Holmes story.

7. No Screens Necessary.
Readers are blackout-proof. While digital books and ereaders are great tools, someone who simply enjoys reading for reading’s sake is well-equipped to survive blackouts, disasters, zombie apocalypses, and long flights sandwiched in-between strangers in the middle seat. They don’t need electricity, screens, or passwords—they just need the words.

8. Intense Concentration.
Reading, requires—and teaches—concentration. Attention spans elongate, vocabularies improve, and the ability to appreciate subtleties beyond the superficial elements of the plot takes root. Concentration and attention span are increasingly important in a world where fewer people seem to possess them, positioning book nerds to inherit the Earth—or dominate the idiocracy.

9. Time Management.
Reading a lot of books isn’t about passing a few moments before bedtime. It takes discipline and planning to get through so many. The more books we manage to cram into a week, the better we become at managing every moment. That skill translates into careers and home lives—and maps to any resource in need of proper management.

10. Writing.
Finally, bigger readers are better writers. It’s no coincidence that the cornerstone of all advice on improving writing is to read more. Voracious readers become much better writers, if only because we have a much larger source of inspiration—and writing has become even more important in the internet age, which remains reliant on text for information exchange.

So there you have it folks. Call me old fashioned. I have over 1300 books in my library and have read them all. Does that make me smart? Absolutely not.

I learn something new every day. Why? Because of the skill set one gains from reading. That is the key. Pick a book you will enjoy. Read it for it’s entertainment value alone, and guess what? When you have finished, you will have picked up skills that will be of benefit to you every day.

So, back to our original question, is technology making us dumber?

New Leadership in the Middle East

Once again, our national media is leaving us totally in the dark when it comes to world affairs.

All we have heard about this week is a new strain of Covid and a focus on Russia and Vladimir Putin.

Do you remember a few weeks back when we talked about Hamas firing thousands of rockets at Israel?

Do you remember that Iran was supplying those rockets to Hamas?

What if I told you that a huge change in that area took place last week with virtually no news coverage?

Well, here it is. Iran just appointed a new president and Israel just appointed a new Prime Minister.

Do you think anyone would have been interested back in WWII if Hitler and Churchill had been replaced in the middle of the conflict?

Well, let’s look at what just happened.

There are two key players here.

Ebrahim Raisi (Ebra Him Rye See) the new President of Iran and Naftali Bennett the new Prime minister of Israel.

Let’s start with Iran’s new President.

The most powerful figure in Iran is the Ayatollah. There have only been two since the Islamic Revolution in 1979 – Ayatollah Ruhollah Khomeini (the founder of the republic) and his successor, the incumbent Ayatollah Ali Khamenei.

Khomeini created the role at the top of Iran’s political structure after the regime of Shah Mohammad Reza Pahlavi was overthrown.

The Supreme Leader is the commander-in-chief of Iran’s armed forces and controls the security services. He also appoints the head of the judiciary, half of the influential Guardian Council’s members, Friday prayer leaders, and the heads of state television and radio networks.

The Supreme Leader’s multi-billion-dollar charitable foundations also control large swaths of the Iranian economy.

Ayatollah Ali Khamenei became Supreme Leader upon Ruhollah Khomeini’s death in 1989. He has maintained a firm grip on power and suppressed challenges to the ruling system.

The President of Iran is elected for four years and can serve no more than two consecutive terms.

The constitution describes him as the second-highest ranking official in the country. He is head of the executive branch of power and is responsible for ensuring the constitution is implemented.

The president has significant influence over domestic policy and foreign affairs. But it is the Supreme Leader who has the final say on all state matters.

Ebrahim Raisi, the new President, stated Monday that he would not meet with US President Joe Biden under any circumstances, including if Washington met all of Tehran’s demands in the ongoing Vienna talks to revive the 2015 nuclear deal.

Raisi, an ultraconservative cleric and close ally of Iran’s supreme leader, was speaking at his first press conference since winning Friday’s presidential election. He will take over from President Hassan Rouhani in early August.

Asked if he would be willing to meet with Biden to resolve the disputes between the US and Iran if Washington lifted sanctions on Tehran and met Iran’s demands first, Raisi answered with a resounding “no.”

The US and Iran have engaged in indirect talks in Vienna since April to revive the 2015 nuclear deal, which Washington withdrew from under former President Donald Trump in 2018.

Raisi urged Washington to return to the deal and lift all sanctions on Iran. “All sanctions imposed on Iran must be lifted and verified by Tehran,” he said.

Raisi reiterated Iran’s position that its ballistic missile program and support of regional militias are “non-negotiable.”

Asked about his role in Iran’s mass execution of political prisoners in 1988, Raisi described himself as a “defender of human rights.”

“If a prosecutor defends the rights of the people and the security of society, he should be commended and encouraged. I am proud to have defended security wherever I was as a prosecutor,” Raisi, who was Tehran’s deputy prosecutor in 1988, said.

Rights groups say Raisi was a leading member of what came to be known as the “death committee,” a group of Iranian judiciary and intelligence officials put together by then-Supreme Leader Ruhollah Khomeini to oversee the mass execution of thousands of political prisoners in 1988.

Rights groups estimate that as many as 5,000 people were executed.

Raisi was sanctioned by the US in 2019 for human rights abuses, including the 1988 executions.

Rights group Amnesty International said on Saturday Raisi must be investigated for crimes against humanity.

Israel condemned on Sunday the election of  Ebrahim Raisi as Iranian president, saying his would be a “regime of brutal hangmen” with which world powers should not negotiate a new nuclear deal.

The new Israeli Prime Minister, Naftali Bennett, convening his first televised cabinet session since taking office last week, described Raisi’s ascent as enabled by Iranian Supreme Leader Ali Khamenei rather than by a free and popular vote.

“Raisi’s election is, I would say, the last chance for world powers to wake up before returning to the nuclear agreement, and understand who they are doing business with”.

“A regime of brutal hangmen must never be allowed to have weapons of mass-destruction,” he said. “Israel’s position will not change on this.”

Raisi has never publicly addressed allegations around his role in what Washington and rights groups have called the extrajudicial executions of thousands of political prisoners in 1988.

The former US administration of Donald Trump agreed with Israel and quit the Iran Nuclear Deal.

Current President Joe Biden wants a US return to the deal. Iran denies seeking nuclear weaponry.

So who is this new Iranian President?

Yaghoub Fazeli, Al Arabiya News website

Published: 20 June ,2021:

Ultraconservative cleric Ebrahim Raisi won Iran’s presidential election with 61.95 percent of the votes in an election that saw the lowest turnout in the history of the Islamic Republic.

The interior ministry announced the result on Saturday, saying voter turnout was at 48.8 percent, the lowest turnout for a presidential election in the history of the Islamic Republic.

The senior judge will leave his current post as head of the judiciary in early August to replace President Hassan Rouhani.

With all serious rivals barred from running by the Guardian Council – an unelected body that answers to the supreme leader only – his victory came as no surprise.

“For Iranians, the contest was yet another indicator of the irreconcilable gap that exists between the state and society in their country. Behnam Ben Taleblu, an Iran expert and senior fellow at the Foundation for Defense of Democracies, told Al Arabiya English.

Raisi was born in 1960 in the northeastern city of Mashhad into a religious family.

He received a doctorate degree in law and jurisprudence from Mottahari University in Tehran, according to his campaign website.

Raisi has been a key figure in Iran’s judiciary since the early 1980s.

In 1985, Raisi moved to the capital Tehran, where he served as deputy prosecutor.

Other senior positions Raisi served in include deputy chief justice from 2004 until 2014, and attorney-general from 2014 until 2016.

Raisi’s name is tied to Iran’s mass execution of political prisoners in 1988, when he was allegedly a leading member of what came to be known as the “death committee,” a group of Iranian judiciary and intelligence officials put together by then-Supreme Leader Ruhollah Khomeini to oversee the mass execution of thousands of political prisoners at the time.

Most of the victims were leftist activists and members of dissident groups. Rights groups estimate that as many as 5,000 people were executed, while some put the number at 30,000 without offering evidence to support their claim.

Iran has never fully acknowledged the executions, and Raisi himself has never publicly addressed the allegations against him.

In 2019, the United States sanctioned Raisi for human rights abuses, including the 1980s executions.

Rights group Amnesty International said on Saturday Raisi must be investigated for crimes against humanity.

“That Ebrahim Raisi has risen to the presidency instead of being investigated for the crimes against humanity of murder, enforced disappearance and torture, is a grim reminder that impunity reigns supreme in Iran,” Amnesty Secretary General Agnès Callamard said in a statement.

Raisi owes his prominence today to a campaign – seemingly being driven by the highest centers of power in Iran – that has aimed over the past six or so years to portray him as a humble, anti-corruption, and no-nonsense figure.

In 2016, Supreme Leader Ali Khamenei appointed Raisi as the custodian of a multi-billion dollar religious conglomerate encompassing businesses and endowments that oversees the holy Shia shrine of Imam Reza in Mashhad, the home city of both Khamenei and Raisi.

Raisi then ran for president in 2017, losing to Rouhani. But his rise within Iran’s ruling establishment went on uninterrupted. In 2019, Khamenei appointed him head of the judiciary, one of the most senior positions within the Islamic Republic.

What to expect

Iran’s foreign policy is set by the supreme leader, not the president, and is therefore unlikely to undergo major change with Raisi as president.

“Abroad, Raisi is poised to implement Khamenei’s vision. Raisi himself is no visionary, nor does the Iranian president have the power to deviate from a pre-ordained path,” Ben Taleblu said.

The Islamic Republic’s core policies “will largely remain the same” with Raisi in office, Jason Brodsky, a senior Middle East analyst at Iran International TV, told Al Arabiya English.

The United States and Iran have engaged in indirect talks in Vienna for months to revive the 2015 nuclear deal that Washington withdrew from under former President Donald Trump in 2018.

Raisi said during a televised presidential debate earlier this month that he is not opposed to the nuclear deal, and Iran’s top nuclear negotiator said Thursday the presidential election would have no impact on the ongoing negotiations in Vienna.

However, “An Islamic Republic with Raisi at the helm means the mask has come off. Moreover, it means that Iran has less of a compunction about hiding its spots,” Ben Taleblu said.

Now let’s turn to Israel and the new Prime Minister.

— Isabel Kershner New York Times

Who is Naftali Bennett?

Mr. Bennett, 49 years old, is a former Israeli military commando who later co-founded an antifraud software company and made millions of dollars when it was sold.

He is also a former defense and education minister as well as a former aide to former Prime Minister Netanyahu. Born in Haifa to American parents, he is a fluent English speaker, like his former mentor. Mr. Bennett was a commander in an elite Israeli special forces unit, which Mr. Netanyahu also served in at an earlier time.

Mr. Bennett entered the Israeli Parliament, in 2013 as the leader of the Jewish Home party, a religious Zionist party.

He formed Yamina (Ya Mee Nah) in 2018, splitting off from more conservatively religious and even more hawkish Israeli politicians—though he remained in a formal alliance with them until earlier this year. He opposes a Palestinian state and supports annexing parts of the occupied West Bank.

Israel’s Parliament, approved a new coalition government by a single-vote margin on Sunday. The vote, 60 to 59 with one abstention, officially ended the longtime reign of Benjamin Netanyahu, the dominant Israeli politician of the past generation, as the country’s Parliament gave its vote of confidence to a precarious coalition government stitched together by widely different anti-Netanyahu forces.

Israel’s Parliament, the Knesset (kuh nes sit), approved the new government by just a single vote — 60 to 59, with one abstention.

After his supporters cheered the announcement of his election, Naftali Bennett then exchanged a brief handshake with Mr. Netanyahu before walking to the lecturn at the front of the parliamentary chamber and taking the oath of office as prime minister.

Yair Lapid, a centrist leader, is set to take Mr. Bennett’s place after two years, if their government can hold together that long.

They lead an eight-party alliance ranging from left to right, from secular to religious, that agrees on little but a desire to oust Mr. Netanyahu, the longest-serving leader in the country’s history, and to end Israel’s lengthy political gridlock.

In a speech made before the confidence vote, Mr. Bennett hailed his unlikely coalition as an essential antidote to an intractable stalemate.

“We stopped the train before the abyss,” Mr. Bennett said. “The time has come for different leaders, from all parts of the people, to stop, to stop this madness.”

Before and after the fragile new government was announced, Mr. Netanyahu and his right-wing allies labored hard to break it before it could take office.

They applied intense pressure on right-wing opposition lawmakers, urging them to peel away from their leaders and refuse to support a coalition that includes centrists, leftists and even a small Arab Islamist party.

It was a watershed moment for politics in Israel, where Mr. Netanyahu, 71, had served as prime minister for a total of 15 years, including the last 12 years uninterrupted.

But given Mr. Netanyahu’s record as a shrewd political operator who has defied many previous predictions of his political demise, few Israelis are writing off his career.

Even out of government and standing trial on corruption charges, he remains a formidable force who will likely try to drive wedges between the coalition parties. He remains the leader of the parliamentary opposition and a cagey tactician, with a sizable following and powerful allies.

Israel has held four inconclusive elections in two years and has gone much of that time without a state budget, fueling disgust among voters with the nation’s politics. Sound familiar folks?

No one was able to cobble together a Knesset majority after the first two contests, and the third produced an unwieldy right-center coalition that collapsed after months in recriminations.

The new coalition proposes to set aside some of the toughest issues and focus on rebuilding the economy. But it remains to be seen whether the new government will avoid another gridlock or crumble under its own contradictions.

Some of its factions hope to see movement away from the social policies that favored the ultra-Orthodox minority, whose parties were allied with Mr. Netanyahu. But Mr. Bennett’s party, which has a partly religious base, is wary of alienating the Haredim (Har ay deem), as the ultra-Orthodox are known in Hebrew.

Supporters also hope for a return to a long tradition of Israel cultivating bipartisan support in the United States.

Mr. Netanyahu had grown more aligned with Republicans and was embraced by Donald J. Trump, the former president. It was uncertain where relations would go under President Biden.

Naftali Bennett, as Israel’s new prime minister, has insisted that there must never be a full-fledged Palestinian state and that Israel should annex much of the occupied West Bank.

He leads a precarious coalition that spans Israel’s fractious political spectrum from left to right and includes a small Arab party — much of which opposes his ideas on settlement and annexation. That coalition proposes to paper over its differences on Israeli-Palestinian relations by focusing on domestic matters.

Mr. Bennett has explained his motives for teaming up with such ideological opposites as an act of last resort to end the political impasse that has paralyzed Israel.

“The political crisis in Israel is unprecedented on a global level,” he said in a televised speech on Sunday. “We could end up with fifth, sixth, even 10th elections, dismantling the walls of the country, brick by brick, until our house falls in on us. Or we can stop the madness and take responsibility.”

So there you have it folks.

On one side you have the Iranians solidifying their complete control of the country and being led by tyrants who will kill anyone who opposes them.

On the flip side you have Israel who ousted a powerful leader and replaced him with a shaky coalition of opposing political parties.

I think I know where this is headed. What do you think?

Missouri Reparations?

Recently, President Biden traveled to Tulsa, Oklahoma and gave a speech commemorating the Tulsa Race Riot of 1921.

In his speech he stated, “We can’t just choose to learn what we want to know and not what we should know. We should know the good, the bad, everything.  That’s what great nations do: They come to terms with their dark sides.  And we’re a great nation.

On Thursday, April 11, 1861, Confederate General P.G.T. Beauregard, gave the order to open fire on Fort Sumter, a federal fort situated at the mouth of Charleston Harbor in South Carolina. This engagement triggered the beginning of the Civil War.

The casualties of this war took a higher percentage of the American population than any war we have ever fought. The problem was, no matter who died, Union or Confederate, an American died.

President Abraham Lincoln immediately called on the states to supply 30,000 troops. The Governor of Missouri received this request, and this was his response.

“Mr. President, your request is illegal, unconstitutional, revolutionary, inhuman, diabolical, and cannot be complied with. Not a man will the state of Missouri furnish to carry out such an unholy crusade!”

So, how did we get to this position? Let me tell you a story.

I’m told that’s what I do best. Tell stories. So, let’s start with this rogue Governor.

Claiborne Fox Jackson, was a democrat in the Missouri House of Representatives, prior to being elected Governor. An interesting fact about the Governor is that he had married the daughter of Dr. John Sappington of Fayette. At the time, the governor was a young man with a successful mercantile business there in Fayette, Missouri.

Unfortunately, Jackson’s wife died of a fever. So, he married the second daughter of Dr. Sappington. Several years later, Jackson’s second wife also died of a fever. So, what did he do? You guessed it. He went back and married Dr. Sappington’s third daughter! Dr. Sappington is said to have stated at the wedding, “Don’t come back again, ‘cause you can’t have my wife!

Jackson was elected Governor in August of 1860 and ran on a platform that stated he would keep Missouri in the Union in the event of Civil War. He won by a margin of 139,000 to 17,500.

So, it was clear where the majority of Missourian stood. Stay in the Union… Or so it appeared.

In the presidential election at the same time, there were 4 candidates.

Abraham Lincoln who was opposed to slavery.

John Bell who wanted to preserve the Union at all costs even if it meant slavery continued in the South.

John Breckenridge who was pro-slavery.

Stephen Douglas who pushed for popular sovereignty (Let the people of the state decide)

So, you had clear platforms on which to vote in the presidential election of 1860.

In Missouri, Stephen Douglas (popular sovereignty) took first.

John Bell (save the Union) took second. Breckenridge (pro-slavery) took third and Abraham Lincoln (opposed slavery) came in dead last.

As you can see, it is hard to tell exactly where Missouri stood. We elected a Governor who would keep us in the Union, but we voted for a president who would let the people of the state decide and rejected Lincoln’s anti slavery position.

Here is a key point. Slavery was the trigger to the Civil War, but the true underlying issue was state’s rights. Does the Federal Government have the right to tell a state what it can and cannot do? The Supreme Court had ruled that a slave is property, and the US Constitution guarantees citizens the right to life, liberty, and property.

If an issue is not specifically called out in the US Constitution, the states have jurisdiction. So, does the federal government have the right to tell a state they can’t have slaves (property)?

Don’t get me wrong. Slavery was an abominable practice. However, at that time, a slave was simply property and if the federal government could take your slaves, what would be next? My horse, my cattle?

As soon as Governor Jackson was elected he stated in his inaugural address “We owe it to our southern brethren to come to the aid of the South.” In other words, Jackson had tricked the people of Missouri and was a closet secessionist!

He now called on the people of Missouri to meet in Jefferson City and to bring whatever arms they had to form a militia.

By May 3, 1861, 800 members of Jackson’s militia had set up camp in north St. Louis in preparation to take the federal arsenal. However, St. Louis was dominated by a powerful pro-union politician, Frank Blair.

Mr. Blair now called upon the local union military commander, Captain Nathaniel Lyon, and the gentlemen now recruited 4 regiments of union soldiers made up of local St. Louis citizens, primarily of German descent.

On May 10, 1861, Captain Lyon’s 3000 union forces marched north and surrounded the 800 Missouri militia men forcing them to surrender. The union forces then marched their prisoners south, through the streets of St. Louis, headed for the Gratiot Street prison. The citizens of St. Louis now lined the streets watching as the Missouri militia was forced at bayonet point toward their destination.

This did not sit well with the people of St. Louis. It is one thing to say you will stay in the union. It is something altogether different when the federal government is taking control of state authority.

As a result, people along the route of the forced march now started shouting and throwing stones at the union forces.

More and more people now joined in and Captain Lyon, fearing he would lose control of his prisoners, now gave the order to fire into the crowd. That’s right. Federal troops shooting Missouri citizens!

When the smoke cleared, 15 unarmed St. Louis citizens lay dead, including women and children. This became known as the St. Louis Massacre.

As word spread throughout the state of what had happened, many people who had sat on the fence now sided with the pro-south Governor.

On June 11, 1861, Governor Jackson and Sterling Price decided to travel to St. Louis to discuss the situation with Captain Lyon and Frank Blair. They met for four hours. Governor Jackson said he would stay in the union provided all federal troops are removed from the state of Missouri. To Frank Blair this was totally unacceptable, and the discussion turned into a shouting match.

Captain Lyon now stated, “If I have to kill every man, woman, and child in the state of Missouri to keep it in the Union I will do so!” He then turned to his second in command and said “This means war. Escort these men out of my lines!”

Think about that. The Federal government is telling our governor that he must do what Washington says or they will kill every Missourian!

This being the case, on October 30, 1861, the Missourians under Price and Jackson formally joined the Confederate cause by holding a meeting of the state legislature in Neosho, Missouri. There, they passed a resolution voting in favor of secession and named Jackson, Governor of the Confederate State of Missouri. There is still question today as to whether this was legal since all members of the legislature were not present for the vote.

Governor Jackson headed south with the politicians and established a Missouri state government in exile in Marshall, Texas.

On August 30, 1861, Union General Freemont had declared martial law in Missouri, making the Union military, judge, jury, and executioner throughout the state.

All civil authority was now suspended and handed over to military courts. He also sent 40,000 Union soldiers into the state to hunt down and kill anyone supporting the Confederacy.

This is where the people of Missouri now rose up and formed guerilla bands to fight the federal occupation of our state.

Guess what. Our resistance fighters were so successful that the Federal government gave the Union military permission to make war on the citizens of Missouri.

The reason being that the Military couldn’t defeat the guerillas militarily, so they figured they must be getting their support from the people of Missouri.

We are now under martial law and the military, as judge, jury and executioner has the full backing of the federal government.

In other words, all of your rights under the constitution have been suspended.

The military now went after the citizens through a series of what were known as General orders. Here are a few.

General Order # 32 Anyone caught in the act of sabotage will be immediately shot. No quarter, no trial. By Union General Halleck dec 21 1861.

General Order #19 “every able-bodied man capable of bearing arms and subject to military duty is hereby ordered to repair without delay to the nearest military post and report for duty. (so, you can’t be neutral, If you don’t have a Union uniform. You must be a guerilla)

Another clause of General Order #19 stated that to arm the military, The Union forces had the right to sieze all guns.This offered an excellent excuse for Union forces to enter private homes and take what they wanted.

Shortly thereafter, on August 12, 1862, General Schofield issued General Order #9 which stated that while the Union army was in the field, they could help themselves to any supplies they needed from any citizens they felt were not loyal to the union.

Brigadier General Thomas Ewing now becomes Commander of Union forces in the western half of Missouri.

He too decided that the guerillas couldn’t be defeated as long as the citizens kept helping them, so he now went after all the citizens of Missouri whether you were Union, pro south, or neutral. It didn’t matter.

He started by arresting and imprisoning the wives, moms, and sisters of the guerrillas. They were rounded up and put in makeshift jails in KC.

Gen Ewing soon realized he didn’t have enough jail space for all of them, so he now proposed the removal of all guerilla families from Missouri.

On August 14, 1863 an old building on Grand avenue in KC, being used as a prison, collapsed. Inside were the relatives of Quabtrill’s men.

On the same day the prison collapsed, General Ewing issued General Order # 10 stating “The wives and children of known guerrillas and also women who are heads of families and are willfully engaged in aiding guerrillas will be notified by such officers to remove out of this district and out of the state of Missouri forthwith.”

This was the final straw for Quantrell and his men. They now mounted up and headed for Lawrence Kansas and conducted the now famous sack of Lawrence on August 21, 1863.

All the male citizens of Lawrence were killed and the town was burned to the ground.

In response, General Thomas Ewing now issued General Order # 11. On August 25, 1863 which called for:

The forced removal of all Missouri citizens in Jackson, Bates, Cass, and ½ of Vernon county. Again. You could be the wife of a Union soldier at home with your kids. It didn’t matter, you are a Missourian.

You have 15 days to get out or you will be shot.

For hundreds of miles every home, barn and structure was then burned to the ground and all the fields set afire. For years after the war, these 4 counties were known as the burnt district.

General Order #23 had been established back in December 1862. To implement martial law. It created a Provost Martial General in St. Louis, and District Provost Martilas throughout the state.

Provost Marshall had complete authority to arrest and imprison people at their will.

Provost Marshalls now came up with a system of loyalty oaths. You had to swear and oath of loyalty to the Union and post $1000 bond.

Don’t have the money? No problem, we will take the deed to your home and hold it as bond.

Now it is simply the Provost Mashall’s word vs. yours if you are loyal to the union and you could lose everything.

In April 1863 The KC Journal stated that the Provost Marshall held bonds totaling over 27 million dollars.

If you didn’t take the oath, you were arrested and imprisoned.

If you broke the oath, you were shot.

Remember, no jury trial, no representation.

Some entire towns vanished because everyone was arrested.

In June of 1863 General Schofield issued general orders stating that for every union soldier killed, $5000 would be assessed and collected from the people living in the community where the death occurred.

Opposition to the union cause by utterance or through the press was forbidden between 1861 and 1865. Orders were sent out that all newspapers had to be sent to the military for inspection prior to publication and all newspaper editors were to take an iron clad oath of loyalty to the US. So much for freedom of the press.

On Sept 17, 1863 General Schofield issued General Order # 96 that basically stated that it was against the law to incite rebellion through published material and if you are found guilty of doing so, it was punishable by fine and imprisonment and the paper will be shut down.

So, between 1861 and 1865 in Missouri, we saw our federal government impose martial law, establish military commissions, arrest and imprison people at will, seize their property and their guns, banish people from the state and eliminate the people’s right to free speech.

President Biden, you say you are in favor of reparations for former slaves and the victims of the Oklahoma Race Riots? While you are doling out money, please include the families of the St. Louis Massacre, and the descendants of all Missouri families who suffered under the atrocities of martial law in Missouri between 1861 and 1865.

I will finish with your quote Mr. President, “We can’t just choose to learn what we want to know and not what we should know. We should know the good, the bad, everything.  That’s what great nations do: They come to terms with their dark sides.

Look at Missouri history if you are looking for a dark side Mr. President.

Socialism. Why is it different worldwide?

In theory, socialism and communism sound good, with everyone doing their share and working together to provide for the greater good.

No crime, no poverty, free education, everyone has a job. Everyone has food to eat. Everyone has a place to live.

Each utilizes a planned production schedule to ensure the needs of all community members are met.

In a communist society everything is owned by the working class and everyone works toward the same communal goal. There are no wealthy and poor classes. Instead, all are equal.

Production from the community is distributed based upon need, not by effort or amount of work.

It is expected that basic needs for each worker are met by the community, and there is no more to be obtained through working more than what is required.

For example, if a worker puts in more time at work, he sees no additional reward.

The worker receives the same government benefits and rations of food and clothing  as before.

Therefore, this type of economy often results in poor production, mass poverty and little advancement. This occurred in the 1980s to the Soviet Union when poverty became so widespread, and rebellions and revolutions caused that nation to collapse.

Socialism shares similarities to communism but to a lesser extreme.

As in communism, equality, or as we hear now, equity is the main focus.

Instead of the workers owning the facilities and tools for production, workers are paid and allowed to spend their wages as they choose, while the governing body owns and operates the means of production for the benefit of the working class.

Each worker is provided with necessities, so he/she is able to produce without worry for their basic needs. Still, advancement and production are limited because there is no incentive to achieve more.

Without motivation to succeed, such as the ability to own an income-producing business, workers’ human instincts tell them to do the minimum. There are no rewards for working harder than your neighbor.

Both communism and socialism are opposites of capitalism, with no private ownership and class equality.

In capitalism, reward comes naturally without limitation to workers who work harder than their neighbor.

When there is profit, the owner can freely keep it, and he has no obligations to share his spoils with anyone else.

A capitalist environment facilitates competition, and the result is unlimited advancement opportunity.

In modern society, many countries have adopted pieces of socialism into their economic and political policies.

For example, in the United Kingdom, markets are allowed to fluctuate rather freely, and workers have unlimited earning potential based on their work. However, basic needs like healthcare are provided to everyone regardless of time or effort in their work.

The welfare programs like food stamps in the United States are also forms of socialist policies that fit into an otherwise capitalist society.

So, socialism and communism both involve ceding, to the state, control over the distribution of goods and services for the masses.

This involves giving up individual rights and giving the state a good measure of control over our personal lives. This road always leads to tyranny, no matter what you pave it with, and no matter what you name it.

The historical fact is that the government in these systems, and their leaders, eventually take control of everything that’s produced—medicine, education, housing, food, transportation, etc.

The government then bureaucratically rations out—as they see fit—the means of human survival. In the end, you’ve basically got an elite corps of people with the power to decide which folks are more equal than others.

Socialism also has a way of producing bloated bureaucracies that in turn produce ever greater poverty. Along the way, this produces even more corruption and cronyism than we see today in our own government.

Censorship becomes the norm because dissent cannot be tolerated or the system would collapse.

The more than 100 million victims of communism/socialism worldwide show just how slippery a slope this is.

Now here is a theory I have.

Socialism/communism is like a religion. Think about it.

How may religions see the bible as the governing document of their faith?

All can agree to certain teachings of the bible. Even Mohammed stated that we are all people of “the book”. So did Martin Luther, and the Mormons, just to name a few.

Yet somehow, with millions of people studying the teachings of the bible, we now find ourselves with thousands of religions based on the same book.

When you look at socialism and communism, you can see a similar phenomenon.

Both communism and socialism arose in response to the Industrial Revolution, during which capitalist factory owners grew extremely wealthy by exploiting their workers.

Early in the industrial period, workers toiled under horrendously difficult and unsafe conditions.

They might work 12 or 14 hours per day, six days per week, without meal breaks.

Workers included children as young as six, who were valued because their small hands and nimble fingers could get inside the machinery to repair it or clear blockages.

The factories often were poorly lit and had no ventilation systems, and dangerous or poorly-designed machinery all too frequently maimed or killed the workers.

In reaction to these horrible conditions within capitalism, German theorists Karl Marx (1818-1883) and Friedrich Engels (1820-1895) created the alternative economic and political system called communism.

In their books, The Condition of the Working Class in England, The Communist Manifesto, and Das Kapital, Marx and Engels decried the abuse of workers in the capitalist system, and laid out a utopian alternative.

Under communism, none of the “means of production” – factories, land, etc. are owned by individuals. Instead, the government controls the means of production, and all of the people work together.

The wealth produced is shared out among the people based on their needs, rather than on their contribution to the work.

The result, in theory, is a classless society where everything is public, rather than private, property.

In order to achieve this communist workers’ paradise, the capitalist system must be destroyed through violent revolution.

Marx and Engels believed that industrial workers (the “proletariat”) would rise up around the world and overthrow the middle class (the “bourgeoisie”).

Once the communist system was established, even government would cease to be necessary, as everyone toiled together for the common good.

The theory of socialism, while similar in many ways to communism, is less extreme and more flexible.  Communism is simply socialism at the end of a barrel of a gun.

For example, although government control of the means of production is one possible solution, socialism also allows for workers’ cooperative groups to control a factory or farm together.

Rather than crushing capitalism and overthrowing the middle class, socialist theory allows for the more gradual reform of capitalism through legal and political processes, such as the election of socialists to national office.

Thus, while communism requires the violent overthrow of the established political order, socialism can work within the political structure.

So let’s go back to my theory. If socialist/communists agree with the teachings of the Communist Mainifesto, why do we see so many variations of communism/socialism throughout the world?

For the same reason we have Baptists, Catholics, Protestants, Mormons, etc all using the same book, but practicing their faith differently.

Christians look at other Christians and agree on the book, they just don’t agree on the interpretation of what it says. In their eyes, the other religion is “just not doing it right”.

Same thing is true with those who follow the teachings of Marx and Engels.

Socialists/communists look at other socialists/communists and say, “You’re just not doing it right”.

Bear in mind, socialism was adopted as a party platform in France, Spain, Germany, Italy, Russia, Yugoslavia, England, and yes, even the United State in the 20th century.

Lets look at the history.

The first two years after Stalin’s death, reflected the insecurities and lack of unity among the new leaders of the Soviet Union. Nikita Khrushchev, the new leader of the USSR, later noted that Stalin had told his successors, “When I’m gone, the Imperialist powers will wring your necks like chickens.”

As such, the successors were on the defensive from the day they started.  Anybody who rose up to threaten them were dealt with severely. 

In July of 1953, when the Armistice was signed in the Korean War, Moscow agreed to provide financial backing, technicians, and technology for more than 100 construction projects in Communist China.  In a display of increasing respect for the Chinese, Khrushchev traveled to Beijing in 1954, the first of many foreign trips Khrushchev made as party leader. 

The most significant turn in Soviet Foreign Policy that year; however, was regarding Yugoslavia and its leader Tito, who Stalin claimed was no better than the United States and its President. 

The problem was Tito was pursuing his own brand of socialism.  In May 1955, Khrushchev led a delegation to Belgrade to talk to Tito. 

Khrushchev now hoped to get Yugoslavia to join the Soviet block.  In Khrushchev’s visit with Tito, he agreed that each socialist country had the right to determine its own form of socialism. 

Folks this is where it all started.

Kruschev said, your thoughts on socialism are good Tito, you’re just not doing it right.

In Khrushchev’s previous speech to the Party Congress, he stated that the break with Yugoslavia in 1948 was Stalin’s fault and a big mistake.

Despite these concessions, Tito remained committed to his own socialist path and had no desire to become just another country in the Soviet Block.

In Poland and Hungary, hearing of Khrushchev’s visit with Tito, these countries now launched an effort to pursue socialism in their own way. They thought maybe Kruschev “wasn’t doing it right”.

These were unintended consequences of Khrushchev’s attempt to get Tito on board. 

Just like with Yugoslavia, and Tito, the Soviet Union and China did not agree on the true teachings of Marx and Lenin.  China did not agree with Khrushchev’s continuing criticism of Stalinism.

So now China says, “Russia isn’t doing it right”. 

Mao Ze Dong wanted Soviet help in modernizing China and developing a nuclear force. Khrushchev was not willing to give China nuclear weapons and was unwilling to risk war with the United States. 

The summer of 1960 Khrushchev ended Soviet economic and technical aid to China and recalled all Soviet experts in the country.

Khrushchev now called Mao “Adventurous” and a “Racist”.  The Chinese referred to Khrushchev as a “Buffoon”.

So China now takes the lead in spreading their versions of communism/socialism throughout Asia, which included, Vietnam Cambodia, and North Korea.

It is interesting to note that the Japanese are mainly given credit as the catalyst for the spread of Communist thought in China, because Japan invaded China on July 7tth of 1937.

During the years this war was being fought, puppet governments supported by the Communist Party were set up in rural villages.

Peasants supported these governments because not only did they give them a say, but the governments “provided self-defense, education agricultural cooperation, support for full-time guerillas, and other needs of the villages”.

Basically these local institutions taught peasants the meaning of government, especially during times of war.

In addition to teaching government, the mass movements endorsed by the Communist Party sparked “the feeling of belonging and of having a stake in government.

This was entirely new to the Chinese masses; and it brought with it an exhilarating sense of self-determination.  

Although the same philosophy guiding the Russian revolution guided China, differences of opinion existed from the beginning. These grew over time, eventually leading to a major political rift known as the Sino-Soviet split.

The early Communist Party in China adhered closely to Russian political philosophy. However, Mao Zedong, a founding member of the Chinese Communist Party, disagreed with the concept of a workers’ revolution in China.

Reasoning that the majority of the Chinese population were peasants, Mao refocused the goal of Chinese communism toward the concept of a peasant revolution.

China restructured its government and eliminated much of its culture during two movements known as the Great Leap Forward and the Cultural Revolution.

During the Great Leap Forward, which took place from 1958 to 1961, the government took land from peasants, who were then organized into farming cooperatives, a policy that ultimately led to famine.

The Cultural Revolution was a state-sponsored eradication of traditional Chinese culture that took place from 1966 to 1976 and resulted in the destruction of temples and schools as well as the murder of people associated with traditional values.

After Mao’s death, China restructured its government, and shifted to a system known as market socialism, which differed from the USSR in its reliance on a free market. This system is still in place in China, which now has one of the world’s strongest economies.

So, while Communist China did have an immoral leader, Mao tse-tung, the Communist Party was able to adapt to the times by putting economic reform before political reform. Ultimately this historically brilliant move led by Deng Xiaoping was arguably what kept the Communist Party in rule and will do so for many years to come.

So there you have it folks.

Is the current push we are seeing for socialism in our country simply that our leaders in Washington think the system works and that all previous leaders who attempted it, simply, “Weren’t doing it right?”