Printing Money

A few weeks back, a listener asked that I address what happens if we just print money to pay off the debt we have created fighting the virus.
Great question. Let’s look at history for the answer.
At the end of WWI England and France were faced with having to pay back the us billions of dollars they had borrowed during the war.
So, the allies now decided that Germany should pay for all damaged caused by the war and this became a key issue in the peace settlement known as the Treaty of Versailles.
France estimated the amount at 200 billion.
Needless to say Germany said no way and said they shouldn’t have to pay anything.
Eventually they did offer to pay 7 billion.
Germany said that until the allies got off their back and let their economy recover, they wouldn’t be able to pay any war debts.
Part of the problem here is that the US was demanding that France and England pay off on their debts. They owed us $10.35 billion.
As such France and England were putting the screws to Germany.
The Treaty of Versailles stipulated that Germany had to pay the allies 5 billion dollars a year for 5 years, then a balloon note to be determined later.
This proved devastating to the German economy.
The Germans now offered to pay 15 billion but again the Allies refused the offer and said it was an insult.
The French now sent troops into the Ruhr valley in 1923 to force the Germans to work and hand over all profits from their businesses.
Germany now resorted to passive resistance and told the workers to simply stay home.
German inflation now went through the roof. The only way the Germans could pay their war debts was to simply print money.
Now here is my concern. We just spent $4 trillion dollars in the Covid 19 bailout. We don’t have that kind of money. So what can we do?
I am afraid we will follow in Germany’s footsteps and simply print more money. This could trigger an economic disaster. Don’t believe me? Let’s look at what happened to Germany.
Printing money following WWI devalued the German Mark and triggered hyper inflation.
In 1922 the German mark was rated at 4.2 to the dollar (like our quarter).
By the end of the year it was at 7000 to the dollar.
When the French occupied the Ruhr in 1923 and the workers went on strike, inflation rose to the rate of one trillion marks to the dollar!
Let me put it in simple terms.
In 1914, before World War I, a loaf of bread in Germany cost 13 cents. Two years later it was 19 cents, and by 1919, after the war, that same loaf was 26 cents – doubling the prewar price in five years.
Now the German government started printing money to pay its war debt. A German loaf of bread now jumped to $1.20. By mid-1922, it was $3.50. Just six months later, a loaf cost $700, and by the spring of 1923 it was $1,200.
As of September, it cost $2 million to buy a loaf of bread. One month later, it cost $670 million, and the month after that $3 billion. Within weeks it was $100 billion, at which point the German mark completely collapsed.
The whole time the German government kept printing more money, so much so that people burned it in their fireplaces because it was cheaper than wood.

The only thing that saved Germany was a program developed by an American banker named Charles Dawes. The US could see that we would never get our money until Germany could recover and start paying back their debts to France and England. So we stepped in, told France and England to back off of Germany, and we then helped rebuild Germany following WWI.
The plan was proposed by the Dawes Committee on April 9, 1924, and accepted by the Allied and German Governments on August 30, 1924.
Knowing what happens when you simply print money to pay your debts, the US had developed a system to control runaway inflation.
The Federal Reserve System.
The Progressives in Congress, under President Woodrow Wilson, passed the Federal Reserve Act on December 23rd, 1913.
1. They set up 12 regional banking districts, each with a federal reserve bank.
2. The federal reserve banks were owned by the member banks of the federal reserve system. (all national banks were required to join)
3. Member banks had to subscribe 6% of their capital to the federal reserve bank in their region.
4. Federal reserve banks would the use this capital to back federal reserve notes (Dollars. Take a look at one, it says “Federal Reserve Note”)
Now this is over simplified, but here is how the system works.
The government prints money. It distributes it thru the federal reserve banks. The government basically loans the money to the federal reserve bank and charges them interest. Let’s say 2%. This is what is referred to as the prime rate.
The federal reserve bank then has cash to loan to the member bank (your local banks. The Federal reserve bank tacks on their percentage. Let say another 2%. We are now at 4%.
So now you want to buy a car. You go to the local back to get a loan. Your local bank needs to make a profit, so they tack on another 2%. So you get a loan and pay 6%.
Now, I am not an economist, and like I said, this is an oversimplification, but it basically shows how the feds control the economy.
If the interest rate is low, more people buy stuff. Cars, appliances, houses, etc. Businesses now raise their prices because people have money to spend. In other words, inflation. This is how the feds can pump money into the economy.
If interest rates are high, people tighten their belts and don’t buy new cars, appliances, and houses. Businesses now have to lower their prices to get people to buy their stuff. Deflation.
Now bear in mind. you took out a loan for that new car when interest rates were low. Now they are high, but you are still paying on that 5 year loan.
Where does that money go? To your local bank who then has to pay back the federal reserve bank. This is how the feds can pull money out of the system.
See how this works? By setting the prime rate, The Federal Reserve can control the economy and basically decide if they want to makes prices for everything you buy, either higher or lower. Money in, money out.
So who runs this mess? The Federal Reserve Board.
The Secretary of the Treasury, and 7 people appointed by the President.
Bet you will pay attention, next time they talk about the prime rate.

Florence Nightingale

History.com Editors
Florence Nightingale (1820-1910), known as “The Lady With the Lamp,” was a British nurse known as the founder of modern nursing.
Her experiences as a nurse during the Crimean War were source of her views about sanitation and her efforts to reform healthcare greatly influenced the quality of care in the 19th and 20th centuries.
She was born on May 12, 1820, in Florence, Italy to Frances Nightingale and William Shore Nightingale. She was the younger of two children.
Nightingale’s affluent British family belonged to elite social circles. Her mother, Frances, was from a family of merchants and took pride in socializing with people of prominent social standing.
Despite her mother’s interest in social climbing, Florence herself was reportedly awkward in social situations. She preferred to avoid being the center of attention whenever possible.
She was however, strong-willed and often butted heads with her mother, whom she viewed as overly controlling.
Florence’s father was William Shore Nightingale, a wealthy landowner who had inherited two estates—one at Lea Hurst, Derbyshire, and the other in Hampshire, Embley Park—when Florence was five years old.
Florence was raised on the family estate at Lea Hurst, where her father provided her with a classical education, including studies in German, French and Italian.
From a very young age, Florence Nightingale was active in helping others, ministering to the ill and poor people in the village neighboring her family’s estate.
By the time she was 16 years old, it was clear to her that nursing was her calling. She believed it to be her divine purpose.
When Nightingale approached her parents and told them about her ambitions to become a nurse, they were not pleased.
In fact, her parents forbade her to pursue nursing. During the Victorian Era, a young lady of Nightingale’s social stature was expected to marry a man of means—not take up a job that was viewed as lowly menial labor by the upper social classes.
When Nightingale was 17 years old, she refused a marriage proposal from a “suitable” gentleman.
Determined to pursue her true calling despite her parents’ objections, in 1844, Nightingale enrolled as a nursing student at the Lutheran Hospital of Pastor Fliedner in Kaiserwerth, Germany.
In the early 1850s, Nightingale returned to London, where she took a nursing job in a Middlesex hospital for ailing governesses. Her performance there so impressed her employer that Nightingale was promoted to superintendent within just a year of being hired.
The position proved challenging as Nightingale grappled with a cholera outbreak and unsanitary conditions conducive to the rapid spread of the disease.
Nightingale made it her mission to improve hygiene practices, significantly lowering the death rate at the hospital in the process. The hard work took a toll on her health. She had just barely recovered when the biggest challenge of her nursing career presented itself.
In October of 1853, the Crimean War broke out. The British Empire was at war against the Russian Empire for control of the Ottoman Empire (The Turks). Thousands of British soldiers were sent to the Black Sea, where supplies quickly dwindled. By 1854, 18,000 soldiers had been admitted into military hospitals.
At the time, there were no female nurses stationed at hospitals in the Crimea. The poor reputation of past female nurses had led the war office to avoid hiring more.

But, after the Battle of Alma, England was in an uproar about the neglect of their ill and injured soldiers, who not only lacked sufficient medical attention due to hospitals being horribly understaffed, but also languished in appallingly unsanitary and inhumane conditions.
In late 1854, Nightingale received a letter from Secretary of War Sidney Herbert, asking her to organize a corps of nurses to tend to the sick and fallen soldiers in the Crimea.
She quickly assembled a team of 34 nurses from a variety of religious orders and sailed with them to the Crimea just a few days later.
Although they had been warned of the terrible conditions there, nothing could have prepared Nightingale and her nurses for what they saw when they arrived at Scutari, the British base hospital in Constantinople.
The hospital sat on top of a large cesspool, which contaminated the water and the hospital building itself. Patients lay on in their own excrement on stretchers strewn throughout the hallways. Rodents and bugs scurried past them.
The most basic supplies, such as bandages and soap, grew increasingly scarce as the number of ill and wounded steadily increased. Even water needed to be rationed. More soldiers were dying from infectious diseases like typhoid and cholera than from injuries incurred in battle.
So Nightingale quickly set to work. She procured hundreds of scrub brushes and asked the least ill patients to scrub the inside of the hospital from floor to ceiling.
Nightingale herself spent every waking minute caring for the soldiers. In the evenings she moved through the dark hallways carrying a lamp while making her rounds, ministering to patient after patient. The soldiers, who were both moved and comforted by her endless supply of compassion, took to calling her “the Lady with the Lamp.” Others simply called her “the Angel of the Crimea.” Her work reduced the hospital’s death rate by two-thirds.
In additional to vastly improving the sanitary conditions of the hospital, Nightingale created a number of patient services that contributed to improving the quality of their hospital stay.
She instituted the creation of an “invalid’s kitchen” where descent food for patients with special dietary requirements was cooked.
She established a laundry so that patients would have clean linens. She also instituted a classroom and a library for patients’ intellectual stimulation and entertainment.
Based on her observations in the Crimea, Nightingale wrote Notes on Matters Affecting the Health, Efficiency and Hospital Administration of the British Army, an 830-page report analyzing her experience and proposing reforms for other military hospitals operating under poor conditions.
The book sparked a total restructuring of the War Office’s administrative department, including the establishment of a Royal Commission for the Health of the Army in 1857.
Thanks to Nightingale, nursing was no longer frowned upon by the upper classes; it had, in fact, come to be viewed as an honorable vocation.
Nightingale remained at Scutari for a year and a half. She left in the summer of 1856, once the Crimean conflict was over, and returned to her childhood home at Lea Hurst.
To her surprise she was met with a hero’s welcome, which the humble nurse did her best to avoid. The previous year, Queen Victoria had rewarded Nightingale’s work by presenting her with an engraved brooch that came to be known as the “Nightingale Jewel” and by granting her a prize of $250,000 from the British government.
Nightingale decided to use the money to further her cause. In 1860, she funded the establishment of St. Thomas’ Hospital, and within it, the Nightingale Training School for Nurses. Nightingale became a figure of public admiration. Young women aspired to be like her.
Eager to follow her example, even women from the wealthy upper classes started enrolling at the training school. Thanks to Nightingale, nursing was no longer frowned upon by the upper classes; it had, in fact, come to be viewed as an honorable vocation.
Unfortunately, while at Scutari, Nightingale had contracted “Crimean fever” and would never fully recover.
By the time she was 38 years old, she was homebound and bedridden, and would be so for the remainder of her life.
However, fiercely determined and dedicated as ever to improving health care and alleviating patients’ suffering, Nightingale continued her work from her bed.
She remained an authority and advocate of health care reform, interviewing politicians and welcoming distinguished visitors from her bed.
In 1859, she published Notes on Hospitals, which focused on how to properly run civilian hospitals.
Throughout the U.S. Civil War, she was frequently consulted about how to best manage field hospitals. Nightingale also served as an authority on public sanitation issues in India for both the military and civilians, although she had never been to India herself.
In 1908, at the age of 88, she was conferred the merit of honor by King Edward. In May of 1910, she received a congratulatory message from King George on her 90th birthday.
In August 1910, Florence Nightingale fell ill, but seemed to recover and was reportedly in good spirits. A week later, on the evening of Friday, August 12, 1910, she developed an array of troubling symptoms and died unexpectedly at 2 p.m. the following day, Saturday, August 13, 1910, at her home in London.
One of the most famous figures in medical history, the Florence Nightingale’s groundbreaking achievements in hygiene, sanitation, and the nursing profession, helped revolutionize modern medicine.

Online or In Class?

Today I chose a topic dear to my heart, education.
I have a grave concern that one of the impacts of the corona virus will be the acceleration of a trend that I feel is destroying our system of education, not only here in the US, but worldwide.
I am referring to online education.
My position is not a popular one, but I hope you will take the time to hear me out as to why I feel classes taught in a classroom by a real live teacher are hands down a much better alternative than the online approach exploding in our primary and secondary school systems as well as in our colleges and universities.
I found an interesting article by an educator in India named Sudhanshu Sinhal who agrees with my position and he provides some valids points in support of tradional classroon teaching.
1. Promotes collaborative learning
Basically, classroom environment is essential to promote and stimulate collaborative learning. Collaborative learning increases a student’s self-awareness about how other students learn and enables them to learn more easily and effectively, transforming them into keen learners inside and outside the classroom.
2. Enhances critical thinking skills
It enhances students’ critical thinking skills. Teaching in a classroom gives students the opportunity to engage in live discussions in which they are forced to use their critical thinking skills to formulate opinions or arguments.
3. Improves social skills
Inside a classroom, students experience social interactions with peers and establish rapport with teachers. Helping children develop socially is an important aspect within the realm of their academic education.

4. Builds organizational skills
Classroom teaching teaches students how to develop organizational skills, beginning with the basics, such as arriving to school on time. In a live classroom, students are held accountable for being prepared to do school work, which includes having done their homework the night before, being ready for pop quizzes, turning in assignments by their due date and being prepared for in-class discussions. In effect, students learn how to organize their time, prioritize their assignments and get their homework done.
5. Keeps students stimulated
The physical presence of a teacher keeps students stimulated through interactive and interesting activities. This enables students to retain more from what they have learned during a session.
6. Teaching style can be modified according to the student’s issues
Teachers can modify their teaching style based on types of learners in their classroom i.e. classroom activities can help visual learner, interactions can help auditory learners, etc. Teachers can get a clear idea whether students are following what has been taught or they require further explanation. At the same time, students can get their doubts clarified immediately before moving ahead in a topic.
7. Develops important personality and career building skills
Classroom teaching develops conflict resolving skills, presentation skills when it comes to presenting their ideas confidently in front of peers, it also develops team spirit and teaches them to get along with those from different cultural backgrounds. Such experiences are valuable in shaping students’ communication and listening skills, as well as growing and maturing emotionally.
These points show that although India is moving towards online education in large numbers like the rest of the world, classroom teaching has certain plus points that online teaching simply cannot replace.

In my research I also found a great article written by a Mr. Seth Hughes on the website Owlcation
Owlcation is a great site created by educators and experts on topics related to education
In the article, Mr. Hughes states that online courses are becoming more and more popular. The ability to relax at home and use our own personal computers to obtain college credit is taking more and more students out of classrooms and putting them online.
Mr. Hughes seems to agree with my position as well and makes several good point in support of traditional classroom learning over online courses.
Lack of Interpersonal Skill Development
Online courses usually require no face to face interaction with classmates or teachers. Information on assignments is posted online and may be completed at leisure without having to attend class meetings.
While the leisure of this concept is nice, it takes away from the interpersonal skills that students need to acquire along with their education.
When in a classroom, students may be required to speak their mind. They may be required to give presentations or speeches. They will have to work in groups with all kinds of people with many differing viewpoints. Online courses require none of that.
Businesses often tell university faculty that they wish graduating students had more interpersonal skills. They say it is crucial to success in their careers. Traditional style learning teaches these things.
Now obviously if businesses are telling universities they wish these skills were more evident, there is room for improvement. Still, online courses are not the answer. If anything, online courses will only hurt a student’s ability to speak and interact with others in a way that will help them in their life and career.
When students are required to interact with classmates and professors, they gain confidence in their ability to speak and interact with others. It grants them the opportunity to learn how to carry themselves in a professional manner. Because online learning can’t do this, it’s value to a student is significantly lower.
Lack of Memory and Learning Development
Why do many students sign up for online courses? Well, one reason is of course the fact that they do not have to attend an actual class and can learn at home.
A more serious, unspoken reason however may be the fact that online learning does not require the student to study or memorize material in the way that traditional learning does.
Students taking a test or quiz online do not have to worry about a professor catching them cheating. Students have the ability to use a book and quickly look up answers online.
Even while many tests are timed and many professors don’t mind the use of a book on certain things, is this really the way a student should learn?
When someone does not have to study and memorize material, it does not embed in his or her long term memory the way it does when they do study it. This is a serious flaw with online courses. It does not promote memory development.
Students may not realize it while enrolled in a difficult course, but they will better appreciate the education they receive in the classroom if they are required to put in more effort in obtaining it.
A child does not learn how to spell by looking up words in a dictionary, he memorizes the words using flashcards or something similar. This is what enables him to actually learn.

Lack of Student Motivation
One problem with online classes is that all too often, they motivate us to get a degree, but not to learn.
Having debates and in class discussions with multiple professors who all have unique personalities motivates students to develop their own opinions. It motivates them to want to voice their opinions. If a student is scared to voice his or her opinion, the classroom is the place he will practice overcoming that fear, not online.
When students receive face to face verbal feedback and constructive criticism from their professors, it instills in them a motivation to not want to let their professors down. It is the relationships and bonds that are formed that give face to face learning an advantage over online learning.
Motivation is a skill that cannot be developed when students are allowed to complete task at their own leisure. They may get the work done, but this does not teach a student how to complete a task under the pressure of time.
During their careers, they will need to complete tasks in a certain amount of time on a whim when they are told. If a professor assigns a paper at the start of class and requires it be handed in at the end, that is good preparation for a career. Reading about a similar assignment online that is due in six days doesn’t teach this.

A recent seminar consisting of 85 companies was held in Ireland. Business leaders from these companies were asked which competencies they most wanted to see from graduates.
The two highest percentage answers chosen were teamwork and communication.”
Key skills you learn in the classroom.
Finally, and most important, the greatest argument I can make for traditional classroom learning is academic freedom.
Prior to the advent of online courses, teachers had tremendous academic freedom.
They chose the text to be used, wrote the lesson plans, and developed the testing methodology.
This being the case, it was fairly easy for school administrators to determine good teachers from bad. Who had mastered their subject material and who had not. Who was capable of delivering the lesson plans and had the ability to engage the students on a personal level and evaluate their progress.
Not so with online learning.
First comes the issue of choosing a text. As I said, it used to be up to the instructor. Not so with online learning. That decision is now made by the administration, in many cases without any input from the instructors.
Think about that. One of the last courses I taught was American History I. The required text started off by saying our country was founded by white, racist, slave holding imperialists.
Back when instructors had a say, they had the option of choosing another text. Not so with online learning.
The classes are all standardized and what the lesson plan says is what the instructor must teach.
In essence, the teachers have now become nothing more than facilitators when it comes to online learning.
Bear in mind, what is taught is what is tested. So regardless of what the instructor thinks, the students must learn what is presented online in the lesson plan.
Now here is the rub. What if the instructor decides he/she doesn’t agree with what is being taught or the way it is being presented?
They can make their feelings known to the administration, but in nearly every case the instructor is told, this is the way it will be done or you can look for a job elsewhere.
This is a huge problem. Book publishers have now jumped on board with the online learning craze and now cut huge contracts with schools and universities.
These contracts obligate the schools to use certain publishers exclusively. So, if the school wants to push a certain agenda, such as the benefits of socialism, climate change, revisionist history, etc, they work with the publisher to develop a text and online courses to address those issues.
Again, in most cases, faculty have little or no say in course development and once the course is rolled out, it becomes the standard that all students and instructors must follow.
Let me ask you listeners. Did you have a favorite teacher in grade school, high school, or college? What made them your favorite. I will bet you that what made them great can’t be found online.
How many of you talked to other students about the teachers and found out who were the good ones and who to avoid? That choice as well is gone with online learning.
Still don’t believe me?
I found one final article that supports what I just told you. It was written by Jonathan Rees, Professor of History at Colorado State University — Pueblo. He is co-author of Education Is Not an App: The Political Economy of University Teaching in the Internet Age.
As Professor Rees points out, no single individual can produce a filmed class for tens of thousands of people all by themselves. As a result, people other than the professor whose name is on the course end up having a disproportionate impact on how faculty operate their classes — certainly much more than they might in ordinary face-to-face classes.
For example, Karen Head, an assistant professor at Georgia Tech, wrote about her online teaching experience for the Chronicle of Higher Education. At one point she noted, “Even with our team of 19, we still needed several other people to provide support.” This kind of intervention would simply not be economical or feasible if the instruction wasn’t occurring online in front of tens of thousands of students.
In ordinary face-to-face classes, an instructor can safely dismiss the advice of instructional designers and other consultants. When tens of thousands of dollars are needed just to get your course off the ground, faculty have to give up some control almost by definition.
Professors who resist this kind of interference with their instructional prerogatives risk having their courses get outflanked by online equivalents.
This has already happened at the University of Oklahoma where an online US history course that is co-branded with the History Channel offers the same credit available to people who take a similar course in Norman with living, breathing faculty members.
Unfortunately, as Jennifer Davis, a tenured professor in the History Department there, reported in the comments of the Chronicle blog post linked to the above article which announced this program, “This course was created with zero input from our department. From the little information I have gleaned from press reports, it fails to meet the basic requirements for a general education course.
There are no exams, very little reading, and a total [of] 6–10 pages in writing assignments.” Lose your students to these kinds of online programs, and your university may eventually have little use for you.
There are some things that need to remain the same as time goes on and societies evolve. Education is one of them. While there is an argument that online learning allows people to learn at their own pace, it is still not as valuable as the education obtained from a traditional style classroom.
My good friend and mentor, Dr. John Keeney once told me as we discussed the push for online learning, “The traditional system of students in a classroom with a teacher has worked for 2000 years. Why would we want to change that?”
Keep in mind, this system is only true if the teachers and professors are good at what they do and if the student is willing to learn. If not, then regardless of the method, nothing will be learned.
It just does not make sense to pay for online education when you gain more skill sets and knowledge with traditional style learning (which is typically less expensive).
Now, many people cannot attend regular class meetings and therefore must utilize online learning. This is the case much of the time with graduate degrees. But for the person who is able to choose online or traditional education courses, the decision should be clear.
A student must ask himself, “Am I going to college to develop life skills and learn, or to get a diploma?”
Unfortunately, with the current pressure on students to obtain a degree in order to get a job, many students are forced to put the diploma in front of the education.

Colonial Disease

Last week I was talking to my 97 year old mother about our current crisis in dealing with the Corona Virus.
She pointed out, that her generation had dealt with terrible outbreaks of disease and yet they somehow managed to survive.
Think about it. They survived, polio, tuberculosis, scarlet fever and many other terrible disease in their day and did not have the medical expertise or facilities we now have available.
In my research I found a fascinating site. The Tully (NY) Area Historical Society News & Databases. So today I thought I would share what I found.
For many of us deadly diseases such as typhoid, smallpox, cholera, yellow fever, measles, and polio are all diseases of the distant past. They have been eradicated or vaccinations have been found.
However, our ancestors all lived with the threat of disease.
Disease was common in the large cities of Europe, from where most early colonizers came.
The Black Death, or Bubonic Plague, of Medieval Europe was still in recent memory for most immigrants. It was just accepted that disease and premature death was part of life.
Europeans brought many Old World diseases like smallpox, influenza, and measles to the New World. Many of these settlers had survived outbreaks in their homelands and developed immunities.
However, the native American Indians had no previous exposure and diseases decimated their populations. With their numbers weakened, Native Americas were unable to stop European immigrants from settling on their lands. With Native American and European troops marching over colonial America, diseases traveled swiftly from colony to colony, country to country.
It was disease, not the rifle that wiped out the American Indian population.

So let’s look at a few of the diseases that wreaked havoc not only on the native Americans, but all of our forefathers who traved to America.
The first disease that we will examine is smallpox. The disease was called smallpox because its victims often had pox marks on the face or body
In her book, Pox Americana, Elizabeth Fenn describes the 32-day timeline of the smallpox virus. She calls it a parasitic virus that consumes its host.
Introduced by either touch or inhalation, Smallpox usually incubates in its new victim for 10-12 days before the first symptoms present themselves.
At first, the victim may feel as if they have the flu with symptoms such as backache, headache, fever, and nausea. At this stage, the victim can contaminate others, yet not even know they are seriously infected.
On about the 15th day a rash and eruptions appear on the skin. They eventually burst, then scab over. Sometimes victims developed internal sores that would cause them to bleed to death from various body openings, such as the eyes and ears. The entire process is full of immense pain and suffering.
By day 30, if the victim is still alive, he or she is no longer contagious, though often scarred for life. The very young and the very old were the largest group of victims.
In one study that Fenn recorded, there was a 29% mortality rate for children under one year of age, 8% for children 5-14 years, and 32% for adults over 45 years.
The mortality rate was highest for pregnant women: 96%. Surprisingly enough, smallpox could survive for long periods outside the human body. A blanket used on a victim, unwashed and stored away for years, could still contain an active virus. You can see why smallpox was so feared in North American colonies.
Large cities usually quarantined ships with known carriers of disease. Eventually, quarantine laws were enacted. Sometimes a house was chosen outside of town in which infected individuals lived out their quarantine.
Whole communities were often quarantined. As Fenn points out, these tactics sometimes worked. For example, in 1721, 900 of the 10,700 citizens of Boston fled to the countryside to escape the smallpox. The problem with such isolation methods was that those who fled the city often carried the disease into the countryside with them, thus spreading, rather than containing, the disease.
Fenn gives a conservative estimate of 130,658 deaths from smallpox during the years 1775-82. And not only were many lives lost, but schools, businesses, and governments all virtually shut down during smallpox epidemics. Life came to a complete and deathly halt during these Colonial epidemics.
Eventually Dr. Edward Jenner developed a vaccine for smallpox in 1796. Today the World Health Organization considers the disease extinct.
Small pox is considered by many to be the most horrifying epidemic of colonial times because of the terrible suffering it inflicted, the high mortality rates, and disfigurement it left behind.
However, John Duffy claims, in his book Epidemics in Colonial America, that respiratory diseases weakened and eventually killed more colonists than small pox .
Also, the endemic diseases of malaria and dysentery were common in colonial times. These diseases did not always kill people; often they were weakened enough that a mild outbreak of influenza or measles would finish them off.
Children and older adults were often the first victims in any epidemic. Along with age, poor diet and hygiene also contributed to the high mortality rates in colonial North America.
Now on to the next major diseases.
A disease is considered what they call endemic if it is localized and recurs year after year.
Malaria and dysentery were endemic for the entire colonial period.
Malaria was also known as “ague”, “quartan ague”, “tertian ague”, and “the Kentish disorder” .
Passed to humans by mosquitoes, the parasites that cause malaria led to chills and fever, vomiting, and other flu-like symptoms. Eventually, victims either die because their red-blood cells are destroyed and anemia results or their capillaries, which lead to the brain or vital organs, are clogged . Either way, malaria was a horrible disease.
Specific colonies-New Jersey, Pennsylvania, New York, and Delaware-were more prone to malaria because of their climate and the resident mosquitoes.
The mosquito’s role in transmitting the disease was not discovered until the twentieth century. Colonials believed in the miasmic theory; that breathing air near stagnant water made one sick . The term malaria itself comes from the Italian words mala and aria, which mean “bad air” .
Malaria was so common that outbreaks often were not recorded. Indeed, Duffy says that for the colonists, “the spring and fall flare ups of malaria were as inevitable as the seasons themselves”.
In his opinion, malaria was “directly or indirectly one of the most fatal of colonial diseases”. Colonists who survived malaria often developed some immunity and resistance, but often they were physically weakened and more susceptible to other diseases or another malaria outbreak.
Newly arrived immigrants were often among the fatalities from the diseases. Indeed, Duffy claims that malaria was more widespread and affected more people than small pox or yellow fever and calls it a “major hurdle in the development of the American colonies”.

Charleston, which was often hard-hit with other diseases, was never really bothered by malaria. Duffy attributes this to the city’s close proximity to salt water, an inhospitable climate for mosquitoes .
New England, while troubled by malaria outbreaks during the early colonial days, had almost eradicated the disease by the American Revolution. However, outbreaks became more severe in the other colonies in the late 1700s.

Another terrible disease to afflict our forefathers was dysentery.
Like malaria, dysentery had devastating effects on colonists. Also known as “camp fever” and “camp disorder” the disease was spread by either a bacteria, parasitic worms, or through feces-contaminated water and food, flies, or handling of food in unsanitary conditions.
A soldiers’ camp or immigrant ship would be just such a place for dysentery to start.
Those with the disease usually died from dehydration because of severe diarrhea. Children were especially susceptible to the disease.
While today we have a better understanding of the importance of cleanliness and greater availability of antibiotics, in colonial times people just had to endure and let the disease run its course.
Some people suffered for long periods before dying while others suffered for short amounts of time; there was no timeline of infection and recovery with dysentery like with small pox.
Unfortunately, like malaria, dysentery often left its survivors sufficiently weakened to be susceptible to other diseases.
Yellow Fever
John Duffy calls it a “strange and unaccountable pestilence that brought death in a horrible fashion to its victims”. Known also as “Bilious Plague” and “Black Vomit” it was transmitted by mosquitoes and often arrived from the West Indies.
The mosquitoes are thought to have survived in buckets or containers of stagnant water on board ship; and once the mosquitoes hit port cities with temperate climates-like Charleston-the yellow fever quickly spread.
The virus transmitted by the mosquitoes had a quick on set and left its victims tired, feverish, jaundiced, and they often would hemorrhage.
Duffy says it “was one of the most dreaded diseases in the affected regions [and had it] adapted itself to all areas in the British American colonies, yellow fever would undoubtedly have ranked with small pox as a leading cause of death”.
Between the years of 1760 and 1793, Duffy states that yellow fever disappeared from the colonies. He credits this to stronger and better-enforced quarantine laws.
Diphtheria and Scarlet Fever
In a time when almost half the children under age five died, diphtheria and scarlet fever were major causes of childhood death. The symptoms of both diseases were so similar that they were not differentiated between until the late 1800s.
Sore throats, major symptoms for both diphtheria and scarlet fever, were common and often unattributable to a specific disease. Duffy believes diphtheria was present in the early colonial days but “it did not attract the attention of the medical profession until the occurrence of a huge outbreak in New England during 1735-36”.
Caused by bacteria, a membrane would form in the throat, which would swell and often led to suffocation.
The bacterium was contagious and “transmitted from person to person by droplet infection produced from respiratory secretions”. Until a vaccination was developed in the 1920’s, the only solution was a tracheotomy that would bypass the membrane and allow the victim to breathe freely.
Families often lost half and sometimes all of their children to the disease.
In the epidemic of 1735-36, there were over 1,000 deaths. And 90% of the deaths in New Hampshire were of children under the age of 10.
By 1740, over 1,000 people had died in Connecticut alone. Such statistics led Noah Webster to call diphtheria “‘the plague among children'”.
Scarlet Fever
Like diphtheria, many of this disease’s victims were small children. Spread by close contact, streptococcus leads to a high fever, vomiting, sore throat, enlarged tonsils, and a red or “scarlet” rash .
In severe cases, the throat enlarges and the victim suffocates similarly to diphtheria. Even though Duffy believes scarlet fever was first identified in 1675, it was often confused with small pox, measles, and diphtheria .
It was called a “frontier disease” because it did not start in large urban centers like other diseases.
Typhus
While there are a variety of typhus fevers, the most common and fatal in colonial times was Rickettsia prowazeki .
Also known as “jail fever”, “prison fever”, “military fever”, “hospital fever”, “camp fever”, “putrid fever”, “ship fever”, and “spotted fever” it was transmitted through the feces of body lice.
Duffy tells us typhus was more common in the filthy, poor, and overcrowded urban European communities than in the American colonies “where land was cheap and the economy predominantly agricultural”.
Therefore, the disease was never endemic and only reached epidemic proportions either when large immigrant groups arrived in port from Europe or during wartime. There were outbreaks of the disease during battles of the American Revolutionary War. Besides these few outbreaks, typhus was rarely present in colonial America.
When typhus first presents itself, the symptoms are similar to influenza and include headache, fever, chills, and weakening of the limbs. Eventually a rash develops and covers the body.
Victims often become “deranged with fever…and death becomes a welcome release”. The best prevention for typhus is adequate sanitation. Pesticides are also used to kill the lice and prevent the spread of the disease.
Now the one we are probably most familiar with is Typhoid Fever.
Caused by poor sanitation, infected water, overcrowding, and war, typhoid fever was transmitted by the bacteria salmonella typhi in feces, vomit, and urine.
Experts were not sure when typhoid fever first appeared in the colonies, but the “burning fevers” of the Jamestown settlers in 1607 may have been typhoid fever .
Typhoid was “one of the first epidemics to occur in the United States [and] took place in Virginia between 1607-1624. Approximately 85% of the arrivals in James River died from disease which began as a typhoid epidemic”.
Typhus was also known as “slow fever”, “nervous fever”, “continued fever”, “burning fever”, “long fever”, and “bilious fever”.
Some of the early symptoms of typhus were fever, overall pain, loss of appetite, headache, muscle pain, cough, and restlessness. Later delirium and severe constipation and dehydration occur, which lead to cardiac arrest and death. The disease runs its course in one to three weeks.
While the epidemics of typhus in Colonial times were not as frequent as small pox and yellow fever, Duffy tells us typhus has a “high rank among destructive sicknesses…[due to] the increasing extent of the sickness and the number of deaths” from 1730-1770 .
With increased sanitation, the epidemics began to subside. Today, along with proper sanitation, the typhoid vaccine, first developed in 1896, has eradicated the disease from most developed nations.
So there you have it folks. Is the corona virus terrible? You bet. Is it the worst disease America has ever had to deal with? Maybe not.

Lister, Pasteur, and Morton

According to School History, a website that provides history teaching modules for instructors in the United Kingdom, Joseph Lister was an English surgeon, the first to provide a solution to the problem of wound infection following surgical operations.
He was born on April 5, 1827, the fourth of seven children, in Upton, a village near London. His father, Joseph Jackson Lister, was a prosperous Quaker merchant.
Joseph Lister received his B.A., and then in 1852, his medical degree from the University College of London. He then became assistant to a leading surgeon. Prof. James Syme of the University of Edinburgh.
In 1856 he married Syme’s daughter, Agnes, giving up his religion to do so. It was a very happy marriage, although they were disappointed in not having any children.
In 1860, Lister was appointed Regius Professor at the University of Glasgow. There he found the mortality following surgical operations even higher than in Edinburgh.
At that time surgery was a last resort because of ‘surgical diseases’ which would frequently kill all the patients in a hospital ward. These diseases were usually blamed on gases which presumably hovered about the hospital and caused wounds to rot.
As a student, Lister had examined dangerous material under a microscope, suspecting that something in the wound rather than in the atmosphere caused the disease.
This, along with his subsequent work on the contraction of arteries was related to the subject of his first important scientific work, published in 1857 and entitled ‘An Essay on the Early Stages of Inflammation’.
These studies enabled him to understand a paper by Louis Pasteur which he read in 1865. It proved, among other things that microbes cause decay. Lister applied this theory to wound infection.
He used carbolic acid to kill the germs in several cases of compound fractures, which generally became infected and required amputation. His approach was successful.
In 1867 he published ‘On the Antiseptic Principle in the Practice of Surgery’. His method was not rapidly adopted, mainly due to opposition to the germ theory. However, despite the controversy, his successes and his perseverance could not be ignored.
Within a few years, antiseptic surgery put an end to surgical diseases; new operations could be performed. Modern scientific surgery was born. Later, the antiseptic method was replaced by the aseptic method, the emphasis being shifted from killing germs to keeping them from wounds.
Lister became interested in Pasteur’s work in 1864 in Glasgow, when he came into contact with the microbiologist’s works ‘On the organized bodies which exist in the atmosphere’, published 1861; and in ‘Investigation into the role attributable to atmospheric gas’ (1863).
At the time, Lister was working at the Glasgow Royal Infirmary and was struck by the amount of people that died following surgical procedures. In fact, people were more susceptible to death because of the ‘cross-infections’ present in the hospitals.
Lister became more and more interested in Pasteur’s work, and started to carry out experiments in order to find out whether he could cure infections caused by germs with antiseptics. In his successful attempts, Lister realized that the study of microorganisms and surgery go hand in hand, since microorganisms can definitely affect the human body and the immune system.
Thanks to Pasteur’s work ‘On the organized bodies which exist in the atmosphere’, Lister came to the conclusion that air in itself is not poisonous: rather, it is the microscopic particles in the air and the minute germs that give a specific quality to the air.

Moreover, Lister was particularly intrigued by a statement made by Pasteur in the scientific magazine Anals of Natural Science (in March and April 1865), in which the French microbiologist drew an analogy between fermentation and the processes of infection.
Lister came to the conclusion that germs could definitely affect and poison the human tissue.
Throughout his experiments, Lister had carefully reproduced Pasteur’s experiments and had made a contribution that sought to eliminate hospital infections.
So who was this Louis Pasteur guy?
Louis Pasteur was born in the market town of Dole in eastern France on December 27, 1822.
His father was Jean-Joseph Pasteur, a decorated former sergeant major in Napoleon Bonaparte’s army, who now worked as a tanner. His mother was Jeanne-Etiennette Roqui. Loius had an older sister and two younger sisters.
When Louis was four years old his family moved to the nearby town of Arbois. He started school at age eight at the École Primaire Arbois – it was actually a single room in the town hall. He could already read, having been taught by his father.
In 1844, at the age of 21, he entered the École Normale Supérieure, a teachers college in Paris.
In 1845 he earned his science degree. Fortunately, a chemistry professor at the college by the name of Antoine Jérome Balard had liked what he’d seen of Pasteur.
Balard was an eminent scientist, famed for his discovery of the element bromine in 1826. He offered Pasteur work as a chemistry graduate assistant along with the opportunity to carry out research for a doctorate.

According to his biography by the Science History Institute, in1857 Louis Pasteur took a position with the École Normale college as director of scientific studies.
In the modest laboratory that he was permitted to establish there, he continued his study of fermentation.
His studies centered around various applications of his pasteurization process, which he originally invented and patented (in 1865) to fight the “diseases” of wine.
He realized that these were caused by unwanted microorganisms that could be destroyed by heating wine to a temperature between 60° and 100°C. The process was later extended to all sorts of other spoilable substances, such as milk.
At the same time Pasteur began his fermentation studies, he adopted a related view on the cause of diseases. He and a minority of other scientists believed that diseases arose from the activities of microorganisms—germ theory.
Opponents believed that diseases, particularly major killer diseases, arose in the first instance from a weakness or imbalance in the internal state and quality of the afflicted individual.
Pasteur disagreed and wanted to move into the more difficult areas of human disease. He looked for a disease that afflicts both animals and humans so that most of his experiments could be done on animals, although here too he had strong reservations.
Rabies, the disease he chose, had long terrified the populace, even though it was in fact quite rare in humans. Up to the time of Pasteur’s vaccine, a common treatment for a bite by a rabid animal had been cauterization with a red-hot iron in hopes of destroying the unknown cause of the disease, which almost always developed anyway after a typically long incubation period.
As a child, his village in rural France had been terrorized by rabid wolves.
A church bell would ring and everyone would lock themselves inside until the wolves were killed of simply left.
Most human victims of rabies died a painful death and the disease appeared to be getting more and more common in France.
Pasteur vowed that one day he would find a cure for the disease, so when he finally became a famous scientist with access to a lab, he spent his free time in the evenings, trying to find a cure.
Though he could not identify the germ, he did find that the rabies germ attacked the nervous system only after it had made its way to the brain.
He traced the germ to the brain and spinal cord of infected animals and by using dried spinal cords, he produced a vaccine for rabies.
It is interesting to note that one of the biggest problems he had was finding and keeping a live rabies virus long enough to experiment on it before its host died.
He solved this problem by infecting rabbits, then dogs.
As the rabid animal began to decline, he would introduce a healthy animal to the cage, and it would then be infected.
Now what is interesting was that he needed to examine the spinal cord of a rabies infected live dog. He could not bring himself to do this, even though he suspected it would prove his theory that this was a neurological virus.
He told his assistant that he could not bring himself to operate on the dog and when Pasteur left for the evening, the assistant took it upon himself to do the operation.
The next morning he told Pasteur, who was horrified. However, the samples taken proved the theory.
The first vaccine was now tried out on animals.
Pasteur injected ‘clean’ animals with the rabies germ found in a spinal cord that was fourteen days old. At this age, the germ was relatively weak and unlikely to threaten the life of the animals.
He then used spinal cords that were thirteen days old, twelve days etc. on the animals until they were injected with the most virulent germ found in infected spinal cord that was fresh.
All of the dogs survived this. But Pasteur faced a serious problem. What worked on animals might not work on humans.
In 1885, a young boy, Joseph Meister, who lived in the village Pasteur grew up in, had been bitten by a rabid dog, and was brought to Pasteur by his mother who was hysterical with grief.
They had come by carriage to Paris as fast as they could, but a lot of time had passed.
The mother had heard of Pasteur’s experiments and begged him to try it on her dying son.
The boy almost certainly would have died an agonising death if nothing was done so Pasteur took the risk of using his untested vaccine.
Pasteur then gave the boy a series of 13 shots over 13 days with each shot introducing a stronger form of the rabies virus.
The boy survived and Pasteur knew that he had found a vaccine for rabies.

How about one final medical breakthrough?
William TG Morton was a Boston dentist who experimented trying to discover a painkiller in 1846.
Martin hired an assistant named Spears who he used as a guinea pig.
Morton came up with Ethyl Ether which was a mixture of grain alcohol and sulfuric acid.
He would have Spears sniff the concoction that he put in a bowl until he got high.
Spears was the town drunk.
Morton was afraid he would kill Spears so he now experimented on his dog.
He increase the dose to his dog each time until the dog passed out. The dog survived but Morton was afraid he’d kill his dog so he now experimented on himself.
The first time he tried it he was out for eight minutes.
He now created a chart by sitting with a stopwatch inhaling more and more of the ether and recording how long he was out.
On September 30, 1846 a patient with a toothache came to Martin’s office begging that it tooth be removed.
Martin told the man about his experiment and the guy was willing to try it.
Morton gave him the ether and the man passed out. Morton now removed the tooth.
He looked at the chart and saw the man should wake up in five minutes. He did not.
At 10 minutes he was still out. At 20 minutes he was still out. One hour passed and the man was still out.
Morton now feared he had killed the man. After one hour and 15 minutes the man came to and was thrilled that his pain was gone and he remembered nothing.
Morton now went back to his chart and tried to figure out what happened. It finally dawned on him that he weighed nearly 300 pounds and his patient weighed 120.
So he now adjusted the chart for weight.
In October 1846 Morton demonstrated his discovery at the Massachusetts General Hospital and it was a complete success.

So there you have it folks. Joseph Lister, Louis Pasteur, and William Morton. Three medical pioneers who stepped up to the plate in times of human suffering.
As we battle our current fight with Corona Virus, let’s not forget our history and know that the next generation of pioneers are out there, right now, seeking the answers to our current dilemma.

War time production and the Corona Virus.

The country was anything but ready for a major conflict in 1941. Due to the Great Depression and a national unwillingness to get involved in conflict overseas, the United States had been unprepared.
But with the attack on Pearl Harbor and the United States’ entry into the war, just like today, the nation had to come to grips with its unreadiness.
The country’s industrial sector was still reeling from the Depression, and owners weren’t thrilled about the thought of investing in defense production. “Many American producers of primary materials were reluctant to expand facilities, and many manufacturers reluctantly converted assembly lines from peacetime goods to vitally needed armaments,” according to historian Barton J. Bernstein.
To break through that reluctance, President Franklin D. Roosevelt pursued sweeping war powers.
He now requisitioned supplies and property and forced entire industries to produce wartime products.
Instead of producing products for civilians, the nation’s factories became powerhouses pumping out planes, tanks, military vehicles, weapons, ships and other defense-related products.
U.S. manufacturing output grew by 300 percent during the war, and despite wartime scarcity, consumer spending increased, too, thanks to higher employment and wages.
“Powerful enemies must be out-fought and out-produced,” President Roosevelt told Congress and his countrymen less than a month after Pearl Harbor.
“It is not enough to turn out just a few more planes, a few more tanks, a few more guns, a few more ships than can be turned out by our enemies,” he said. “We must out-produce them overwhelmingly, so that there can be no question of our ability to provide a crushing superiority of equipment in any theatre of the world war.”
Two years earlier, America’s military preparedness was not that of a nation expecting to go to war.
In 1939, the United States Army ranked thirty-ninth in the world, possessing a cavalry force of fifty thousand and using horses to pull the artillery.
Many Americans — still trying to recover from the decade-long ordeal of the Great Depression — were reluctant to participate in the conflict that was spreading throughout Europe and Asia.
President Roosevelt did what he could to coax a reluctant nation to focus its economic might on military preparedness. If the American military wasn’t yet equal to the Germans or the Japanese, American workers could build ships and planes faster than the enemy could sink them or shoot them down.
In the wake of Pearl Harbor, the president set staggering goals for the nation’s factories: 60,000 aircraft in 1942 and 125,000 in 1943; 120,000 tanks in the same time period and 55,000 antiaircraft guns.
In an attempt to coordinate government war agencies Roosevelt created the War Production Board in 1942 and later in 1943 the Office of War Mobilization. To raise money for defense, the government relied on a number of techniques — calling on the American people to ration certain commodities, generating more tax revenue by lowering the personal exemption and selling government war bonds to individuals and financial institutions. All of these methods served to provide the government with revenue and at the same time keep inflation under control.
War production profoundly changed American industry. Companies already engaged in defense work expanded. Others, like the automobile industry, were transformed completely. In 1941, more than three million cars were manufactured in the United States. Only 139 more were made during the entire war. Instead, Chrysler made fuselages. General Motors made airplane engines, guns, trucks and tanks. Packard made Rolls-Royce engines for the British air force.
And at its vast Willow Run plant in Ypsilanti, Michigan, the Ford Motor Company performed something like a miracle 24-hours a day. The average Ford car had some 15,000 parts. The B-24 Liberator long-range bomber had 1,550,000. One came off the line every 63 minutes.
America launched more vessels in 1941 than Japan did in the entire war. Shipyards turned out tonnage so fast that by the autumn of 1943 all Allied shipping sunk since 1939 had been replaced. In 1944 alone, the United States built more planes than the Japanese did from 1939 to 1945. By the end of the war, more than half of all industrial production in the world would take place in the United States.
Wartime production boomed as citizens flocked to meet the demand for labor.
While 16 million men and women marched to war, 24 million more moved in search of defense jobs, often for more pay than they previously had ever earned. Eight million women stepped into the work force and ethnic groups such as African Americans and Latinos found job opportunities as never before.
“Most of the people who got out of high school if they were female and didn’t go to the war, they went to Mobile,” said Emma Belle Petcher, who moved to the city from the tiny town of Millry, Alabama. “That was the place to go and get a job. And there were all kinds of jobs.”
World War II utterly transformed Mobile and its economy. Local shipyards won contracts to build Liberty ships and destroyers in 1940, and by the time America entered the war in late 1941, Mobile was already booming. The Alcoa plant processed millions of pounds of aluminum used to build many of the 304,000 airplanes America produced during the war; the Waterman Steamship Company boasted one of the nation’s largest merchant fleets, and Mobile became one of the busiest shipping and shipbuilding ports in the nation. In 1940, Gulf Shipbuilding had had 240 employees; by 1943, it had 11,600. Alabama Dry Dock went from 1,000 workers to almost 30,000.
Like the shipyards in Mobile and plane-repair facilities near Sacramento, factories in Waterbury, Connecticut were transformed to keep up with the war. The Mattatuck Manufacturing Company switched from making upholstery nails to cartridge clips for the Springfield rifle, and soon was turning out three million clips a week. The American Brass Company made more than two billion pounds of brass rods, sheets and tubes during the war. The Chase Brass and Copper Company made more than 50 million cartridge cases and mortar shells, more than a billion small caliber bullets and, eventually, components used in the atomic bomb.
Scovill Manufacturing produced so many different military items, the Waterbury Republican reported, that “there wasn’t an American or British fighting man … who wasn’t dependent on [the company] for some part of the food, clothing, shelter and equipment that sustained [him] through the … struggle.”
Many factories ran around the clock. “It was seven days a week,” said Clyde Odom of Mobile. “And during the war when it was so strong, it was twelve-hour days five days a week, ten hours on Saturday, eight hours on Sunday.
“Money seemed to be the least of the concerns,” Ray Leopold of Waterbury said. “The thing was to produce material that will win the war and bring their boys home.”
Following WWII, as the Cold War heated up, President Harry Truman and his advisors saw Korea as a pivotal front. When Soviet-backed North Korea invaded South Korea in June 1950, catching the United States unaware, western powers worried that it was the first foray of a larger communist world takeover and braced for military intervention.
Once again, the United States was unprepared for war. Defense production had dropped off and industries were once again catering to civilian needs. Even the kinds of tools that would be needed to produce more military materials were in short supply, and experts agreed the nation was not ready for another war.
If Communists attempted to fight their western opponents on another front, too, the United States would be unable to respond.
In July 1950, Truman warned Congress that the seemingly inevitable war in Korea would cause supply shortages and inflation at home and asked them—and the nation—to ramp up defense spending at home.
“The things we need to do to build up our military defense will require considerable adjustment in our domestic economy,” he said in an address. “Our job now is to divert to defense purposes more of [our economy’s] tremendous productive capacity–more steel, more aluminum, more of a good many things.”
Truman had been involved in defense production during the previous war, chairing a special committee that exposed abuses and waste in war production. Now, faced with the prospect of a massive, well-organized enemy, he requested the authority to oversee another economic mobilization. In September 1950, Congress passed the Defense Production Act.
“While not as sweeping as the executive powers granted during World War II,” writes historian Paul G. Pierpaoli, “the Defense Production Act was nonetheless an unprecedented foray into government planning and control during a time when no formal war had been declared.”
The law let the president force manufacturers to prioritize defense production, set price ceilings, expand private and public production capacity and more. It has since been reauthorized 53 times.
Over the years, the law’s definition of “national defense” has broadened, and now includes homeland security and infrastructure assistance to foreign nations. Broadly speaking, the law lets the president force industries to make government contracts a priority.
It’s been invoked to do everything from fund biofuel research to prioritize government contracts in the wakes of hurricanes. It’s also been used to increase production of things like batteries for military use and specialized circuits and materials deemed important for national security.
So, as the globe confronts the coronavirus pandemic, one urgent problem is the shortage of key pieces of equipment, including high-quality masks, test kits and—perhaps most important of all—ventilators. It seems hundreds of thousands of lives might be saved, if only manufacturers could quickly ramp up the production of such equipment, perhaps by a factor of 100 or 1,000, within a few weeks.
We know the United States has done something similar, on a nationwide scale, once before—eight decades ago during the emergency of World War II.
Might there be lessons to be learned now, from that history? Here are a few takeaways from our past that we could use today.
1. If the government wants machines fast, it better promise to buy them.
During World War II, manufacturers of key items were guaranteed that national government agencies would purchase all of their output, even if the equipment ultimately wasn’t needed. The lesson for 2020 is that if we want more ventilators as soon as possible, the national government needs to guarantee it will purchase them.

2. Production can be scaled up fast when companies cooperate
During World War II, expert manufacturers shared designs and techniques with other firms, so that key items could be produced on multiple production lines, simultaneously.
This was done on a grand scale in the U.S. aircraft industry. For example, Pratt & Whitney, the top aircraft engine manufacturer, shared drawings and knowledge with Ford Motor Co. and General Motors, the giant automakers, so that they could mass-produce engines.
Similarly, Boeing worked with competitors, including Lockheed and Vega, so the Boeing-designed B-17 bomber could be made at its rivals’ plants in California, as well as in the company’s home facility in Seattle.
Today, public authorities and business leaders might use a similar approach, by arranging to have the best versions of key items like tests, ventilators, medications and vaccines made by multiple companies, using temporary deals that would bypass delays that might come from concerns about proprietary technology and competitive advantage.
3. The government can build and own new factories, and let companies run them
World War II saw the emergency construction of manufacturing plants, the vast majority of which were paid for and owned by U.S. government agencies but operated by companies in the private sector.
Indeed, the main mechanism for the biggest expansions of industrial capacity in World War II was the government-owned, contractor-operated (GOCO) plant.
The famed Kaiser shipyards, which turned out big merchant vessels in a matter of days, were GOCO facilities, as were most of the big new bomber plants run by the top airframe manufacturers, including Douglas, Martin and North American.
The atomic bomb project, like the conventional explosives program, was based on GOCO plants, run by some of the country’s leading corporate manufacturers, including DuPont and Eastman Kodak.
Today, the GOCO model could be useful in cases where new production lines—for respirators, vaccines or other items we realize are now essential—need to be built fast, without waiting to see whether private capital will fund them.
4. In a pinch, create homegrown alternatives
The record of World War II shows it is possible, if not easy, to produce emergency alternatives closer to home when global supply chains are disrupted.
The U.S. was forced into an all-out emergency scramble to replace imports with new domestic sources, most importantly in the case of rubber. Here American authorities failed to anticipate an enormous problem for the war effort that occurred when Japan’s victories in early 1942 cut off imports of natural rubber from Indonesia.
This threatened to cripple the production of items like military trucks and aircraft, which needed rubber for their tires. However, a massive, rapid effort, using GOCO plants and the expertise of oil, chemical and tire companies, allowed the United States to rapidly build from scratch a big new synthetic rubber industry.

Today, as the disruption of global supply chains is making it harder to procure a variety of key components for the coronavirus fight, it makes sense for policymakers and business leaders to engage in some quick planning and cooperation, to find and finance domestic substitutes.
So there you have it folks. Even though we are dealing with a new enemy, a lot can be learned from those who lived before us about how to survive our current threats.

Middle East Oil & WWI

When World War I ended, new countries were born and borders were redrawn in the Middle East. But those changes were marked with missteps that have led to many of the conflicts that have made it one of the most volatile regions in the world.
Here is why.
In order to understand, we have to go back to WWI.
The British, French and Russians had been jockeying for position over the declining Ottoman Empire (modern day Turkey) for decades before World War I.
But as the war unfolded, Germany’s spreading influence in the region brought concern from all parties. Great Britain wanted to protect its interests in the region – mainly oil and mobility via the Suez Canal – so Britain and its most important colony, India, sent troops to the Middle East
On Nov. 5, 1914, France and Britain declared war on the Ottoman Empire who had joined Germany and Austria Hungary.
During the war, Arab rebels who wanted to be free from the Ottoman Empire asked the British for help. The British supported that request, with the help of France.
When the war ended, the two European powers implemented a mandate system in the Treaty of Versailles that split up the former empire’s countries without consulting the people who lived there.
The Ottoman Empire now became Turkey. It’s still the biggest and most powerful country in the region.
Lebanon was created as a state separate from Syria. These were put under French rule and stayed that way until after World War II.
Mesopotamia (Iraq) had been made up of three former Turkish provinces – Mosul in the north (known as Kurdistan), Basra in the south, and Baghdad in the middle. After the war, they were united as one country under British colonial rule.
Palestine was put under British control and divided into two countries, with the western portion of it becoming Jordan.
Britain also took control of Iran.
In 1932, Arabia’s kingdoms and dependencies were combined into one, called Saudi Arabia.
When Iraq was put under British rule after the war, it triggered conflict in the region that continues today:
• British leaders didn’t understand Iraq’s political or social issues and underestimated the popularity of the Arab nationalist movement (which was opposed to British rule).
Iraq’s provinces were each ruled by tribes and sheiks and had their own ethnic, cultural and religious identity. They weren’t used to a centralized government, which now included the voices and protections of minorities like Jews and Christians, so conflict erupted from the start.
• In 1921 a conference was held in Cairo to try and settle the conflicts.
• Agreements made at the conference drastically reduced British troop levels in a region that had little civil order and governmental oversite.
• The British also scrapped their promise to create an independent Kurdistan in Iraq’s north. To this day, Kurds in Turkey and elsewhere continue to defend their desire to become an autonomous region.
• The conference led to the next major point, the appointment of Faisal as king.
• At the conference, Faisal Bin Al Hussein, Faisal, for short – was installed as Iraq’s king since he was key in the success of the Arab revolt against the Ottomans. But as ruler, he rejected British control and wanted to form a single national identity, despite the aforementioned tribes, religions and ethnic groups.
• Since then, mostly Sunni Arabs have had political control over land that was largely populated by Kurds and Shiites, and each group’s grievances have brought about violent confrontations.
The Cairo Conference’s decision to install Faisal as king in Iraq also deeply affected Palestine and Jordan. Faisal’s brother, Abdullah, had been trying to regain Syrian independence from the French. But the British didn’t want to cause conflict with France, so it threatened Faisal, telling him he wouldn’t get to rule Iraq if Abdullah attacked Syria.
To appease Abdullah, the British created Trans-Jordan from Palestinian land and made Abdullah its king. This split set the foundation for the Arab-Israeli conflict we see today, since it split in half the land that would be considered for a future Jewish national homeland.
In Iran, the Anglo-Persian Agreement of 1919 that was formed after World War I would have given Persia (Iran) British money and advisors in exchange for oil access. But that was rejected by the Iranian Parliament in 1921.
Iran’s king, Ahmad Shah Qajar, was removed from power in 1925 by the parliament after his position was weakened in a military coup.
Reza Pahlavi, a former military officer, was named the new king and, in 1935, renamed the nation Iran. He was deposed in 1941 following an invasion by Soviet, British and other commonwealth forces looking to secure oil reserves from possible German seizure. His son, Mohammad Reza Pahlavi, then became king (shah, as they call it).
Unrest due to corruption and the shah’s efforts to westernize the country finally bubbled over in 1979, and the shah was forced to leave Iran.
The Ayatollah Ruhollah Khomeini, who had previously been exiled, returned to become the country’s supreme spiritual leader, and he made Iran a theocracy.
Iranian revolutionaries, angered by American interests and political dealings in their country, also stormed the U.S. Embassy, accusing the U.S. of harboring the exiled shah, who had relied on the U.S. to stay in power. Hostages were taken, ties were severed, and thus began the lack of diplomatic relations between the U.S. and Iran that continue to this day.
The Sykes-Picot Agreement of January 3, 1916 was a secret treaty between Britain and France to carve up the Middle East after WWI.
France would get the territory of modern day Syria, Lebanon, and northern Iraq, while Britain would get the territory of modern day Egypt, Israel, Jordan, and southern Iraq.
In an effort to win the war, the British supported the Arabs in launching a revolt against the Turks during the war.
The British promised Hussein, the King of Mecca, that he would be made king of a unified and independent Arab state after the war if he revolted against the Turks. Hussein agreed. His son Faisal, advised by Lawrence of Arabia, led the Arab revolt against the Turks. The Arab revolt played an important role in the collapse of the Ottoman Empire.
At the peace conference, the British broke their promise to establish a unified and independent Arab state. Instead, they created a handful of new nations in the Middle East that would be dominated by Britain and France.
In 1921, the French created the Kingdom of Syria. The British convinced the French to make Faisal the ruler of Syria, but he had no independence. He was exiled by the French in July 1920. The French created the state of Lebanon in 1920, and transferred territory from Syria to Lebanon. This act of imperialism still irritates Syrians today.
The Sykes-Picot Agreement also led to the creation of Iraq. According to Sykes-Picot, the British would get Baghdad and Basra, while the French would get Mosul in the North.
The British realized the importance of oil much earlier than the French, and the British suspected there was oil in Mosul. In 1918, the British convinced the French to relinquish their claim to Mosul. In this way, the British took control of the entire territory that is now Iraq. The British formed the Kingdom of Iraq in 1921, and Faisal was made king.
Now, British promises to European Jews further complicated the situation in the region. On November 2, 1917, the British government issued the Balfour Declaration — a public statement supporting a homeland for the Jewish people in Palestine.
Sykes-Picot gave the British control of Palestine. In 1921, the British carved Jordan out of Palestine and made Hussein’s son Abdullah king. However, the creation of Jordan infuriated both the Jews and the Arabs. On the one hand, the Jews thought the Balfour Declaration granted them the entire territory of Palestine. On the other hand, the Arabs in Palestine revolted at the idea of a Jewish homeland in their territory. There has been tension between the Jews and Muslims in the region ever since.
So bottom line is, the British broke promises made to both the Arabs and the Jews.

The war and peace treaties resulted in the creation of new and unsustainable nation states in the Middle East. For those living in the Middle East, even the names Iraq, Syria, Lebanon, Jordan, Israel, etc., are constant reminders that Britain and France betrayed the Arabs. In the end, British and French imperialism in the Middle East caused a century of turmoil in the region. So now to the oil:
Before the discovery of oil in the Middle East, Iraq, Iran, Saudi Arabia, and Kuwait were all poor undeveloped countries.The situations in all of these countries were similar. The majority of the population consisted of extremely poor peasants. No middle class existed to curb the power of the few rich families and a person had little chance of improving his status. The countries had few natural resources and for the most part the land was not suitable for farming.
At the beginning of the twentieth century oil was discovered in Iran and later in Saudi Arabia, Kuwait, and Iraq as well. Extraction of the discovered oil reserves in these undeveloped nations was a problem.
Developed countries already had the money, technology, and knowledge of the industry required to mine and market the oil within their borders. Countries like Venezuela and Saudi Arabia did not have many people with the money or knowledge of the industry required to make use of the natural resources their country controlled.
As a result nations with extensive petroleum reserves were unable to mine or market their petroleum. They needed the aid of the large foreign oil corporations in order to realize any profits from their resources.
At the time, the international petroleum industry was almost entirely developed and dominated by seven companies.
Five of the companies were American (Chevron, Exxon, Gulf, Mobil, and Texaco), one was British (British Petroleum), and one was Anglo-Dutch (Royal Dutch/Shell).
These major oil companies saw the opportunity for profit presented by the impoverished petroleum rich countries and decided to take advantage. This led to a series of concession agreements between the seven major oil corporations of the world, and the soon to be oil producing countries in the Middle East, Africa, and South America.
The details of these contracts vary, but they all shared a few common features. The governments gave the companies exclusive rights to explore and develop oil production within a limited area for a limited amount of time. The companies own all the oil they extract. However, the companies take all of the financial and commercial risks involved with the enterprise and they must make payments to the host governments in the form of taxes, royalties, production taxes, etc.
At face value the contracts might seem to be a good deal considering the host nations are profiting from resources that they could not mine themselves without doing any work.
In reality though, the contracts were not at all fair to the developing nations. The contracts were for a finite amount of time and area, but they covered huge expanses of land.
Contracts with three companies covered the whole of Iraq. A single contract covered the entire southern half of Iran, and another one covered all of the United Arab Emirates.
On top of this they were of incredibly long duration. Iran’s initial deal, which was not unusually long, lasted for sixty years. When it expired a new contract was negotiated to last twenty-five years with the option of renewal for up to 15 extra years.
The oil companies, who realized what a good deal this was for them, did not allow the oil possessing countries any means of backing out of their contracts.
The way the contracts were set up, the developing nations were unable to alter their contracts, short of nationalization, without the companies themselves agreeing.
Most of the countries even signed away their right to tax the companies in exchange for one time royalty payments.
For the first few decades the undeveloped nations with oil were happy to have the contracts. The oil deals brought an unprecedented amount of money to the poverty stricken nations. However, it was not long before they began to realize that they were being exploited.
Venezuela, which had the most favorable concession agreement, was the first to act. Since the country still maintained its right to raise taxes on the companies, Venezuela passed legislation in 1943 designed to increase the total royalties and tax paid by oil companies to 50% of their total profits.
The oil companies did not actually have a major problem with this change. They already had to pay income taxes not only to the oil producing countries, but also to their own governments.
Five out of the seven big companies were American. In the United States any tax that the oil companies paid to the oil producing nations was directly deducted from their income tax.
As a result the tax hike did not really reduce the profits seen by the oil companies. More important than the companies power hold on the individual oil producing countries was their grip on the oil market as a whole. They were essentially, though not legally speaking, a cartel.
Since some companies had a surplus of oil and others did not have enough, they worked out an agreement whereby the companies with surpluses would sell their extra oil to the others at a reduced rate. This had the effect of limiting the supply, and increasing prices (The United States government tried for years to catch the oil companies for anti-trust law violations, but was unsuccessful since their actions were technically legal).
The higher prices of oil actually benefited the oil producing countries since their profits were directly tied to the oil companies. However, the same power which allowed the companies to control prices also gave them the ability to control where that extra money went.
The seven major oil companies each had rights in several different producing nations and controlled almost all of the world’s oil supplies.
Since they each had several countries from which they could extract their oil they could easily reduce production in one location and raise it in another giving them a powerful bargaining advantage over individual countries.
At that time the individual governments knew very little about what was going on with their oil. They had no idea where the oil was being extracted from, who it was being sold to, or at what price. All they really knew was how much oil the companies claimed to have sold, and how much money they received for it. There was no communication between countries, so no government knew how much other nations were making from their oil.
However, Saudi Arabia soon became aware that any payments made by the companies to oil producing nations were deducted from their income tax. So, naturally the middle east nations now demanded more money.
The United States government cared more about ensuring access to oil than the extra tax revenue, however, so they allowed the oil companies to consider the increased payments a tax rather than a royalty so that it could still be deducted from their income tax. It was not long before all of the oil producing nations had fifty-fifty profit sharing contracts.
Most of the Middle Eastern countries were content with the fifty-fifty profit sharing. Iran, however, had a more radical idea in mind. The sentiment grew in the Iranian Parliament that nationalization was the way to go. Prime Minister Ali Razmara, who was the main anti-nationalization force, was assassinated in 1951 and a nationalization bill was passed in the Iranian Parliament soon afterward.
British Petroleum (BP) was the only oil company that had a concession agreement in Iran. In the interest of maintaining profits, BP increased production in Iraq and Kuwait while looking to England for support in keeping their interests in Iran after negotiations failed.
Both England and America attempted to work out deals with Iran’s new Prime Minister Mohamed Mossadegh but none were reached and as a result, oil exportation ceased entirely.
After two years without oil income the country was feeling the effects and Mossadegh began to lose support. So in 1953 the CIA (at the request of England) funded a coup which retuned power to the Iranian Shaw and landed Mossadegh in prison.
Consequently, the movement for nationalization in Iran was crushed. After the destabilization of their government and three years without oil revenue Iran ended up with a fifty-fifty deal equivalent to what they had been offered before trying to nationalize.
On top of this, oil interests in Iran were spread among all of the major oil companies, not just BP. This not only helped to increase the oil companies’ hold on the market, it also made negotiations much more complicated for the Iranian government. England and America had made an example of Iran that would not be forgotten by any of the oil producing nations for a long time.
This system worked in favor of the oil companies because they themselves controlled the posted prices. They were able to increase the actual price of oil without changing the posted prices. Thus, an increase in their oil profits did not necessarily mean an increase in taxes they paid. The oil producing nations knew very little about the oil industry beyond what the companies told them, so they were fairly oblivious if posted prices did not increase with actual oil prices.
The developing nations might not have noticed the fact that they were being slighted as prices increased, but they definitely took note when posted prices started to decrease. As the cost of oil dropped in the late fifties Middle Eastern countries began to complain when the oil companies repeatedly reduced their posted prices.
It aggravated them even more that the oil companies would drop the prices without warning them in advance.
With the outcome of Iran’s attempt at nationalization in mind, however, none of them actually did much beyond voicing their discontent, and a few empty threats.
By 1960, however, the oil producing nations had had enough of the companies reducing posted prices without warning. Another price drop in August 1960 pushed Iran, Iraq, Kuwait, and Saudi Arabia over the edge.
The first meeting of OPEC was held on September 10th, 1960.
The five members of the newly formed group drafted a set of resolutions at their first conference.
The OPEC members didn’t even demand the freezing of posted prices, but requested only that they be warned before the prices were lowered. The weak tone of the resolutions suggests that even together the nations were afraid to really stand up to the oil giants.
The oil producing nations had begun to unite, but they were still not able to work together. Iraq attempted to take over Kuwait almost immediately after the founding.
Iran, was basically a puppet for the US government. They reported everything that went on in OPEC meetings directly to the American government, seriously undermining the group’s effectiveness.
Saudi Arabia, realizing that the countries must all work together to if anything were to be accomplished, would not agree to anything that Iran did not back. As a result the weak stance taken in their initial resolutions would continue to characterize the actions of OPEC during its first ten years.
OPEC might not have significantly affected the way the oil industry was run during its early years, but it did have an important effect on its member nations. Before the group was formed there had been very little cooperation between oil producing nations.

Though OPEC could not make all of its members work together right away, it gave them a foundation on which to build. Beyond this, it also helped the member nations better understand the oil industry as a whole.
It was not until OPEC that the oil producing nations really became aware of the details of how the oil companies mined and sold their oil and to whom. This greater knowledge of the oil industry combined with the support that OPEC provided would give the producing nations the edge in negotiations with the oil companies.
The rest, as we say, is history. Callers?

Unit 731

While many people know about the atrocities performed by the German Nazi’s during World War II, few know of the similar, if not worse, experiments committed by the Imperial Japanese Army during this same time.
Japan’s focus during World War II was to create and use biological and chemical weapons.
With the recent scare of the Corona Virus and the rumor that it may have started in a biological weapons lab in China, I thought it might be interesting to share with you a fascinating story that most people know very little about.
Unit 731 was a biological and chemical research program headed by General Shiro Ishii in 1936.
The program was disguised as a Water Purification Plant and a Timber Mill that was staged in various cities throughout China.
Unit 731 conducted numerous experiments and vivisections on living human beings without anesthesia for almost ten years. These experiments ranged from performing amputations, to infecting victims with various biological diseases such as; bubonic plague, cholera, typhus, and sexually transmitted diseases.
Unit 731 also conducted biological and chemical attacks on many cities in China and even tried to attack the United States.
Japanese scientists were famous for their work and were known as “science fanatics” and became the world leader in battlefield medicine.
By practicing this type of research Japan was able to keep her soldiers healthy. If her soldiers were healthy, they could fight more effectively and become a stronger force to be reckoned with.
While in a war with China, a young Japanese man was coming up quickly in the scientific and medical community.
Shiro Ishii was born into a wealthy family on June 25, 1892 in a town two hours from Tokyo. This is said to be the reason for his very self-centered personality and the drive to succeed. His socio-economic status earned him respect amongst military and education officials.
In 1916, Shiro was accepted into the Kyoto Imperial University, which was renowned for its medical department, especially in the field of bacteriology. Shiro graduated from the Kyoto Imperial University and joined the Army Medical Corps.
He also married the daughter of the President of the Kyoto Imperial University. It is speculated he only did this to get in better with the schools’ president so he could advance his career.
Shortly after joining the Army Medical Corps, Shiro was a representing member for Japan during the Geneva Convention signing that banned the use of biological and chemical warfare. Shiro “reasoned that if something were bad enough to be outlawed, then it must certainly be effective.”
This is where Shiro insisted on turning the “silent killer” into the “silent ally,” even after knowing it had been banned by the Geneva Convention.
Shiro lobbied to the military how cost effective establishing a bacteriological and chemical program could be.
Developing and studying biological and chemical agents was far less expensive than training and equipping more and more troops to do half the job that bio-chemical weapons could.
Shiro Ishii was given an explicit imperial order by Emperor Hirohito and was now able to develop a program designed to specifically research and develop biological and chemical weapons. This Program became Unit 731.
Shiro Ishii’s main area of expertise was in bacteriological research.
The bacteriological research division was in charge of studying bacteriological agents such as cholera, dysentery, epidemic hemorrhagic fever, and various sexually transmitted diseases.
Bacteria Mass Production and Storage was also an important area of study. It is said that “at Unit 731’s height of production, they had the capacity to create enough bacteria to kill the worlds’ population several times over.”
The first city to be taken hostage by Japan’s Unit 731 was the city of Harbin. Harbin was a booming city with a huge population, perfect for what Shiro and his war criminal partners hoped to accomplish.
The laboratory in Harbin was referred to as an “Epidemic Prevention and Water Purification Plant”. This walled city was called the Zhongma Fortress because of how huge it really was. A revolt happened inside the fortress and a few captives were able to escape and talk about the atrocities happening within the city.
The South Manchurian Railway worked into Shiros’ plans as he was able to transport victims easily to and from his laboratories. It is said that at least 9,000 people died from experiments in Harbin.
The next place Unit 731 began to experiment on people in secrecy was a small district of Harbin called PingFang. PingFang’s lab was disguised as a Lumber Mill. The running joke amongst the Japanese soldiers was that victims were called “maruta” which means “logs.” It is estimated anywhere from 3,000 to 12,000 people died in PingFang from various tests.
There was no requirement for becoming a victim of Unit 731’s awful experiments. If you were captured, you were experimented on.
The majority of people experimented on were Chinese followed by Russians. Men, women, and children were all used. Prisoners, criminals, local populace, soldiers and anyone else the Japanese could get their hands on became test subjects.
There was even a report of experiments being performed on a four day old baby. It didn’t matter if you were young and healthy or elderly and sick.
The newly ranked General Shiro Ishii now had everything he needed to begin his human experimentation program. He had plenty of funds coming from the Imperial government, had hundreds of thousands of human test subjects and doctors and professionals willing to perform experimentations. Now all Unit 731 had to do was start researching, experimenting, and putting their experiments to use.
Experiments were conducted so the doctors could learn more about how humans live and die. These included studies of dehydration, starvation, frostbite, air pressure – some inmates had their eyes blown out and received transfusions of animal blood.
Even children and babies were destroyed this way.
While these are terrible examples, a major horror that needs to be discussed is the use of “vivisection.”
Vivisection is the act of surgically operating on something living. This means they would literally cut open victims while they were still alive, most of the time without the use of anesthesia. Anesthesia was not used because the doctors felt it could throw off results.
After all, soldiers in the battlefield rarely had anesthesia for their wounds and the goal was to create a more powerful army by giving the Imperial Japanese Army an advantage and taking away their enemies advantage, whatever it may be.
Unit 731 also invested pretty heavily into amputation and its’ effects on the human body. This was a typical battlefield injury that could result in mass casualties for any army.
Some of the researchers would amputate hands and feet and sew them back in opposite places to see if the person could function.
Sometimes they would amputate a limb just to calculate how long it took a victim to bleed to death, or if they would even bleed to death.
Another major area of research that had been heavily invested in was the study of cholera. “Cholera can be life-threatening but is easily prevented and treated.” Cholera can be contracted by a person drinking or eating anything with the Vibrio Cholerae bacterium. This is usually caused by fecal matter being in the drinking water.
Water sources were usually contaminated by armies practicing a scorched-earth policy as they retreated. The water filtration process during World War II was very poor. Luckily, the Japanese developed a water purification system that helped combat the issue. However, they were still unable to figure out a way to treat it once a person became infected. For this reason Unit 731 researched and studied cholera extensively.
Unit 731 infected food and water consumed by victims. The cholera would spread like wildfire and the inhabitants would suffer greatly. The only ones left were those who were too sick to move.
Between 1932 and 1945, Unit 731 attacked PingFang, Manchuria with multiple biological attacks, mainly using cholera. During these attacks it’s estimated between 1000-2000 people were either killed or injured. The purpose of these attacks was to test their biological weapons capabilities.
From1940-1942 Japan again attacked Chinese cities with cholera. This time more than 10,000 people were killed or injured, exact numbers are unknown.
Plague was another biological agent Unit 731 dipped their feet into, more specifically bubonic plague. The Bubonic Plague outbreak in Europe has been viewed for centuries as one of the most devastatingly awful biological killers of all time.
Japan knew if they could use it to their advantage, maybe they would be capable of destruction powerful enough to make their army the strongest of any nation.
“Plague is a bacterial disease, caused by Yersinia pestis, which primarily affects wild rodents. It is spread from one rodent to another by fleas. Humans bitten by an infected flea usually develop a bubonic form of plague.
Japan’s use of plague was rather unique. They could easily infect cities without building an actual bomb to disperse it. Once the plague bacterium was concentrated, Unit 731 researchers would infect fleas with the plague and release fleas in villages. The fleas would get onto rats which would further spread plague.
Unit 731 medical officers would then vivisect patients to see how the symptoms were affecting their victims in real time. The purpose was to see how and where the plague developed.
To spread some of the fleas, Japanese soldiers would release balloons filled with infected fleas and flour. The balloons would pop sending the fleas and flour crashing into villages. The flour would attract rats and become infested with fleas.
While Japan tested many biological and chemical attacks against Chinese cities between 1932 and 1945, it is also important to note that Japan either attempted or planned to attack the United States as well. There were more than 9,000 balloons launched towards the West Coast of the United States in an attempt to kill American civilians and start fires.
Soon, these bombs became visible in American skies without the civilians knowing what they were.
One of these balloons containing a bomb dropped near the city of Bly, Oregon in which 6 people were killed.
Another attack of biological and chemical warfare planned against the United States came in 1945. Japan wanted to use kamikaze pilots to dive bomb into San Diego with planes carrying canisters of plague infested fleas. “Blossoms at Night.”
Japan had planned to use submarines to get closer to the United States in order to launch their kamikaze pilots and begin the attack sometime in September 1945. However, with the Japanese surrender in August 1945, the plans were destroyed along with all of the other germ warfare program evidence.
As World War II came to a close, Japan quickly came under attack by the Allied forces. They did not want the outside world to know what they were doing so began destroying all evidence of Unit 731.
As the United States government began learning of the experiments and attacks conducted by Unit 731, something strange happened. Unit 731 members were paid and released; something the Nazi Germans had been tried and killed for.
The United States government paid off Unit 731 members and completely pardoned them in exchange for their data and research.
As a result, none of the members of Unit 731were prosecuted by the United States government. They were released with full pardons back to Japan, many of these researchers gained prestigious positions in schools and hospitals.
So there you have it folks. While everyone has focused on the horrors perpetrated by the Nazis during WWII, Japan slipped under the radar and helped create the threat of biological warfare we live under today.

Revolution 2020

OK Gang. This is a show I have been wanting to do for a long time. Unfortunately, like many of you, I am guilty of putting off till tomorrow what should be done today.
This show involved a lot of research, and quite frankly, sometimes I am just lazy. However, knowing I won’t be here for a couple of weeks, I felt it necessary to tackle this topic and allow you to think on it until my return. So here we go.
Watching the events happening in Washington this past year and listening to all the talking heads on the national media discussing the presidential candidates and the upcoming election, I can’t help but look at the amazing parallels between what is happening now, and what has happened in the past.
I have a theory, and quite frankly, it scares me to death. Knowing what I do about history, I see great peril in our future if we do not change our present course.
Throughout history we have seen tremendous turmoil every time a nation finds itself divided on political lines. Our country is at that point.
Eight years under President Obama, awoke the far left in America. Now looking at eight years under President Trump, we are looking at the rise of the far right in our nation. The moderate political views no longer exist.
Civilized discussions between left and right no longer exist. You are either all in one camp, or all in the other.
This sets the stage for what we have seen in the past.
I am not afraid of what was put in place by President Obama or for that matter, what President Trump is doing.
I am however, terrified of what comes next. People say, “No way a Bernie Sanders can win the election and turn us into a socialist nation. I say, you are dead wrong.
Let us now turn to our history.
In the winter of 1916-17, conditions got worse for the people of Russia. Their wages were far behind inflation and especially harsh weather caused fuel and food shortages.
On March 8th, International Women’s Day, women textile workers who went on strike took to the streets and demanded more bread. Other workers soon joined them and within 2 days more than 200,000 strikers were marching in the streets.
Czar Nicholas Romanov, ruler of Russia, was informed by telegraph, what was happening in the Capital. He now sent orders that at all costs the military must restore order in the Capital.
The city’s military commander ordered police and troops to disperse the demonstrators, firing upon them if necessary. After some shooting, the key turning point happened on March 11, soldiers in one regiment after another refused to fire on their fellow citizens.
On March 12, some members of the Russian Parliament, the Duma, defied the Czar and formed a Provisional government. The Provisional government now called for Nicholas to abdicate the throne in favor of his son, Alexei.
Meanwhile, Nicholas’s generals convinced him that in the interest of Russia and the war effort (WWI) he should step down from the throne. On March 15 he signed the abdication papers. He also abdicated for his son.
The Provisional government renounced the crown and decided they would pick their own leader. Russia had been ruled by the Romanov family for the past 300 years.
On March 17 the public was informed the Romanov Dynasty had collapsed. The new Provisional government consisted primarily of Moderates, with only one socialist in the cabinet, the Minister of Justice, Alexander Kerenski who pushed for a parliamentary form of government like that of England.
Kerenski was chosen to lead the new government primarily due to the fact that he had a much more militant posture.
He was a radical lawyer who was born in the same town as Lenin and had earned a reputation as a fiery speaker and Duma politician. He was also the only cabinet member who was also a member of the Petrograd Soviet. Therefore they saw him as someone who would have the support of the people and nearly all of the students at the universities.
While the revolution of March was taking place, the Bolshevik leader, 46 year old Vladimir Lenin, was still living in Switzerland, having been exiled from Russia 20 years earlier. Lenin contacted the Germans and told them if they would get him across Germany and into Russia he would launch a revolution, and if he was successful he would immediately pull Russia out of the war.
On the night of April 16 along with a number of other prominent Bolsheviks, he arrived at Petrograd’s Finland Station. In speeches that night and the next day, April 17, 1917 Lenin proclaimed his agenda.
He opposed the Soviets (local councils) policy of continued support for the ongoing war and their support for the Provisional government.
On April 17 Lenin also called for establishment of a republic of soviets of workers, he called for the confiscation of all land and the nationalization of all land.
He wanted government control over banking, production, and distribution and the abolition of the police, army, and bureaucracy. He now changed his party’s name from Bolshevik to the Communist Party.
Lenin called for a revolution of the Proletariat, the middle class, and he concluded: under the firm guidance of the party the poor peasants could also play a major part in the revolution.
He was convinced the only way to move to communism was through radical revolution. Lenin believed that his party spoke for the true interest of the workers. He believed that his party should be run through a strongly centralized, super-national party looking out for the welfare and equal treatment of all people. Sound familiar?
Key to the success of Lenin’s party was the support of the workers. As WWI progressed, Russia’s economic problems grew worse. Coal, metal and other resources became harder to obtain. Inflation spiraled upward. Business leaders became increasingly resentful of the Provisional Government.
When an all-Russian conference of factory workers met about a week before the November Revolution, 96 of 167 delegates were members of the Lenin’s Bolshevik party.
1917, Lenin told the people that if they put him in power, he would pull Russia out of WWI and give free land to all the peasants.
More than 1 million troops now deserted the front. Most soldiers were peasants, so naturally they supported Lenin’s party. Lenin called for land seizures. He wanted to take the land from the state and the Nobles and provide land to the poor people.
As the central government lost its authority these peasants formed local Soviets (councils) and their members became the chief centers of local power.
By October 1917, Lenin’s party had gained a majority of both the Petrograd and the Moscow Soviets. Lenin now traveled the countryside speaking of the glory and the honor of becoming a member of the communist party. The all-Russian Congress of Soviets was scheduled to meet in early November.
On November 6, the day before the congress was to meet, Kerenskii triggered a Communist Coup by ordering the closing of the Bolshevik Press. Troops under the direction of Lenin and the Petrograd Soviets took up positions to prevent any counter-revolutionary moves and almost immediately assumed the offensive.
On November 7, meeting little resistance, they took control of the vital buildings of St. Petersburg.
Kerensky, escaped in an automobile to the US Embassy and fled to America. The palace takeover in the early morning hours of November 8, after surprising little bloodshed, completed the Coup.
In the early hours of November 9, the Russian Congress approved a new government. Lenin was made chairman of the Council of Peoples Commissars and as we say, the rest is history.
So, the government of Russia, Ruled by Czars for 300 years, was overthrown by a group of students led by Alexander Kerensky who wanted a government like that of England.
The country fell into armed camps. Those who wanted the Czar back and those who wanted a parliamentary government.
Seeing the chaos and the weakened state of Russia’s government, Lenin saw control of Russia laying there on the table and simply stepped in and took it.
The people of Russia did not want Communism once they saw what Lenin had in store for them. In fact, as soon as he came to power they voted to remove Lenin and bring Kerensky back. By then it was too late.
Lenin simply ignored the results of the election, closed the Russian Parliament, and used his power of the military to force his will on the people.
No one thought it could happen, but it did.
How about another example.
In 1916, Eighteen socialists were expelled from the German Parliament (The Reichstag) for voting against the war.(WWI)
A small group of German socialists openly opposed the war. They were called Spartacists.
By September 1918 the Germans were on their last legs and were now forced to negotiate a peace settlement.
The morale of the people now fell apart. They couldn’t believe they had lost the war! Everything they heard up until now said they were winning.
Kaiser Wilhelm, the leader of Germany, was forced to abdicate in disgrace.
The Socialists now rose up and proclaimed a republic.
The German socialists were a lot different however than their Russian counterparts. They were not communists. They would not take power without the support of the majority of the workers. This they never received.
They also never seriously considered a coup like Lenin’s
The German people had just suffered a terrible defeat and were starving but unlike the Russians, they did not want to add to their problems by launching a socialist revolution.
In Germany the people looked upon the socialist/communists as being the people to blame for betraying the nation at a time when it was hurting. Remember, the socialists had opposed entering the war.
As a result, the returning troops saw the socialists as stabbing the country in the back after they had just fought so hard to defend it.
Once the army came home it regrouped and formed the “free corps” which then terrorized the socialists creating a new form of government, Fascism.
The Fascists hated socialists, intellectuals, Jews, and the old aristocracy. There sole aim was to restore order and power to Germany at any cost.
So here we go again. The country is divided politically. Some people want the Emperor, Kaiser Wilhelm back, some want socialism, and some want a new form of government, fascism, led by a strong, all powerful leader who will take control and reestablish order.
Now at the end of WWI, all the winners in the conflict met in France and laid out the terms of Germany’s surrender in The Treaty of Versailles.
Germany was forced to accept full guilt for the war. The industrial and mineral rich Saar Basin and the left bank of the Rhine River were to be internationalized. The right bank demilitarized for 30 miles.
Germany had to pay for all civilian damages caused to all allied nations during the war. Initial payment was $5 billion a year for 15 years with a balloon note to come later.
All coal output from the Saar Basin would go to France.
Finally, Germany would lose all its African colonies and could not join the League of Nations. They would also lose the Marshall, Marianas, and Caroline Islands in the Pacific, which would go to Japan.
Following WWI, with the abdication of Kaiser Wilhelm, the Weimar Republic of Germany was formed. It was this government that was forced to sign and agree to the harsh terms of the Treaty of Versailles.
So the people hated this new government. The Weimar Republic was made up of moderate Social Democrats under President Friedrich Ebert and the more radical Independent Social Democrats, who were hoping for a more fundamental socialist revolution.
Germany now faced insurrections by both the left and the right, trillion dollar to the mark inflation, and the occupation of the Ruhr valley by the French. Ebert died in the midst of a campaign against him by right-wing critics and was succeeded by Marshal Paul von Hindenburg.
Although Hindenburg sought German unity, he promoted the interests of the Junkers, the German landed aristocracy. (sound familiar?)
A president supporting the wealthy class and an opposing party supporting socialism.
Hindenberg ran for the presidency again in 1932 as the only one who could defeat the National Socialist (Nazi) party candidate Adolf Hitler.
So who was this Hitler guy? Just like Lenin, Hitler saw that with all the political turmoil in Germany, control of the nation was just sitting there waiting for someone to step in and take it.
After WWI Hitler rose to power as leader of the national socialist German Workers Party (Nazis)
It was only one of many political parties claiming that the government had betrayed the people.
The new party grew slowly, and principally in Bavaria. Convinced of the necessity of violence to achieve its ends, the party soon organized the Storm Troops, or SA, to defend its meetings; to disrupt the meetings of liberal democrats, socialists, Communists, and trade unionists; and to persecute Jews, especially Jewish merchants.
It was aided in these activities by former WWI army officers.
In 1921 Hitler was elected “unlimited chairman” of the Nazi party.
In the meantime, as the German Communist party, founded in 1919, grew in strength, the National Socialists (Nazis) concentrated much of their propaganda on denunciations of Bolshevism, which they characterized as a conspiracy of international Jewish financiers.
They also proclaimed their contempt for parliamentary democracy and pushed for a dictatorship.
In order to gain support for the Nazi party, Hitler pushed the theory that there was a great Jewish conspiracy and he now bought into the idea of creating a master race.
It is interesting to note that in Germany itself there was no Jewish problem. The German Jews numbered 1/2 million and made up less than 1% of the total population.
In Germany there was less resentment towards Jews than there was in England or France at the time.
Many contend that the hatred of the Jews stemmed from the fact that they owned big business in Germany. What? Go after big business as a political ploy? Surely not!
In fact, the Jew’s influence was confined to big department stores and some of the big newspapers and entertainment industries.
However, the theme of anti-Semitism worked to help unify Hitler’s party with a common goal.
Hitler contended that the Jews ruined small businessmen, corrupted the German women, organized revolutions, and spoiled German culture.
Jews also were said to overcharge the workers, make bad movies, create ugly commercialism, spy for Russia (what?), and sell out Germany to wall street capitalists. Hello…..sound familiar?
The German people, shell shocked from war and the great depression, fell for it hook, line , and sinker.
In a society where the common man felt helpless to strike back, Hitler gave him a common enemy. The Jew.
Hitler now launched a huge propaganda campaign.
He organized huge parades, and rituals which dazzled the German people.
Hitler now had the backing of nearly all the German people in bringing about a renaissance of the German spirit.
Hitler also advocated rallies which burned books he deemed harmful to the German people.
Some of the German intellectuals could see the writing on the wall and now chose to flee Germany.
Among them were Albert Einstein and Nobel prize winning author Thomas Mann.
On June 30, 1934 Hitler launched a campaign that came as a shock to even some of his closest followers.
Hitler saw the SA (storm troops) and their leader Ernst Rohm as a threat to his power now that he had gained control of the Nazi Party.
Through the campaigning of Hitler and his followers, the Nazi party gained the majority control of the German Parliament (Reichstag) in 1933. They did this at the ballot box, not at the end of the barrel of a gun.
Hindenberg was still president, but he now faced a Reichstag controlled by the Nazi Party. Again, sound familiar?
The Nazi party members now demanded that Hidenberg name Hitler as Chancellor (Prime Minister). Under extreme political pressure, Hindenberg did so.
Now all that stood between Hitler and complete control was President Hindenberg.
Ernst Rohm, leader of the military arm of the Nazi political party, now asked Hitler that he be given control of the German army—a move opposed by the army’s high command. Rohm now sided with left-wing dissidents who antagonized the wealthy conservative supporters of Hitler.
In order to placate the army and the industrialists, Hitler had Rohm and the dissidents murdered in a blood purge—also called night of the long knives—of June 30, 1934.
Hitler had hundreds of the SA killed and he now waited 2 weeks before he explained what had happened. He told the people that he was simply trying to save the nation from revolutionaries.
President Hindenberg even thanked Hitler for saving the German nation!
Hindenberg died on August 2, 1934 and Hitler now took complete control of the government naming himself as president and chancellor.
So there you have it again folks. Just like Russia, Germany found itself politically divided and opened the door for total chaos.
Now what scares me, is that this pattern continues to repeat itself. In Italy, King Victor Emmanuel was fighting against the Italian Socialist Party.
Benito Mussolini saw this, formed his own fascist party and stepped in to aid the King. He now gained the support of the King and the Italian people. Once he defeated the socialists, he simply went to the King and told him that he, Mussolini, would now be running the country.
In Spain, we had a monarchy opposed by socialists and fascists. Francisco Franco, seeing the political turmoil there, led a military coup and took control of that country.
Simply a European thing you say?
In China, the Party of Sun Yat Sen was fighting a civil war to create a parliamentary government in opposition to the ruling Samurai class of dictatorships. Japan, saw this happening and promptly invaded northern China (Manchuria) in the first step leading to WWII.
So there you have it folks. Have I convinced you that to continue down our current political path is suicide?
Our hostility to opposing political views and our total lack of compromise mirrors exactly what we have seen in the past.
The question is, have we learned a damned thing, or are we so stubborn that we will follow the same path of destruction I have shared with you.

A New Way Forward?

What would you say about legislation requiring the government to use tax money to transport convicted criminal illegal aliens who’d already been deported back to the United States, calling it the “right to come home”?
What if that bill also created a situation in which a known Mexican drug cartel leader could be released from prison, enter the United States “illegally,” and it would no longer be a crime?
Such radical and self-destructive legislation has already been proposed.
It’s called the New Way Forward Act (H.R. 5383), and it goes way beyond Hillary Clinton’s 2013 call for a “hemispheric common market” with “open borders,” according to Fox news commentator Tucker Carlson.
This is no exaggeration. Introduced by Democratic Representatives Jesús Garcia (Ill.), Pramila Jayapal (Wa.), Karen Bass (Calif.), and Ayanna Pressley (Mass.) in December 2019, the New Way Forward Act (NWFA), is supported by almost one-fifth of House Democrats, including Representatives Ilhan Omar (Minn.) and Alexandria Ocasio-Cortez (N.Y.).
Just as shockingly, this nation-rending bill has received virtually no media attention, tragic as it “would entirely remake our immigration system, with the explicit purpose of ensuring that criminals are able to move here, and settle here permanently, with impunity”.
Carlson calls the NWFA “the most radical single piece of legislation we’ve ever seen,” making the Green New Deal, proposals to end fracking via executive order and nationalized healthcare seem sane by comparison. The bill’s promotional materials actually state, “Convictions … should not lead to deportation”.
Currently, “legal U.S. immigrants can be deported if they commit an ‘aggravated felony’ or a ‘crime of moral turpitude’ — that is, a vile, depraved act, like molesting a child,” explained Carlson. Yet under the NWFA, “‘crimes of moral turpitude’ are eliminated entirely as a justification for deportation. And the category of ‘aggravated felony’ gets removed as well”.
The NWFA also would:
• end automatic deportation for all crimes, including robbery, fraud, and child sexual abuse;
• make aliens who falsify passports immune from deportation, period;
• raise the minimum sentence triggering deportation, for crimes that still require it, from one to five years. Note that “crimes like car theft, fraud, and weapons offenses all carry average prison sentences of fewer than five years. Moreover, some rapists, child abusers, and even killers receive less time than that. Yet aliens thus convicted would stay in our country and eventually receive citizenship;
The bill would also:
• allow anti-American immigration judges to nullify deportation orders even for aliens with five-year-plus sentences;
• allow drug addicts and those convicted of drug crimes abroad to immigrate to the U.S;
• decriminalize illegal entry, even for the previously deported. The NWFA asserts that “criminalizing illegal entry … is ‘white supremacist,’”
• effectively abolish all enforcement against illegal migration.
In summary, it would be much harder arresting illegals than arresting you. “They’re the protected class here. You’re just some loser who’s paying for it all,” says Carlson.
As for the aforementioned “right to come home,” it would include tens of thousands of aliens expelled from our country for many crimes: “Sexual abuse. Robbery. Assault. Drug trafficking, weapons trafficking, and human trafficking.
In fact, from “2002 to 2018, 480,000 people were deported for illegal entry or reentry into America.”
The NWFA would force us to buy them all plane tickets back here — costing approximately a billion dollars — “and that’s before Democrats make you start paying for these criminals’ free health care, too. Which they plan as well.”
It’s not surprising the mainstream media have suppressed this story. It reveals, right before an election and in stark terms, the new Democratic Party’s face: radical, anti-American, angry, and bent on our nation’s destruction.
That folks, is why we need to pay attention to who we vote for in the next election.
If the Democrats win the House and Senate and the White house is occupied by a Bernie Sanders or an Elizabeth Warren, our country will see these types of changes rubber stamped by Congress and implemented immediately.
Of course, as with most all of today’s illegal and legal immigrants, the vast majority of these “new Americans” would vote Democratic. That’s the whole idea, too: The Left wants an “entirely new country, in which resistance is crushed, and they’re in charge forever,” Carlson concluded.
The NWFA won’t likely pass and be signed into law anytime soon. But know that this is precisely what the Democrats’ AOC/Sanders wing — which will one day control the whole party — has in store for us.
The New Way Forward Act would, if passed as drafted, virtually abolish any sort of control over the people allowed to enter the United States. For example, paragraph (6) says there will be special rules for VULNERABLE PERSONS AND PRIMARY CAREGIVERS.
So what does vulnerable mean? According to the act:
Anyone under 21 or over 60; anyone – presumably any woman – who is pregnant; anyone who identifies as lesbian, gay, bisexual, transgender, or intersex; a victim or witness of a crime (including dropping litter?); who has filed a civil rights claim in Federal or State court; who has a serious mental or physical illness or disability (give us your poor, your wretched, and your defectives); who has been determined by an asylum officer in an interview conducted under section 235(b)(1)(B) to have a credible fear of persecution or a reasonable fear of persecution; who has limited English language proficiency and is not provided access to appropriate and meaningful language services in a timely fashion.
Finally, anyone who has been determined by an immigration judge or the Secretary of Homeland Security to be experiencing severe trauma or to be a survivor of torture or gender-based violence, based on information obtained during intake, from the alien’s attorney or legal service provider, or through credible self-reporting.
That last category alone would lead to the admission of almost anyone because it is a standard ploy of female illegal immigrants to claim they were raped by soldiers or police officers in their own countries.
The purposes of Titles VI and VII of the bill are stated candidly to repeal migration criminal laws and to give those actually deported the right to “come home”.
And if you oppose any of the above, you can only be racist, right? No other country on Earth has or would attempt to introduce such a bill, although Western Europe has been going down the same route for decades.
The New Way Forward Act is not simply laughable, it is insane, and further proof as if further proof were needed that if Donald Trump is not re-elected later this year, America will be finished.
Democrats are attempting to relegate American citizens to second-class status while elevating illegal aliens- even those convicted of violent felonies- to being virtually untouchable by law enforcement.
It’s understandable why this has gone almost unreported by the Fake News Complex- it would cause an uproar from the public. In short, it’s one of the most anti-American pieces of legislation ever presented.

Let’s review this again.
It does away with all border enforcement, stops the deportation of immigrants who have engaged in serious felonies or acts of moral turpitude, and allows immigration judges to step in and oversee federal criminal trials.
The most shocking thing, though, is that, when it comes to criminals already deported over the last twenty years for serious felonies or acts of moral turpitude, the bill requires that American taxpayers pay the billion or so dollars needed to bring all 480,000 of them back to America.
You read that right- the bill provides that the American taxpayer has to foot the bill for repatriating all illegals that have already been deported back to the US.
The good news is that this monstrosity is unlikely to make it out of the House, and if it does it will likely die in the Senate- unless the Dems take it in November.
In any event, it will take the Democrats controlling both houses of Congress and the presidency- or an unprecedented amount of voter fraud- to pass.
If the Democrats were to take Congress and the White House, this bill could well become law. The same Democrat presidential candidates who want to have open borders and provide free health care for illegal aliens will be perfectly happy to add in non-American criminals.
This bill is a slap in the face to those law-abiding people, both citizens and non-citizens, who have to share their communities with the new protected class of criminal foreigners.
It’s not the bill’s sponsors or supporters who have to share an apartment building with the child-molester who can’t be sent away or have to hope their children won’t be shot dead by the MS-13 gang member who’s been flown back to America on the taxpayers’ dime.
Instead, it’s inner-city people — often minorities — who are going to find themselves at the mercy of criminals who were once sent back to the places from whence they came.
The significance of this bill- at least the attempt to push it through- shows the Democrats’ disdain for their traditional voting blocks, blacks and Hispanics- the ones who were born here or came legally.
The handwriting is on the wall, blacks and Hispanics are deserting the Democratic Party in droves and the desperate Dems have to replace them.
There’s another reason for this push as well- 2020 promises to be a violent election and the Dems are building an army- one that will fulfill what the cowards in ANTI-fa can’t- actual violence from people who will attack more than the elderly and disabled.
The Dems are bent on power and if they can’t get it at the ballot box, they’ll try to get it in the streets. The Fake News Complex will paint the violent illegals as victims in much the same way that the New Way Forward Act does.
If the Democrats “New Way Forward Act,” sounds completely insane… It Gets Even Worse.
This “New Way Forward Act,” is intended to protect not just criminals who were caught doing petty crimes.
But it also protects those who are found guilty of much more serious crimes such as robbery, child sexual abuse, and even murder from deportation.
Essentially the Democrats’ “New Way Forward Act,” removes all criminal criteria currently used to deport people.
Keep in mind folks that the bill is currently “sponsored by 44 House Democrats, including all four of the Democrats “Squad” members.
Representative Garcia, the sponsor of the bill, brags that the bill will break the “prison to deportation pipeline.” How does it do that?
Simple. If this bill passes the House and Senate and is signed into law by the president, there will no longer be any crimes that automatically require deportation. None.
Think about this. Making and possessing fake IDs and stealing other people’s identity would not be considered a crime for which to be deported:
And one crime – falsifying a passport – will be made immune from deportation, no matter what.
If you just renewed your driver’s license to comply with the Real ID Act, you must feel like an idiot.
The California legislators on the list of co-sponsors of this debacle and who have seen the corrosive effects of letting prisoners out early are unmoved by the Act changing minimum sentences that require deportation.
Under the proposed legislation, the minimum prison sentence for crimes that still require deportation would rise from one year to five.
Checking the Bureau of Justice Statistics. According to federal data, crimes like car theft, fraud, and weapons offenses all carry average prison sentences of fewer than five years. And that’s just looking at averages.
There are people who commit rape, child abuse and even manslaughter and receive sentences of fewer than five years. Lots of them.
If the New Way Forward Act becomes law, immigrants who commit those crimes and receive those sentences would remain in the country.
The decriminalization of illegal immigration was already becoming a more mainstream position with the Democratic Party. In fact, before the introduction of the bill, Democratic presidential candidates Bernie Sanders and Elizabeth Warren both came out in support of the proposal.
Sanders, a clear frontrunner in the nomination contest, has also voiced support for virtually ceasing all deportations.
“Make no mistake: House Democrats have embraced open borders policies,” Wyoming Rep. Liz Cheney, leader of the House Republican caucus, recently stated.
“The ideas included in this radical bill, on top of their plans to give free health care to illegal immigrants and abolish ICE, all would jeopardize the security of the nation and reward people who cross the border illegally. On so many important issues, House Democrats have adopted an extreme and socialist agenda that’s wrong for the country,” Cheney continued.
She added: “Every day they show the American people why they cannot be trusted with power.”
In a document pushed by Representative Garcia’s (sponsor of the bill) office, stated that the laws that made unauthorized entry into the U.S. a federal crime were “born from white supremacist ideology and politics.”
So there you have it folks.
Are you a white supremacist because you want to keep criminals and illegal aliens from entering the US?
Should we just open our borders and allow anyone to come in?
If you say no, are the democrats right in assuming you are nothing but a racist, xenophobe, who lacks any compassion for your fellow man?
Callers?