Videos uploaded by user “Way Back”
LBJ Speaks to John F. Kennedy's Mother Rose After the Assassination (1963)
Rose Elizabeth Fitzgerald Kennedy (July 22, 1890 – January 22, 1995) was an American philanthropist and socialite. She was deeply embedded in the "lace curtain" Irish Catholic community in Boston, where her father was mayor. She was the wife of businessman and investor Joseph P. Kennedy, Sr., who was United States Ambassador to the Court of St James's. Their nine children included President John F. Kennedy, Senator Robert F. Kennedy, and longtime Senator Ted Kennedy. On October 7, 1914, she married Joseph Patrick "Joe" Kennedy at age 24 after a courtship of more than seven years. He was the elder son of businessman/politician P. J. Kennedy (political rival of Honey Fitz) and Mary Augusta Hickey. She gave birth to the couple's first child, Joseph, Jr., nine months later. They initially lived in a home in Brookline, Massachusetts that is now the John Fitzgerald Kennedy National Historic Site, and later a 15-room vacation home at Hyannis Port on Cape Cod, which became the Kennedy family’s lasting base. They had nine children. Joseph provided well for their family, but was unfaithful. His affairs included one with Gloria Swanson. While eight months pregnant with the couple's fourth child Kathleen, Rose temporarily went back to her parents, but returned to Joseph. In turning a blind eye to her husband's affairs, Rose depended heavily on medication. Ronald Kessler found records for prescription tranquilizers Seconal, Placidyl, Librium, and Dalmane to relieve Rose's nervousness and stress, and Lomotil, Bentyl, Librax, and Tagamet for her stomach.[3] Rose Kennedy was a strict Catholic throughout her life. Even after her 100th birthday, she rarely missed Sunday Mass and maintained an "extremely prudish" exterior. Her strict beliefs often placed her at odds with her children.[4] Jacqueline Kennedy described her mother-in-law in her correspondence to Father Joseph Leonard, an Irish priest: "I don't think Jack's mother is too bright – and she would rather say a rosary than read a book."[5] Rose Kennedy stated that she felt completely fulfilled as a full-time homemaker. In her 1974 autobiography, Times to Remember, she wrote, "I looked on child rearing not only as a work of love and a duty, but as a profession that was fully as interesting and challenging as any honorable profession in the world and one that demanded the best I could bring to it..... What greater aspiration and challenge are there for a mother than the hope of raising a great son or daughter?" http://en.wikipedia.org/wiki/Rose_Kennedy
Views: 46965 Way Back
Why Do We Spend So Much Money on Defense? Bernie Sanders on Reinvesting in America (1991)
Sanders is a self-described socialist, democratic socialist, and progressive who admires the Nordic model of social democracy and is a proponent of workplace democracy. In November 2015, Sanders gave a speech at Georgetown University about his view of democratic socialism, including its place in the policies of presidents Franklin D. Roosevelt and Lyndon B. Johnson. In defining what democratic socialism means to him, Sanders said: "I don’t believe government should take over the grocery store down the street or own the means of production, but I do believe that the middle class and the working families who produce the wealth of America deserve a decent standard of living and that their incomes should go up, not down. I do believe in private companies that thrive and invest and grow in America, companies that create jobs here, rather than companies that are shutting down in America and increasing their profits by exploiting low-wage labor abroad.” Many commentators have noted the consistency of his views throughout his political career. Calling them a "disaster for the American worker", Sanders voted against and has spoken for years against NAFTA, CAFTA, and PNTR with China, saying that they have resulted in American corporations moving abroad. He is also against the Trans-Pacific Partnership, which he says was "written by corporate America and the pharmaceutical industry and Wall Street."[170][171] Sanders focuses on economic issues such as income and wealth inequality,[15][172] raising the minimum wage,[173] universal healthcare,[174] reducing the burden of student debt,[175] making public colleges and universities tuition-free by taxing financial transactions,[176] and expanding Social Security benefits by eliminating the cap on the payroll tax on all income above $250,000.[177][178] He has become a prominent supporter of laws requiring companies to give their workers parental leave, sick leave, and vacation time, noting that such laws have been adopted by almost every other developed country.[179] He also supports legislation that would make it easier for workers to join or form a union.[180][181] Sanders has advocated for greater democratic participation by citizens, campaign finance reform, and the overturn of Citizens United v. FEC.[182][183] He also advocates comprehensive financial reforms,[184] such as breaking up "too big to fail" financial institutions, restoring Glass–Steagall legislation, reforming the Federal Reserve Bank and allowing the Post Office to offer basic financial services in economically marginalized communities.[185][186][187][188] Sanders strongly opposed the U.S. invasion of Iraq and has criticized a number of policies instituted during the War on Terror, particularly mass surveillance and the USA PATRIOT Act.[189][190] Sanders has liberal stances on social issues, having advocated for LGBT rights and against the Defense of Marriage Act and being pro-choice on abortion, as well as opposing the defunding of Planned Parenthood.[191][192] He has denounced institutional racism and called for criminal justice reform to reduce the number of people in prison, advocates a crackdown on police brutality, and supports abolishing private, for-profit prisons[193][194][195] and the death penalty.[196] Sanders supports legalizing marijuana at the federal level.[197] On November 15, 2015, in response to ISIS's attacks in Paris, Sanders cautioned against "Islamophobia" and said, "We gotta be tough, not stupid," in the war against ISIS, and said the U.S. should continue to welcome Syrian refugees.[198] Sanders advocates bold action to reverse global warming and substantial investment in infrastructure, with "energy efficiency and sustainability" and job creation as prominent goals.[199][200] Sanders acknowledges climate change as the greatest threat to national security. https://en.wikipedia.org/wiki/Bernie_Sanders
Views: 4773 Way Back
What Are the Arguments Against Religion? A. C. Grayling on the Case for Humanism (2013)
The God Argument: The Case against Religion and for Humanism is a 2013 book by English philosopher and humanist, A. C. Grayling, which counters the arguments for the existence of God, and puts forward humanism as an alternative to religion. About the book: https://www.amazon.com/gp/product/1620401924/ref=as_li_tl?ie=UTF8&camp=1789&creative=9325&creativeASIN=1620401924&linkCode=as2&tag=ub066-20&linkId=c1552cb601c26a65278bc28ba7e408bf Grayling is concerned with tone. He claims to have written the first book "thoroughly and calmly to examine all the arguments offered in support of religious beliefs", such as the ontological argument for the existence of God, which he argues against in the first half of the book. In the other half, he proposes humanism as a suitable substitute of religion for a moral life or what he calls a "good life". According to his definition of humanism, if you believe that moral choices should be grounded in "the responsible use of reason" and "human experience in the real world" then you are a humanist. http://en.wikipedia.org/wiki/The_God_Argument Anthony Clifford "A. C." Grayling (born 3 April 1949) is an English philosopher. In 2011 he founded and became the first Master of New College of the Humanities, an independent undergraduate college in London. Until June 2011, he was Professor of Philosophy at Birkbeck, University of London, where he taught from 1991. He is also a supernumerary fellow of St Anne's College, Oxford. Grayling is the author of about 30 books on philosophy, including The Refutation of Scepticism (1985), The Future of Moral Values (1997), The Meaning of Things (2001), The Good Book (2011), and The God Argument (2013). He is a Trustee of the London Library, a Fellow of the Royal Society of Literature, and a Fellow of the Royal Society of Arts.[1] He is a director/contributor at Prospect Magazine. His main academic interests lie in epistemology, metaphysics, and philosophical logic.[1] He has described himself as "a man of the left" and is associated in Britain with the new atheism movement,[2] and is sometimes described as the 'Fifth Horseman of New Atheism'. http://en.wikipedia.org/wiki/A._C._Grayling Image By Ian Scott (Own work) [CC-BY-SA-3.0 (http://creativecommons.org/licenses/by-sa/3.0)], via Wikimedia Commons
Views: 75033 Way Back
How We Think Without Thinking: Malcolm Gladwell on Great Decision Makers (2005)
Gladwell has written four books. More about his book Blink: The Power of Thinking Without Thinking: https://www.amazon.com/gp/product/0316010669/ref=as_li_tl?ie=UTF8&camp=1789&creative=9325&creativeASIN=0316010669&linkCode=as2&tag=ub066-20&linkId=a40cb8908184ea6942a795c2a1dc31e5 When asked for the process behind his writing, he said "I have two parallel things I'm interested in. One is, I'm interested in collecting interesting stories, and the other is I'm interested in collecting interesting research. What I'm looking for is cases where they overlap." The initial inspiration for his first book, The Tipping Point, came from the sudden drop of crime in New York City. He wanted the book to have a broader appeal than just crime, however, and sought to explain similar phenomena through the lens of epidemiology. While Gladwell was a reporter for The Washington Post, he covered the AIDS epidemic. He began to take note of "how strange epidemics were," saying that epidemiologists have a "strikingly different way of looking at the world." The word "tipping point" comes from the moment in an epidemic when the virus reaches critical mass and begins to spread at a much higher rate. After The Tipping Point, Gladwell wrote Blink in 2005. The book explains how the human subconscious interprets events or cues and how past experiences can lead people to make informed decisions very rapidly, using examples like the Getty kouros and psychologist John Gottman's research on the likelihood of divorce in married couples. Gladwell's hair was the inspiration for Blink.[25] He stated that he started to get speeding tickets all the time, an oddity considering that he had never got one before, and that he started getting pulled out of airport security lines for special attention.[26] In a particular incident, he was accosted by three police officers while walking in downtown Manhattan, because his curly hair matched the profile of a rapist, despite the fact that the suspect looked nothing like him otherwise.[27] Gladwell's third book, Outliers, published in 2008, examines how a person's environment, in conjunction with personal drive and motivation, affects his or her possibility and opportunity for success. Gladwell's original question revolved around lawyers: "We take it for granted that there's this guy in New York who's the corporate lawyer, right? I just was curious: Why is it all the same guy?", in reference to the comparable family histories of many early corporate lawyers.[clarification needed][9] In another example given in the book, Gladwell noticed that people ascribe Bill Gates's success to being "really smart" or "really ambitious." He noted that he knew a lot of people who are really smart and really ambitious, but not worth 60 billion dollars. "It struck me that our understanding of success was really crude—and there was an opportunity to dig down and come up with a better set of explanations."[28] Gladwell's fourth book, What the Dog Saw: And Other Adventures, was published on October 20, 2009.[29] What the Dog Saw bundles together Gladwell's favorite articles from The New Yorker since he joined the magazine as a staff writer in 1996.[30] The stories share a common theme, namely that Gladwell tries to show us the world through the eyes of others, even if that other happens to be a dog.[31][32] Gladwell's books The Tipping Point (2000) and Blink (2005), were international bestsellers. The Tipping Point sold over two million copies in the United States. Blink sold equally well.[13][33] As of November 2008, the two books had sold a combined 4.5 million copies.[34] Gladwell's next book, entitled David and Goliath, is scheduled to be released in 2013 and will examine the struggle of underdogs versus favorites. The book is partially inspired by an article Gladwell wrote for the New Yorker in 2009 entitled "How David Beats Goliath". http://en.wikipedia.org/wiki/Malcolm_gladwell
Views: 135435 Way Back
Animated Cartoon: Atomic Bombing of Hiroshima and Nagasaki, Japan
During the final stages of World War II in 1945, the Allies of World War II conducted two atomic bombings against the cities of Hiroshima and Nagasaki in Japan. These two events are the only use of nuclear weapons in war to date. Following a firebombing campaign that destroyed many Japanese cities, the Allies prepared for a costly invasion of Japan. The war in Europe ended when Nazi Germany signed its instrument of surrender on 8 May, but the Pacific War continued. Together with the United Kingdom and the Republic of China, the United States called for a surrender of Japan in the Potsdam Declaration on 26 July 1945, threatening Japan with "prompt and utter destruction". The Japanese government ignored this ultimatum, and two nuclear weapons developed by the Manhattan Project were deployed. Little Boy was dropped on the city of Hiroshima on 6 August 1945, followed by the Fat Man over Nagasaki on 9 August. Within the first two to four months of the bombings, the acute effects killed 90,000--166,000 people in Hiroshima and 60,000--80,000 in Nagasaki, with roughly half of the deaths in each city occurring on the first day. The Hiroshima prefecture health department estimated that, of the people who died on the day of the explosion, 60% died from flash or flame burns, 30% from falling debris and 10% from other causes. During the following months, large numbers died from the effect of burns, radiation sickness, and other injuries, compounded by illness. In a US estimate of the total immediate and short term cause of death, 15--20% died from radiation sickness, 20--30% from burns, and 50--60% from other injuries, compounded by illness. In both cities, most of the dead were civilians, although Hiroshima had a sizeable garrison. On 15 August, six days after the bombing of Nagasaki, Japan announced its surrender to the Allies, signing the Instrument of Surrender on 2 September, officially ending World War II. The bombings led, in part, to post-war Japan's adopting Three Non-Nuclear Principles, forbidding the nation from nuclear armament. The role of the bombings in Japan's surrender and their ethical justification are still debated. http://en.wikipedia.org/wiki/Atomic_bombing_of_japan
Views: 163673 Way Back
A Former Clinton Adviser Exposes Hillary's History of Lies and Obfuscations (2004)
Morris first worked with Bill and Hillary Clinton during Bill Clinton's successful 1978 bid for Governor of Arkansas. About the book: https://www.amazon.com/gp/product/0060736690/ref=as_li_tl?ie=UTF8&camp=1789&creative=9325&creativeASIN=0060736690&linkCode=as2&tag=ub066-20&linkId=f65cd0c66c29a19aae2546f1b635d8b9 Morris did not have a role in Clinton's successful 1992 presidential campaign, which instead was headed by David Wilhelm, James Carville, George Stephanopoulos, and Paul Begala. After the 1994 mid-term election, in which Republicans took control of both houses of the United States Congress and gained considerable power in the states, Clinton once again sought Morris' help to prepare for the 1996 Presidential election. As of August 2009, Morris lends his name and assistance to the League of American Voters, an advocacy group for seniors to defeat the Patient Protection and Affordable Care Act. He has been described as "America's most ruthless political consultant" in the BBC documentary Century of the Self,[30] which chronicled how he brought lifestyle marketing to politics for the first time. Morris has consulted for candidates in other countries of the western hemisphere, including the campaigns of Fernando de la Rua for President of Argentina (1999), Jorge Batlle for President of Uruguay (1999), Vicente Fox for President of Mexico (2000), and Raphael Trotman for President of Guyana (2006). Regarding the 2004 Democratic presidential nomination, he initially stated that Howard Dean's candidacy could be written off right away. He had earlier discussed the likelihood of Dean defeating John Kerry after early strong showings by the former Vermont governor. Kerry defeated Dean and all his other rivals and won the nomination. In a column in The Hill on June 22, 2005, Morris predicted that Hillary Clinton would face her "worst nightmare" in her 2006 Senate race against moderate Republican candidate Jeanine Pirro, whose campaign subsequently collapsed within a matter of two months after repeated crushing defeats in the opinion polls due to her husband's alleged Mafia ties. He even went so far as to suggest that Hillary Clinton would drop out to focus on her 2008 presidential campaign. https://en.wikipedia.org/wiki/Dick_Morris
Views: 279309 Way Back
Aaron McGruder on The Boondocks: Cast, Characters, Cartoon, Quotes, Comic Strip (1999)
The Boondocks was a daily syndicated comic strip written and originally drawn by Aaron McGruder that ran from 1996 to 2006. His books: https://www.amazon.com/gp/search?ie=UTF8&tag=ub066-20&linkCode=ur2&linkId=160270564a0ef182cd900d8ef0fc5b94&camp=1789&creative=9325&index=books&keywords=Aaron%20McGruder Created by McGruder in 1996 for Hitlist.com, an early online music website, it was printed in the monthly hip hop magazine The Source in 1997. As it gained popularity, the comic strip was picked up by the Universal Press Syndicate and made its national debut on April 19, 1999. A popular and controversial strip, The Boondocks satirizes African American culture and American politics as seen through the eyes of young, black radical Huey Freeman. McGruder's syndicate said it was among the biggest launches the company ever had.[2] McGruder sold the television and film rights for the strip to Sony Pictures Entertainment. The Boondocks animated TV series premiered on the Cartoon Network's Adult Swim programming block on November 6, 2005. McGruder launched an unsuccessful Kickstarter campaign for a live action movie featuring the Boondocks character Uncle Ruckus in 2013. https://en.wikipedia.org/wiki/The_Boondocks_(comic_strip) Aaron McGruder (born May 29, 1974)[1] is an American writer, producer, and cartoonist best known for writing and drawing The Boondocks, a Universal Press Syndicate comic strip about two young African-American brothers, Huey (named after Huey P. Newton) and his younger brother and wannabe gangsta, Riley,[2] from inner-city Chicago now living with their grandfather in a sedate suburb, as well as being the creator, executive producer, and head writer of The Boondocks animated TV series based on his strip. He was also a screenwriter on Red Tails, and co-author, with Reginald Hudlin, of a 2004 graphic novel, Birth of a Nation: A Comic Novel, drawn by cartoonist Kyle Baker. Other projects include variety comedy series The Super Rumble Mix Show and Black Jesus, the latter on Adult Swim. He is a frequent public speaker on political and cultural issues. https://en.wikipedia.org/wiki/Aaron_McGruder
Views: 65221 Way Back
Shelby Foote & Walker Percy: Correspondence, Civil War, Quotes, Biography (1997)
Shelby Dade Foote, Jr. (November 17, 1916 – June 27, 2005) was an American historian and novelist who wrote The Civil War: A Narrative, a massive, three-volume history of the war. About the book of correspondence: https://www.amazon.com/gp/product/0393040313/ref=as_li_tl?ie=UTF8&camp=1789&creative=9325&creativeASIN=0393040313&linkCode=as2&tag=ub066-20&linkId=93e120f1c457b9f903787d5665aac796 With geographic and cultural roots in the Mississippi Delta, Foote's life and writing paralleled the radical shift from the agrarian planter system of the Old South to the Civil Rights era of the New South. Foote was relatively unknown to the general public for most of his life until his appearance in Ken Burns's PBS documentary The Civil War in 1990, where he introduced a generation of Americans to a war that he believed was "central to all our lives." Foote did all his writing by hand with an old-fashioned nib pen, disdaining the typewriter. Horton Foote, the playwright and screenwriter (To Kill A Mockingbird, Baby the Rain Must Fall and Tender Mercies) was the voice of Jefferson Davis in the PBS series. The two Footes are third cousins; their great-grandfathers were brothers. "And while we didn't grow up together, we have become friends; I was the voice of Jefferson Davis in that TV series," Horton Foote added proudly. In 1992, Foote received an honorary doctorate from the University of North Carolina. In the early 1990s, Foote was interviewed by journalist Tony Horwitz for the project on American memory of the Civil War which Horwitz eventually published as Confederates In The Attic (1998). Foote was also a member of The Modern Library's editorial board for the re-launch of the series in the mid-1990s. (This series published two books excerpted from his Civil War narrative. Foote also contributed a long introduction to their edition of Stephen Crane's The Red Badge of Courage giving a narrative biography of the author.) https://en.wikipedia.org/wiki/Shelby_Foote Walker Percy, Obl.S.B. (May 28, 1916 – May 10, 1990) was a Southern author from Covington, Louisiana, whose interests included philosophy and semiotics. Percy is known for his philosophical novels set in and around New Orleans, Louisiana, the first of which, The Moviegoer, won the U.S. National Book Award for Fiction. He devoted his literary life to the exploration of "the dislocation of man in the modern age." His work displays a combination of existential questioning, Southern sensibility, and deep Catholic faith. Percy was born in 1916 in Birmingham, Alabama, as the first of three boys to LeRoy Pratt Percy and Martha Susan Phinizy. His father's Mississippi Protestant family included his uncle LeRoy Percy, a U.S. Senator, and LeRoy Pope Percy, a Civil War hero. In February 1917, Percy's grandfather committed suicide. This seemed to set a family pattern of emotional struggle and deaths that would haunt Percy throughout his life. In 1929, when Percy was 13, his father committed suicide. His mother took the family to live at her own mother's home in Athens, Georgia. Two years later, Percy's mother died when she drove a car off a country bridge and into Deer Creek near Leland, Mississippi, where they were visiting. Percy regarded this death as another suicide. Walker and his two younger brothers, LeRoy (Roy) and Phinizy (Phin), were taken in by their second cousin William Alexander Percy, a bachelor lawyer and poet in Greenville, Mississippi. Percy was raised as an agnostic, though he was nominally affiliated with a theologically liberal Presbyterian church. William Percy introduced him to many writers and poets, and to a neighboring youth his own age, Shelby Foote, who became his lifelong best friend. As young men, Percy and Foote decided to pay their respects to William Faulkner by visiting him in Oxford, Mississippi. But when they arrived at his home, Percy was so in awe of the literary giant that he could not bring himself to speak to him. He later recounted how he could only sit in the car and watch while Foote and Faulkner had a lively conversation on the porch. Percy attended the University of North Carolina at Chapel Hill, where he joined the Xi chapter of Sigma Alpha Epsilon fraternity. He received a medical degree from Columbia University in New York City in 1941. There he had psychotherapy to deal with the legacy of suicides and depression in his family. After contracting tuberculosis while performing an autopsy at Bellevue Hospital Center, Percy spent several years recuperating at the Trudeau Sanitorium in Saranac Lake, New York. At the time, there was no known treatment for TB other than rest. During this period, Percy read the works of the Danish existentialist writer Søren Kierkegaard and the Russian novelist Fyodor Dostoevsky. He began to question the ability of science to explain the basic mysteries of human existence. https://en.wikipedia.org/wiki/Walker_Percy
Views: 25184 Way Back
How to Be Accountable: Jamie Dimon University Commencement Address (2010)
Dimon resigned as CEO of First Chicago Corp. In March 2000, Dimon became CEO of Bank One, the nation's fifth largest bank. When JPMorgan Chase purchased Bank One in July 2004, Dimon became president and chief operating officer of the combined company. On December 31, 2005, he was named chief executive officer of JPMorgan Chase and one year later, on December 31, 2006, he was named chairman of the board.[20] In March 2008 he was a board member of the New York Federal Reserve Bank and CEO of JPMorgan. Under Dimon's leadership, with the acquisitions during his tenure, JPMorgan Chase has become the leading U.S. bank in domestic assets under management, market capitalization value, and publicly traded stock value. JPMorgan Chase is also the No. 1 credit card provider in the U.S. In 2009, Dimon was considered one of "The TopGun CEOs" by Brendan Wood International, an advisory agency.[21][22] On September 26, 2011, Dimon was involved in a high-profile heated exchange with Mark Carney, the governor of the Bank of Canada, in which Dimon said provisions of the Basel III international financial regulations discriminate against U.S. banks and are "anti-American".[23] On May 10, 2012, JPMorgan Chase initiated an emergency conference call to report a loss of at least $2 billion in trades that Dimon said were "designed to hedge the bank's overall credit risks". The strategy was, in Dimon's words, "flawed, complex, poorly reviewed, poorly executed, and poorly monitored".[24] The episode is being investigated by the Federal Reserve, the SEC, and the FBI.[25] Dimon commented on the Volcker Rule in January 2012, "Part of the Volcker Rule I agreed with, which is no prop trading. But market making is an essential function. And the public should recognize that we have the widest, the deepest, the most transparent capital markets in the world. And part of that is because we have enormous market making. If the rules were written as they originally came out; I suspect they'll be changed, it would really make it hard to be a market maker in the United States.”[26] He served as Chairman of the Executive Committee of The Business Council for 2011 and 2012.[27] In October 2013, it was reported that Dimon had given up his role as chairman of JP Morgan's main banking business to be succeeded by former Johnson & Johnson chief executive William Weldon. Dimon retained his roles as chairman and CEO of the parent company.[28] On January 24, 2014 it was announced that Dimon would receive $20 million for his work in 2013, despite what was reported as the bank’s worst year under Dimon’s reign. The award was a 74% raise, which included over $18 million in restricted stock. Forbes reported that, in a statement following news of Dimon’s compensation, the bank said, “Under Mr. Dimon’s stewardship, the Company has fortified its control infrastructure and processes and strengthened each of its key businesses while continuing to focus on strengthening the Company’s leadership capabilities across all levels.” Dimon donates primarily to the Democratic Party.[35] In May 2012, he described himself as "barely a Democrat" [36] stating, I've gotten disturbed at some of the Democrats' anti-business behavior, the attacks on work ethic and successful people. I think it's very counterproductive. ... It doesn't mean I don't have their values. I want jobs. I want a more equitable society. I don't mind paying higher taxes. ... I do think we're our brother's keeper but I think that attacking that which creates all things, is not the right way to go about it.[36] After Obama won the 2008 presidential election, there was speculation that Dimon would serve in the Obama Administration as Secretary of the Treasury. Obama eventually named the president of the Federal Reserve Bank of New York, Timothy Geithner, to the position.[37] Following the acquisition of Washington Mutual by JPMorgan Chase, Obama commented on Dimon's handling of the real-estate crash, credit crisis, and the banking collapse affecting corporations nationwide, including major financial institutions like Bank of America, Citibank, and Wachovia (now Wells Fargo). You know, keep in mind, though there are a lot of banks that are actually pretty well managed, JPMorgan being a good example, Jamie Dimon, the CEO there, I don't think should be punished for doing a pretty good job managing an enormous portfolio.[38] Dimon has had close ties to some people in the Obama White House, including former Chief of Staff Rahm Emanuel.[39] Dimon was one of three CEOs—along with Lloyd Blankfein and Vikram Pandit—said by the Associated Press to have had liberal access to former Treasury Secretary Timothy Geithner.[40][41] Nonetheless, Dimon has often publicly disagreed with some of Obama's policies. http://en.wikipedia.org/wiki/Jamie_Dimon
Views: 1819 Way Back
Why Thomas Paine's Common Sense Is Important: Chris Hedges & Cornel West (2014)
Thomas Paine (February 9, 1737 [O.S. January 29, 1736] – June 8, 1809) was an English and American political activist, philosopher, political theorist and revolutionary. More from Cornel West: https://www.amazon.com/gp/search?ie=UTF8&tag=ub066-20&linkCode=ur2&linkId=782141bdd4fe3cbc002bc60543958ffe&camp=1789&creative=9325&index=books&keywords=cornel%20west More from Chris Hedges: https://www.amazon.com/gp/search?ie=UTF8&tag=ub066-20&linkCode=ur2&linkId=e3808bdb42d7439c1bbce65341f817e1&camp=1789&creative=9325&index=books&keywords=chris%20hedges As the author of the two most influential pamphlets at the start of the American Revolution, he inspired the rebels in 1776 to declare independence from Britain. His ideas reflected Enlightenment-era rhetoric of transnational human rights. He has been called "a corsetmaker by trade, a journalist by profession, and a propagandist by inclination". Born in Thetford, England, in the county of Norfolk, Paine emigrated to the British American colonies in 1774 with the help of Benjamin Franklin, arriving just in time to participate in the American Revolution. Virtually every rebel read (or listened to a reading) of his powerful pamphlet Common Sense (1776), which crystallized the rebellious demand for independence from Great Britain. His The American Crisis (1776–83) was a prorevolutionary pamphlet series. Common Sense was so influential that John Adams said, "Without the pen of the author of Common Sense, the sword of Washington would have been raised in vain." Paine lived in France for most of the 1790s, becoming deeply involved in the French Revolution. He wrote Rights of Man (1791), in part a defense of the French Revolution against its critics. His attacks on British writer Edmund Burke led to a trial and conviction in absentia in 1792 for the crime of seditious libel. In 1792, despite not being able to speak French, he was elected to the French National Convention. The Girondists regarded him as an ally. Consequently, the Montagnards, especially Robespierre, regarded him as an enemy. In December 1793, he was arrested and imprisoned in Paris, then released in 1794. He became notorious because of his pamphlet The Age of Reason (1793–94), in which he advocated deism, promoted reason and free thought, and argued against institutionalized religion in general and Christian doctrine in particular. He also wrote the pamphlet Agrarian Justice (1795), discussing the origins of property, and introduced the concept of a guaranteed minimum income. In 1802, he returned to the U.S. where he died on June 8, 1809. Only six people attended his funeral as he had been ostracized for his ridicule of Christianity. Thomas Paine has a claim to the title The Father of the American Revolution because of Common Sense, the pro-independence monograph pamphlet he anonymously published on January 10, 1776; signed "Written by an Englishman", the pamphlet became an immediate success.[21] It quickly spread among the literate, and, in three months, 100,000 copies (estimated 500,000 total including unauthorized editions sold during the course of the Revolution)[22] sold throughout the American British colonies (with only two million free inhabitants), making it the best-selling American title of the period.[22][23] Paine's original title for the pamphlet was Plain Truth; Paine's friend, pro-independence advocate Benjamin Rush, suggested Common Sense instead. The pamphlet came into circulation in January 1776, after the Revolution had started. It was passed around, and often read aloud in taverns, contributing significantly to spreading the idea of republicanism, bolstering enthusiasm for separation from Britain, and encouraging recruitment for the Continental Army. Paine provided a new and convincing argument for independence by advocating a complete break with history. Common Sense is oriented to the future in a way that compels the reader to make an immediate choice. It offers a solution for Americans disgusted with and alarmed at the threat of tyranny.[24] Paine was not, on the whole, expressing original ideas in Common Sense, but rather employing rhetoric as a means to arouse resentment of the Crown. To achieve these ends, he pioneered a style of political writing suited to the democratic society he envisioned, with Common Sense serving as a primary example. Part of Paine's work was to render complex ideas intelligible to average readers of the day, with clear, concise writing unlike the formal, learned style favored by many of Paine's contemporaries.[25] Scholars have put forward various explanations to account for its success, including the historic moment, Paine's easy-to-understand style, his democratic ethos, and his use of psychology and ideology. http://en.wikipedia.org/wiki/Thomas_Paine
Views: 92658 Way Back
George Carlin: Quotes, Stand-Up, Stuff, Advertising, Books, Education, Politics (1999)
George Denis Patrick Carlin (May 12, 1937 – June 22, 2008) was an American stand-up comedian, social critic, actor, author, and philosopher. His books: https://www.amazon.com/gp/search?ie=UTF8&tag=ub066-20&linkCode=ur2&linkId=e1e277ae9b6ddd7a28309d805ee9b60c&camp=1789&creative=9325&index=books&keywords=george%20carlin Carlin was noted for his black comedy and his thoughts on politics, the English language, psychology, religion, and various taboo subjects. Carlin and his "Seven dirty words" comedy routine were central to the 1978 U.S. Supreme Court case F.C.C. v. Pacifica Foundation, in which a 5–4 decision affirmed the government's power to regulate indecent material on the public airwaves. He is widely regarded as one of the most important and influential stand-up comedians: One newspaper called Carlin "the dean of counterculture comedians."[2] In 2004, Carlin was placed second on the Comedy Central list of "Top 10 Comedians of US Audiences" compiled for an April 2004 special.[3] The first of his 14 stand-up comedy specials for HBO was filmed in 1977. From the late 1980s, Carlin's routines focused on sociocultural criticism of American society. He often commented on contemporary political issues in the United States and satirized the excesses of American culture. He was a frequent performer and guest host on The Tonight Show during the three-decade Johnny Carson era, and hosted the first episode of Saturday Night Live. His final HBO special, It's Bad for Ya, was filmed less than four months before his death. In 2008, he was posthumously awarded the Mark Twain Prize for American Humor. Carlin's influences included Danny Kaye,[8][51] Jonathan Winters,[8] Lenny Bruce,[31][52][53] Richard Pryor,[31] Jerry Lewis,[8][31] the Marx Brothers,[8][31] Mort Sahl,[53] Spike Jones,[31] Ernie Kovacs,[31] and the Ritz Brothers.[8] Comedians who have claimed Carlin as an influence include Bill Burr,[54] Chris Rock,[55] Jerry Seinfeld,[56] Louis C.K.,[57] Lewis Black,[58] Jon Stewart,[59] Stephen Colbert,[60] Bill Maher,[61] Patrice O'Neal,[62] Adam Carolla,[63] Colin Quinn,[64] Steven Wright,[65] Mitch Hedberg,[66] Russell Peters,[67] Jay Leno,[68] Ben Stiller,[68] Kevin Smith,[69] Chris Rush[70] and Rob McElhenney.[71] Carlin's acting career was primed with a major supporting role in the 1987 comedy hit Outrageous Fortune, starring Bette Midler and Shelley Long; it was his first notable screen role after a handful of previous guest roles on television series. Playing drifter Frank Madras, the role poked fun at the lingering effect of the 1960s counterculture. In 1989, he gained popularity with a new generation of teens when he was cast as Rufus, the time-traveling mentor of the titular characters in Bill & Ted's Excellent Adventure, and reprised his role in the film sequel Bill and Ted's Bogus Journey as well as the first season of the cartoon series. He also played the role of "Mr Conductor" on the PBS show Shining Time Station and narrating the show's sequences of the American version of Thomas the Tank Engine & Friends from 1991 to 1995, replacing Ringo Starr. In 1996 he continued to play the role as "Mr Conductor" on Mr. Conductor's Thomas Tales, and Storytime with Thomas. After the cancelation of Shining Time Station, Mr. Conductor's Thomas Tales, and Storytime with Thomas he left and was replaced in 1998 by Alec Baldwin for the fifth onwards. Also in 1991, Carlin had a major supporting role in the movie The Prince of Tides, which starred Nick Nolte and Barbra Streisand. He portrayed the gay neighbor of the main character's suicidal sister. https://en.wikipedia.org/wiki/George_Carlin
Views: 187672 Way Back
Inside the Wealthiest Black Families in America: Social Clubs, Elites, History (1999)
Lawrence Otis Graham is a nationally-known corporate and labor attorney as well as a New York Times bestselling author of 14 non-fiction books on the subject of politics, education, race and class in America. About the book: https://www.amazon.com/gp/product/0060984384/ref=as_li_tl?ie=UTF8&camp=1789&creative=9325&creativeASIN=0060984384&linkCode=as2&tag=ub066-20&linkId=1d4b8065ed34252e2ed83f2ff157c7d6 His work has appeared in such publications as The New York Times, Reader's Digest, Glamour, U.S. News & World Report and Reader's Digest, where he has served as a contributing editor. His book Our Kind of People: Inside America’s Black Upper Class (HarperCollins) was a New York Times, Los Angeles Times and Essence Magazine bestseller, as well as a selection of the Book of the Month Club. Graham’s book The Senator and The Socialite: the Story of America’s First Black Political Dynasty (HarperCollins) is a biography of U.S. Senator Blanche Bruce, the first black person to serve a full term in the U.S. Senate. Graham is also the author of such books as The Best Companies for Minorities (Penguin Books) and Proversity: Getting Past Face Value (John Wiley & Sons)—two guides on diversity in the workplace—as well as Member of the Club, which focused on his now-famous experience of leaving his New York law firm and going undercover as a busboy to expose racism, sexism and anti-Semitism at a segregated country club in Connecticut during the 1990s. That was originally a cover story on New York Magazine, and was later optioned for a feature film by Warner Brothers. The article led to the Professional Golfers' Association of America's decision to no longer host events at segregated clubs. Upon the article's publication, Graham was named Young Lawyer of the Year by the National Bar Association, and several city bar associations around the nation adopted policies that discouraged member firms from hosting events or conducting business with clubs that did not permit women, minorities or Jews. Graham has appeared on numerous TV programs including Charlie Rose, The Oprah Winfrey Show, Today Show, The View, Hardball with Chris Matthews, and Good Morning America, and has been profiled in USA Today, TIME Magazine, Ebony, and People Magazine. He is a popular speaker at colleges, corporations and other institutions where he has addressed the issues of education, diversity and American culture. His audiences have included Duke University, UCLA, Howard University, Yale University, Kraft Foods, Corning, Xerox, Disney, American Jewish Committee, the American Library Association, and other organizations around the U.S. and Japan. His research and advice have appeared in The Wall Street Journal. He launched a campaign to get the U.S. Postal Service to honor Senator Blanche Bruce on a postage stamp, since the nation has never placed a black elected official on a stamp. A former adjunct professor at Fordham University, Graham has taught African American Studies as well as American Government. Graham appears weekly as a political commentator on News 12,[citation needed] and he writes Westchester Magazine's online political column, Point of View. He is chairman of the Westchester County Police Board and has served on such other boards as Red Cross of Westchester, the Boy Scouts of America, Princeton Center for Leadership Training, Jack & Jill Foundation, and Council on Economic Priorities. Graham is Editor at Large of Uptown Magazine. Graham is also a trustee of SUNY Purchase College Foundation and the Horace Mann School. https://en.wikipedia.org/wiki/Lawrence_Otis_Graham
Views: 40747 Way Back
The Importance of Understanding Your Enemy: Tom Clancy on Politics (2005)
By 1988, Clancy had earned $1.3 million for The Hunt for Red October and had signed a $3 million contract for his next three books. By 1997, it was reported that Penguin Putnam Inc. (part of Pearson Education) would pay Clancy $50 million for world rights to two new books and another $25 million to Red Storm Entertainment for a four-year book/multimedia deal. Clancy followed this up with an agreement with Penguin's Berkley Books for 24 paperbacks to tie in with the ABC television miniseries Tom Clancy's Net Force aired in the fall/winter of 1998. The Op-Center universe has laid the ground for the series of books written by Jeff Rovin, which was in an agreement worth $22 million, bringing the total value of the package to $97 million. In 1993, Clancy joined a group of investors, that included Peter Angelos, and bought the Baltimore Orioles from Eli Jacobs.[13][14] In 1998, he reached an agreement to purchase the Minnesota Vikings but had to abandon the deal because of a divorce settlement cost.[15][16] In 2008, the French video game manufacturer Ubisoft purchased the use of Clancy's name for an undisclosed sum. It has been used in conjunction with video games and related products such as movies and books.[17] Based on his interest in private spaceflight and his US$1 million investment in the launch vehicle company Rotary Rocket,[18] Clancy was interviewed in 2007 for the documentary film Orphans of Apollo (2008). A longtime holder of conservative and Republican views, Clancy's books bear dedications to American conservative political figures, most notably Ronald Reagan. A week after the September 11, 2001 attacks, on The O'Reilly Factor, Clancy suggested that left-wing politicians in the United States were partly responsible for September 11 due to their "gutting" of the Central Intelligence Agency.[19] On September 11, 2001, Clancy was interviewed by Judy Woodruff on CNN.[20] During the interview, he asserted "Islam does not permit suicide." Among other observations during this interview, Clancy cited discussions he had had with military experts on the lack of planning to handle a hijacked plane being used in a suicide attack and criticized the news media's treatment of the United States Intelligence Community. Clancy appeared again on PBS's Charlie Rose, to discuss the implications of the day's events with Richard Holbrooke, New York Times journalist Judith Miller, and Senator John Edwards, among others.[21] Clancy was interviewed on these shows because his book Debt of Honor (1994) included a scenario wherein a disgruntled Japanese airline pilot crashes a fueled Boeing 747 into the U.S. Capitol dome during an address by the President to a joint session of Congress, killing the President and most of Congress. This plot device bore strong similarities to the attacks of September 11, 2001. In later years, Clancy associated himself with General Anthony Zinni, a critic of the George W. Bush administration; Clancy was also critical of former Defense Secretary Donald Rumsfeld. https://en.wikipedia.org/wiki/Tom_Clancy
Views: 3018 Way Back
The Federalist Papers Explained: Authors, Hamilton, Important Quotes, Summary (2000)
The Federalist Papers is a collection of 85 articles and essays written (under the pseudonym Publius) by Alexander Hamilton, James Madison, and John Jay promoting the ratification of the United States Constitution. About the book: https://www.amazon.com/gp/product/B00486U8MC/ref=as_li_tl?ie=UTF8&camp=1789&creative=9325&creativeASIN=B00486U8MC&linkCode=as2&tag=ub066-20&linkId=62307a5e603abb32c5347629e6ae43e3 Seventy-seven were published serially in The Independent Journal and The New York Packet between October 1787 and August 1788. A compilation of these and eight others, called The Federalist: A Collection of Essays, Written in Favour of the New Constitution, as Agreed upon by the Federal Convention, September 17, 1787, was published in two volumes in 1788 by J. and A. McLean.[1] The collection's original title was The Federalist; the title The Federalist Papers did not emerge until the 20th century. Though the authors of The Federalist Papers foremost wished to influence the vote in favor of ratifying the Constitution, in Federalist No. 1 they explicitly set that debate in broader political terms: It has been frequently remarked, that it seems to have been reserved to the people of this country, by their conduct and example, to decide the important question, whether societies of men are really capable or not, of establishing good government from reflection and choice, or whether they are forever destined to depend, for their political constitutions, on accident and force.[2] Federalist No. 10, in which Madison discusses the means of preventing rule by majority faction and advocates a large, commercial republic, is generally regarded as the most important of the 85 articles from a philosophical perspective; it is complemented by Federalist No. 14, in which Madison takes the measure of the United States, declares it appropriate for an extended republic, and concludes with a memorable defense of the constitutional and political creativity of the Federal Convention.[3] In Federalist No. 84, Hamilton makes the case that there is no need to amend the Constitution by adding a Bill of Rights, insisting that the various provisions in the proposed Constitution protecting liberty amount to a "bill of rights". Federalist No. 78, also written by Hamilton, lays the groundwork for the doctrine of judicial review by federal courts of federal legislation or executive acts. Federalist No. 70 presents Hamilton's case for a one-man chief executive. In Federalist No. 39, Madison presents the clearest exposition of what has come to be called "Federalism". In Federalist No. 51, Madison distills arguments for checks and balances in an essay often quoted for its justification of government as "the greatest of all reflections on human nature." According to historian Richard B. Morris, they are an "incomparable exposition of the Constitution, a classic in political science unsurpassed in both breadth and depth by the product of any later American writer." https://en.wikipedia.org/wiki/The_Federalist_Papers
Views: 16398 Way Back
Why Is Marshall McLuhan Important? Tom Wolfe on Media, Advertising, Technology (1999)
Herbert Marshall McLuhan, CC (July 21, 1911 – December 31, 1980) was a Canadian philosopher of communication theory and a public intellectual. His work is viewed as one of the cornerstones of the study of media theory, as well as having practical applications in the advertising and television industries. He was educated at the University of Manitoba and Cambridge University and began his teaching career as a Professor of English at several universities in the U.S. and Canada, before moving to the University of Toronto where he would remain for the rest of his life. McLuhan is known for coining the expressions the medium is the message and the global village, and for predicting the World Wide Web almost thirty years before it was invented. Although he was a fixture in media discourse in the late 1960s, his influence began to wane in the early 1970s. In the years after his death, he continued to be a controversial figure in academic circles. With the arrival of the internet, however, interest in his work and perspective has renewed. During his years at Saint Louis University (1937–1944), McLuhan worked concurrently on two projects: his doctoral dissertation and the manuscript that was eventually published in 1951 as the book The Mechanical Bride: Folklore of Industrial Man, which included only a representative selection of the materials that McLuhan had prepared for it. McLuhan's 1942 Cambridge University doctoral dissertation surveys the history of the verbal arts (grammar, logic, and rhetoric—collectively known as the trivium) from the time of Cicero down to the time of Thomas Nashe.[38] In his later publications, McLuhan at times uses the Latin concept of the trivium to outline an orderly and systematic picture of certain periods in the history of Western culture. McLuhan suggests that the Middle Ages, for instance, was characterized by the heavy emphasis on the formal study of logic. The key development that led to the Renaissance was not the rediscovery of ancient texts but a shift in emphasis from the formal study of logic to rhetoric and language. Modern life is characterized by the reemergence of grammar as its most salient feature—a trend McLuhan felt was exemplified by the New Criticism of Richards and Leavis.[39] In The Mechanical Bride, McLuhan turned his attention to analysing and commenting on numerous examples of persuasion in contemporary popular culture. This followed naturally from his earlier work as both dialectic and rhetoric in the classical trivium aimed at persuasion. At this point his focus shifted dramatically, turning inward to study the influence of communication media independent of their content. His famous aphorism "the medium is the message" (elaborated in his 1964 book, Understanding Media: The Extensions of Man) calls attention to this intrinsic effect of communications media.[40] McLuhan also started the journal Explorations with anthropologist Edmund "Ted" Carpenter. In a letter to Walter Ong dated May 31, 1953, McLuhan reported that he had received a two-year grant of $43,000 from the Ford Foundation to carry out a communication project at the University of Toronto involving faculty from different disciplines, which led to the creation of the journal. Tom Wolfe suggests that a hidden influence on McLuhan's work is the Catholic philosopher Teilhard de Chardin whose ideas anticipated those of McLuhan, especially the evolution of the human mind into the "noosphere". Wolfe theorizes that McLuhan may have thought that association of his ideas with those of a Catholic theologian, albeit one suppressed by Rome, might have denied him the intellectual audience he wanted to reach and so omitted all reference of de Chardin from his published work, while privately acknowledging his influence. https://en.wikipedia.org/wiki/Marshall_McLuhan
Views: 11986 Way Back
Fred Rogers on Children's Television: Quotes, Education, History, Life, Legacy (1990)
Fred McFeely Rogers (March 20, 1928 – February 27, 2003) was an American television personality, educator, Presbyterian minister, composer, songwriter, author, and activist. Rogers was most famous for creating, hosting, and composing the theme music for the educational preschool television series Mister Rogers' Neighborhood (1968–2001), which featured his gentle, soft-spoken personality and directness to his audiences.[1] Initially educated to be a minister, Rogers was displeased with the way television addressed children and made an effort to change this when he began to write for and perform on local Pittsburgh-area shows dedicated to youth. WQED developed his own show in 1968 and it was distributed nationwide by Eastern Educational Television Network. Over the course of three decades on television, Fred Rogers became an indelible American icon of children's entertainment and education, as well as a symbol of compassion, patience, and morality.[2] He was also known for his advocacy of various public causes. His testimony before a lower court in favor of fair use recording of television shows to play at another time (now known as time shifting) was cited in a U.S. Supreme Court decision on the Betamax case, and he gave now-famous testimony to a U.S. Senate committee, advocating government funding for children's television.[3] Rogers received the Presidential Medal of Freedom, some forty honorary degrees,[4] and a Peabody Award. He was inducted into the Television Hall of Fame, was recognized by two Congressional resolutions, and was ranked No. 35 among TV Guide's Fifty Greatest TV Stars of All Time.[5] Several buildings and artworks in Pennsylvania are dedicated to his memory, and the Smithsonian Institution displays one of his trademark sweaters as a "Treasure of American History". Rogers was born in Latrobe, Pennsylvania, 40 miles (65 km) southeast of Pittsburgh, to James and Nancy Rogers; he had one sister, Elaine.[6] Early in life he spent much of his free time with his maternal grandfather, Fred McFeely, who had an interest in music. He would often sing along as his mother would play the piano and he himself began playing at five.[4] Rogers graduated from Latrobe High School (1946).[7] He studied at Dartmouth College (1946–48),[8] then transferred to Rollins College in Winter Park, Florida, where he earned a B.A. in Music Composition in 1951.[9] Rogers was also a trained general aviation pilot. [10] At Rollins, he met Sara Joanne Byrd, an Oakland, Florida native; they married on June 9, 1952.[11] They had two sons, James (b. 1959) and John (b. 1961).[12] In 1963, Rogers graduated from Pittsburgh Theological Seminary and was ordained a minister in the United Presbyterian Church in the U.S.A.. Rogers had an apartment in New York City and a summer home on Nantucket island in Massachusetts.[12][13] Rogers was red–green color blind,[14] swam every morning, and neither smoked nor drank.[15] He was a vegetarian on ethical grounds, stating "I don't want to eat anything that has a mother."[16] Despite recurring rumors, he never served in the military.[17][18][19] His office at WQED Pittsburgh famously did not have a desk, only sofa chairs, because Rogers thought a desk was "too much of a barrier". http://en.wikipedia.org/wiki/Fred_Rogers
Views: 40314 Way Back
Why Authors Are Important: Tom Wolfe On His Favorite Books, Writing, Art, Style  (1996)
Thomas Kennerly "Tom" Wolfe, Jr. (born March 2, 1931)[1] is an American author and journalist, best known for his association with and influence over the New Journalism literary movement, in which literary techniques are used in objective even-handed journalism. He began his career as a regional newspaper reporter in the 1950s, but achieved national prominence in the 1960s following the publication of such best-selling books as The Electric Kool-Aid Acid Test (a highly experimental account of Ken Kesey and the Merry Pranksters), and two collections of articles and essays, Radical Chic & Mau-Mauing the Flak Catchers and The Kandy-Kolored Tangerine-Flake Streamline Baby. His first novel, The Bonfire of the Vanities, released in 1987, was met with critical acclaim, became a commercial success, and was adapted as a major motion picture (directed by Brian De Palma). https://en.wikipedia.org/wiki/Tom_Wolfe Philip Milton Roth (born March 19, 1933) is an American novelist. He first gained attention with the 1959 novella Goodbye, Columbus, an irreverent and humorous portrait of American Jewish life for which he received the U.S. National Book Award for Fiction. Roth's fiction, regularly set in Newark, New Jersey, is known for its intensely autobiographical character, for philosophically and formally blurring the distinction between reality and fiction, for its "supple, ingenious style" and for its provocative explorations of Jewish and American identity. His profile rose significantly in 1969 after the publication of the controversial Portnoy's Complaint, the humorous and sexually explicit psychoanalytical monologue of "a lust-ridden, mother-addicted young Jewish bachelor," filled with "intimate, shameful detail, and coarse, abusive language." Roth is one of the most awarded U.S. writers of his generation: his books have twice received the National Book Award, twice the National Book Critics Circle award, and three times the PEN/Faulkner Award. He received a Pulitzer Prize for his 1997 novel, American Pastoral, which featured one of his best-known characters, Nathan Zuckerman, the subject of many other of Roth's novels. The Human Stain (2000), another Zuckerman novel, was awarded the United Kingdom's WH Smith Literary Award for the best book of the year. In 2001, in Prague, Roth received the inaugural Franz Kafka Prize. https://en.wikipedia.org/wiki/Philip_Roth Anton Pavlovich Chekhov (/ˈtʃɛkɔːf, -ɒf/;[1] Russian: Анто́н Па́влович Че́хов, pronounced [ɐnˈton ˈpavləvʲɪtɕ ˈtɕɛxəf]; 29 January 1860[2] – 15 July 1904)[3] was a Russian physician, playwright and author who is considered to be among the greatest writers of short stories in history. His career as a playwright produced four classics and his best short stories are held in high esteem by writers and critics.[4][5] Chekhov practiced as a medical doctor throughout most of his literary career: "Medicine is my lawful wife", he once said, "and literature is my mistress."[6] Along with Henrik Ibsen and August Strindberg, Chekhov is often referred to as one of the three seminal figures in the birth of early modernism in the theater.[7] Chekhov renounced the theatre after the disastrous reception of The Seagull in 1896, but the play was revived to acclaim in 1898 by Constantin Stanislavski's Moscow Art Theatre, which subsequently also produced Chekhov's Uncle Vanya and premiered his last two plays, Three Sisters and The Cherry Orchard. These four works present a challenge to the acting ensemble[8] as well as to audiences, because in place of conventional action Chekhov offers a "theatre of mood" and a "submerged life in the text".[9] Chekhov had at first written stories only for financial gain, but as his artistic ambition grew, he made formal innovations which have influenced the evolution of the modern short story.[10] He made no apologies for the difficulties this posed to readers, insisting that the role of an artist was to ask questions, not to answer them. https://en.wikipedia.org/wiki/Anton_Chekhov
Views: 9355 Way Back
Recollections of Fighting for the Confederacy: One of the Richest Personal Accounts (2000)
Criticism from authors in the Lost Cause movement attacked Longstreet's war career for many years after his death. Knudsen maintains that because Longstreet became a "reconstructed rebel", embraced equal rights for blacks, unification of the nation, and reconstruction, he became the target of those who wanted to maintain racist policies and otherwise could not accept the verdict of the battlefield.[81] The attacks formally began on January 19, 1872, the anniversary of Robert E. Lee's birth, and less than two years after Lee's death. Jubal Early, in a speech at Washington College, exonerated Lee of his failure at Gettysburg and falsely accused Longstreet of attacking late on the second day and of being responsible for the debacle on the third. The following year William N. Pendleton, Lee's artillery chief, claimed in the same venue that Longstreet disobeyed an explicit order to attack at sunrise on July 2. Both of these allegations were fabrications;[81] however, Longstreet failed to challenge these lies publicly until 1875. The delay was damaging to his reputation, as the Lost Cause mythology had taken hold in common opinion by this time. In the 20th century Douglas Southall Freeman kept criticism of Longstreet foremost in Civil War scholarship in his biography of Lee.[82] Clifford Dowdey, a Virginia newspaperman and novelist, was noted for his severe criticism of Longstreet in the 1950s and 1960s.[83] After Longstreet's death, his second wife Helen privately published Lee and Longstreet at High Tide in his defense, in which she stated "the South was seditiously taught to believe that the Federal Victory was wholly the fortuitous outcome of the culpable disobedience of General Longstreet."[84] The publication of Michael Shaara's novel The Killer Angels in 1974, based in part on Longstreet's memoirs, followed by its 1993 film adaptation, Gettysburg, have been credited with helping to restore Longstreet's reputation as a general and to dramatically raise his public visibility.[85] The 1982 work by Thomas L. Connolly and Barbara L. Bellows, God and General Longstreet, provided a "further upgrading of Longstreet through an attack on Lee, the Lost Cause, and the Virginia revisionists."[86] Jeffry D. Wert wrote that "Longstreet ... was the finest corps commander in the Army of Northern Virginia; in fact, he was arguably the best corps commander in the conflict on either side."[87] Richard L. DiNardo wrote "Even Longstreet's most virulent critics have conceded that he put together the best staff employed by any commander, and that his de facto chief of staff, Lieutenant Colonel G. Moxley Sorrel, was the best staff officer in the Confederacy." DiNardo cited the effective way in which Longstreet delegated responsibilities for control of battlefield movements to his staff and how they were able to communicate with him more effectively during battles than the staffs of other Confederate generals during the war. Longstreet plays a prominent role in Michael Shaara's Pulitzer Prize-winning 1974 novel The Killer Angels. He is portrayed in the 1993 film Gettysburg (based on the The Killer Angels) by Tom Berenger, and in the prequel, Gods and Generals (2003), by Bruce Boxleitner. He was portrayed by Brian Amidei onstage in the world premiere of The Killer Angels at the Lifeline Theatre in Chicago.[93] Longstreet is a character in a number of prominent alternate history novels: Robert Skimin's Gray Victory (1988), Robert Conroy's 1901 (1995), and Harry Turtledove's Southern Victory Series: Volume 1: How Few Remain (1997). Longstreet appears as a character in Row After Row, a full length one act play by American playwright, Jessica Dickey. The action of the play takes place one evening after a Gettysburg re-enactment. One re-enactor, Cal, plays Longstreet in the battle. In parts of the play, the action moves to the moments leading up to Pickett's Charge. The play ends with a tormented Longstreet addressing the future, as he wonders if we will ever form a "more perfect union." Longstreet is a character in the alternate history novels Gettysburg: A Novel of the Civil War (2003), "Grant Comes East" (2004), and Never Call Retreat: Lee and Grant: The Final Victory (2005) by Newt Gingrich and William Forstchen. https://en.wikipedia.org/wiki/James_Longstreet
Views: 7069 Way Back
Why the U.S. Left the Gold Standard: Origins, Benefits, Drawbacks (2012)
The gold standard is supported by many followers of the Austrian School of Economics, free-market libertarians and some supply-siders. In the United States, strict constitutionalists object to the government issuing fiat currency through central banks. Some gold-standard advocates also call for a mandated end to fractional-reserve banking. Many similar alternatives have been suggested, including energy-based currencies, collections of currencies or commodities, with gold as one component. A return to the gold standard was considered by the US Gold Commission back in 1982, but found only minority support. In 2001 Malaysian Prime Minister Mahathir bin Mohamad proposed a new currency that would be used initially for international trade among Muslim nations, using the Islamic gold dinar, defined as 4.25 grams of pure (24-carat) gold. Mohamad claimed it would be a stable unit of account and a political symbol of unity between Islamic nations. This would purportedly reduce dependence on the dollar and establish a non-debt-backed currency in accord with Sharia law that prohibited the charging of interest. As of 2013 the global monetary system continued to rely on the dollar as a reserve currency. Former U.S. Federal Reserve Chairman, Alan Greenspan acknowledged he was one of "a small minority" within the central bank that had some positive view on the gold standard.[87] Greenspan once famously argued the case for returning to a 'pure' gold standard in his 1966 paper "Gold and Economic Freedom", in which he described supporters of fiat currencies as "welfare statists" intending to use monetary policy to finance deficit spending.[88] More recently he claimed that by focusing on targeting inflation "central bankers have behaved as though we were on the gold standard", rendering a return to the standard unnecessary.[89] Similarly, economists like Robert Barro argued that whilst some form of "monetary constitution" is essential for stable, depoliticized monetary policy, the form this constitution takes—for example, a gold standard, some other commodity-based standard, or a fiat currency with fixed rules for determining the quantity of money—is considerably less important.[90] Congressman Ron Paul is a long-term, high-profile advocate of a gold standard, but has also expressed support for using a standard based on a basket of commodities that better reflects the state of the economy.[91] In 2011 the Utah legislature passed a bill to accept federally issued gold and silver coins as legal tender to pay taxes.[92] As Federally issued currency, the coins were already legal tender for taxes, although the market price of their metal content currently exceeds their monetary value. Similar legislation is under consideration in other US states.[93] The bill was initiated by newly elected Republican Party legislators associated with the Tea Party movement and was driven by anxiety over the policies of President Barack Obama.[94] In 2013, the Arizona Legislature passed SB 1439, which would have made gold and silver coin a legal tender in payment of debt, but the bill was vetoed by the Governor. As of 2013, no countries use a gold standard. From 1936 until 2000 the Swiss Franc was based on a 40% gold-reserve.[96] Gold reserves are held in significant quantity by many nations as a means of defending their currency and hedging against the dollar, which forms the bulk of liquid currency reserves.[97] Both gold coins and gold bars are traded in liquid markets and serve as a private store of wealth. In 1999 the European Central Bank and 11 European national banks signed the Washington Agreement on Gold declaring that "gold will remain an important element of global monetary reserves"; the Agreement was later amended and extended. http://en.wikipedia.org/wiki/Gold_standard
Views: 27950 Way Back
Communists on Campus: The Weather Underground
The Weather Underground Organization (WUO), commonly known as the Weather Underground, was an American radical left organization founded on the Ann Arbor campus of the University of Michigan. Originally called Weatherman, the group became known colloquially as the Weathermen. Weatherman first organized in 1969 as a faction of Students for a Democratic Society (SDS) composed for the most part of the national office leadership of SDS and their supporters. Their goal was to create a clandestine revolutionary party for the overthrow of the US government. With revolutionary positions characterized by Black liberation rhetoric,[2] the group conducted a campaign of bombings through the mid-1970s, including aiding the jailbreak and escape of Timothy Leary. The "Days of Rage", their first public demonstration on October 8, 1969, was a riot in Chicago timed to coincide with the trial of the Chicago Seven. In 1970 the group issued a "Declaration of a State of War" against the United States government, under the name "Weather Underground Organization" (WUO).[4] The bombing attacks mostly targeted government buildings, along with several banks. Most were preceded by evacuation warnings, along with communiqués identifying the particular matter that the attack was intended to protest. No persons were killed in any of their acts of property destruction, although three members of the group were killed in the Greenwich Village townhouse explosion. For the bombing of the United States Capitol on March 1, 1971, they issued a communiqué saying it was "in protest of the U.S. invasion of Laos". For the bombing of the Pentagon on May 19, 1972, they stated it was "in retaliation for the U.S. bombing raid in Hanoi". For the January 29, 1975 bombing of the United States Department of State building, they stated it was "in response to escalation in Vietnam".[4] The Weathermen grew out of the Revolutionary Youth Movement (RYM) faction of SDS. It took its name from the lyric "You don't need a weatherman to know which way the wind blows", from the Bob Dylan song "Subterranean Homesick Blues". You don't need a weatherman to know which way the wind blows was the title of a position paper they distributed at an SDS convention in Chicago on June 18, 1969. This founding document called for a "white fighting force" to be allied with the "Black Liberation Movement" and other radical movements[5] to achieve "the destruction of US imperialism and achieve a classless world: world communism".[6] The Weathermen disintegrated after the United States reached a peace accord in Vietnam in 1973, after which the New Left declined. http://en.wikipedia.org/wiki/Weather_underground
Views: 37020 Way Back
Alain de Botton: A Serious But Intellectually Wild Ride
Alain de Botton, FRSL (/dəˈbɒtən/; born 20 December 1969) is a Swiss-born British philosopher and author. His books discuss various contemporary subjects and themes, emphasizing philosophy's relevance to everyday life. He published Essays in Love (1993), which went on to sell two million copies. Other bestsellers include How Proust Can Change Your Life (1997), Status Anxiety (2004) and The Architecture of Happiness (2006). He co-founded The School of Life in 2008 and Living Architecture in 2009. In 2015, he was awarded "The Fellowship of Schopenhauer", an annual writers' award from the Melbourne Writers Festival, for this work. He was sent to the Dragon School, a boarding school in Oxford, where English became his primary language. Describing himself as a shy child, he boarded at Harrow School, before going up to Gonville and Caius College, Cambridge, where he read History (1988–1991) and subsequently completed a master's degree (MPhil) in Philosophy at King's College, London (1991–1992).[5] He began studying for a PhD in French philosophy at Harvard University,[6] but gave up this research to write books for the general public. In 1997 he published his first non-fiction book, How Proust Can Change Your Life, based on the life and works of Marcel Proust.[9] It was a bestseller in both the US and UK.[10] This was followed by The Consolations of Philosophy in 2000. The title of the book is a reference to Boethius's Consolation of Philosophy, in which philosophy appears as an allegorical figure to Boethius to console him in the period leading up to his impending execution. In The Consolations of Philosophy, de Botton attempts to demonstrate how the teachings of philosophers such as Epicurus, Montaigne, Nietzsche, Schopenhauer, Seneca, and Socrates can be applied to modern everyday woes. The book has been both praised and criticized for its therapeutic approach to philosophy. In 2004, he published Status Anxiety. In The Architecture of Happiness[11] (2006), he discusses the nature of beauty in architecture and how it is related to the well-being and general contentment of the individual and society. He describes how architecture affects people every day, though people rarely pay particular attention to it. A good portion of the book discusses how human personality traits are reflected in architecture. He defends Modernist architecture, and chastises the pseudo-vernacular architecture of housing, especially in the UK. "The best modern architecture," he argues, "doesn't hold a mirror up to nature, though it may borrow a pleasing shape or expressive line from nature's copybook. It gives voice to aspirations and suggests possibilities. The question isn't whether you'd actually like to live in a Le Corbusier home, but whether you'd like to be the kind of person who'd like to live in one." https://en.wikipedia.org/wiki/Alain_de_Botton
Views: 3234 Way Back
Zora Neale Hurston: One of the Most Intriguing Cultural Figures of the 20th Century (2003)
Zora Neale Hurston (January 7, 1891 – January 28, 1960) was an American novelist, short story writer, folklorist, and anthropologist. Of Hurston's four novels and more than 50 published short stories, plays, and essays, she is best known for her 1937 novel Their Eyes Were Watching God. In addition to new editions of her work being published after a revival of interest in her in 1975, her manuscript Every Tongue Got to Confess (2001), a collection of folktales gathered in the 1920s, was published posthumously after being discovered in the Smithsonian archives. Zora Neale Hurston's hometown of Eatonville, Florida, celebrates her life in an annual festival and is home to the Zora Neale Hurston Museum of Fine Arts, named in her honor. Her life and legacy are celebrated every year here at the Zora Neale Hurston Festival of the Arts and Humanities.[37] A library named for her opened in January 2004. Hurston's house in Fort Pierce has been designated a National Historic Landmark. The city celebrates Hurston annually through various events such as Hattitudes, birthday parties, and a several-day festival at the end of April known as Zora Fest.[38][39] Author Alice Walker sought Hurston's grave in 1973 and planted a grave marker calling her "A Genius of the South."[40][41] Walker then published "In Search of Zora Neale Hurston" in the March 1975 issue of Ms. magazine, reviving interest in Hurston's work.[42] The renewal of attention to Hurston was related also to the rise of new African-American authors such as Maya Angelou, Toni Morrison, and Walker, whose works are centered on African-American experiences and include, but do not necessarily focus upon, racial struggle.[citation needed] In 2002, scholar Molefi Kete Asante listed Zora Neale Hurston on his list of 100 Greatest African Americans.[43] Barnard College dedicated its 2003 Virginia C. Gildersleeve Conference to Hurston. "Jumpin’ at the Sun: Reassessing the Life and Work of Zora Neale Hurston" focused on her work and influence.[44] Alice Walker's Gildersleeve lecture detailed her work on discovering and publicizing Hurston's legacy.[45] The Zora Neale Hurston Award was established in 2008; it is awarded to an American Library Association member who has "demonstrated leadership in promoting African American literature".[46] She was inducted as a member of the inaugural class of the New York Writers Hall of Fame in 2010. On January 7, 2014, the 123rd anniversary of Hurston's birthday was commemorated by a Google Doodle.[47][48] She was one of twelve inaugural inductees to the Alabama Writers Hall of Fame on June 8, 2015. Film and TV: In 1989 PBS aired a drama based on Hurston's life entitled Zora is My Name! The 2004 film Brother to Brother, set in part during the Harlem Renaissance, featured Hurston (portrayed by Aunjanue Ellis). Their Eyes Were Watching God was adapted for a 2005 film of the same title by Oprah Winfrey's Harpo Productions, with a teleplay by Suzan-Lori Parks. The film starred Halle Berry as Janie Starks. On April 9, 2008, PBS broadcast a 90-minute documentary, Zora Neale Hurston: Jump at the Sun,[64] written and produced by filmmaker Kristy Andersen,[65] as part of the American Masters series.[66] In 2009, Hurston was featured in a 90-minute documentary about the WPA Writers' Project titled Soul of a People: Writing America's Story,[67] which premiered on the Smithsonian Channel. Her work in Florida during the 1930s is highlighted in the companion book, Soul of a People: The WPA Writers' Project Uncovers Depression America. https://en.wikipedia.org/wiki/Zora_Neale_Hurston
Views: 8893 Way Back
The Federalist Papers' Ideals, Arguments, and Enduring Effects on American Political Life (1999)
Charles R. Kesler (born 1956) is professor of Government/Political Science at Claremont McKenna College and Claremont Graduate University. He holds a Ph.D in Government from Harvard University, from which he received his AB degree in 1978. He is editor of the Claremont Review of Books, and the author of Keeping the Tablets: Readings in American Conservatism. He was Director of the Henry Salvatori Center for the Study of Individual Freedom in the Modern World and Claremont Institute's Publius Fellows Program. At Claremont, he is a senior fellow of the conservative Claremont Institute, and directs their Publius Fellows Program, a summer institute. Additionally, he is the editor of the Claremont Review of Books, a quarterly political magazine. He was the Director of Henry Salvatori Center at Claremont McKenna College. Kesler is a senior fellow at the Claremont Institute and an editor of the Claremont Review of Books. Kesler describes the purpose of the Institute as follows: Some conservatives start, as it were, from Edmund Burke; others from Friedrich Hayek. While we respect both thinkers and their schools of thought, we begin instead from America, the American political tradition in all its genius and profundity, and the relation of our tradition to revealed wisdom and to what the elderly Jefferson once called, rather insouciantly, "the elementary books of public right, as Aristotle, Cicero, Locke, Sidney, etc." We think conservatism should take its bearings from the founders' statesmanship, our citizens' loyalty to the Declaration and Constitution, and the scenes, both tender and proud, of our national history. This kind of approach clears the air. It concentrates the mind. It engages and informs the ordinary citizen's patriotism. And it introduces a new, sharper view of liberalism as descended not from the French Revolution, the Industrial Revolution, nor (God forbid) Abraham Lincoln, but from that movement which, a century ago, criticized George Washington's and Lincoln's Constitution as outmoded and, as we'd say today, racist, sexist, and antidemocratic. The Progressives broke with the old Constitution and its postulates, and set out to make a new, living constitution and a new, unlimited state, and the Obama Administration's programs are merely the latest, and worst, installment of that purported evolution. Kesler has edited several widely used books: Saving the Revolution:The Federalist Papers and the American Founding (Free Press, 1987) held in over 500 American libraries. Keeping the Tablets: Readings in American Conservatism (HarperCollins, 1988)(together with William F. Buckley, Jr.). The Federalist Papers (Signet Classics, 2003) (this is the best selling edition of The Federalist Papers)[3] He has published many peer-reviewed articles and political articles and reviews in publications of the Claremont Institute and elsewhere. Dr. Kesler was a delegate to the International Youth Year Conference in Jamaica in 1985. https://en.wikipedia.org/wiki/Charles_R._Kesler
Views: 6650 Way Back
The Untold Story of the American Women Trapped on Bataan (1999)
Norman earned a Ph.D. and M.A. from New York University and a B.A. from Rutgers University. She is a registered nurse. Norman has served as director of the doctoral program at New York University's Division of Nursing in the School of Education. As an author, Norman has made significant contributions to the field of women's military history. Her work brings to light the often-neglected experiences of women during wartime. Her first book, Women at War, examines the previously untold experience of fifty women who served as nurses during the Vietnam War. Her second book, We Band of Angels, is based on interviews with female nurses who were held captive by the Japanese for three years in Bataan, Philippines during World War II. Norman was the first to speak to these women, known as the Angels of Bataan, about the tragedy they endured.[3] She described the experience of conducting these interviews as, "women talking candidly about women swept up in a lethal enterprise of men."[4] Her third book, Tears in the Darkness, is a history of the Bataan Death March and the American, Filipino, and Japanese combatants who were involved.[5] Her inspiration to write about military nurses came from her experience as a nurse as well as the fact that both her mother and husband have served in the U.S. military. Works: Norman, Elizabeth M. (1999). We Band of Angels: The Untold Story of American Nurses Trapped on Bataan by the Japanese. New York: Random House. ISBN 0671787187. OCLC 39930499. Norman, Elizabeth M. (1990). Women at war: the story of fifty military nurses who served in Vietnam. Philadelphia: University of Pennsylvania Press. ISBN 0812282493. OCLC 21332836. Norman, Michael; Norman, Elizabeth M. (2009). Tears in the darkness : the story of the Bataan Death March and its aftermath. New York: Farrar, Straus, and Giroux. ISBN 9780374272609. OCLC 263984541. We Band of Angels was well received and has been reviewed by forty American newspapers, such as the New York Times and Washington Post.[6] The Publisher's Weekly review of the book read, "[Norman] captures moments of great courage...but the true highlights come in the evocation of tears and sweat that went into the nurses daily struggle."[7] Her book Tears in the Darkness, co-written with her husband Michael Norman, was listed number nine on the New York Times Best Sellers list for non-fiction in July 2009. The New York Times said of the book, "'Tears of Darkness' is a book about heroism and survival...If you aren't weeping openly by the book's final scenes...then you have a hard crust of salt around your soul." Awards: Rutgers Living History Society's Stephen E. Ambrose Oral History Award, 2011 Dayton Literary Peace Prize for Tears in the Darkness, 2010 Lavinia Dock Award for historical scholarship American Academy of Nursing National Media Award Agnes Dillon Randolph Award https://en.wikipedia.org/wiki/Elizabeth_Norman
Views: 55310 Way Back
The Epic Story of Dutch Manhattan and the Forgotten Colony That Shaped America (2004)
New Netherland (Dutch: Nieuw Nederland; Latin: Nova Belgica or Novum Belgium) was a 17th-century colony of the Dutch Republic that was located on the East Coast of North America. The claimed territories extended from the Delmarva Peninsula to extreme southwestern Cape Cod, while the more limited settled areas are now part of the Mid-Atlantic States of New York, New Jersey, Delaware, and Connecticut, with small outposts in Pennsylvania and Rhode Island. The colony was conceived by the Dutch West India Company (WIC) in 1621 to capitalise on the North American fur trade. During its first decades, New Netherland was settled rather slowly, stemming both from policy mismanagement by the WIC as well as conflicts with American Indians. The settlement of New Sweden, founded by the Swedish South Company, encroached on its southern flank, while its northern border was re-drawn to accommodate an expanding New England Confederation. During the 1650s, the colony experienced dramatic growth and became a major port for trade in the North Atlantic. The surrender of Fort Amsterdam to England in 1664 was formalized in 1667, contributing to the Second Anglo-Dutch War. In 1673, the Dutch re-took the area but relinquished it under the Second Treaty of Westminster, ending the Third Anglo-Dutch War the next year. The inhabitants of New Netherland were American Indians, European Colonists, and Africans, the last chiefly imported as enslaved laborers. The colony had an estimated population between 7,000 and 8,000 people by 1664, at the time of transfer to England, half of whom were not of Dutch descent.[3] Descendants of the original settlers played a prominent role in colonial America. For two centuries, New Netherland Dutch culture characterized the region of today's Capital District around Albany, the Hudson Valley, western Long Island, northeastern New Jersey, and New York City. The concept of tolerance was the mainstay of the province's Dutch mother country. The Dutch Republic was a haven for many religious and intellectual refugees fleeing oppression, as well as home to the world's major ports in the newly developing global economy. Concepts of religious freedom and free-trade (including a stock market) were Netherlands imports. In 1682, visiting Virginian William Byrd commented about New Amsterdam that "they have as many sects of religion there as at Amsterdam". The Dutch Republic was one of the first nation-states of Europe where citizenship and civil liberties were extended to large segments of the population. The framers of the U.S. Constitution were influenced by the Constitution of the Republic of the United Provinces, though that influence was more as an example of things to avoid than of things to imitate.[59] In addition, the Act of Abjuration, essentially the declaration of independence of the United Provinces from the Spanish throne, is strikingly similar to the later American Declaration of Independence,[60] though there is no concrete evidence that one influenced the other. John Adams went so far as to say that “the origins of the two Republics are so much alike that the history of one seems but a transcript from that of the other.”[61] The Articles of Capitulation (outlining the terms of transfer to the English) in 1664[53] provided for the right to worship as one wished, and were incorporated into subsequent city, state, and national constitutions in the United States, and are the legal and cultural code that lies at the root of the New York Tri-State traditions.[62] Many prominent U.S. citizens are Dutch American directly descended from the Dutch families of New Netherland.[63] The Roosevelt family produced two Presidents and are descended from Claes van Roosevelt, who emigrated around 1650.[64] The Van Buren family of President Martin Van Buren also originated in New Netherland.[4] The Bush family descendants from Flora Sheldon are descendants from the Schuyler family. https://en.wikipedia.org/wiki/New_Netherland
Views: 8705 Way Back
How the Largest Corporate Merger in History Became an Epic Disaster (2003)
In 2000, AOL purchased Time Warner for US$164 billion. The deal, announced on January 10, 2000 and officially filed on February 11, 2000, employed a merger structure in which each original company merged into a newly created entity. The Federal Trade Commission cleared the deal on December 14, 2000, and gave final approval on January 11, 2001; the company completed the merger later that day. The deal was approved on the same day by the Federal Communications Commission, and had already been cleared by the European Commission on October 11, 2000. Due to the larger market capitalization of AOL, their shareholders would own 55% of the new company while Time Warner shareholders owned only 45%, so in actual practice AOL had acquired Time Warner, even though Time Warner had far more assets and revenues. The 2001 AOL merger was 'the biggest mistake in corporate history', believes Time Warner chief Jeff Bewkes.[53] AOL Time Warner Inc., as the company was then called, was supposed to be a merger of equals with top executives from both sides. Gerald Levin, who had served as CEO of Time Warner, was CEO of the new company. Steve Case served as Executive Chairman of the board of directors, Robert W. Pittman (President and COO of AOL) and Dick Parsons (President of Time Warner) served as Co-Chief Operating Officers, and J. Michael Kelly (the CFO from AOL) became the Chief Financial Officer.[54] According to AOL President and COO Bob Pittman, the slow-moving Time Warner would now take off at Internet speed, accelerated by AOL: "All you need to do is put a catalyst to [Time Warner], and in a short period, you can alter the growth rate. The growth rate will be like an Internet company." When the AOL Time Warner deal was announced, the vision for its future seemed clear and straightforward; by tapping into AOL, Time Warner would reach deep into the homes of tens of millions of new customers. AOL would use Time Warner's high-speed cable lines to deliver to its subscribers Time Warner's branded magazines, books, music, and movies. This would have created 130 million subscription relationships. Unfortunately, the growth and profitability of the AOL division stalled due to advertising and subscriber slowdowns in part caused by the burst of the dot-com bubble and the economic recession after September 2001. The value of the America Online division dropped significantly, not unlike the market valuation of similar independent internet companies that drastically fell, and forced a goodwill write-off, causing AOL Time Warner to report a loss of $99 billion in 2002 — at the time, the largest loss ever reported by a company. The total value of AOL stock subsequently went from $226 billion to about $20 billion.[55] An outburst by Vice Chairman Ted Turner at a board meeting prompted Steve Case to contact each of the directors and push for CEO Gerald Levin's ouster. Although Case's coup attempt was rebuffed by Parsons and several other directors, Levin became frustrated with being unable to "regain the rhythm" at the combined company and announced his resignation in the fall of 2001, effective in May 2002.[56] Although Co-COO Bob Pittman was the strongest supporter of Levin and largely seen as the heir-apparent, Dick Parsons was instead chosen as CEO. Time Warner CFO J. Michael Kelly was demoted to COO of the AOL division, and replaced as CFO by Wayne Pace. AOL Chairman and CEO Barry Schuler was removed from his position and placed in charge of a new "content creation division", being replaced on an interim basis by Pittman, who was already serving as the sole COO after Parsons' promotion. Many expected synergies between AOL and other Time Warner divisions never materialized, as most Time Warner divisions were considered independent fiefs that rarely cooperated prior to the merger. A new incentive program that granted options based on the performance of AOL Time Warner, replacing the cash bonuses for the results of their own division, caused resentment among Time Warner division heads who blamed the AOL division for failing to meet expectations and dragging down the combined company. AOL Time Warner COO Pittman, who expected to have the divisions working closely towards convergence instead found heavy resistance from many division executives, who also criticized Pittman for adhering to optimistic growth targets for AOL Time Warner that were never met. Some of the attacks on Pittman were reported to come from the print media in the Time, Inc. division under Don Logan.[57] Furthermore, CEO Parsons' democratic style prevented Pittman from exercising authority over the "old-guard" division heads who resisted Pittman's synergy initiatives. https://en.wikipedia.org/wiki/Time_Warner
Views: 2898 Way Back
Who Killed JFK and Why? America's Biggest Cover-Up Exposed After 50 Years! (2013)
A 2003 Gallup poll indicated that nearly 20% of Americans suspected Lyndon B. Johnson of being involved in the assassination of Kennedy. About the book: https://www.amazon.com/gp/product/1629144894/ref=as_li_tl?ie=UTF8&camp=1789&creative=9325&creativeASIN=1629144894&linkCode=as2&tag=ub066-20&linkId=3566e412f845e787ef50c212b9ceac5f Most current theories put forth a criminal conspiracy involving parties as varied as the CIA, the Mafia, anti-Castro Cuban exile groups, the military industrial complex, the Israeli Mossad, sitting Vice President Lyndon B. Johnson, Cuban President Fidel Castro, FBI director J. Edgar Hoover, the KGB, or some combination of those entities. In an article published prior to the 50th anniversary of Kennedy's assassination, author Vincent Bugliosi estimates that a total of 42 groups, 82 assassins, and 214 people have been accused in conspiracy theories challenging the "lone gunman" theory. Critics of the Warren Commission have accused Johnson of plotting the assassination because he "disliked" the Kennedys and feared that he would be dropped from the Democratic ticket for the 1964 election. With his 1968 book, The Dark Side of Lyndon Baines Johnson, Joachim Joesten is credited as being the first conspiracy author to accuse Johnson of having his role in the assassination. According to Joesten, Johnson "played the leading part" in a conspiracy that involved "the Dallas oligarchy and ... local branches of the CIA, the FBI, and the Secret Service." Other assassination authors who have indicated there was complicity on the part of Johnson include Jim Marrs, Ralph D. Thomas, J. Gary Shaw, Larry Harris, Walt Brown, Noel Twyman, Barr McClellan, Craig Zirbel, Penn Jones, Jr., and Madeleine Brown. In 2003, researcher Barr McClellan published the book Blood, Money & Power. McClellan claims that Johnson, motivated by the fear of being dropped from the Kennedy ticket in 1964 and the need to cover up various scandals, masterminded Kennedy's assassination with the help of his friend, Austin attorney Edward A. Clark. The book suggests that a smudged partial fingerprint from the sniper's nest likely belonged to Johnson's associate Malcolm "Mac" Wallace, and that Mac Wallace was, therefore, on the sixth floor of the Depository at the time of the shooting. The book further claims that the killing of Kennedy was paid for by oil magnates, including Clint Murchison and H. L. Hunt. McClellan states that the assassination of Kennedy allowed the oil depletion allowance to be kept at 27.5 percent. It remained unchanged during the Johnson presidency. According to McClellan, this resulted in a saving of over $100 million to the American oil industry. McClellan's book subsequently became the subject of an episode of Nigel Turner's ongoing documentary television series, The Men Who Killed Kennedy. The episode, "The Guilty Men", drew angry condemnation from the Johnson family, Johnson's former aides, and former Presidents Gerald Ford and Jimmy Carter following its airing on The History Channel. The History Channel assembled a committee of historians who concluded the accusations in the documentary were without merit, and The History Channel apologized to the Johnson family and agreed not to air the series in the future. Madeleine Brown, who alleged she was the mistress of Johnson, also implicated him in a conspiracy to kill Kennedy. In 1997, Brown said that Johnson, along with H. L. Hunt, had begun planning Kennedy's demise as early as 1960. Brown claimed that by its fruition in 1963, the conspiracy involved dozens of persons, including the leadership of the FBI and the Mafia, as well as prominent politicians and journalists. In the documentary The Men Who Killed Kennedy, Madeleine Brown and May Newman (an employee of Texas oilman Clint Murchison) both placed J. Edgar Hoover at a social gathering at Murchison's mansion the night before the assassination. Also in attendance, according to Brown, were John McCloy, Richard Nixon, George Brown, R. L. Thornton, and H. L. Hunt. Madeleine Brown claimed that Johnson arrived at the gathering late in the evening and, in a "grating whisper," told her that the "...Kennedys will never embarrass me again—that's no threat—that's a promise." In addition, Brown said that on New Year's Eve 1963, she met Johnson at the Adolphus Hotel in Dallas and that he confirmed the conspiracy to kill Kennedy, insisting that "the fat cats of Texas and [U.S.] intelligence" had been responsible. Brown reiterated her allegations against Johnson in the 2006 documentary Evidence of Revision. In the same documentary, several other Johnson associates also voiced their suspicions of Johnson. http://en.wikipedia.org/wiki/JFK_assassination_theories
Views: 159449 Way Back
Mercury & Apollo 13: A Fascinating Firsthand Account by a Mission Controller (2000)
Eugene Francis "Gene" Kranz (born August 17, 1933) is an engineer, a retired NASA Flight Director and manager. Kranz served as a Flight Director, the successor to NASA founding Flight Director Chris Kraft, during the Gemini and Apollo programs, and is best known for his role in directing the successful Mission Control team efforts to save the crew of Apollo 13, which later became the subject story of a major motion picture of the same name, in which he was portrayed by actor Ed Harris, and serving as flight director during the first lunar landing of Apollo 11. He is also noted for his trademark close-cut flattop hairstyle, and the wearing of dapper "mission" vests (waistcoats), of different styles and materials made by his wife, Marta Kranz, during missions for which he acted as Flight Director. A personal friend of the American astronauts of his time, Kranz remains a prominent and colorful figure in the history of U.S. manned space exploration, the embodiment of "NASA tough-and-competent" of the Kranz Dictum. Kranz has been the subject of movies, documentary films, and books and periodical articles. Kranz is a recipient of a Presidential Medal of Freedom. In a 2010 Space Foundation survey, Kranz was ranked as the #2 most popular space hero. Kranz has appeared as a character in several dramatizations of the Apollo program. The first portrayal was in the 1974 TV movie Houston, We've Got a Problem, where he is played by Ed Nelson. He is played by Ed Harris in the 1995 film Apollo 13, who received an Oscar nomination for Best Performance by an Actor in a Supporting Role. Matt Frewer portrays him in the 1996 TV movie Apollo 11. He is portrayed by Dan Butler in the 1998 HBO miniseries From the Earth to the Moon. Kranz has also been featured in several documentaries using NASA film archives, including the 2004 History Channel production Failure Is Not an Option and its 2005 follow-up Beyond the Moon: Failure Is Not an Option 2, recurring History Channel broadcasts based on the book The Right Stuff, and the 2008 Discovery Channel production When We Left Earth. The independent video game Kerbal Space Program features a non-playable flight controller named Gene Kerman, who has a flattop haircut and white vest. An archive audio clip including Kranz's name is included in the track "Go!" on the 2015 Public Service Broadcasting album, The Race for Space, a track inspired by the Apollo 11 moon landing. https://en.wikipedia.org/wiki/Gene_Kranz
Views: 11119 Way Back
A Captivating History of Money: Gold, Debt and the Impact on Free Markets (2014)
Kwasi Alfred Addo Kwarteng (born 26 May 1975) is a British politician and historian. A member of the Conservative Party, he has served as a Member of Parliament (MP) since 2010 representing the constituency of Spelthorne in Surrey. Kwarteng was born in London.[3] His parents migrated to the UK from Ghana as students in the 1960s.[4] Kwarteng attended Eton College as a King's Scholar, and then went on to Cambridge University where he read Classics and History at Trinity College.[5] He was a member of the team which won University Challenge in 1995 (in the first series after the programme was revived by the BBC in 1994).[4][6] He attended Harvard University on a Kennedy Scholarship, and then earned a PhD in Economic History at Cambridge University.[5] Prior to becoming an MP, Kwarteng worked as an analyst in financial services. He has written a book, Ghosts of Empire, about the legacy of the British Empire, published by Bloomsbury in 2011.[4] He has also co-authored (with Jonathan Dupont) the book Gridlock Nation on the causes and solutions to traffic congestion in Britain.[7] In 2014 he also published War and Gold: A Five-Hundred-Year History of Empires, Adventures and Debt a history of capital and the enduring ability of money - when combined with speculation - to ruin societies.[8] Kwarteng was the Conservative candidate in the constituency of Brent East at the 2005 general election. He finished in third place behind the incumbent Liberal Democrat MP Sarah Teather (who had won the seat in a 2003 by-election) and Yasmin Qureshi of the Labour Party. Kwarteng was chairman of the Bow Group in 2005-06. In 2006, The Times suggested that he could become the first black Conservative cabinet minister.[9] He was sixth on the Conservative list of candidates for the London Assembly in the 2008 London Assembly election, but was not elected as the Conservatives claimed only three London-wide list seats. Kwarteng was selected as the Conservative candidate for Spelthorne at an open primary in January 2010 after the incumbent Conservative MP, David Wilshire, became mired in controversy arising from the Parliamentary expenses scandal and announced that he would be retiring from Parliament at the next general election. Kwarteng was described by a local paper as a "black Boris".[3] At the 2010 general election, Kwarteng won the seat with 22,261 votes (claiming a majority of 10,019).[10] In August 2012, Kwarteng co-authored a book, Britannia Unchained. In it, the authors claim that "Once they enter the workplace, the British are among the worst idlers in the world". http://en.wikipedia.org/wiki/Kwasi_Kwarteng
Views: 13409 Way Back
Fred Rogers Tribute: Quotes, Biography, Facts, Education, History (2003)
Fred McFeely Rogers (March 20, 1928 – February 27, 2003) was an American television personality, puppeteer, educator, Presbyterian minister, composer, songwriter, author, and activist. Rogers was most famous for creating, hosting, and composing the theme music for the educational preschool television series Mister Rogers' Neighborhood (1968–2001), which featured his gentle, soft-spoken personality and directness to his audiences.[1] Initially educated to be a minister, Rogers was displeased with the way television addressed children and made an effort to change this when he began to write for and perform on local Pittsburgh-area shows dedicated to youth. WQED developed his own show in 1968 and it was distributed nationwide by Eastern Educational Television Network. Over the course of three decades on television, Fred Rogers became an indelible American icon of children's entertainment and education, as well as a symbol of compassion, patience, and morality.[2] He was also known for his advocacy of various public causes. His testimony before a lower court in favor of fair use recording of television shows to play at another time (now known as time shifting) was cited in a U.S. Supreme Court decision on the Betamax case, and he gave now-famous testimony to a U.S. Senate committee, advocating government funding for children's television.[3] Rogers received the Presidential Medal of Freedom, some forty honorary degrees,[4] and a Peabody Award. He was inducted into the Television Hall of Fame, was recognized by two Congressional resolutions, and was ranked No. 35 among TV Guide's Fifty Greatest TV Stars of All Time.[5] Several buildings and artworks in Pennsylvania are dedicated to his memory, and the Smithsonian Institution displays one of his trademark sweaters as a "Treasure of American History." https://en.wikipedia.org/wiki/Fred_Rogers
Views: 9655 Way Back
Michio Kaku: Are Teleportation, Time Machines & Force Fields Possible with Technology? (2008)
Kaku uses discussion of speculative technologies to introduce topics of fundamental physics to the reader. The topic of invisibility becomes a discussion on why the speed of light is slower in water than in vacuum, that electromagnetism is similar to ripples in a pond, and Kaku discusses newly developed composite materials. The topic of Star Trek "phasers" becomes a lesson on how lasers work and how laser-based research is conducted. The cover of his book depicts a TARDIS, a device used in the British science fiction television show Doctor Who to travel in space and time, in its disguise as a police box, continuously passing through a time loop. With each discussion of science fiction technology topics he also "explains the hurdles to realizing these science fiction concepts as reality". According to Kaku, technological advances that we take for granted today were declared impossible 150 years ago. William Thomson Kelvin (1824–1907), a mathematical physicist and creator of the Kelvin scale said publicly that “heavier than air” flying machines were impossible. “He thought X-rays were a hoax, and that radio had no future.”[4] Likewise, Ernest Rutherford (1871–1937), a physicist who experimentally described the atom, thought the atom bomb was impossible and he compared it to moonshine (a crazy or foolish idea). Televisions, computers, and the Internet would seem incredibly fantastic to the people of the turn of the 20th century. Black holes were considered science fiction and even Einstein showed that black holes could not exist. 19th century science had determined that it was impossible for the earth to be billions of years old. Even in the 1920s and 1930s, Robert Goddard was scoffed at because it was believed that rockets would never be able to go into space.[4] Such advances were considered impossible because the basic laws of physics and science were not understood as well as they would subsequently be. Kaku states that “as a physicist [he] learned that the impossible is often a relative term.” By this definition of "impossible", he poses the question "Is it not plausible to think that we might someday build space ships that can travel distances of light years, or think that we might teleport ourselves from one place to the other?" Each chapter is named by a possible, or improbable, technology of the future. After a look at the development of today's technology, there is discussion as to how this advanced technology might become a reality. Chapters become somewhat more general towards the end of the book. Some of our present day technologies are explained, and then extrapolated into futuristic applications. In the future, current technologies are still recognizable, but in a slightly altered form. For example, when discussing force fields of the future, Dr. Kaku writes about cutting edge laser technology, and newly developed plasma windows. These are two of several technologies, which he sees as required for creating a force field. To create a force field these would be combined in a slightly altered form, such as more precise or more powerful. Furthermore, this discussion on force fields, as well as on the pantheon of highly advanced technologies, remains as true to the original concepts (as in how the public generally imagines advanced technologies) as possible, while remaining practical.[5][6] Kaku concludes his book with a short epilogue detailing the newest frontiers in physics and how there is still much more to be learned about physics and our universe. Kaku writes that since scientists understand the basic laws of physics today they are able to perceive or imagine a basic outline of future technologies that might work. Kaku writes: "Physicists today understand the basic laws [of physics] extending over a staggering forty three orders of magnitude, from the interior of the proton out to the expanding universe."[5] He goes on to say that physicists can discern between future technologies that are merely improbable and those technologies that are truly impossible. He uses a system of Class I, Class II, and Class III to classify these science-fictional future technologies that are believed to be impossible today. Class I Impossibilities are "technologies that are impossible today[when?], but that do not violate the known laws of physics." Kaku speculates that these technologies may become available in some limited form in a century or two. http://en.wikipedia.org/wiki/Physics_of_the_Impossible
Views: 3555 Way Back
Millions of High School Graduates Can Barely Read or Write: American Culture (2000)
Morris Berman (born 1944), is an American historian and social critic. He was born in Rochester, New York, going on to earn his BA in mathematics at Cornell University in 1966 and his PhD in the history of science at The Johns Hopkins University in 1972. As an academic humanist cultural critic, Berman specializes in Western cultural and intellectual history. Berman has served on the faculties of a number of universities in the U.S., Canada, and Europe. Berman emigrated from the U.S. to Mexico in 2006, where he was a visiting professor at the Tecnologico de Monterrey in Mexico City from 2008 to 2009. During this period he continued writing for various publications including Parteaguas, a quarterly magazine.[1] Although an academic, Berman has written several books for a general audience.[2] They deal with the state of Western civilization and with an ethical, historically responsible, or enlightened approach to living within it. His work emphasizes the legacies of the European Enlightenment and the historical place of present-day American culture. As book reviewer George Scialabba points out, Berman's work is generally discussed in terms of the two trilogies he produced over a thirty-year span (between 1981 and 2011): "Most historians would be content to have written one deeply researched and interpretively wide-ranging trilogy on a large and important subject. Berman has written two: one on alternative forms of consciousness and spirituality (The Re-enchantment of the World, Coming to Our Senses, Wandering God) and one on the decline of American civilization (The Twilight of American Culture, Dark Ages America, Why America Failed). The second trilogy, a grimly fascinating inventory of the pathologies of contemporary America and an unsparing portrait of American history and national character, is a masterpiece." In 1990, Morris Berman received the Governor's Writers Award (Washington State) for his book Coming to Our Senses.[4] In 1992, he was the recipient of the first annual Rollo May Center Grant for Humanistic Studies. In 2000, Berman's book The Twilight of American Culture was named one of the ten most recommended books of the year by the Christian Science Monitor[5] and was named a "Notable Book" by The New York Times Book Review.[6] In 2013 he received the "Neil Postman Award for Career Achievement in Public Intellectual Activity" from the Media Ecology Association.[7] As of 2014 Berman continues to live in Mexico. Books: Social Change & Scientific Organization: The Royal Institution 1799 – 1844. 1978. – nonfiction The Reenchantment of The World. 1981. – nonfiction Coming to Our Senses: Body and Spirit in the Hidden History of the West. 1989. – nonfiction Wandering God: A Study in Nomadic Spirituality. 2000. – nonfiction The Twilight of American Culture. 2000. – nonfiction Dark Ages America: The Final Phase of Empire. 2006. – nonfiction A Question of Values. 2010. – essay collection - nonfiction Destiny. 2010. – fiction (a collection of three novellas) Counting Blessings. 2011. – poetry Why America Failed: The Roots of Imperial Decline. 2011. – nonfiction[9] Spinning Straw Into Gold: Straight Talk for Troubled Times. 2013. – a philosophical memoir - nonfiction Neurotic Beauty: An Outsider Looks At Japan. 2015. – nonfiction [3] The Man Without Qualities. 2016. – fiction (a novel) https://en.wikipedia.org/wiki/Morris_Berman
Views: 7621 Way Back
Texas: The Best, Most Comprehensive Account of the Lone Star State's History (2003)
Texas is the second largest state in the United States by both area and population. Geographically located in the South Central region of the country, Texas shares borders with the U.S. states of Louisiana to the east, Arkansas to the northeast, Oklahoma to the north, New Mexico to the west, and the Mexican states of Chihuahua, Coahuila, Nuevo León, and Tamaulipas to the southwest, while the Gulf of Mexico is to the southeast. Houston is the most populous city in Texas and the fourth largest in the U.S., while San Antonio is the second most populous in the state and seventh largest in the U.S. Dallas–Fort Worth and Greater Houston are the fourth and fifth largest metropolitan statistical areas in the country, respectively. Other major cities include Austin, the second most populous state capital in the U.S., and El Paso. Texas is nicknamed "The Lone Star State" to signify its former status as an independent republic, and as a reminder of the state's struggle for independence from Mexico. The "Lone Star" can be found on the Texan state flag and on the Texan state seal.[9] The origin of Texas's name is from the word "Tejas," which means "friends" in the Caddo language.[10] Due to its size and geologic features such as the Balcones Fault, Texas contains diverse landscapes that resemble both the U.S. Southern and Southwestern regions.[11] Although Texas is popularly associated with the U.S. southwestern deserts, less than 10% of Texas' land area is desert.[12] Most of the population centers are located in areas of former prairies, grasslands, forests, and the coastline. Traveling from east to west, one can observe terrain that ranges from coastal swamps and piney woods, to rolling plains and rugged hills, and finally the desert and mountains of the Big Bend. The term "six flags over Texas"[note 1] refers to several nations that have ruled over the territory. Spain was the first European country to claim the area of Texas. France held a short-lived colony. Mexico controlled the territory until 1836 when Texas won its independence, becoming an independent Republic. In 1845,[13] Texas joined the union as the 28th state. The state's annexation set off a chain of events that led to the Mexican–American War in 1846. A slave state before the American Civil War, Texas declared its secession from the U.S. in early-1861, and officially joined the Confederate States of America on March 2 of the same year. After the Civil War and the restoration of its representation in the federal government, Texas entered a long period of economic stagnation. One Texan industry that thrived after the Civil War was cattle. Due to its long history as a center of the industry, Texas is associated with the image of the cowboy. The state's economic fortunes changed in the early 20th century, when oil discoveries initiated an economic boom in the state. With strong investments in universities, Texas developed a diversified economy and high tech industry in the mid-20th century. As of 2015, it is second on the list of the most Fortune 500 companies with 54.[14] With a growing base of industry, the state leads in many industries, including agriculture, petrochemicals, energy, computers and electronics, aerospace, and biomedical sciences. Texas has led the nation in export revenue since 2002 and has the second-highest gross state product. https://en.wikipedia.org/wiki/Texas
Views: 15645 Way Back
The Origins of the Black Panther Party: History, Facts, Goals, Platform (2006)
David Hilliard (born May 15, 1942) was a member of the Black Panther Party. He was Chief of Staff in the party. He is currently a visiting instructor at the University of New Mexico. More on the Black Panther Party: https://www.amazon.com/gp/search?ie=UTF8&tag=ub066-20&linkCode=ur2&linkId=5628ca7d9dfa084496bf684cbcf43796&camp=1789&creative=9325&index=books&keywords=black%20panther%20party Hilliard was convicted on two counts of assault with a deadly weapon for his part in a 1968 ambush of the Oakland Police in retribution for the assassination of Martin Luther King. The April 6, 1968 ambush resulted in the death of Panther Bobby Hutton and the capture of Panther Eldridge Cleaver, who masterminded the botched operation. In July 1971, Hilliard was sentenced to one to ten years and incarcerated at Vacaville Prison. In January 1973 while serving a sentence of six months to 10 years, he was denied parole. In his autobiography Revolutionary Suicide, Huey P. Newton claimed the district attorney of Alameda County was attempting to send Hilliard to prison on "trumped up charges". With Fredrika Newton, Hilliard later formed the Dr. Huey P. Newton Foundation. https://en.wikipedia.org/wiki/David_Hilliard The Black Panther Party or BPP (originally the Black Panther Party for Self-Defense) was a revolutionary black nationalist and socialist organization[1][2] active in the United States from 1966 until 1982, with its only international chapter operating in Algeria from 1969 until 1972.[3] At its inception in October 1966, the Black Panther Party's core practice was its armed citizens' patrols to monitor the behavior of police officers and challenge police brutality in Oakland, California. In 1969, community social programs became a core activity of party members.[4] The Black Panther Party instituted a variety of community social programs, most extensively the Free Breakfast for Children Programs, and community health clinics.[5][6][7] Federal Bureau of Investigation Director J. Edgar Hoover called the party "the greatest threat to the internal security of the country",[8] and he supervised an extensive program (COINTELPRO) of surveillance, infiltration, perjury, police harassment, and many other tactics designed to undermine Panther leadership, incriminate party members, discredit and criminalize the Party, and drain the organization of resources and manpower. The program was also accused of using assassination against Black Panther members.[9][10][11][12] Government oppression initially contributed to the growth of the party as killings and arrests of Panthers increased support for the party within the black community and on the broad political left, both of whom valued the Panthers as powerful force opposed to de facto segregation and the military draft. Black Panther Party membership reached a peak in 1970, with offices in 68 cities and thousands of members, then suffered a series of contractions. After being vilified by the mainstream press, public support for the party waned, and the group became more isolated.[13] In-fighting among Party leadership, caused largely by the FBI's COINTELPRO operation, led to expulsions and defections that decimated the membership.[14] Popular support for the Party declined further after reports appeared detailing the group's involvement in illegal activities such as drug dealing and extortion schemes directed against Oakland merchants.[15] By 1972 most Panther activity centered on the national headquarters and a school in Oakland, where the party continued to influence local politics. Party contractions continued throughout the 1970s. By 1980 the Black Panther Party had just 27 members.[16] The history of the Black Panther Party is controversial. Scholars have characterized the Black Panther Party as the most influential black movement organization of the late 1960s, and "the strongest link between the domestic Black Liberation Struggle and global opponents of American imperialism".[17] Other commentators have described the Party as more criminal than political, characterized by "defiant posturing over substance". https://en.wikipedia.org/wiki/Black_Panther_Party
Views: 60758 Way Back
How to Take Over a Company: T. Boone Pickens, Jr. on Corporate Decisions & Finance (1987)
On the local level, Pickens chaired the Board of Regents of West Texas State University (now West Texas A&M University) in Canyon and in 1987–1988 contributed to the restoration of the administration building known as "Old Main". He was also active in the Republican Party in Potter County. Pickens organized a campaign in the mid-1980s against the Amarillo Globe-News newspaper, for what he claimed was inaccurate reporting about his deals and Mesa. Although the newspaper owner, Morris Communications, replaced its publisher twice during the conflict, Pickens' attempts to have the paper change its editorial policy failed. Shortly thereafter, in 1989, Pickens and Mesa moved to a suburb of Dallas.[10] Pickens sold Mesa to Richard Rainwater in 1996.[11] Mesa merged with Parker & Parsley Petroleum in 1997 to form Pioneer Natural Resources.[12] In 1997, Pickens founded BP Capital Management (then called BP Energy Fund) — the initials standing for "Boone Pickens" and not related to British Petroleum. He holds a 46% interest in the company which runs two hedge funds, Capital Commodity and Capital Equity, both of which invest primarily in traditional energy companies such as oil, natural gas, and nuclear power corporations like Halliburton, Schlumberger, and Shaw Group. In 2006, Pickens earned $990 million from his equity in the two funds and $120 million from his share of the 20% fees applied to fund profits.[13] In 2007, Pickens earned $2.7 billion, as BP Capital Equity Fund grew by 24% after fees, and the then $590 million Capital Commodity fund grew 40%, thanks to, among others, large positions in the stocks of Suncor Energy, ExxonMobil and Occidental Petroleum.[14] Pickens' most recent recognition comes from The Franklin Institute in Philadelphia. T. Boone Pickens received the 2009 Bower Award for Business Leadership for 50 years of visionary leadership in oil and other types of energy production, including domestic renewable energy, and for his philanthropic leadership contributing to education, medical research, and wildlife conservation. In his 2008 book, The First Billion is the Hardest, he noted a belief in the "peak oil" theory. He has since altered that position, noting technical achievements of the domestic oil and natural gas industries in utilizing horizontal drilling and fracking to unlock shale oil and gas reserves. He has called for the construction of more nuclear power plants, the use of natural gas to power the country's transportation systems, and the promotion of alternative energy. Pickens's involvement with the natural gas fueling campaign is long-running. He formed Pickens Fuel Corp. in 1997 and began promoting natural gas as the best vehicular fuel alternative because it is a domestic resource that, among many advantages, is cleaner-burning (Natural Gas Vehicles or NGVs emit up to 30% less pollution than gasoline or diesel vehicles) and reduces foreign oil consumption. Reincorporated as Clean Energy Fuels Corp. in 2001, the company now owns and operates natural gas fueling stations from British Columbia to the Mexican border. http://en.wikipedia.org/wiki/T._Boone_Pickens
Views: 5183 Way Back
T. Boone Pickens on Making His First Billion Dollars: Capital Management Investment Firm (2008)
Thomas Boone Pickens, Jr. (born May 22, 1928), known as T. Boone Pickens, is an American business magnate and financier. Pickens chairs the hedge fund BP Capital Management. He was a well-known takeover operator and corporate raider during the 1980s. As of September 2014, Pickens has a net worth of $1 billion. Pickens was born in Holdenville, Oklahoma, the son of Grace (née Molonson) and Thomas Boone Pickens. His father worked as an oil and mineral landman (rights leaser). During World War II, his mother ran the local Office of Price Administration, rationing gasoline and other goods in three counties.[2] Pickens was the first child born via Caesarean section in the history of Holdenville hospital.[3] At age 12, Pickens delivered newspapers. He quickly expanded his paper route from 28 papers to 156.[4] Pickens later cited his boyhood job as an early introduction to "expanding quickly by acquisition", a business practice he favored later in life.[4] When the oil boom in Oklahoma ended in the late 1930s, Pickens' family moved to Amarillo, Texas.[4] Pickens never served in the military[4] but instead attended Texas A&M on a basketball scholarship, but he lost the scholarship[4] and transferred to Oklahoma A&M (now Oklahoma State University), where he majored in geology. He is a member of the Sigma Alpha Epsilon Fraternity. He graduated from Oklahoma A&M with a degree in geology in 1951. Following his graduation, Pickens was employed by Phillips Petroleum. He worked for Phillips until 1954.[5] In 1956, following his period as a wildcatter, he founded the company that would later become Mesa Petroleum.[5] By 1981, Mesa had grown into one of the largest independent oil companies in the world. Pickens led Mesa's first major acquisition, a takeover of the Hugoton Production Company, which was 30 times the size of Mesa.[6] He then shifted his focus to acquiring other oil and gas companies by making solicited and unsolicited buyout bids and other merger and acquisition activity. Pickens' corporate acquisitions made him a celebrity during the 1980s, an era of vigorous and extensively reported takeover activity. His most publicized deals included attempted buyouts of Cities Service, Gulf Oil, Phillips Petroleum, and Unocal.[7] It was during this period that Pickens led Mesa's successful acquisitions of Pioneer Petroleum and the mid-continent assets of Tenneco. These as well as other deals placed Pickens at the center of controversy during the 1980s. His celebrity rose so quickly after the Gulf Oil takeover bid that Time magazine[8] put Pickens on the cover for the March 1985 issue. He briefly considered running for president in the 1988 elections.[9] During this period, he was often characterized as a corporate raider and greenmailer. This is due to the fact that many of his deals were not completed, although Pickens and the shareholders he represented received substantial profits through the eventual sale of their stock as a result. His later takeover targets included Newmont Mining, a New York-based firm, Diamond Shamrock, and Koito Mfg., Ltd., a Japanese auto-parts manufacturer, making substantial gains in the process.[10] He was also involved in the creation of the United Shareholders Association (USA), which from 1986–1993 attempted to influence the governance of several large companies. After nearly two years of periodic hearing and debate, in July 1998 the Securities and Exchange Commission voted 4–1 to approve a one-share, one-vote rule, a primary USA objective. http://en.wikipedia.org/wiki/T._Boone_Pickens
Views: 5025 Way Back
JFK Assassination: Air Force One Transmission on November 22, 1963 - Part 2
During the flight back to Andrews Air Force Base, Johnson made several phone calls on the radio telephone, including to Rose Kennedy (JFK's mother) and Nellie Connally (wife of John Connally). In addition, he made the decision to request all cabinet members to stay in their posts and asked to meet both parties' leaders in Congress soon. Johnson also asked Jack Valenti, Bill Moyers, and Liz Carpenter to write a brief statement for him to read on the day's events, which he then edited slightly himself. At 6:10 pm, after landing at Andrews amid a crowd of Congressional leaders, he walked to an already prepared set of microphones and began his first public statement as president: This is a sad time for all people. We have suffered a loss that cannot be weighed. For me, it is a deep personal tragedy. I know that the world shares the sorrow that Mrs. Kennedy and her family bear. I will do my best. That is all I can do. I ask for your help--and God's. Afterwards Johnson was said to have regretted delivering the remarks, believing he sounded harsh and strident. http://en.wikipedia.org/wiki/First_inauguration_of_Lyndon_B._Johnson Under John F. Kennedy presidential air travel officially entered the jet age.[17] He had used the Eisenhower-era jets for trips to Canada, France, Austria and the United Kingdom.[18] However, in October 1962, the administration purchased a Boeing C-137 Stratoliner, a modified long-range 707—Special Air Mission (SAM) 26000.[19] The Air Force had attempted a special presidential livery of their own design: a scheme in red and metallic gold, with the nation's name in block letters. Kennedy felt the aircraft appeared too regal, and, on advice from his wife, First Lady Jacqueline Kennedy, he contacted the French-born American industrial designer Raymond Loewy for help in designing a new livery and interiors for the VC-137 jet.[2] Loewy met with the president, and his earliest research on the project took him to the National Archives, where he looked at the first printed copy of the United States Declaration of Independence, and saw the country's name set widely spaced and in upper case in a typeface called Caslon. He chose to expose the polished aluminum fuselage on the bottom side, and used two blues; a slate-blue associated with the early republic and the presidency, and a more contemporary cyan to represent the present and future. The presidential seal was added to both sides of the fuselage near the nose, a large American flag was painted on the tail, and the sides of the aircraft read "United States of America" in all capital letters. Loewy's work won immediate praise from the president and the press. The VC-137 markings were adapted for the larger VC-25 when it entered service in 1990.[20] SAM 26000 was in service from 1962 to 1998, serving Presidents Kennedy to Clinton. On 22 November 1963, SAM 26000 carried President Kennedy to Dallas, Texas, where it served as the backdrop as the Kennedys greeted well-wishers at Dallas' Love Field. Later that afternoon, Kennedy was assassinated, and Vice President Lyndon Johnson assumed the office of president and took the oath of office aboard SAM 26000. At Johnson's request, the plane carried Kennedy's body back to Washington.[21] A decade later, SAM 26000 brought Johnson's own body home to Texas after his state funeral in Washington.[22][23] President Lyndon B. Johnson used SAM 26000 to travel extensively domestically, and used it to visit troops in South Vietnam during the Vietnam War. SAM 26000 served President Nixon on several groundbreaking overseas voyages, including his famous visit to the People's Republic of China in February 1972 and trip to the Soviet Union later that year, both firsts for an American president.[24] Nixon dubbed the plane the "Spirit of '76" in honor of the upcoming bicentennial of the United States, and that logo was painted on both sides of the plane's nose.[25] http://en.wikipedia.org/wiki/Air_Force_One
Views: 3250 Way Back
The Life of One of America's Great and Overlooked Revolutionaries: Benjamin Rush (2004)
Benjamin Rush (January 4, 1746 [O.S. December 24, 1745] – April 19, 1813) was a Founding Father of the United States. Rush was a civic leader in Philadelphia, where he was a physician, politician, social reformer, humanitarian, and educator as well as the founder of Dickinson College. Rush attended the Continental Congress and signed the Declaration of Independence. His later self-description there was: "He aimed right." He served as Surgeon General of the Continental Army and became a professor of chemistry, medical theory, and clinical practice at the University of Pennsylvania. Rush was a leader of the American Enlightenment and an enthusiastic supporter of the American Revolution. He was a leader in Pennsylvania's ratification of the Constitution in 1788. He was prominent in many reforms, especially in the areas of medicine and education. He opposed slavery, advocated free public schools, and sought improved education for women and a more enlightened penal system. As a leading physician, Rush had a major impact on the emerging medical profession. As an Enlightenment intellectual, he was committed to organizing all medical knowledge around explanatory theories, rather than rely on empirical methods. Rush argued that illness was the result of imbalances in the body's physical system and was caused by malfunctions in the brain. His approach prepared the way for later medical research, but Rush himself undertook none of it. He promoted public health by advocating clean environment and stressing the importance of personal and military hygiene. His study of mental disorder made him one of the founders of American psychiatry. Benjamin Rush was born to John Rush and Susanna Hall on January 4, 1746 (December 24, 1745 O.S.). The family, of English descent,[4] lived on a plantation in the Township of Byberry in Philadelphia County, about 14 miles outside of Philadelphia (the township was incorporated into Philadelphia in 1854). Benjamin was the fourth of seven children. John Rush died in July 1751 at age thirty-nine, leaving his mother, who ran a country store, to care for the family. At age eight, Benjamin was sent to live with an aunt and uncle to receive an education.[5] Benjamin and his older brother Jacob[6] attended a school in Cecil County, Maryland, run by Reverend Samuel Finley, which later became West Nottingham Academy. In 1760, after further studies at the College of New Jersey (now Princeton University), Rush graduated with a Bachelor of Arts degree at age fourteen. From 1761 to 1766, Rush apprenticed under Dr. John Redman in Philadelphia. Redman encouraged him to further his studies at the University of Edinburgh in Scotland, where Rush studied from 1766 to 1768 and earned a M.D. degree.[7] Rush became fluent in French, Italian, and Spanish as a result of his studies and European tour. While at Edinburgh, he became a friend of the Earl of Leven and his family, including William Leslie.[8] Returning to the Colonies in 1769, Rush opened a medical practice in Philadelphia and became Professor of Chemistry at the College of Philadelphia.[9] Rush ultimately published the first American textbook on chemistry, several volumes on medical student education, and wrote influential patriotic essays. https://en.wikipedia.org/wiki/Benjamin_Rush
Views: 1777 Way Back
Making Gold and Silver Legal Tender: Should the Gold Standard Be Reinstated? (2011)
Metallism is the economic principle that money derives its value from the purchasing power of the commodity upon which it is based. The currency in a metallist monetary system may be made from the commodity itself (commodity money) or use tokens such as national banknotes redeemable in that commodity. The term was coined by Georg Friedrich Knapp to describe monetary systems using coin minted in silver, gold or other metals. In metallist economic theory, the value of the currency derives from the market value of the commodity upon which it is based independent of its monetary role. Carl Menger theorized money came about when buyers and sellers in a market agreed on a common commodity as a medium of exchange in order to reduce the costs of barter. The intrinsic value of that commodity must be sufficient to make it highly “saleable”, or readily accepted as payment. In this system buyers and sellers of real goods and services establish the medium of exchange, not a sovereign state. Metallists view the state's role in the minting or official stamping of coins as one of authenticating the quality and quantity of metal used in making the coin. Knapp distinguished metallism from chartalism (or antimetallism), a monetary system in which the state has monopoly power over its own currency and creates a unique market and demand for that currency by imposing taxes or other such legally enforceable debts upon its people which can only be paid in that currency. Joseph Schumpeter distinguished between "theoretical" and "practical" metallism. Schumpeter categorized the Munger position, that a commodity link is essential to understanding the origins and nature of money, as "theoretical metallism". He defined "practical metallism" as the theory that although a sovereign state has unfettered power to create non-backed currencies, money with no intrinsic or redeemable commodity value, it is more prudent to adopt a backed currency system. http://en.wikipedia.org/wiki/Metallism Silver, in the form of electrum (a gold–silver alloy), was coined to produce money around 700 BC by the Lydians. Later, silver was refined and coined in its pure form. Many nations used silver as the basic unit of monetary value. In the modern world, silver bullion has the ISO currency code XAG. The name of the pound sterling (£) reflects the fact it originally represented the value of one pound Tower weight of sterling silver; other historical currencies, such as the French livre, have similar etymologies. During the 19th century, the bimetallism that prevailed in most countries was undermined by the discovery of large deposits of silver in the Americas; fearing a sharp decrease in the value of silver and thus the currency, most states switched to a gold standard by 1900. In some languages, such as Sanskrit, Spanish, French, and Hebrew, the same word means both silver and money. The 20th century saw a gradual movement to fiat currency, with most of the world monetary system losing its link to precious metals after Richard Nixon took the United States dollar off the gold standard in 1971; the last currency backed by gold was the Swiss franc, which became a pure fiat currency on 1 May 2000. During this same period, silver gradually ceased to be used in circulating coins. In 1964, the United States stopped minting their silver dime and quarter. They minted their last circulating silver coin in 1970 in its 40% half-dollar.[21] In 1968, Canada minted their last circulating silver coins which were the 50% dime and the 50% quarter. The Royal Canadian Mint still makes many collectible silver coins with various dollar denominations. In addition to Canada, the United States and many other countries continue to mint silver coins that are collected for their bullion and numismatic value. The U.S. coin is known as the "Silver Eagle". Silver is used as a currency by many individuals, and is legal tender in the US state of Utah.[22] Silver coins and bullion are also used as an investment to guard against inflation and devaluation. http://en.wikipedia.org/wiki/Silver
Views: 2019 Way Back
How the U.S. Dollar Impacts Other Currencies, Commodities, Oil & Gold - Forex (2009)
The 6th paragraph of Section 8 of Article 1 of the U.S. Constitution provides that the U.S. Congress shall have the power to "coin money" and to "regulate the value" of domestic and foreign coins. Congress exercised those powers when it enacted the Coinage Act of 1792. That Act provided for the minting of the first U.S. dollar and it declared that the U.S. dollar shall have "the value of a Spanish milled dollar as the same is now current". The table to the right shows the equivalent amount of goods that, in a particular year, could be purchased with $1. The table shows that from 1774 through 2012 the U.S. dollar has lost about 97.0% of its buying power.[60] The decline in the value of the U.S. dollar corresponds to price inflation, which is a rise in the general level of prices of goods and services in an economy over a period of time.[61] A consumer price index (CPI) is a measure estimating the average price of consumer goods and services purchased by households. The United States Consumer Price Index, published by the Bureau of Labor Statistics, is a measure estimating the average price of consumer goods and services in the United States.[62] It reflects inflation as experienced by consumers in their day-to-day living expenses.[63] A graph showing the U.S. CPI relative to 1982–1984 and the annual year-over-year change in CPI is shown at right. The value of the U.S. dollar declined significantly during wartime, especially during the American Civil War, World War I, and World War II.[64] The Federal Reserve, which was established in 1913, was designed to furnish an "elastic" currency subject to "substantial changes of quantity over short periods", which differed significantly from previous forms of high-powered money such as gold, national bank notes, and silver coins.[65] Over the very long run, the prior gold standard kept prices stable—for instance, the price level and the value of the U.S. dollar in 1914 was not very different from the price level in the 1880s. The Federal Reserve initially succeeded in maintaining the value of the U.S. dollar and price stability, reversing the inflation caused by the First World War and stabilizing the value of the dollar during the 1920s, before presiding over a 30% deflation in U.S. prices in the 1930s.[66] Under the Bretton Woods system established after World War II, the value of gold was fixed to $35 per ounce, and the value of the U.S. dollar was thus anchored to the value of gold. Rising government spending in the 1960s, however, led to doubts about the ability of the United States to maintain this convertibility, gold stocks dwindled as banks and international investors began to convert dollars to gold, and as a result the value of the dollar began to decline. Facing an emerging currency crisis and the imminent danger that the United States would no longer be able to redeem dollars for gold, gold convertibility was finally terminated in 1971 by President Nixon, resulting in the "Nixon shock".[67] The value of the U.S. dollar was therefore no longer anchored to gold, and it fell upon the Federal Reserve to maintain the value of the U.S. currency. The Federal Reserve, however, continued to increase the money supply, resulting in stagflation and a rapidly declining value of the U.S. dollar in the 1970s. This was largely due to the prevailing economic view at the time that inflation and real economic growth were linked (the Phillips curve), and so inflation was regarded as relatively benign.[67] Between 1965 and 1981, the U.S. dollar lost two thirds of its value.[60] In 1979, President Carter appointed Paul Volcker Chairman of the Federal Reserve. The Federal Reserve tightened the money supply and inflation was substantially lower in the 1980s, and hence the value of the U.S. dollar stabilized.[67] Over the thirty-year period from 1981 to 2009, the U.S. dollar lost over half its value.[60] This is because the Federal Reserve has targeted not zero inflation, but a low, stable rate of inflation—between 1987 and 1997, the rate of inflation was approximately 3.5%, and between 1997 and 2007 it was approximately 2%. The so-called "Great Moderation" of economic conditions since the 1970s is credited to monetary policy targeting price stability.[67] There is ongoing debate about whether central banks should target zero inflation (which would mean a constant value for the U.S. dollar over time) or low, stable inflation (which would mean a continuously but slowly declining value of the dollar over time, as is the case now). Although some economists are in favor of a zero inflation policy and therefore a constant value for the U.S. dollar,[66] others contend that such a policy limits the ability of the central bank to control interest rates and stimulate the economy when needed. http://en.wikipedia.org/wiki/United_States_dollar#Value
Views: 11189 Way Back
Thurgood Marshall: The Definitive Biography of the Great Lawyer and Supreme Court Justice (1999)
Thurgood Marshall (July 2, 1908 – January 24, 1993) was an Associate Justice of the Supreme Court of the United States, serving from October 1967 until October 1991. Marshall was the Court's 96th justice and its first African-American justice. Before becoming a judge, Marshall was a lawyer who was best known for his high success rate in arguing before the Supreme Court and for the victory in Brown v. Board of Education, a decision that desegregated public schools. He served on the United States Court of Appeals for the Second Circuit after being appointed by President John F. Kennedy and then served as the Solicitor General after being appointed by President Lyndon Johnson in 1965. President Johnson nominated him to the United States Supreme Court in 1967. Although best remembered for jurisprudence in the fields of civil rights and criminal procedure, Marshall made significant contributions to other areas of the law as well. In Teamsters v. Terry, he held that the Seventh Amendment entitled the plaintiff to a jury trial in a suit against a labor union for breach of duty of fair representation. In TSC Industries, Inc. v. Northway, Inc., he articulated a formulation for the standard of materiality in United States securities law that is still applied and used today. In Cottage Savings Association v. Commissioner of Internal Revenue, he weighed in on the income tax consequences of the Savings and Loan crisis, permitting a savings and loan association to deduct a loss from an exchange of mortgage participation interests. In Personnel Administrator MA v. Feeney, Marshall wrote a dissent saying that a law that gave hiring preference to veterans over non-veterans was unconstitutional because of its inequitable impact on women. Among his many law clerks were attorneys who went on to become judges themselves, such as Judge Douglas Ginsburg of the D.C. Circuit Court of Appeals; Judge Ralph Winter of the United States Court of Appeals for the Second Circuit; Supreme Court Justice Elena Kagan; as well as notable law professors Susan Low Bloch, Elizabeth Garrett (President of Cornell University), Paul Gewirtz, Dan Kahan, Randall L. Kennedy, Eben Moglen, Rick Pildes, Louis Michael Seidman,[25] Cass Sunstein, and Mark Tushnet (editor of Thurgood Marshall: His Speeches, Writings, Arguments, Opinions and Reminiscences); and law school deans Paul Mahoney of University of Virginia School of Law, Martha Minow of Harvard Law School, and Richard Revesz of New York University School of Law. Marshall retired from the Supreme Court in 1991 due to declining health. In his retirement press conference on June 28, 1991, he expressed his view that race should not be a factor in choosing his successor, and he denied circulating claims that he was retiring because of frustration or anger over the conservative direction in which the Court was heading."[26] He was reportedly unhappy that it would fall to President George H. W. Bush to name his replacement.[27] Bush nominated Clarence Thomas to replace Marshall. In 2006, Thurgood, a one-man play written by George Stevens, Jr., premiered at the Westport Country Playhouse, starring James Earl Jones and directed by Leonard Foglia.[38] Later it opened Broadway at the Booth Theatre on April 30, 2008, starring Laurence Fishburne.[39] On February 24, 2011, HBO screened a filmed version of the play which Fishburne performed at the John F. Kennedy Center for the Performing Arts. The production was described by the Baltimore Sun as "one of the most frank, informed and searing discussions of race you will ever see on TV.".[40][41] On February 16, 2011, a screening of the film was hosted by the White House as part of its celebrations of Black History Month[42][43] A painting of Justice Thurgood by Chaz Guest currently hangs at the White House.[44][45] Also, a new film titled Marshall, is being made, with actor Chadwick Boseman as Thurgood Marshall, and directed by Reginald Hudlin. https://en.wikipedia.org/wiki/Thurgood_Marshall
Views: 8665 Way Back
Secret Archive of Top-Level KGB Documents Smuggled Out of the Soviet Union (1999)
Christopher Maurice Andrew (born 23 July 1941) is an historian at the University of Cambridge with a special interest in international relations and in particular the history of intelligence services. About the book: https://www.amazon.com/gp/product/0465003125/ref=as_li_tl?ie=UTF8&camp=1789&creative=9325&creativeASIN=0465003125&linkCode=as2&tag=ub066-20&linkId=67f0dfc948745a0e0f992e150a3d02e9 Andrew is Professor of Modern and Contemporary History, former Chairman of the History Faculty at Cambridge University, Official Historian of the Security Service (MI5), Honorary Air Commodore of 7006 (VR) Intelligence Squadron in the Royal Auxiliary Air Force, Chairman of the Cambridge Intelligence Seminar, and former Visiting Professor at Harvard, Toronto and Canberra. Professor Andrew is also co-editor of Intelligence and National Security, and a regular presenter of BBC Radio and TV documentaries, including the Radio Four series What If?. His twelve previous books include a number of path-breaking studies on the use and abuse of secret intelligence in modern history. He is currently a governor of Norwich School where in the 1950s he was a pupil, and has recently retired from his post as President of Corpus Christi College, Cambridge. Andrew studied under the historian and wartime cryptanalyst Sir Harry Hinsley, in common with fellow historian Peter Hennessy.[1] Former students of Andrew - including Peter Jackson, Richard Aldrich, Tim Edwards and Wesley Wark - now staff the intelligence studies and intelligence history posts in universities around the English-speaking world. Professor Andrew's reputation as an historian of intelligence studies was cemented with two studies completed in collaboration with two defectors and former KGB officers, Oleg Gordievsky and Vasili Mitrokhin. The first of these works, KGB: The Inside Story was a scholarly work on the history of KGB actions against Western governments produced from archival and open sources, with the critical addition of information from the KGB defector Gordievsky. His two most detailed works about the KGB were produced in collaboration with KGB defector and archivist Vassili Mitrokhin, who over the course of several years recopied vast numbers of KGB archive documents as they were being moved for long storage. Exfiltrated by the Secret Intelligence Service in 1992, Mitrokhin and his documents were made available to Andrew after an initial and thorough review by the security services. Both volumes, 1999's The Sword and the Shield: The Mitrokhin Archive and the Secret History of the KGB and the 2005 edition The World Was Going Our Way: The KGB and the Battle for the Third World (both volumes simply titled The Mitrokhin Archive in UK publication) resulted in some public scandal as they revealed the names of former KGB agents and collaborators in government, industry and private life around the world.[2] Most famous amongst these was the revelation in 1999 of the "Grandmother Spy", 87-year-old Melita Norwood, who had passed industrial information and other intelligence to the KGB for more than 50 years. The Cambridge Intelligence Seminar, chaired by Professor Andrew (and founded by his late mentor Harry Hinsley), convenes regularly in rooms at Corpus Christi College, Cambridge. Active and former senior members of various intelligence services around the world participate in the discussions, with most participants made up of Andrew's graduate students, fellow historians and other academics. At these meetings, detailed analysis of various past and present intelligence affairs is discussed under the Chatham House Rule, with the confidence that it will not be attributed to a person or organisation. https://en.wikipedia.org/wiki/Christopher_Andrew_(historian)
Views: 6858 Way Back
Winston Churchill on America: His Personal Vision of U.S. History (1999)
In 1895 Winston Churchill was commissioned cornet (second lieutenant) into the 4th Queen's Own Hussars. About the book: https://www.amazon.com/gp/product/037550320X/ref=as_li_tl?ie=UTF8&camp=1789&creative=9325&creativeASIN=037550320X&linkCode=as2&tag=ub066-20&linkId=15eb2df3ba2dc63b46d173e0f63d0c4f His annual pay was £300, and he calculated he needed an additional £500 to support a style of life equal to that of other officers of the regiment. To earn the required funds, he gained his colonel's agreement to observe the Cuban War of Independence; his mother, Lady Randolph Churchill, used her influence to secure a contract for her son to send war reports to The Daily Graphic.[4] He was subsequently posted back to his regiment, then based in British India, where he took part in, and reported on the Siege of Malakand; the reports were published in The Pioneer and The Daily Telegraph.[5][4] The reports formed the basis of his first book, The Story of the Malakand Field Force, which was published in 1898.[6] To relax he also wrote his only novel, Savrola, which was published in 1898.[7] That same year he was transferred to the Sudan to take part in the Mahdist War (1881–99), where he participated in the Battle of Omdurman in September 1898. He published his recollections in The River War (1899).[8][6] In 1899 Churchill resigned his commission and travelled to South Africa as the correspondent with The Morning Post, on a salary of £250 a month plus all expenses, to report on the Second Boer War.[9][b] He was captured by the Boers in November that year, but managed to escape. He remained in the country and continued to send in his reports to the newspaper. He subsequently published his despatches in two works, London to Ladysmith via Pretoria and Ian Hamilton's March (both 1900).[4] He returned to Britain in 1900 and was elected as the Member of parliament for the Oldham constituency at that year's general election. As a serving MP he began publishing pamphlets containing his speeches or answers to key parliamentary questions. Beginning with Mr Winston Churchill on the Education Bill (1902), over 135 such tracts were published over his career.[11] Many of these were subsequently compiled into collections, several of which were edited by his son, Randolph and others of which were edited by Charles Eade, the editor of the Sunday Dispatch.[12][13] In addition to his parliamentary duties, Churchill wrote a two-volume biography of his father, Lord Randolph Churchill, published in 1906, in which he "presented his father as a tory with increasingly radical sympathies", according to the historian Paul Addison.[9] In the 1923 general election Churchill lost his parliamentary seat and moved to the south of France where he wrote The World Crisis, a six-volume history of the First World War, published between 1923 and 1931. The book was well-received, although the former Prime Minister Arthur Balfour dismissed the work as "Winston's brilliant autobiography, disguised as world history".[14] At the 1924 general election Churchill returned to the Commons.[9] In 1930 he wrote his first autobiography, My Early Life, after which he began his researches for Marlborough: His Life and Times (1933–38), a four-volume biography of his ancestor, John Churchill, 1st Duke of Marlborough.[15] Before the final volume was published, Churchill wrote a series of biographical profiles for newspapers, which were later collected together and published as Great Contemporaries (1937).[9] In May 1940, eight months after the outbreak of the Second World War, Churchill became Prime Minister. He wrote no histories during his tenure, although several collections of his speeches were published.[16][17] At the end of the war he was voted out of office at the 1945 election; he returned to writing and, with a research team headed by the historian William Deakin, produced a six-volume history, The Second World War (1948–53). The books became a best-seller in both the UK and US.[17][18] Churchill served as Prime Minister for a second time between October 1951 and April 1955 before resigning the premiership; he continued to serve as an MP until 1964. His final major work was the four-volume work A History of the English-Speaking Peoples (1956–58).[19] In 1953 Churchill was awarded the Nobel Prize in Literature "for his mastery of historical and biographical description as well as for brilliant oratory in defending exalted human values".[1] Churchill was almost always well paid as an author and, for most of his life, writing was his main source of income. He produced a huge portfolio of written work; the journalist and historian Paul Johnson estimates that Churchill wrote an estimated eight to ten million words in more than forty books, thousands of newspaper and magazine articles, and at least two film scripts. https://en.wikipedia.org/wiki/Winston_Churchill_as_writer
Views: 7158 Way Back
Hemingway vs. Fitzgerald: Admiration, Jealousy, Friendship & Literature (2000)
Ernest Miller Hemingway (July 21, 1899 – July 2, 1961) was an American novelist, short story writer, and journalist. His economical and understated style had a strong influence on 20th-century fiction, while his life of adventure and his public image influenced later generations. Hemingway produced most of his work between the mid-1920s and the mid-1950s, and won the Nobel Prize in Literature in 1954. He published seven novels, six short story collections, and two non-fiction works. Additional works, including three novels, four short story collections, and three non-fiction works, were published posthumously. Many of his works are considered classics of American literature. Hemingway was raised in Oak Park, Illinois. After high school, he reported for a few months for The Kansas City Star, before leaving for the Italian front to enlist with the World War I ambulance drivers. In 1918, he was seriously wounded and returned home. His wartime experiences formed the basis for his novel A Farewell to Arms (1929). In 1921, he married Hadley Richardson, the first of his four wives. The couple moved to Paris, where he worked as a foreign correspondent and fell under the influence of the modernist writers and artists of the 1920s "Lost Generation" expatriate community. He published his first novel, The Sun Also Rises, in 1926. After his 1927 divorce from Hadley Richardson, Hemingway married Pauline Pfeiffer; they divorced after he returned from the Spanish Civil War where he had been a journalist, and after which he wrote For Whom the Bell Tolls (1940). Martha Gellhorn became his third wife in 1940; they separated when he met Mary Welsh in London during World War II. He was present at the Normandy landings and the liberation of Paris. Shortly after the publication of The Old Man and the Sea (1952), Hemingway went on safari to Africa, where he was almost killed in two successive plane crashes that left him in pain or ill health for much of his remaining life. Hemingway maintained permanent residences in Key West, Florida, (1930s) and Cuba (1940s and 1950s), and in 1959, he bought a house in Ketchum, Idaho, where he committed suicide in the summer of 1961. https://en.wikipedia.org/wiki/Ernest_Hemingway Francis Scott Key Fitzgerald (September 24, 1896 – December 21, 1940) was an American novelist and short story writer, whose works are the paradigmatic writings of the Jazz Age. He is widely regarded as one of the greatest American writers of the 20th century. Fitzgerald is considered a member of the "Lost Generation" of the 1920s. He finished four novels: This Side of Paradise, The Beautiful and Damned, The Great Gatsby (his best known), and Tender Is the Night. A fifth, unfinished novel, The Love of the Last Tycoon, was published posthumously. Fitzgerald also wrote numerous short stories, many of which treat themes of youth and promise, and age and despair. https://en.wikipedia.org/wiki/F._Scott_Fitzgerald
Views: 4154 Way Back
One of the US' Most Closely Guarded Secrets: What Happened at the Top Secret Site 85? (1999)
Just before midday on 11 March, the USAF turned their attention (recon) from looking for their missing personnel to that of destroying the captured radar, along with all the documentation and operation information left behind at Lima Site 85. About the book: https://www.amazon.com/gp/product/0231103174/ref=as_li_tl?ie=UTF8&camp=1789&creative=9325&creativeASIN=0231103174&linkCode=as2&tag=ub066-20&linkId=6ed05b0e020ce58ded70877ff131bac1 Between 12–18 March, the USAF conducted a total of 95 strike sorties against the radar site, and on 19 March an A-1 fighter-bomber destroyed every building at the old facility. In addition to the destruction of their radar equipment, the USAF bombing of Lima Site 85 may also had the effect of obliterating the bodies of U.S. personnel left behind at the site (2 remains found in 2013). In the days following the loss of Phou Pha Thi, Sullivan reflected on the disaster at Lima Site 85 and commented that U.S. technicians operating there should have been evacuated on 10 March, when it became amply clear the North Vietnamese were preparing to launch an assault.[27] For the USAF, the loss at Phou Pha Thi was not a result of intelligence failure, because it had been provided with accurate information from the very start. Instead, it was clearly a failure of command and control, as the U.S. personnel and their Hmong allies were not permitted to freely organize their own defense to hold the radar facility.[27] The Battle of Lima Site 85 resulted in the largest ground combat loss of USAF personnel during the Vietnam War.[28] A total of 12 U.S. personnel were missing or killed in the fighting on Phou Pha Thi; 11 were killed or missing on the ground and one was shot dead during the evacuation.[29] {The single fatality occurring during the evacuation was Air Force Chief Master Sergeant Richard Etchberger who was awarded the Medal of Honor posthumously in September 2010 for his role in helping four injured airmen into the evacuation helicopter lift sling.}[30] The total casualty figures for North Vietnamese, Pathet Lao, Hmong, and Thai units are unknown. According to official Vietnamese history, the VPA 41st Special Forces Battalion lost one soldier killed and two wounded in their fight for Lima Site 85. Against those losses, the Vietnamese claimed a total of 42 Hmong and Thai soldiers were killed, and a number of others were wounded. A large number of weapons were captured by the NVA, including one 105 mm howitzer, one 85 mm artillery piece, four recoilless rifles, four heavy mortars, nine heavy machine guns, and vast amounts of ammunition.[31] The North Vietnamese victory proved to be a significant one, as they had succeeded in knocking out a major asset of the USAF, which had inflicted heavy damages to North Vietnam's limited industrial infrastructures.[32] The fight at Phou Pha Thi, which was part of a larger military campaign waged by the North Vietnamese and their Pathet Lao allies, marked the beginning of the Communist dry-season offensive against Laotian Government forces in northeastern Laos. By September 1968, the strength of North Vietnamese and Pathet Lao forces in the Sam Neua area were estimated to have numbered more than 20 battalions.[32] Against such heavy odds, General Vang Pao insisted on recapturing Phou Pha Thi, which the U.S. Embassy believed unnecessary. On 1 November 1968, Pao launched Operation Pig Fat in attempt to retake Phou Pha Thi, but the operation quickly turned into a rout of the Royal Laos Army and the Hmong guerrillas and Phou Pha Thi was never retaken.[33] Although airpower was to be a major factor in the defense of Lima Site 85, it could not be applied without limitations and restrictions. The defense of Lima Site 85 was not the sole focus of limited air resources at the time. During this period, the 1968 Tet Offensive was underway in South Vietnam, the Marine outpost at Khe Sanh was under siege, and there existed an unprecedented flow of enemy logistical traffic which had to be interdicted. Lima Site 85 had provided direction to about a quarter of the USAF missions over North Vietnam and Barrel Roll from November 1967 to 11 March 1968. No other facility existed to provide a similar coverage over these areas. While this loss was a serious blow to the USAF air effort, it was not crippling. Eleven of the twelve USAF personnel lost on the day of the battle were listed first as missing in action (MIA), then later as KIA/body not recovered.[34] Between 1994 and 2004, 11 investigations were conducted by both Joint POW/MIA Accounting Command (JPAC) and unilaterally by Lao and Vietnamese investigators on both sides of the border.[35] In 2002 two of the former VPA soldiers who had taken part in the attack told investigators that they threw the bodies of the Americans off the mountain after the attack as they were unable to bury them on the rocky surface. https://en.wikipedia.org/wiki/Battle_of_Lima_Site_85
Views: 21134 Way Back
When Affirmative Action Was For White People: Its Twisted Origins (2005)
The first appearance of the term 'affirmative action' was in the National Labor Relations Act, better known as the Wagner Act, of 1935.[25]:15 Proposed and championed by U.S. Senator Robert F. Wagner of New York, the Wagner Act was in line with President Roosevelt's goal of providing economic security to workers and other low-income groups.[26] During this time period it was not uncommon for employers to blacklist or fire employees associated with unions. The Wagner Act allowed workers to unionize without fear of being discriminated against, and empowered a National Labor Relations Board to review potential cases of worker discrimination. In the event of discrimination, employees were to be restored to an appropriate status in the company through 'affirmative action'.[27] While the Wagner Act protected workers and unions it did not protect minorities, who, exempting the Congress of Industrial Organizations, were often barred from union ranks.[25]:11 This original coining of the term therefore has little to do with affirmative action policy as it is seen today, but helped set the stage for all policy meant to compensate or address an individual's unjust treatment.[citation needed] FDR's New Deal programs often contained equal opportunity clauses stating "no discrimination shall be made on account of race, color or creed",[25]:11 but the true forerunner to affirmative action was the Interior Secretary of the time, Harold L. Ickes. Ickes prohibited discrimination in hiring for Public Works Administration funded projects and oversaw not only the institution of a quota system, where contractors were required to employ a fixed percentage of Black workers, by Robert C. Weaver and Clark Foreman,[25]:12 but also the equal pay of women proposed by Harry Hopkins.[25]:14 FDR's largest contribution to affirmative action, however, lay in his Executive Order 8802 which prohibited discrimination in the defense industry or government.[25]:22 The executive order promoted the idea that if taxpayer funds were accepted through a government contract, then all taxpayers should have an equal opportunity to work through the contractor. To enforce this idea, Roosevelt created the Fair Employment Practices Committee (FEPC) with the power to investigate hiring practices by government contractors. https://en.wikipedia.org/wiki/Affirmative_action_in_the_United_States
Views: 2072 Way Back
One of the Strangest Shows Ever to Air on TV (1958)
Television game shows descended from similar programs on radio. The very first television game show, Spelling Bee, was broadcast in 1938. Truth or Consequences was the first game show to air on commercially-licensed television. Its first episode aired in 1941 as an experimental broadcast. Over the course of the 1950s, as television began to pervade the popular culture, game shows quickly became a fixture. Daytime game shows would be played for lower stakes to target stay-at-home housewives. Higher-stakes programs would air in prime time. During the late 1950s, high-stakes games such as Twenty One and The $64,000 Question began a rapid rise in popularity. However, the rise of quiz shows proved to be short-lived. In 1959, many of the higher stakes game shows were discovered to be rigged. Ratings declines led to most of the prime time games being canceled. An evolution of the quiz show, called the panel show, gained popularity in the 1950s and survived the quiz show scandals. Panels of celebrities, rather than members of the public, would answer the questions on shows like What's My Line?, I've Got A Secret and To Tell The Truth, each of which had success in primetime until the late 1960s. In the US, panel shows were largely relegated to daytime television by the 1970s, most notably with Match Game and Hollywood Squares. In the UK, however, panel shows have continued to thrive in primetime as they've transformed into showcases for the nation's top stand-up comedians on shows such as Have I Got News For You, Would I Lie to You?, Mock The Week, QI and 8 out of 10 Cats, all of which put a heavy emphasis on comedy, leaving the points as mere formalities. The focus on quick-witted comedians has resulted in strong ratings, which, combined with low costs of production, have only spurred growth in the UK panel show phenomenon. Game shows remained a fixture of US daytime television through the 1960s after the quiz show scandals. Lower-stakes games made a slight comeback in daytime in the early 1960s; examples include Jeopardy! which began in 1964 and the original version of The Match Game first aired in 1962. Let's Make a Deal began in 1963 and the 1960s also marked the debut of Hollywood Squares, Password, The Dating Game and The Newlywed Game. http://en.wikipedia.org/wiki/Game_show
Views: 1444948 Way Back
Learn the Keys to Robert E. Lee's Greatness as a Man and a Leader (1999)
Lee serves as a main character in the Shaara Family novels The Killer Angels (Gettysburg), Gods and Generals, and The Last Full Measure, as well as the film adaptations of Gettysburg and Gods and Generals. He is played by Martin Sheen in the former and his descendant Robert Duvall in the latter. Lee is portrayed as a hero in the historical children's novel Lee and Grant at Appomattox by MacKinlay Kantor. His part in the Civil War is told from the perspective of his horse in Richard Adams' book Traveller. Lee is an obvious subject for American Civil War alternate histories. Ward Moore's Bring the Jubilee, Kantor's If the South Had Won the Civil War (1960), and Harry Turtledove's The Guns of the South, all have Lee ending up as President of a victorious Confederacy and freeing the slaves (or laying the groundwork for the slaves to be freed in a later decade). Although Bring and If relegate him to a set of passing references, Lee is more of a main character in the Guns. He is also the prime character of Turtledove's "Lee at the Alamo," which can be read on-line here, and sees the opening of the Civil War drastically altered so as to affect Lee's personal priorities considerably. Turtledove's "War Between the Provinces" series is an allegory of the Civil War told in the language of fairy tales, with Lee appearing as a knight named "Duke Edward of Arlington." Lee is also a knight in "The Charge of Lee's Brigade" in Alternate Generals volume 1, written by Turtledove's friend S.M. Stirling and featuring Lee, whose Virginia is still a loyal British colony, fighting for the Crown against the Russians in Crimea. In Lee Allred's "East of Appomattox" in Alternate Generals volume 3, Lee is the Confederate Minister to London circa 1868, desperately seeking help for a CSA which has turned out poorly suited to independence. Robert Skimin's Grey Victory features Lee as a supporting character preparing to run for the presidency in 1867. On September 18, 1960, the American actor George Macready portrayed Lee in the episode "Johnny Yuma at Appomattox" of the ABC television series The Rebel, starring Nick Adams in the title role.[168] Robert Symonds played Lee in the 1982 miniseries The Blue and the Gray . In the 1986 TV series North and South Book II , Lee was portrayed by actor William Schallert . The Dodge Charger featured in the CBS television series The Dukes of Hazzard was named The General Lee.[169][170] In the 2005 film based on this series, the car drives past a statue of the General, and its drivers salute him. https://en.wikipedia.org/wiki/Robert_E._Lee
Views: 11843 Way Back
The CIA Knew About Oswald and His Cuban Involvement: What the Warren Report Missed (2004)
One of Oswald's Fair Play for Cuba leaflets had the address "544 Camp Street" hand-stamped on it, apparently by Oswald himself. The address was in the "Newman Building" which, from October 1961 to February 1962, housed a militant anti-Castro group, the Cuban Revolutionary Council. Around the corner but located in the same building, with a different entrance, was the address 531 Lafayette Street—the address of "Guy Banister Associates", a private detective agency run by former FBI agent Guy Banister. Banister's office was involved in anti-Castro and private investigative activities in the New Orleans area (a CIA file indicated that in September 1960, the CIA had considered "using Guy Banister Associates for the collection of foreign intelligence, but ultimately decided against it"). In the late 1970s, the House Select Committee on Assassinations (HSCA) investigated the possible relationship of Oswald to Banister's office. While the committee was unable to interview Guy Banister (who died in 1964), the committee did interview his brother Ross Banister. Ross "told the committee that his brother had mentioned seeing Oswald hand out Fair Play for Cuba literature on one occasion. Ross theorized that Oswald had used the 544 Camp Street address on his literature to embarrass Guy."[136] Guy Banister's secretary, Delphine Roberts, told author Anthony Summers that she saw Oswald at Banister's office, and that he filled out one of Banister's "agent" application forms. She said, "Oswald came back a number of times. He seemed to be on familiar terms with Banister and with the office."[137] The House Select Committee on Assassinations investigated Roberts' claims and said that "because of contradictions in Roberts' statements to the committee and lack of independent corroboration of many of her statements, the reliability of her statements could not be determined."[138] Oswald's 1963 New Orleans activities were later investigated by New Orleans District Attorney Jim Garrison, as part of his prosecution of Clay Shaw in 1967–1969. Garrison was particularly interested in an associate of Guy Banister—a man named David Ferrie[139] and his possible connection to Oswald, which Ferrie himself denied.[140] Ferrie died before Garrison could complete his investigation.[141] Charged with conspiracy in the JFK assassination, Shaw was found not guilty. The Warren Commission examined Oswald's involvement with a New Orleans Civil Air Patrol troop he briefly attended in 1955 with high school friend Edward Voebel. Several witnesses testified that David Ferrie was the Civil Air Patrol unit's commander during at least some of the time that Oswald attended C.A.P. meetings.[35][142][143][144][145] However, the FBI interviewed Ferrie shortly after the assassination and concluded there was no relationship of significance in regards to Oswald.[146] A more extensive investigation was done by the House Select Committee on Assassinations, which interviewed several of Oswald's former fellow cadets and others, none of whom recalled Ferrie and Oswald interacting. These fellow cadets said that Oswald attended some 8 to 10 C.A.P. meetings over a two-month period.[33][34][35] In 1993, the PBS television program Frontline obtained a photograph taken in 1955 showing Oswald and Ferrie at a C.A.P. cookout with other cadets. Marina's friend, Ruth Paine, transported Marina and her child by car from New Orleans to the Paine home in Irving, Texas, near Dallas, on September 23, 1963.[110][149] Oswald stayed in New Orleans at least two more days to collect a $33 unemployment check. It is uncertain when he left New Orleans; he is next known to have boarded a bus in Houston on September 26—bound for the Mexican border, rather than Dallas—and to have told other bus passengers that he planned to travel to Cuba via Mexico.[150][151] He arrived in Mexico City on September 27, where he applied for a transit visa at the Cuban Embassy,[152] claiming he wanted to visit Cuba on his way to the Soviet Union. The Cuban embassy officials insisted Oswald would need Soviet approval, but he was unable to get prompt co-operation from the Soviet embassy. After five days of shuttling between consulates—that included a heated argument with an official at the Cuban consulate, impassioned pleas to KGB agents, and at least some CIA scrutiny[153]—Oswald was told by a Cuban consular officer that he was disinclined to approve the visa, saying "a person like [Oswald] in place of aiding the Cuban Revolution, was doing it harm." http://en.wikipedia.org/wiki/Lee_Harvey_Oswald
Views: 5427 Way Back
Larry Flynt v. Jerry Falwell: Supreme Court Case - Hustler Magazine (1987)
Hustler Magazine, Inc. v. Falwell, 485 U.S. 46 (1988), was a United States Supreme Court case in which the Court held, in a unanimous 8–0 decision (Justice Anthony Kennedy took no part in the consideration or decision of the case), that the First Amendment's free-speech guarantee prohibits awarding damages to public figures to compensate for emotional distress intentionally inflicted upon them. Thus, Hustler magazine's parody of Jerry Falwell was deemed to be within the law, because the Court found that reasonable people would not have interpreted the parody to contain factual claims, leading to a reversal of the jury verdict in favor of Falwell, who had previously been awarded $150,000 in damages by a lower court. While Hustler magazine has always been known for its explicit pictures of nude women and for what many consider crude humor,[2] the prominent fundamentalist Protestant minister Jerry Falwell objected to the parody ad the magazine printed in 1983 targeted at him, in which Falwell related having an incestuous encounter with his mother in an outhouse.[3] The satire at issue was a takeoff of an advertising campaign for Campari, an Italian apéritif.[4] The real ads were tongue-in-cheek interviews with celebrities talking about their "first time".[5] The ads, which played off the double entendre in the headline (“X talks about his first time”), initially appeared to discuss the star’s first sexual experience before revealing that the discussion actually concerned the subject's first time drinking Campari. The Hustler parody, created by writer Terry Abrahamson and art director Mike Salisbury,[6] featured a picture of Falwell, and a fictional "interview" in which "Falwell" describes his first sexual experience as occurring "with Mom" in an outhouse while both were "drunk off our God-fearing asses on Campari."[7] In the spoof interview, "Falwell" goes on to say that he was so intoxicated that "Mom looked better than a Baptist whore with a $100 donation," that he decided to have sex with his mother since she had "showed all the other guys in town such a good time", and that they had intercourse regularly afterwards.[8] Finally, when asked if he had tried Campari since, "Falwell" answered, "I always get sloshed before I go out to the pulpit. You don’t think I could lay down all that bullshit sober, do you?" The ad carried a disclaimer in small print at the bottom of the page, reading "ad parody—not to be taken seriously."[9] The magazine's table of contents also listed the ad as "Fiction; Ad and Personality Parody."[10] Falwell sued Larry Flynt, Hustler magazine, and Flynt's distribution company in the United States District Court for the Western District of Virginia for libel, invasion of privacy, and intentional infliction of emotional distress.[11] Before trial, the court granted Flynt's motion for summary judgment on the invasion of privacy claim, and the remaining two charges proceeded to trial. A jury found in favor of Flynt on the libel claim, stating that the parody could not "reasonably be understood as describing actual facts about [Falwell] or actual events in which [he] participated."[12] On the claim of intentional infliction of emotional distress, the jury ruled in favor of Falwell and awarded him $150,000 in damages.[12] Flynt appealed to the Fourth Circuit. The Fourth Circuit affirmed, rejecting Flynt's argument that the actual-malice standard of New York Times Company v. Sullivan, 376 U.S. 254 (1964) applied in cases of intentional infliction of emotional distress where the plaintiff was a public figure, as Falwell concededly was. The New York Times standard focused too heavily on the truth of the statement at issue; for the Fourth Circuit, it was enough that Virginia law required the defendant to act intentionally. After the Fourth Circuit declined to rehear the case en banc, the U.S. Supreme Court granted Flynt's request to hear the case. https://en.wikipedia.org/wiki/Hustler_Magazine_v._Falwell Image By Jo Anna Barber [CC BY-SA 2.0 (http://creativecommons.org/licenses/by-sa/2.0)], via Wikimedia Commons
Views: 5580 Way Back

Kansas city chiefs trivia
Oil change colorado springs
Washington social diary
Ati college of health miami fl
Citizens first bank the villages florida