Family is the tie that binds, but while one end under girds human civilization, the other is a noose choking away our individuationSherlock to Joan - from Elementary
In regard to the recent Orlando massacre:
Comment section :
Misscreant19 hours ago #1
Well, I was going to write,"So, this is how liberty dies, with thunderous applause," but I'm thinking that liberty actually died some time ago.
In reply to: Misscreant #1
Liberty died for 49 people in Orlando just a few days ago to a thunderous sound, but it wasn't applause.
That will arrive from a vast majority of the American people when these common sense bills are passed, signalling the end of the NRA choke hold we've so deeply despised.
Thank you Connecticut Sen. Chris Murphy for your lengthy filibuster and courageous stand against the insanity.
Clinton and Obama Team Up Against Trump
Speaking 250 miles apart Tuesday, but as if reading from the same hymnal, President Barack Obama and presumptive Democratic nominee Hillary Clinton delivered simultaneous withering critiques of Donald Trump's response to the Orlando terror attack.
The seemingly coordinated salvo from the Democratic Party's two biggest heavyweights is a preview of the months to come, when Clinton will have at her disposal at least two popular presidents, the vice president, her Democratic primary opponent, and a slew of other high-profile Democrats.
Clinton's campaign would not comment specifically on whether the two speeches were coordinated, but spokesperson Nick Merrill said, "Obviously we speak regularly with the White House."
As the Clinton campaign musters a coordinated communications strategy, Trump has been left more or less to defend himself, with few high-profile surrogates to back him up.
"At this point, Trump is like an army column advancing with no armor on either side of him," said Robert Shrum, a former top strategist to two Democratic presidential campaigns. "He put himself in a very vulnerable position."
Related: Angry President Tears into Trump Like Never Before
In its response to Obama's evisceration Tuesday, the Republican National Committee, which Trump has leaned on to supplement his under-developed campaign, made no effort to defend its presumptive nominee. The committee's press release criticized Obama's terror strategy and linked it to Clinton without even mentioning Trump.
The disparity between the two sides has been especially noteworthy since the Orlando mass shooting upended the campaign script.
During Trump's speech in New Hampshire on Monday, when he expanded his call for a temporary ban on Muslims entering the U.S., the state's top Republican officials did not attend -- the audience was filled instead with people invited by Republican operatives.
None of the North Carolina GOP congressional delegation was expected to attend Trump's campaign event Tuesday night in Greensboro, NC, either, citing congressional business, according to an NBC News survey.
The tandem rebuke of the Republican on Tuesday by former rivals Obama and Clinton, regardless of whether the White House and the campaign coordinated directly, worked together to reinforce the party line.
Obama in Washington, mocked Trump's fixation with Democrats' refusal to say the words "radical Islam," which has become a key argument against Obama and Clinton in their approach to terrorism.
"There's no magic to the phrase 'radical Islam.' It's a political talking point," Obama said. "Not once has an advisor said, 'Man, if we only used that phrase, we'd really turn this thing around.' "
Related: Republicans Run From Donald Trump's Orlando Response
Clinton, at a union hall in Pittsburgh, echoed the sentiment. "Is Donald Trump saying that somehow there are magic words that once uttered will stop terrorists from coming after us?" she asked.
Obama also knocked "yapping" from "politicians who tweet" -- Twitter being one of Trump's primary means of communication -- adding that "loose talk and sloppiness" is dangerous.
Clinton, for her part, slammed Trump's response to the Orlando shooting as reckless and lacking substance. "He went on TV and suggested that President Obama is on the side of the terrorists. Now just think about that for a second," she said. "Even in a time of divided politics, this is way beyond anything that should be said by someone running for President of the United States."
What other prez could pull this off?
Finally found the original tune and performer to a tune that instantly grabbed me
Well done Mrs C
Hamburg, Germany -- WE Germans can never escape the trauma of our recent history. That has rarely been clearer than today, as we look around our Continent and across the Atlantic. There are almost too many differences to mention between what happened in the 1930s over here and what is going on today. And it goes without saying that Donald J. Trump and Austria's Norbert Hofer are not Adolf Hitler. Still, Germany's slide into a popular embrace of authoritarianism in the 1930s offers a frame for understanding how liberal democracies can suddenly turn toward anti-liberalism.
Setting aside debate about whether the rise of Nazism was built into the German DNA, there were four trends that led the country to reject its post-World War I constitutional, parliamentary democracy, known as the Weimar Republic: economic depression, loss of trust in institutions, social humiliation and political blunder. To a certain degree, these trends can be found across the West today.
First, the history. The Black Friday stock-market collapse of 1929 set off a global depression. As bad as things were in America, they were even worse in Germany, where industrial production shrank by half in the following three years. Stocks lost two-thirds of their value. Inflation and unemployment skyrocketed. The Weimar government, already held in low esteem by many Germans, seemed to have no clue about what to do.
All this happened as traditional ways of life and values were being shaken by the modernization of the 1920s. Women suddenly went to work, to vote, to party and to sleep with whomever they wanted. This produced a widening cultural gap between the tradition-oriented working and middle classes and the cosmopolitan avant-garde -- in politics, business and the arts -- that reached a peak just when economic disaster struck. The elites were blamed for the resulting chaos, and the masses were ripe for a strongman to return order to society.
Some people today imagine that Hitler sneaked up on Germany, that too few people understood the threat. In fact, many mainstream politicians recognized the danger but they failed to stop him. Some didn't want to: The conservative parties and the nobility believed the little hothead could serve as their useful idiot, that as chancellor he would be contained by a squad of reasonable ministers. Franz von Papen, a nobleman who was Hitler's first vice chancellor, said of the new leader, "We've hired him."
At the same time, even the imminent threat of a fascist dictatorship couldn't persuade the left-wing parties to join forces. Instead of being conciliatory for the sake of the national interest, Ernst Thälmann, the head of the German Communist Party, branded the center-left Social Democrats the "moderate wing of fascism." No wonder Hitler had an easy time uniting broad sections of the German public.
Are we at another Weimar moment now?
This piece first appeared at Mike Krauss' blog.
The other night I watched The Greatest Cable News Program That Absolutely Ever Was. The host was extolling the virtues of capitalism, repeating the claims you can read in The Economist or Wall Street Journal; that capitalism has lifted many millions out of poverty world wide.
The same broadcast also reported that most Americans "could not lay their hands on $1,000" in an emergency. That figure may be on the high side. Other published reports put it at $400, including what may be available on a credit card.
The program host missed the contradiction. You can't be a capitalist without capital. The overwhelming majority of Americans don't have any, and are completely excluded from the "benefits" of the capitalist system he extolled.
Capitalism is not a form of government. It is a system of wealth management. It does not create wealth, but only allocates it. It is indifferent to the welfare of people. It has no social purpose. Private profit is everything.
Over several decades, as millions in Asia and elsewhere have seen living standards rise, tens of millions of Americans have seen theirs fall dramatically - low wages, and lost jobs - in a massive re-allocation of wealth abroad from the once large and prosperous American middle class.
In order to mask the growing poverty in America, the capitalists introduced massive credit, debt and propaganda to sustain the illusion of prosperity among enough Americans to head off a revolt against an economic system that clearly no longer works for them.
Americans are now drowning in debt: families, students, businesses, state and local governments and school districts.
20th Century capitalism is like a sun burning out, collapsing in on itself, consolidating into global monopolies to reduce competition and maintain private profit.
In order to form and protect monopolies, capitalists must dominate governments. These monopolies were once national in scope. Now they are global. The form of government that capitalists have always favored is fascism - the integration and primacy of corporate interests in the government, for which the military is an agent.
Think Nazi Germany. Its purpose was not military domination or even control of individual liberty. These were incidental to the first purpose: the global primacy of German corporations and the German 1 percent.
The point of World War II, from the German perspective, was that after the war, Daimler-Benz would be the world's largest car manufacturer, Krupp and Thyssen would be the dominant steel manufacturers, IG Farben would be the dominant chemical and pharmaceutical company and Deutsche Bank would lead world banking and finance.
It didn't work out that way. The U.S. destroyed the physical plant of both Germany and Japan - our two main commercial rivals - and U.S. manufacturers and banks had a field day. The American middle class boomed.
But the Germans and Japanese rebuilt with modern technology and began to compete with the outdated American physical plant. U.S. unions resisted modernization that cost any jobs and U.S. manufacturers began to relocate overseas into modern and more efficient facilities. Then modern container shipping slashed the cost of distribution from foreign to U.S. markets and American manufacturers brought other nations into the game.
As the accumulated wealth of the American middle class was re-allocated abroad by the capitalist system, the capitalists began the drive to eliminate the drag on profits of global competition by consolidating into global monopolies.
That is the purpose of the Trans Pacific and Trans Atlantic "trade" deals promoted by U.S. President Obama, British Prime Minister Cameron and the global cartel of banksters they represent, who provide the financing (debt) to enable the capitalists to compensate each other for lost future profits when one is aggregated into a new and larger monopoly by another.
But the serfs on the neo-feudal debt manor are finally now in revolt. One battle is the vote in the United Kingdom on an exit from the European Union (EU). A dissolution of the EU undermines the Trans Atlantic deal and threatens the Trans Pacific deal and the entire fascist future of the New World Order
Another battle is the U.S. presidential elections and Donald Trump's assault on these deals.
So Obama went to the U.K. to lay down the law and explain the dire consequences of any resistance to that New World Order. Then he went to Asia to deliver the same message. He will push for a vote in Congress on the Trans Pacific deal as soon as possible, while he still has the support of the pre-Trump GOP of Paul Ryan.
Capitalists of the world unite! The fascist future is in reach.
Mike Krauss is chair of The Pennsylvania Project, a non-partisan public policy advocacy organization. His byline has appeared in the Wall Street Journal and regularly as a guest columnist in the Bucks County Courier Times.
Countries with the best standard of living are turning atheist. That shift offers a glimpse into the world's future.
Religious people are annoyed by claims that belief in God will go the way of horse transportation, and for much the same reason, specifically an improved standard of living.
The view that religious belief will give way to atheism is known as the secularization thesis. The specific version that I favor (1) is known as the existential security hypothesis. The basic idea is that as people become more affluent, they are less worried about lacking for basic necessities, or dying early from violence or disease. In other words they are secure in their own existence. They do not feel the need to appeal to supernatural entities to calm their fears and insecurities.
The notion that improving living conditions are associated with a decline in religion is supported by a mountain of evidence (1,2,3).
That does not prevent some serious scholars, like political scientist Eric Kaufmann (4), from making the opposite case that religious fundamentalists will outbreed the rest of us. Yet, noisy as they can be, such groups are tiny minorities of the global population and they will become even more marginalized as global prosperity increases and standards of living improve.
Moreover, as religious fundamentalists become economically integrated, young women go to work and produce smaller families, as is currently happening for Utah's Mormons.
The most obvious approach to estimating when the world will switch over to being majority atheist is based on economic growth. This is logical because economic development is the key factor responsible for secularization. In deriving this estimate, I used the nine most godless countries as my touchstone (excluding Estonia as a formerly communist country).
The countries were Belgium, Czech Republic, Denmark, France, Germany, Japan, Netherlands, Sweden and the United Kingdom. These nine countries averaged out at the atheist transition in 2004 (5) with exactly half of the populations disbelieving in God. Their gross domestic product (GDP) averaged $29,822 compared to $10,855 for the average country in the world. How long will it take before the world economy has expanded sufficiently that the GDP of the average country has caught up to the average for the godless countries in 2004?
Using the average global growth rate of GDP for the past 30 years of 3.33 percent (based on International Monetary Fund data from their website), the atheist transition would occur in 2035.
Belief in God is not the only relevant measure of religion, of course. A person might believe in God in a fairly superficial way without religion affecting his or her daily life. One way of assessing the depth of religious commitment is to ask survey participants whether they think that religion is important in their daily lives as the Gallup Organization has done in worldwide nationally representative surveys.
If fewer than 50 percent of the population agreed that religion was important to them, then the country has effectively crossed over to a secular majority. The godless countries by religiosity were Spain, South Korea, Canada, Switzerland, Uruguay, Germany and France. At a growth rate of 3.33 percent per year it would be 2041 before the average country in the world would be at an equivalent level of affluence as these godless nations.
If national wealth drives secularization, the global population will cross an atheist threshold where the majority see religion as unimportant by 2041.
Averaging across the two measures of atheism, the entire world population would cross the atheist threshold by about 2038 (average of 2035 for disbelief and 2041 for religiosity). Although 2038 may seem improbably fast, this requires only a shift of approximately 1 percent per year whether in religiosity or belief in God. Using the Human Development Index as a clock suggests an even earlier arrival for the atheist transition (1).
Is the loss of religious belief something fear? Contrary to the claims of religious leaders, Godless countries are highly moral nations with an unusual level of social trust, economic equality, low crime and a high level of civic engagement (5). We could do with some of that.
1. Barber, N. (2012). Why atheism will replace religion: The triumph of earthly pleasures over pie in the sky. E-book, available at: http://www.amazon.com/Atheism-Will-Replace-Religion-ebook/dp/B00886ZSJ6/
2. Norris, P., & Inglehart, R. (2004). Sacred and secular: Religion and politics worldwide. Cambridge: Cambridge University Press.
3. Barber, N. (2011). A Cross-National test of the uncertainty hypothesis of religious belief Cross-Cultural Research, 45, 318-333.
4. Kaufmann, E. (2010). Shall the religious inherit the earth? London: Profile books.
5. Zuckerman, P. (2008). Society without God: What the least religious nations can tell us about contentment. New York: New York University Press.
The most important musical form of the 20th century will be nearly forgotten one day. People will probably learn about the genre through one figure -- so who might that be?
By Chuck Klosterman
Classifying anyone as the "most successful" at anything tends to reflect more on the source than the subject. So keep that in mind when I make the following statement: John Philip Sousa is the most successful American musician of all time.
Marching music is a maddeningly durable genre, recognizable to pretty much everyone who has lived in the United States for any period. It works as a sonic shorthand for any filmmaker hoping to evoke the late 19th century and serves as the auditory backdrop for national holidays, the circus and college football. It's not "popular" music, but it's entrenched within the popular experience. It will be no less fashionable tomorrow than it is today.
And this entire musical idiom is now encapsulated in one person: John Philip Sousa. Even the most cursory two-sentence description of marching music inevitably cites him by name. I have no data on this, but I would assert that if we were to ask the entire population of the United States to name every composer of marching music they could think of, 98 percent of the populace would name either one person (Sousa) or no one at all. There's just no separation between the awareness of this person and the awareness of this music, and it's hard to believe that will ever change.
Now, the reason this happened -- or at least the explanation we've decided to accept -- is that Sousa was simply the best at this art. He composed 136 marches over a span of six decades and is regularly described as the most famous musician of his era. The story of his life and career has been shoehorned into the U.S. education curriculum at a fundamental level. (I first learned of Sousa in fourth grade, a year before we memorized the state capitals.) And this, it seems, is how mainstream musical memory works. As the timeline moves forward, tangential artists in any field fade from the collective radar, until only one person remains; the significance of that individual is then exaggerated, until the genre and the person become interchangeable. Sometimes this is easy to predict: I have zero doubt that the worldwide memory of Bob Marley will eventually have the same tenacity and familiarity as the worldwide memory of reggae itself.
But envisioning this process with rock music is harder. Almost anything can be labeled "rock": Metallica, ABBA, Mannheim Steamroller, a haircut, a muffler. If you're a successful tax lawyer who owns a hot tub, clients will refer to you as a "rock-star C.P.A." when describing your business to less-hip neighbors. The defining music of the first half of the 20th century was jazz; the defining music of the second half of the 20th century was rock, but with an ideology and saturation far more pervasive. Only television surpasses its influence.
And pretty much from the moment it came into being, people who liked rock insisted it was dying. The critic Richard Meltzer supposedly claimed that rock was already dead in 1968. And he was wrong to the same degree that he was right. Meltzer's wrongness is obvious and does not require explanation, unless you honestly think "Purple Rain" is awful. But his rightness is more complicated: Rock is dead, in the sense that its "aliveness" is a subjective assertion based on whatever criteria the listener happens to care about.
This is why the essential significance of rock remains a plausible thing to debate, as does the relative value of major figures within that system (the Doors, R.E.M., Radiohead). It still projects the illusion of a universe containing multitudes. But it won't seem that way in 300 years.
The symbolic value of rock is conflict-based: It emerged as a byproduct of the post-World War II invention of the teenager, soundtracking a 25-year period when the gap between generations was utterly real and uncommonly vast. That dissonance gave rock music a distinctive, nonmusical importance for a long time. But that period is over. Rock -- or at least the anthemic, metaphoric, Hard Rock Cafe version of big rock -- has become more socially accessible but less socially essential, synchronously shackled by its own formal limitations. Its cultural recession is intertwined with its cultural absorption. As a result, what we're left with is a youth-oriented music genre that a) isn't symbolically important; b) lacks creative potential; and c) has no specific tie to young people. It has completed its historical trajectory. Which means, eventually, it will exist primarily as an academic pursuit. It will exist as something people have to be taught to feel and understand.
I imagine a college classroom in 300 years, in which a hip instructor is leading a tutorial filled with students. These students relate to rock music with no more fluency than they do the music of Mesopotamia: It's a style they've learned to recognize, but just barely (and only because they've taken this specific class). Nobody in the room can name more than two rock songs, except the professor. He explains the sonic structure of rock, its origins, the way it served as cultural currency and how it shaped and defined three generations of a global superpower. He shows the class a photo, or perhaps a hologram, of an artist who has been intentionally selected to epitomize the entire concept. For these future students, that singular image defines what rock was.
So what's the image?
Matthew Harwood and Jay Stanley / TomDispatch
This piece first appeared at TomDispatch. Read Tom Engelhardt's introduction here.
Can't you see the writing on the touchscreen? A techno-utopia is upon us. We've gone from smartphones at the turn of the twenty-first century to smart fridges and smart cars. The revolutionary changes to our everyday life will no doubt keep barreling along. By 2018, so predicts Gartner, an information technology research and advisory company, more than three million employees will work for "robo-bosses" and soon enough we--or at least the wealthiest among us--will be shopping in fully automated supermarkets and sleeping in robotic hotels.
With all this techno-triumphalism permeating our digitally saturated world, it's hardly surprising that law enforcement would look to technology--"smart policing," anyone?--to help reestablish public trust after the 2014 death of Michael Brown in Ferguson, Missouri, and the long list of other unarmed black men killed by cops in Anytown, USA. The idea that technology has a decisive role to play in improving policing was, in fact, a central plank of President Obama's policing reform task force.
In its report, released last May, the Task Force on 21st Century Policing emphasized the crucial role of technology in promoting better law enforcement, highlighting the use of police body cameras in creating greater openness. "Implementing new technologies," it claimed, "can give police departments an opportunity to fully engage and educate communities in a dialogue about their expectations for transparency, accountability, and privacy."
Indeed, the report emphasized ways in which the police could engage communities, work collaboratively, and practice transparency in the use of those new technologies. Perhaps it won't shock you to learn, however, that the on-the-ground reality of twenty-first-century policing looks nothing like what the task force was promoting. Police departments nationwide have been adopting powerful new technologies that are remarkably capable of intruding on people's privacy, and much of the time these are being deployed in secret, without public notice or discussion, let alone permission.
And while the task force's report says all the right things, a little digging reveals that the feds not only aren't putting the brakes on improper police use of technology, but are encouraging it--even subsidizing the misuse of the very technology the task force believes will keep cops honest. To put it bluntly, a techno-utopia isn't remotely on the horizon, but its flipside may be.
Getting Stung and Not Even Knowing It
Shemar Taylor was charged with robbing a pizza delivery driver at gunpoint. The police got a warrant to search his home and arrested him after learning that the cell phone used to order the pizza was located in his house. How the police tracked down the location of that cell phone is what Taylor's attorney wanted to know.
The Baltimore police detective called to the stand in Taylor's trial was evasive. "There's equipment we would use that I'm not going to discuss," he said. When Judge Barry Williams ordered him to discuss it, he still refused, insisting that his department had signed a nondisclosure agreement with the FBI.
"You don't have a nondisclosure agreement with the court," replied the judge, threatening to hold the detective in contempt if he did not answer. And yet he refused again. In the end, rather than reveal the technology that had located Taylor's cell phone to the court, prosecutors decided to withdraw the evidence, jeopardizing their case.
And don't imagine that this courtroom scene was unique or even out of the ordinary these days. In fact, it was just one sign of a striking nationwide attempt to keep an invasive, constitutionally questionable technology from being scrutinized, whether by courts or communities.
The technology at issue is known as a "Stingray," a brand name for what's generically called a cell site simulator or IMSI catcher. By mimicking a cell phone tower, this device, developed for overseas battlefields, gets nearby cell phones to connect to it. It operates a bit like the children's game Marco Polo. "Marco," the cell-site simulator shouts out and every cell phone on that network in the vicinity replies, "Polo, and here's my ID!"
Thanks to this call-and-response process, the Stingray knows both what cell phones are in the area and where they are. In other words, it gathers information not only about a specific suspect, but any bystanders in the area as well. While the police may indeed use this technology to pinpoint a suspect's location, by casting such a wide net there is also the potential for many kinds of constitutional abuses--for instance, sweeping up the identities of every person attending a demonstration or a political meeting. Some Stingrays are capable of collecting not only cell phone ID numbers but also numbers those phones have dialed and even phone conversations. In other words, the Stingray is a technology that potentially opens the door for law enforcement to sweep up information that not so long ago wouldn't have been available to them.
All of this raises the sorts of constitutional issues that might normally be settled through the courts and public debate... unless, of course, the technology is kept largely secret, which is exactly what's been happening.
After the use of Stingrays was first reported in 2011, the American Civil Liberties Union (ACLU) and other activist groups attempted to find out more about how the technology was being used, only to quickly run into heavy resistance from police departments nationwide. Served with "open-records requests" under Freedom of Information Act-like state laws, they almost uniformly resisted disclosing information about the devices and their uses. In doing so, they regularly cited nondisclosure agreements they had signed with the Harris Corporation, maker of the Stingray, and with the FBI, prohibiting them from telling anyone (including other government outfits) about how--or even that--they use the devices.
Sometimes such evasiveness reaches near-comical levels. For example, police in the city of Sunrise, Florida, served with an open-records request, refused to confirm or deny that they had any Stingray records at all. Under cover of a controversial national security court ruling, the CIA and the NSA sometimes resort to just this evasive tactic (known as a "Glomar response"). The Sunrise Police Department, however, is not the CIA, and no provision in Florida law would allow it to take such a tack. When the ACLU pointed out that the department had already posted purchase records for Stingrays on its public website, it generously provided duplicate copies of those very documents and then tried to charge the ACLU $20,000 for additional records.
One of the good guys takes his leave.
Born in Toronto, Safer began his career writing for various Canadian newspapers before he jumped to the Canadian Broadcasting Corporation and embraced the then-new technology of television.
He joined CBS as a London correspondent in 1964, and the very next year he was sent to Saigon to open the network's new bureau and cover the war.
Not long after he got to Vietnam, Safer followed a group of Marines -- who had dubbed themselves "The Zippo Brigade" after then ubiquitous cigarette lighters -- on a "search and destroy" mission to the village of Cam Ne.
"They moved into the village and they systematically began torching every house -- every house as far as I could see, getting people out in some cases, using flame throwers in others," Safer recalled years later in a PBS special report.
Safer's report, broadcast on CBS News on Aug. 5, 1965, horrified Americans and helped turn public opinion against the Vietnam War.
An outraged President Lyndon Johnson rousted CBS chief Frank Stanton from his bed and told him they had "s--- on the American flag" and ordered a security check to see if Safer was a communist.
Safer later distinguished himself covering another horror show -- the bloody Nigerian Civil War -- before joining 60 Minutes in December 1970, the third year of the groundbreaking news show.
There, Safer interviewed everyone from presidents and potentates to Miss Piggy. He was one of the brightest stars in a constellation of TV reporting talent that included Mike Wallace and Dan Rather.
During his career, Safer won a dozen Emmys and numerous other awards for his reporting.
He is survived by his wife, Jane Fearer, and their daughter, Sarah Alice Anne Safer.
Wright plays first lady Claire Underwood, an equally devious partner for Spacey's President Frank Underwood, on the Netflix series.
The Huffington Post reports Wright said during a Tuesday interview at the Rockefeller Foundation in New York that she demanded equal pay after seeing statistics showing her character was more popular than Spacey's for a period of time. She says she threatened to "go public" if she didn't get paid.
While washing her hands in a Connecticut Walmart bathroom, a woman named Aimee Toms, 22, was harassed by another customer who said:
"You're disgusting!" "You don't belong here!"
Toms replied, "Oh, yes I do!" She soon figured out the woman's outburst had to do with new state laws like the North Carolina's HB2 (Hate Bill) that disallows transgender people to use the public restroom that aligns with their own gender identity.
Aimee Toms has a short haircut. She said she has a short haircut because because she donates her hair to a program that uses the hair to create wigs for child cancer patients. But it shouldn't matter why anyone has short hair. This was a clear case of ugly, hateful discrimination and a result of Republican extremist lawmakers like Governor Pat McCrory. Toms adds:
"I've had people call me all sorts of names for having short hair. I've had people call me a boy, I've had people call me a dyke, I've had people call me gay." Toms said. "I'm grateful that that woman only called me disgusting and didn't physically attack me ... I was a victim of transphobia today as a cisgender female because my hair is short."
When she got home, Toms posted about her experience on Facebook. She says after experiencing the hatred, she can't imagine the discrimination transgender people must face during their lives, on a daily basis.
"Can you imagine going out every day and having people tell you should not be who you are or that people will not accept you as who you are?"
Here is the YouTube video by Aimee Toms:
Go slow dudes.
When part of the process being touted is the idea of making the process more efficient by deleting stretches of (apparently) "of DNA that do not have any function"...it sets off alarms in this brain.
By Andrew Pollack
Scientists are now contemplating the fabrication of a human genome, meaning they would use chemicals to manufacture all the DNA contained in human chromosomes.
The prospect is spurring both intrigue and concern in the life sciences community because it might be possible, such as through cloning, to use a synthetic genome to create human beings without biological parents.
While the project is still in the idea phase, and also involves efforts to improve DNA synthesis in general, it was discussed at a closed-door meeting on Tuesday at Harvard Medical School in Boston. The nearly 150 attendees were told not to contact the news media or to post on Twitter during the meeting.
Organizers said the project could have a big scientific payoff and would be a follow-up to the original Human Genome Project, which was aimed at reading the sequence of the three billion chemical letters in the DNA blueprint of human life. The new project, by contrast, would involve not reading, but rather writing the human genome -- synthesizing all three billion units from chemicals.
But such an attempt would raise numerous ethical issues. Could scientists create humans with certain kinds of traits, perhaps people born and bred to be soldiers? Or might it be possible to make copies of specific people?
"Would it be O.K., for example, to sequence and then synthesize Einstein's genome?" Drew Endy, a bioengineer at Stanford, and Laurie Zoloth, a bioethicist at Northwestern University, wrote in an essay criticizing the proposed project. "If so how many Einstein genomes should be made and installed in cells, and who would get to make them?"
Dr. Endy, though invited, said he deliberately did not attend the meeting at Harvard because it was not being opened to enough people and was not giving enough thought to the ethical implications of the work.
George Church, a professor of genetics at Harvard Medical School and an organizer of the proposed project, said there had been a misunderstanding. The project was not aimed at creating people, just cells, and would not be restricted to human genomes, he said. Rather it would aim to improve the ability to synthesize DNA in general, which could be applied to various animals, plants and microbes.
"They're painting a picture which I don't think represents the project," Dr. Church said in an interview.
He said the meeting was closed to the news media, and people were asked not to tweet because the project organizers, in an attempt to be transparent, had submitted a paper to a scientific journal. They were therefore not supposed to discuss the idea publicly before publication. He and other organizers said ethical aspects have been amply discussed since the beginning.
The project was initially called HGP2: The Human Genome Synthesis Project, with HGP referring to the Human Genome Project. An invitation to the meeting at Harvard said that the primary goal "would be to synthesize a complete human genome in a cell line within a period of 10 years."
By Eugene Robinson
Save us all the faux drama. We already know how this star-crossed courtship is going to end: House Speaker Paul Ryan will decide that Donald Trump isn't such an ogre after all, and they'll live unhappily ever after.
Ryan will be unhappy, at least. Trump has stolen his party, and there's nothing Ryan can do in the short term to get it back.
"I heard a lot of good things from our presumptive nominee," Ryan told reporters after his much-ballyhooed Thursday meeting with Trump. "I do believe we are now planting the seeds to get ourselves unified to bridge the gaps and differences."
Translation: Ryan may still not be "there yet," in terms of a formal endorsement, but we should have no doubt about where he's headed.
Trump came to Washington for meetings with Ryan and other GOP establishment figures as a conqueror, not a supplicant. His populism, xenophobia, isolationism, bigotry and evident love of big government may be anathema to the Republican elite, but the party's base clearly feels otherwise. Anyone choosing self-interest over principle--a habit I have observed among politicians--would think twice about opposing a man who received more primary votes than any previous GOP nominee.
Thus we witness a shameful parade of quislings. The most galling surrender may have been that of Sen. John McCain of Arizona, who says he will support the nominee even though Trump cruelly ridiculed him for being shot down and captured during the Vietnam War.
McCain's military service was a profile in courage; what he's doing now is not. Leaving aside the personal insult, McCain has spent his career advocating a muscular foreign policy. His has been one of the loudest and most persistent voices arguing that more U.S. troops be sent to Syria and Iraq. Trump, by contrast, has proclaimed an "America first" doctrine that focuses resources on solving problems at home. Trump has even expressed deep skepticism about NATO, which has been the cornerstone of the West's security architecture for more than half a century.
Sen. Lindsey Graham of South Carolina, McCain's closest soul mate on national security issues, is one of the few leading Republicans who remain in the "never Trump" camp. He vowed this week that "no re-education camp" would change his mind.
What's the difference between the two amigos? Graham doesn't have to face South Carolina voters again until 2020. McCain is running for re-election this year--and watched as Trump scored a blowout victory in Arizona's presidential primary in March.
Ryan is, or perhaps was, the last great hope of those Republicans who oppose Trump on ideological and historical grounds. The party of Lincoln has a storied past--the landmark civil rights laws of the 1960s, for example, never could have made it through Congress without GOP support. This heritage has been dishonored in recent years; among other transgressions, Republican governors and state legislatures across the country are trying to discourage minority voters with restrictive voter-identification laws. But there are those, such as Ryan, who profess to believe that the party can still be compassionate and inclusive.
Not with Trump in charge, however. Trump's appeal has been built on anger, grievance and nostalgia for a golden age that never was (at least for women and people of color). To the extent he has any coherent political philosophy, it is one of exclusion. His one unwavering promise involves the building of a wall.
Everything else, it seems, is negotiable. Having sewn up the nomination, Trump has entered the "three-card Monte" phase of his campaign in which he shuffles his positions so quickly that the gullible patsy loses track. His proposed ban on Muslim immigration? That was a mere "suggestion," he said the other day. His view that wages are too high? He now wants to see the minimum wage raised, but by the states, not the federal government. His view on whether the rich should pay more in taxes? Yes, no and maybe.
Ryan acknowledged after his meeting with Trump that "differences" remain. But Senate Majority Leader Mitch McConnell of Kentucky has endorsed Trump, as has most of Ryan's leadership team in the House. If Ryan were to announce at this point that he deems Trump unfit for the presidency and therefore cannot support him, he would become the leader of a movement with few followers.
The Republican Party will not be united this fall. In what promises to be a display of cravenness on an epic scale, it will pretend to be.
Another great leaves us.
Lena Horne Dies At 92
Lena Horne, the enchanting jazz singer and actress who reviled the bigotry that allowed her to entertain white audiences but not socialize with them, slowing her rise to Broadway superstardom, has died. She was 92.
Horne died Sunday at NewYork-Presbyterian Hospital, according to hospital spokeswoman Gloria Chin. Chin would not release any other details.
Horne, whose striking beauty and magnetic sex appeal often overshadowed her sultry voice, was remarkably candid about the underlying reason for her success.
"I was unique in that I was a kind of black that white people could accept," she once said. "I was their daydream. I had the worst kind of acceptance because it was never for how great I was or what I contributed. It was because of the way I looked."
In the 1940s, she was one of the first black performers hired to sing with a major white band, the first to play the Copacabana nightclub and among a handful with a Hollywood contract.
In 1943, MGM Studios loaned her to 20th Century-Fox to play the role of Selina Rogers in the all-black movie musical "Stormy Weather." Her rendition of the title song became a major hit and her signature piece.
On screen, on records and in nightclubs and concert halls, Horne was at home vocally with a wide musical range, from blues and jazz to the sophistication of Rodgers and Hart in songs like "The Lady Is a Tramp" and "Bewitched, Bothered and Bewildered."
In her first big Broadway success, as the star of "Jamaica" in 1957, reviewer Richard Watts Jr. called her "one of the incomparable performers of our time." Songwriter Buddy de Sylva dubbed her "the best female singer of songs."
But Horne was perpetually frustrated with the public humiliation of racism.
"I was always battling the system to try to get to be with my people. Finally, I wouldn't work for places that kept us out ... it was a damn fight everywhere I was, every place I worked, in New York, in Hollywood, all over the world," she said in Brian Lanker's book "I Dream a World: Portraits of Black Women Who Changed America."
While at MGM, she starred in the all-black "Cabin in the Sky," in 1943, but in most of her other movies, she appeared only in musical numbers that could be cut in the racially insensitive South without affecting the story. These included "I Dood It," a Red Skelton comedy, "Thousands Cheer" and "Swing Fever," all in 1943; "Broadway Rhythm" in 1944; and "Ziegfeld Follies" in 1946.
"Metro's cowardice deprived the musical of one of the great singing actresses," film historian John Kobal wrote.
Early in her career Horne cultivated an aloof style out of self-preservation, becoming "a woman the audience can't reach and therefore can't hurt" she once said.
Later she embraced activism, breaking loose as a voice for civil rights and as an artist. In the last decades of her life, she rode a new wave of popularity as a revered icon of American popular music.
Her 1981 one-woman Broadway show, "Lena Horne: The Lady and Her Music," won a special Tony Award. In it, the 64-year-old singer used two renditions -- one straight and the other gut-wrenching -- of "Stormy Weather" to give audiences a glimpse of the spiritual odyssey of her five-decade career.
A sometimes savage critic, John Simon, wrote that she was "ageless. ... tempered like steel, baked like clay, annealed like glass; life has chiseled, burnished, refined her."
On May 9, the IOP welcomed Jon Stewart for a special live taping of "The Axe Files" podcast, hosted by IOP Director David Axelrod. Make sure to subscribe to the podcast at iTunes.com/AxeFiles to listen to this episode and more!
Minnesota Senator Amy Klobuchar (D) talks with Rachel Maddow about peculiar line of attack Donald Trump has taken on Hillary Clinton, and whether Clinton will have better success running against Trump than his Republican rivals did.
Finally, we learn what we intuited about Trump's lack of policy is actually and horrifyingly true.
Truly, Donald Trump knows nothing. He is more ignorant about policy than you can possibly imagine, even when you take into account the fact that he is more ignorant than you can possibly imagine. But his ignorance isn't as unique as it may seem: In many ways, he's just doing a clumsy job of channeling nonsense widely popular in his party, and to some extent in the chattering classes more generally.
Last week the presumptive Republican presidential nominee -- hard to believe, but there it is -- finally revealed his plan to make America great again. Basically, it involves running the country like a failing casino: he could, he asserted, "make a deal" with creditors that would reduce the debt burden if his outlandish promises of economic growth don't work out.
The reaction from everyone who knows anything about finance or economics was a mix of amazed horror and horrified amazement. One does not casually suggest throwing away America's carefully cultivated reputation as the world's most scrupulous debtor -- a reputation that dates all the way back to Alexander Hamilton.
The Trump solution would, among other things, deprive the world economy of its most crucial safe asset, U.S. debt, at a time when safe assets are already in short supply.
Of course, we can be sure that Mr. Trump knows none of this, and nobody in his entourage is likely to tell him. But before we simply ridicule him -- or, actually, at the same time that we're ridiculing him -- let's ask where his bad ideas really come from.
First of all, Mr. Trump obviously believes that America could easily find itself facing a debt crisis. But why? After all, investors, who are willing to lend to America at incredibly low interest rates, are evidently not worried by our debt. And there's good reason for their calmness: federal interest payments are only 1.3 percent of G.D.P., or 6 percent of total outlays.
These numbers mean both that the burden of the debt is fairly small and that even complete repudiation of that debt would have only a minor impact on the government's cash flow.
So why is Mr. Trump even talking about this subject? Well, one possible answer is that lots of supposedly serious people have been hyping the alleged threat posed by federal debt for years. For example, Paul Ryan, the speaker of the House, has warned repeatedly about a "looming debt crisis." Indeed, until not long ago the whole Beltway elite seemed to be in the grip of BowlesSimpsonism, with its assertion that debt was the greatest threat facing the nation.
A lot of this debt hysteria was really about trying to bully us into cutting Social Security and Medicare, which is why so many self-proclaimed fiscal hawks were also eager to cut taxes on the rich. But Mr. Trump apparently wasn't in on that particular con, and takes the phony debt scare seriously. Sad!
Still, even if he misunderstands the fiscal situation, how can he imagine that it would be O.K. for America to default? One answer is that he's extrapolating from his own business career, in which he has done very well by running up debts, then walking away from them.
But it's also true that much of the Republican Party shares his insouciance about default. Remember, the party's congressional wing deliberately set about extracting concessions from President Obama, using the threat of gratuitous default via a refusal to raise the debt ceiling.
And quite a few Republican lawmakers defended that strategy of extortion by arguing that default wouldn't be that bad, that even with its access to funds cut off the U.S. government could "prioritize" payments, and that the financial disruption would be no big deal.
Given that history, it's not too hard to understand why candidate Trump thinks not paying debts in full makes sense.
The important thing to realize, then, is that when Mr. Trump talks nonsense, he's usually just offering a bombastic version of a position that's widespread in his party. In fact, it's remarkable how many ridiculous Trumpisms were previously espoused by Mitt Romney in 2012, from his claim that the true unemployment rate vastly exceeds official figures to his claim that he can bring prosperity by starting a trade war with China.
None of this should be taken as an excuse for Mr. Trump. He really is frighteningly uninformed; worse, he doesn't appear to know what he doesn't know. The point, instead, is that his blithe lack of knowledge largely follows from the know-nothing attitudes of the party he now leads.
Oh, and just for the record: No, it's not the same on the other side of the aisle. You may dislike Hillary Clinton, you may disagree sharply with her policies, but she and the people around her do know their facts. Nobody has a monopoly on wisdom, but in this election, one party has largely cornered the market in raw ignorance.
TOKYO -- An artist was found not guilty of obscenity Monday for displaying figurines modeled on her vagina but received a fine for distributing digital data that could be used to make a realistic three-dimensional recreation of her genitalia.
Image: Megumi Igarashi in 2013 Megumi Igarashi paddles in her kayak modeled on her vagina in Tokyo's Tama river on Oct. 19, 2013. Eigo Shimojo / Handout via Reuters, file
A court in Tokyo dismissed prosecutors' charge that Megumi Igarashi, who works under the name "Rokudenashiko" -- or "good-for-nothing girl" -- had displayed obscene objects. It ruled her figurines decorated with fake fur and glitter could be considered "pop art."
However, the judges said the data, from a scan of her own vagina, could be used with a three-dimensional printer to create a realistic shape that could sexually arouse viewers.
Related: Is This Vagina-Inspired Kayak Obscene?
The court found Igarashi guilty for distributing digital data of indecent material and fined her 400,000 yen ($3,700). Prosecutors had sought a fine of more than $7,400.
Igarashi said she was "20 percent happy" that the court acknowledged her figurines as art, but stressed she was "completely innocent."
"I am of course indignant. I will appeal and continue to fight in court," she told a news conference.
Scott is not only repugnant on every level as a human being...he also a freaking dolt.
By Steve Benen
Last week, Florida Gov. Rick Scott (R) decided to pick a curious fight with California. In retrospect, he probably should have thought this through a little better.
To briefly recap, the Republican governor announced that he was on his way to the Golden State, where he'd try to convince business leaders to relocate. As part of the pitch, Scott, relying on his economic development organization, launched radio ads touting Florida's lower minimum wage, non-existent state income tax, and weaker regulations.
The response from California Gov. Jerry Brown's (D) press secretary was pretty compelling in its own right, but the governor himself took some time to reply to Rick Scott directly yesterday. The L.A. Times reported:
Brown's letter to Gov. Rick Scott was billed as a plea for the Florida Republican to get engaged on the issue of climate change. But he also made it clear that he sees nothing wrong with California's economic health these days.
"Rick," Brown wrote, "a fact you'd like to ignore: California is the 7th largest economic power in the world. We're competing with nations like Brazil and France, not states like Florida."
As it turns out, Brown didn't just send correspondence to his cross-country critic. The Californian also sent a report from a non-partisan climate initiative: "Come Heat and High Water: Climate Risk in the Southeastern U.S. and Texas."
As Brown seemed eager to note, the report said, "Florida faces more risk than any other state that private, insurable property could be inundated by high tide, storm surge and sea level rise. By 2030 up to $69 billion in coastal property will likely be at risk of inundation at high tide that is not at risk today. By 2050, the value of property below local high tide levels will increase to up to about $152 billion."
The Democratic governor concluded, "So while you're enjoying a stroll on one of California's beautiful beaches this week, don't stick your head in the sand. Take a few minutes to read the rest of this report. There's no time to waste."
As best as I can tell, Scott, a climate denier, has not yet responded. Here's hoping it's because the Florida governor is still reading the report and coming to terms with its findings.
from The Root
Shakur was in the middle of wrapping up a biopic about her son's life.
By: Yesha Callahan
According to the New York Daily News, Shakur died Monday in Sausalito, Calif., after being rushed to the hospital for possible cardiac arrest.
In her early days, Shakur was active in the Black Panther movement and gave birth to the future rap legend just one year after being acquitted in a bombing conspiracy.
After Tupac's death in 1996, she took over his estate and vowed to keep his legacy alive. A year later, Afeni Shakur founded the Tupac Amaru Shakur Foundation, which champions youth arts programs and oversaw his unreleased work. Shakur was currently serving as an executive producer for the upcoming Tupac biopic, All Eyez on Me, with Danai Guerira cast to portray her.
Here's how the wealthiest nation in the world deals with their poorest: