The litter-filled streets of Sunset Park are an unlikely setting for a housing comeback. Flanked by an elevated six-lane highway on one side and a 478-acre cemetery on the other, the Brooklyn neighbourhood has seen house prices climb since the economic downturn.
For that it can thank immigrants such as Liu Song Yan, whose dreams of home ownership are driving the neighbourhood's housing market recovery - an increasingly common pattern across the US.
"It is important for me to own my own home," says 66-year old Mr Liu, who started life in New York as a clothing factory worker after arriving from Taishan, China in 1985. After 20 years of long working days, scrimping and saving to build a cash stash, he put down a deposit on a modest two-storey brick house in Sunset Park, an area that has long been a magnet for working-class immigrants.
"It is about financial security. Even if you have nothing to eat, at least you have a roof over your head. For me, this is the American dream."
A specter is haunting university history departments: the specter of capitalism.
After decades of "history from below," focusing on women, minorities and other marginalized people seizing their destiny, a new generation of scholars is increasingly turning to what, strangely, risked becoming the most marginalized group of all: the bosses, bankers and brokers who run the economy.
Even before the financial crisis, courses in "the history of capitalism" -- as the new discipline bills itself -- began proliferating on campuses, along with dissertations on once deeply unsexy topics like insurance, banking and regulation. [...]
The new history of capitalism is less a movement than what proponents call a "cohort": a loosely linked group of scholars who came of age after the end of the cold war cleared some ideological ground, inspired by work that came before but unbeholden to the questions -- like, why didn't socialism take root in America? -- that animated previous generations of labor historians.
Instead of searching for working-class radicalism, they looked at office clerks and entrepreneurs.
"Earlier, a lot of these topics would've been greeted with a yawn," said Stephen Mihm, an associate professor of history at the University of Georgia and the author of "A Nation of Counterfeiters: Capitalists, Con Men and the Making of the United States." "But then the crisis hit, and people started asking, 'Oh my God, what has Wall Street been doing for the last 100 years?' "
In 1996, when the Harvard historian Sven Beckert proposed an undergraduate seminar called the History of American Capitalism -- the first of its kind, he believes -- colleagues were skeptical. "They thought no one would be interested," he said.
But the seminar drew nearly 100 applicants for 15 spots and grew into one of the biggest lecture courses at Harvard, which in 2008 created a full-fledged Program on the Study of U.S. Capitalism. That initiative led to similar ones on other campuses, as courses and programs at Princeton, Brown, Georgia, the New School, the University of Wisconsin and elsewhere also began drawing crowds -- sometimes with the help of canny brand management.
After Seth Rockman, an associate professor of history at Brown, changed the name of his course from Capitalism, Slavery and the Economy of Early America to simply Capitalism, students concentrating in economics and international relations started showing up alongside the student labor activists and development studies people.
...but it's been decades since the students had any interest in the nonsense radicals want to teach.
Ultimately the question of growth revolves around the preferences of consumers. Despite predictions that the rise of singles, an aging population and the changing preferences of millennials will create a glut of 22 million unwanted large-lot homes by 2025, it seems more likely that three critical groups will fuel demand for more suburban housing.
Between 2000 and 2011, there has been a net increase of 9.3 million in the foreign born population, largely from Asia and Latin America, with these newcomers accounting for about two out of every five new residents of the nation's 51 largest metropolitan areas. And these immigrants show a growing preference for more "suburbanized" cities such as Nashville, Charlotte, Houston and Dallas-Fort Worth. An analysis of census data shows only New York--with nearly four times the population--drew (barely) more foreign-born arrivals over the past decade than sprawling Houston. Overwhelmingly suburban Riverside-San Bernardino expanded its immigrant population by nearly three times as many people as the much larger and denser Los Angeles-Orange County metropolitan area.
Clearly, immigrants aren't looking for the density and crowding of Mexico City, Seoul, Shanghai, or Mumbai. Since 2000, about two-thirds of Hispanic household growth was in detached housing. The share of Asian arrivals in detached housing is up 20 percent over the same span. Nearly half of all Hispanics and Asians now live in single-family homes, even in traditionally urban places like New York City, according to the census's American Community Survey.
Nowhere are these changes more marked than among Asians, who now make up the nation's largest wave of new immigrants. Over the last decade, the Asian population in suburbs grew by about 2.8 million, or 53 percent, while that of core cities grew by 770,000, or 28 percent.
Aging boomers, too, continue to show a preference for space, despite the persistent urban legend that they will migrate back to the core city. Again, the numbers tell a very different story.
A National Association of Realtors survey last year of buyers over 65 found that the vast majority looked for suburban homes. Of the remaining seniors, only one in 10 looked for a place in the city--less than the share that wanted a rural home. When demographer Wendell Cox examined the cohort that was 54 to 65 in 2000 to see where they were a decade later, the share that lived in the suburbs was stable, while many had left the city--the real growth was people moving to the countryside. Within metropolitan areas, more than 99 percent of the increase in population among people aged 65 and over between 2000 and 2010 was in low-density counties with less than 2,500 people per square mile.
With the over-65 population expected to double by 2050, making it by far America's fastest-growing age group, they appear poised to be a significant source of demand for suburban housing.
But arguably the most critical element to future housing demand is the rising millennial generation. It has been widely asserted by retro-urbanists that young people prefer urban living. Urban theorists such as Peter Katz have maintained that millennials (the generation born after 1983) have little interest in "returning to the cul-de-sacs of their teenage years."
To bolster their assertions, retro-urbanist point to stated-preference research showing that more than three quarters of millennials say they "want to live in urban cores." But looking at where millenials actually live now--and where they see themselves living in the future--shows a very different story. In the nation's major metropolitan areas, only 8 percent of residents aged 20 to 24 (the only millennial adult age group for which census data is available) live in the highest-density counties--and that share has declined from a decade earlier. What's more, 43 percent of millenials describe the suburbs as their "ideal place to live"--a greater share than their older peers--and 82 percent of adult millenials say it's "important" to them to have an opportunity to own their home.
And, of course, as people get older and take on commitments and start families, they tend to look for more settled, and less dense, environments. A 2009 Pew study found that 45 percent of Americans 18 to 34 would like to live in New York City, compared with just 14 percent of those over 35. As about 7 million more millenials--a group the Pew surveys show desire children and place a premium on being good parents--hit their 30s by 2020, expect their remaining attachment to the city to wane.
This family connection has always eluded the retro-urbanists. "Suburbs," Jane Jacobs once wrote, "must be difficult places to raise children." Yet suburbs have served for three generation now as the nation's nurseries. Jacobs's treatment of the old core city--particularly her Greenwich Village in the early 1960s--lovingly portrayed these places as they once were, characterized by class, age, and some ethnic diversity along with strong parental networks, often based on ethnic solidarity.
To say the least, this is not what characterizes Greenwich Village or in Manhattan today. In fact, many of the most vibrant, and high-priced urban cores--including Manhattan, San Francisco, Chicago, and Seattle--have remarkably few children living there. Certainly, the the 300-square-foot "micro-units" now all the rage among the retro-urbanist set seem unlikely to attract more families, or even married couples.
The United States has double the amount of oil and three times the amount of natural gas than previously thought, stored deep under the states of North Dakota, South Dakota, and Montana, according to new data the Obama administration released Tuesday.
In announcing the new data in a conference call, Interior Secretary Sally Jewell also said the administration will release within weeks draft rules to regulate hydraulic fracturing, technology that has come under scrutiny for its environmental impact but that is essential to developing all of this energy.
"These world-class formations contain even more energy-resource potential than previously understood, which is important information as we continue to reduce our nation's dependence on foreign sources of oil," Jewell said in a statement.
The formations, called Bakken and Three Forks, span much of western North Dakota, the northern tip of South Dakota and the northeastern tip of Montana. The last time the United States Geological Survey assessed this area for its oil and gas reserves was in 2008. But that assessment did not include the Three Forks formation, which explains the substantial increase in the estimates. USGS estimates that these two formations together hold 7.4 billion barrels of undiscovered--but technically recoverable--oil and 6.7 trillion cubic feet of natural gas.
A president is in trouble when he's forced to defend his relevancy, as Bill Clinton did 18 years ago, or to quote Mark Twain, as Barack Obama did Tuesday. "Rumors of my demise," he said at a news conference, "may be a little exaggerated at this point."
Not wrong--just "exaggerated." Not forever--just "at this point."
Parsing aside, Obama channeled Clinton's April 18, 1995, news conference by projecting a sense of helplessness--or even haplessness--against forces seemingly out of a president's control.
...at least he was never relevant in the first place, so it's not like he's lost anything. Just as Bill Clinton's only historical legacies are the trade bills he inherited from GHWB and the Welfare Reform he was handed by Newt Gingrich, so too is the Obama legacy going to be the continuation of W's Middle East liberalization and passage of the Heritage Foundation's health insurance mandate.
The Treasury Department said that it expects to retire a net $35 billion in bonds, notes and bills from April to the end of June. That compares with its estimate from earlier this year that it would rack up an additional $103 billion in marketable debt in the second quarter.
"The paydown this quarter, the first since 2007 is emblematic of the turn in budget finances from horrible, to grim on their way to steadily better," Eric Green, global head of research at TD Securities, said in a note.
Schizophrenic. Killer. My Cousin. : It's insanity to kill your father with a kitchen knife. It's also insanity to close hospitals, fire therapists, and leave families to face mental illness on their own. (Mac McClelland | May/June 2013, Mother Jones)
Psychiatrist E. Fuller Torrey calls a crime like Houston's "a predictable tragedy." That's what he has also called the Gabrielle Giffords shooting; he says the same thing about the Virginia Tech massacre, the Aurora movie theater shooting, the Sandy Hook Elementary shooting, and dozens of other recent homicides, some of them famous mass killings or subway platform shovings, but many of them less publicized. Ten percent of US homicides, he estimates based on an analysis of the relevant studies, are committed by the untreated severely mentally ill--like my schizophrenic cousin. And, he says: "I'm thinking that's a conservative estimate."
Saying that the severely mentally ill are disproportionately responsible for homicides has made Torrey, author of The Insanity Offense and the forthcoming American Psychosis, unpopular in some circles. "[My critics'] argument is you can't talk about these things because it causes stigma," he says. In the aftermath of the Newtown tragedy, some mental-illness advocates insisted that even if Adam Lanza had Asperger's or any mental-health issues, it would be totally inappropriate to cite that as a factor in his actions. But other administrators and caretakers think it's vital to bring up. "We have to think about mental-health care in a public health framework," says Dee Roth, who is on the National Advisory Council of the federal Substance Abuse and Mental Health Services Administration (SAMHSA). "Public health measures solved rickets, cholera, people dying when they're 30." But when it comes to mental illness, she says, "we're not treating the sick people." And while the details of Lanza's diagnosis or any attempts to treat it remain unconfirmed, what is known, as Torrey pointed out in a piece he coauthored in the Wall Street Journal, is that Connecticut is "among the worst states to seek such treatment. It has among the weakest involuntary treatment laws and is one of only six states that doesn't have a law permitting court-ordered 'assisted outpatient treatment,'" which, Torrey notes, "has been shown to decrease re-hospitalizations, incarcerations and, most importantly, episodes of violence among severely mentally ill individuals." Although even Torrey, who founded the Treatment Advocacy Center, an organization that pushes for fewer restrictions on involuntary commitment, admits that such measures would hardly plug all the holes in our mental-health-care system.
For three decades, we have debated what causes homelessness and how to deal with it. Is homelessness a mental health problem? A substance abuse problem? A problem caused by gentrification and urban redevelopment? Or something else again?
The Bush administration substituted a much simpler idea -- an idea that happened to work. Whatever the cause of homelessness, the solution is ... a home.
In 2002, Bush appointed a new national homeless policy czar, Philip Mangano. A former music agent imbued with the religious philosophy of St. Francis of Assisi, Mangano was seized by an idea pioneered by New York University psychiatrist Sam Tsemberis: "housing first."
The "housing first" concept urges authorities to concentrate resources on the hardest cases -- to move them into housing immediately -- and only to worry about the other problems of the homeless after they first have a roof over their heads. A 2004 profile in The Atlantic nicely summarized Tsemberis' ideas: "Offer them (the homeless) the apartment first, he believes, and you don't need to spend years, and service dollars, winning their trust."
We conservatives are all for a world that's benefited from both premodern and modern experiences, although we don't think that there's anything historically inevitable or even likely about the emergence of such a world. A genuinely postmodern world avoids the spiritual and aristocratic excesses of the medieval world and the material and democratic excesses of the modern world. It's a place human beings can flourish as material and spiritual beings or, more precisely, as whole persons. We think that the true human progress is personal and relational. It takes place over the course of particular human lives in the direction of living responsibly in light of the truth.
For this understanding of postmodernism, I refer you to the work of the great anticommunist dissidents Aleksandr Solzhenitsyn and Vaclav Havel, as well as to the American philosopher-novelist Walker Percy. For a genuinely postmodern thinker, a conservative criticism of the excessively technological orientation of the contemporary West doesn't mean a rejection what we've learned that's true about our freedom and our productive capabilities from modern developments. It does mean acknowledging that our mistaken identification of progress in techno-productivity has been at the expense of who we are as relational and purposeful beings.
...that we no longer need to focus on material things, but can focus on matters of the soul.
[N]ew estimates from the Environmental Protection Agency show that the leakage of methane - a greenhouse gas - from wells, pipelines and other infrastructure is much lower than previously believed, thanks in large part to better pollution controls implemented by the industry itself, according to the Associated Press. Recently released EPA estimates of methane emissions between 1990 and 2010 are 20 percent less than previous estimates, the AP reported, even as natural gas production has grown by almost 40 percent during the same period.
AT ABORTION clinics, the presence of awnings, the width of doorways and the dimensions of janitorial closets have little to do with the health of patients. But by requiring that Virginia's 20 abortion clinics conform to strict licensing standards designed for new hospitals, the state has ensured that many or most of them will be driven out of business in the coming months.
So if women's health isn't important enough for abortion clinics to meet health standards then what was wrong with back alleys in the first place?
It was David Hume, in the 18th century, who showed how to bring scepticism back to life. The first step is to keep in mind what Hume called the "strange infirmities" of human understanding, and the "universal perplexity and confusion, which is inherent in human nature". Armed with this knowledge--for our ignorance is the one thing of which we can be certain--we should be sure to exercise the "degree of doubt, and caution, and modesty, which, in all kinds of scrutiny and decision, ought for ever to accompany a just reasoner". Apart from anything else, this would help to cure people of their "haughtiness and obstinacy".
To Walk Together... (Francis Cardinal George, O.M.I., April 28 - May 11, 2013, Catholic New World)
When I was Bishop of Yakima, Wash., a small diocese in the Northwest with an economy dependent on harvesting fruits and vegetables, many workers would regularly come up from Mexico, work most of the year in Washington State and then return to Mexico for a month or so. The borders were porous, for we needed their labor on the farms and in construction. We failed to enforce our own laws. Those who are here now are our neighbors and friends, and our economy still depends upon many of them. If they are forced to leave, they will not be able to return for 10 years, their families will be abandoned and all of us will suffer.
Bipartisan legislation introduced last week in Congress by eight Senators reflects many of the provisions that the bishops have been asking for these past 15 years. T[...]
No law is perfect, but this is a good start. It restores due process protections and weeds out criminals, for there are dishonest people born elsewhere as there are dishonest people born here. It provides a path to citizenship and supports families and children. It will help us to live and walk together more justly and peacefully.
South Sudanese today are thinking more about another U.S. president: that would be Obama's predecessor, Bush 43. As a liberal Democrat and Obama supporter, I was particularly struck by this. Yes, Bush is a hero in Africa, and Americans, too, should know why.
No American president, before or since, has had Bush's vision and determination to save so many millions of lives.
For Africans, that vision traces back to the early years of his presidency. In his 2003 State of the Union Address, Bush introduced the "President's Emergency Plan for AIDS Relief" (PEPFAR.)
And that proposal had real meat: $15 billion over five years, as well as a serious look at African health problems, beyond HIV/AIDS.
Bush proposed it, and his proposal wasn't just a few throw-away lines in a speech; even as the Iraq war raged, Bush spent precious political capital to get PEPFAR enacted.
The result was the largest upfront contribution ever made by any country to fight HIV. And the numbers are staggering.
Five million children, women and men have received antiretroviral treatment under PEPFAR. In 2010 alone, 600,000 pregnant mothers received treatment so their newborn children would not be infected.
Yes, millions of people live productive, healthy lives due to Bush 43.
1. Put frozen chicken, wing sauce, and ranch dip mix into Crock-Pot. Cook on low for 6-7 hours.
2. Using two forks, shred chicken, then return to Crock-Pot (you can shred it while it's still in the pot too, as long as you don't plunge your hands into the hot lava). Add butter. Cook on low for an additional hour.
3. Serve on toasted deli rolls with pepper jack cheese or bleu cheese dressing (if desired).
IN the spring of 2010, a new patient came to see me to find out if he had attention-deficit hyperactivity disorder. He had all the classic symptoms: procrastination, forgetfulness, a propensity to lose things and, of course, the inability to pay attention consistently. But one thing was unusual. His symptoms had started only two years earlier, when he was 31.
Though I treat a lot of adults for attention-deficit hyperactivity disorder, the presentation of this case was a violation of an important diagnostic criterion: symptoms must date back to childhood. It turned out he first started having these problems the month he began his most recent job, one that required him to rise at 5 a.m., despite the fact that he was a night owl.
The patient didn't have A.D.H.D., I realized, but a chronic sleep deficit. I suggested some techniques to help him fall asleep at night, like relaxing for 90 minutes before getting in bed at 10 p.m. If necessary, he could take a small amount of melatonin. When he returned to see me two weeks later, his symptoms were almost gone. I suggested he call if they recurred. I never heard from him again.
Many theories are thrown around to explain the rise in the diagnosis and treatment of A.D.H.D. in children and adults. According to the Centers for Disease Control and Prevention, 11 percent of school-age children have now received a diagnosis of the condition. I don't doubt that many people do, in fact, have A.D.H.D.; I regularly diagnose and treat it in adults. But what if a substantial proportion of cases are really sleep disorders in disguise?
For some people -- especially children -- sleep deprivation does not necessarily cause lethargy; instead they become hyperactive and unfocused. Researchers and reporters are increasingly seeing connections between dysfunctional sleep and what looks like A.D.H.D., but those links are taking a long time to be understood by parents and doctors.
What If We Never Run Out of Oil? : New technology and a little-known energy source suggest that fossil fuels may not be finite. This would be a miracle--and a nightmare. (CHARLES C. MANNAPR 24 2013, The Atlantic)
In the 1970s, geologists discovered crystalline natural gas--methane hydrate, in the jargon--beneath the seafloor. Stored mostly in broad, shallow layers on continental margins, methane hydrate exists in immense quantities; by some estimates, it is twice as abundant as all other fossil fuels combined. Despite its plenitude, gas hydrate was long subject to petroleum-industry skepticism. These deposits--water molecules laced into frigid cages that trap "guest molecules" of natural gas--are strikingly unlike conventional energy reserves. Ice you can set on fire! Who could take it seriously? But as petroleum prices soared, undersea-drilling technology improved, and geological surveys accumulated, interest rose around the world. The U.S. Department of Energy has been funding a methane-hydrate research program since 1982.
Nowhere has the interest been more serious than Japan. Unlike Britain and the United States, the Japanese failed to become "the owners, or at any rate, the controllers" of any significant amount of oil. (Not that Tokyo didn't try: it bombed Pearl Harbor mainly to prevent the U.S. from blocking its attempted conquest of the oil-rich Dutch East Indies.) Today, Churchill's nightmare has come true for Japan: it is a military and industrial power almost wholly dependent on foreign energy. It is the world's third-biggest net importer of crude oil, the second-biggest importer of coal, and the biggest importer of liquefied natural gas. Not once has a Japanese politician expressed happiness at this state of affairs.
Japan's methane-hydrate program began in 1995. Its scientists quickly focused on the Nankai Trough, about 200 miles southwest of Tokyo, an undersea earthquake zone where two pieces of the Earth's crust jostle each other. Step by step, year by year, a state-owned enterprise now called the Japan Oil, Gas, and Metals National Corporation (JOGMEC) dug test wells, made measurements, and obtained samples of the hydrate deposits: 130-foot layers of sand and silt, loosely held together by methane-rich ice. The work was careful, slow, orderly, painstakingly analytical--the kind of process that seems intended to snuff out excited newspaper headlines. But it progressed with the same remorselessness that in the 1960s and '70s had transformed offshore oil wells from Waterworld-style exoticisms to mainstays of the world economy.
In January, 18 years after the Japanese program began, the Chikyu left the Port of Shimizu, midway up the main island's eastern coastline, to begin a "production" test--an attempt to harvest usefully large volumes of gas, rather than laboratory samples. Many questions remained to be answered, the project director, Koji Yamamoto, told me before the launch. JOGMEC hadn't figured out the best way to mine hydrate, or how to ship the resultant natural gas to shore. Costs needed to be brought down. "It will not be ready for 10 years," Yamamoto said. "But I believe it will be ready." What would happen then, he allowed, would be "interesting."
Already the petroleum industry has been convulsed by hydraulic fracturing, or "fracking"--a technique for shooting water mixed with sand and chemicals into rock, splitting it open, and releasing previously inaccessible oil, referred to as "tight oil." Still more important, fracking releases natural gas, which, when yielded from shale, is known as shale gas. (Petroleum is a grab-bag term for all nonsolid hydrocarbon resources--oil of various types, natural gas, propane, oil precursors, and so on--that companies draw from beneath the Earth's surface. The stuff that catches fire around stove burners is known by a more precise term, natural gas, referring to methane, a colorless, odorless gas that has the same chemical makeup no matter what the source--ordinary petroleum wells, shale beds, or methane hydrate.) Fracking has been attacked as an environmental menace to underground water supplies, and may eventually be greatly restricted. But it has also unleashed so much petroleum in North America that the International Energy Agency, a Paris-based consortium of energy-consuming nations, predicted in November that by 2035, the United States will become "all but self-sufficient in net terms." If the Chikyu researchers are successful, methane hydrate could have similar effects in Japan. And not just in Japan: China, India, Korea, Taiwan, and Norway are looking to unlock these crystal cages, as are Canada and the United States.
Not everyone thinks JOGMEC will succeed. But methane hydrate is being developed in much the same methodical way that shale gas was developed before it, except by a bigger, more international group of researchers. Shale gas, too, was subject to skepticism wide and loud. The egg on naysayers' faces suggests that it would be foolish to ignore the prospects for methane hydrate--and more foolish still not to consider the potential consequences.
With the end of the Cold War, however, the Marxian element started to subside. The radicals started to get less radical, and the newcomers weren't very radical at all. "I think a number of my colleagues on the left end of the department were taken with the idea that the great twentieth century battle between capitalism and socialism had ended. Capitalism had won and socialism had lost." Wolff remembers. "So that project struck them as suddenly out of date or even anachronistic. It wasn't relevant anymore."
"The younger economists were much more empirically oriented, a bit more policy oriented," Epstein says. "We're a much more diverse group now." [...]
"There was now, in a sense, two economics departments. There's the regular one and PERI," Wolff says. "You might want to call them left Keynesians, but the Keynesianism is the theoretical frame. Marxism, for sure, is not." Pollin insists he'd be more than happy to hire Marxists; it's just that economics departments don't churn them out anymore.
To be sure, some of the economists in the department today wouldn't be out of place working as applied macroeconomists anywhere else. But they still tend to focus on issues that other departments neglect.
"If the ObamaCare exchanges are good enough for the hardworking Americans and small businesses the law claims to help, then they should be good enough for the president, vice president, Congress, and federal employees," said Camp's spokeswoman in a statement.
The political principle is straightforward, but it would come at a price. Putting all federal employees on the exchanges would obliterate the most market-oriented insurance program run by the government, the Federal Employee Health Benefits Program, or FEHBP. Indeed, the FEHBP has long been considered a model for market-based reform of the Medicare and Medicaid programs.
Nothing says moonbat quite like trying to kill your own policy preferences in order to score partisan points.
"The conference will unite around tax reform," said House Majority Whip Kevin McCarthy (R-Calif.), who hosted the first "listening session" on the issue Thursday in his first-floor Capitol office. "The window is now."
House Ways and Means Committee Chairman Dave Camp (R-Mich.) led the session, offering polling and focus-group data showing that voters are hungry for simpler, fairer tax laws. Camp has started drafting legislation that would wipe out the current welter of exemptions and deductions and replace them with sharply lower rates, an approach championed by Erskine Bowles and Alan Simpson, co-chairmen of Obama's fiscal commission.
"We're not going to take the current code and see what comes out. We're going to take a blank piece of paper and see what goes back in," said Camp, who advocates a streamlined code with just two brackets and a top rate of 25 percent.
...and trying to explain why it's good policy to punish income?
Even Lady Thatcher's government did not attempt the root-and-branch welfare reform that Mr Duncan Smith is introducing. He is, in many people's eyes, the ideal man for the job: a former Conservative leader who was effectively ditched by his own party in 2003 without even fighting a general election, but who found political redemption through a drive to fight poverty and reform the welfare system which began when he toured the Easterhouse estate in Glasgow more than a decade ago.
It is virtually impossible to imagine him doing any other government job than his current post. "I don't have any particular ambitions...I see [politics] as a vocation, not a career," he says.
His work on welfare reform is clearly underpinned by a sense of morality which must surely derive from his religious beliefs (he is sometimes cited as the Cabinet's most senior Roman Catholic). [...]
Tomorrow (MON) sees the launch of pilot projects (the DWP prefers calling them "pathways") of Universal Credit, the single welfare payment that will, in time, wrap up all the main working-age benefits. It is the flagship change, which has seen Mr Duncan Smith come under fire for a imposing a gradualist approach, amid predictions it will be a major failure because of IT problems.
He is unapologetic about moving slowly, claiming that Labour's big-bang approach to introducing its own major welfare change, tax credits, made the system a "disaster" which had to be relaunched several times and which was crippled by fraud and error.
The "big cultural change" by contrast under Universal Credit is that people have to sign a "claim of commitment" under which they will pledge to make themselves available for work, search for a job, take interviews, take the first job that becomes available and "work hard." If they fail to do this, they can lose their benefits for up to three years on a sliding scale.
Mr Duncan Smith says: "People will know from day one, for the first time ever, what's expected of them. They'll have a sheet of paper which is their contract.... We want to say to people, you're claiming unemployment benefit but you're actually in work paid for by the state: you're in work to find work. That's your job from now on: to find work."
As Universal Credit rolls out, other work goes on, of course. Mr Duncan Smith is charged with finding an extra £6.5billion of savings from his department's budget for the year 2015-16, most of which will be after the next general election. Cabinet colleagues, themselves under pressure to make substantial savings, have publicly claimed welfare should be pruned even more to spare cuts to the military, the police or the justice system.
"Instead of always talking about it, it would be nice if occasionally they'd talk about what we've actually achieved," Mr Duncan Smith says, in a grumble at militant ministers whose numbers include Theresa May, the Home Secretary, Chris Grayling, the Justice Secretary and Philip Hammond, the Defence Secretary.
One battle which it appears Mr Duncan Smith might have lost is his fight to end the "universal benefits" regime which sees a range of payments - such help a winter fuel allowance and free bus passes and TV licences - made to all pensioners, no matter how well off they are. In the past he has described this as an "anomaly" and a "matter for debate" and has suggested they could be up for review when the Conservatives write their manifesto for the 2015 general election.
Elton's new BBC One sitcom, The Wright Way, has suffered some of the worst reviews of any television comedy in decades. It has been called "dated", "desperate", "laboured" and "groan-inducing". The social media reaction has been even harsher, and from within the chastened Beeb-hive comes the familiar buzz - when things go wrong - that many of the Corporation's creative types didn't really want the show anyway.
It was too old-fashioned, too middle-Englandy, too predictable, and its premise of exposing the follies of a local council's health and safety department, run by a clipboard-carrying despot called Mr Wright (David Haig), played too readily to the prejudices of curtain-twitching suburbanites.
In all of this, one theme was unmissable: whatever happened to Ben Elton, the Tory-bashing, mockney-spouting, stand-up radical who helped transform British comedy in the Eighties? Most of the people who today run, and for that matter, write about, television grew up thinking of Ben as a prime adornment of the anti-Thatch resistance, and believed that his routines would have the masses laughing their way out of their chains.
When it turned out that Elton wasn't quite as seditious as the comrades had hoped, and, by his own admission, was slightly bewildered to find himself being cast as any kind of political figurehead, his reputation in right-on circles began to wither. The process was accelerated by the discovery that he came from a well-to-do family, the son of a prominent physicist, had been educated at a top Surrey grammar school, and that his street punk London accent had been adopted in the hope that comedy circuit audiences would find it funnier.
Many in the artsy-liberal crowd wondered if they hadn't been conned. The accusations of outright treachery would come later. "Politically," shrugged Elton, "I'm kind of a pub bore going on about what's wrong with the world. It's just that I have a bigger pub." How could the man who created The Young Ones, the most anarchic, insubordinate show on television do this to his fans?
As it happened, he only became truly funny when he ditched all the tribune-of-the-people schtick and turned his mind to comedy that might appeal to a wider audience.
That's the simple reality of Friday's vote to ease the pain for the Federal Aviation Administration. By assenting to it, Democrats have agreed to sequestration for the foreseeable future.
Recall the Democrats' original theory of the case: Sequestration was supposed to be so threatening that Republicans would agree to a budget deal that included tax increases rather than permit it to happen. That theory was wrong. The follow-up theory was that the actual pain caused by sequestration would be so great that it would, in a matter of months, push the two sides to agree to a deal. Democrats just proved that theory wrong, too.
President Obama has a less than stellar reputation with corporate America, but he gave businesses a sweetheart tax deal that tacked several billions of dollars onto the national debt.
As part of his 2009 stimulus package, the president tried to jumpstart the economy for part of 2010 and 2011 by letting companies deduct the full cost of new computers, machinery and office equipment--what accountants call "bonus depreciation" The plan halved the size of the deduction for 2012 and 2013, even though the results appear to be lackluster.
Icelandic voters have dumped the Social Democrats from power, returning a centre-right government that ruled over the country's disastrous financial collapse just five years ago.
Once a European financial hub, the windswept north Atlantic island has been limping along for years, still crippled from a crash that brought it to its knees in just a matter of days.
"We are offering a different road, a road to growth, protecting social security, better welfare and job creation," said Bjarni Benediktsson, the favourite to become the next prime minister after his Independence party took first place in the vote, as the results were coming in.
"What we won't compromise about is cutting taxes and lifting the living standards of people," said Benediktsson, 43, a former professional footballer.
Why are we still talking as if it were 1982? Can't everybody see how utterly pointless it is to frame every policy debate in terms of Right and Left as if the ideological purity and fervour of those polar positions were still a living political reality? The most pernicious aspect of this simple-minded formula is that it forces every debate on to tram lines. A policy or a position will not be judged by whether it is socially constructive, fair-minded, compassionate, or economically productive. It is only discussed as a move in an outdated game in which nobody any longer has a clear idea of the rules. Encouraging social mobility can be either Left-wing or Right-wing depending on whether you sell it wrapped in the packaging of Labour egalitarianism or Thatcherite individual aspiration. So how do we judge the Tory education reforms? The fact that they are hated by the teaching unions suggests that they are Right-wing, but what if their effect is to enable more working-class children to do well at school? Isn't that a Left-wing goal? Either you want more disadvantaged children to succeed or you don't. We will soon have pretty clear empirical evidence of whether the new system produces that result. If it does, where will that leave the relevance of Right vs Left in this discussion?
Everyone presumably wants to promote the fulfilment of individual potential and to encourage social cooperation, but any measure that is suggested for accomplishing these things that is thought to emanate from the wrong source - or to spring from the wrong political sympathies - may be condemned out of hand before it has even got its boots on. We are getting nowhere with this silly, reductionist vocabulary. In fact, the most interesting and worthwhile policy initiatives of the moment achieve the miraculous feat of appealing to the Right (which effectively means most real people) while being so truly progressive that they are almost impossible for the Left to attack.
...is a function of the absence of any meaningful policy differences? Having arrived at the Third Way consensus the parties of the First and Second Way can only distinguish themselves by partisan opposition to even their own ideas if proposed by the other, as with the health mandate.
[L]ast quarter I had an intriguing thought while preparing my Game Theory lectures. Tests are really just measures of how the Education Game is proceeding. Professors test to measure their success at teaching, and students take tests in order to get a good grade. Might these goals be maximized simultaneously? What if I let the students write their own rules for the test-taking game? Allow them to do everything we would normally call cheating?
A week before the test, I told my class that the Game Theory exam would be insanely hard--far harder than any that had established my rep as a hard prof. But as recompense, for this one time only, students could cheat. They could bring and use anything or anyone they liked, including animal behavior experts. (Richard Dawkins in town? Bring him!) They could surf the Web. They could talk to each other or call friends who'd taken the course before. They could offer me bribes. (I wouldn't take them, but neither would I report it to the dean.) Only violations of state or federal criminal law such as kidnapping my dog, blackmail, or threats of violence were out of bounds.
Gasps filled the room. The students sputtered. They fretted. This must be a joke. I couldn't possibly mean it. What, they asked, is the catch?
"None," I replied. "You are UCLA students. The brightest of the bright. Let's see what you can accomplish when you have no restrictions and the only thing that matters is getting the best answer possible."
They could surf the Web. They could talk to each other. They could offer me bribes.Once the shock wore off, they got sophisticated. In discussion section, they speculated, organized, and plotted. What would be the test's payoff matrix? Would cooperation be rewarded or counter-productive? Would a large group work better, or smaller subgroups with specified tasks? What about "scroungers" who didn't study but were planning to parasitize everyone else's hard work? How much reciprocity would be demanded in order to share benefits? Was the test going to play out like a dog-eat-dog Hunger Games? In short, the students spent the entire week living Game Theory. It transformed a class where many did not even speak to each other into a coherent whole focused on a single task--beating their crazy professor's nefarious scheme.
On the day of the hour-long test they faced a single question: "If evolution through natural selection is a game, what are the players, teams, rules, objectives, and outcomes?" One student immediately ran to the chalkboard, and she began to organize the outputs for each question section. The class divided tasks. They debated. They worked on hypotheses. Weak ones were rejected, promising ones were developed. Supportive evidence was added. A schedule was established for writing the consensus answers. (I remained in the room, hoping someone would ask me for my answers, because I had several enigmatic clues to divulge. But nobody thought that far afield!) As the test progressed, the majority (whom I shall call the "Mob") decided to share one set of answers. Individuals within the Mob took turns writing paragraphs, and they all signed an author sheet to share the common grade. Three out of the 27 students opted out (I'll call them the "Lone Wolves"). Although the Wolves listened and contributed to discussions, they preferred their individual variants over the Mob's joint answer.
In the end, the students learned what social insects like ants and termites have known for hundreds of millions of years. To win at some games, cooperation is better than competition. Unity that arises through a diversity of opinion is stronger than any solitary competitor.
What the rest of us learned is that what remains of evolution theory is entirely dependent on intelligent decision making.
On the other hand, what we should take away from this is the massive structural flaw it illustrates in our entire educational system. We continue to teach and test as if every student were Robinson Crusoe or Jeremiah Johnson and would grow up to be completely isolated from any source of knowledge outside his own brain. This has obviously been outdated for quite some time, but never more so than today, when the student's phone provides access to everything humankind knows. A good education would prepare students to act as these did, to arrive at correct answers collaboratively.
To see the truth of this we can offer a simple thought experiment: you suffer from a complicated medical condition. You are offered two choices: (1) you can go to the most renowned physician in the field, Dr. Lone Wolf, graduated first in his class at Harvard, Chairman of the Department at the Mayo Clinic, blah, blah, blah, and he will make a decision about what is wrong with you and the course of treatment you will receive on the basis of his memory; or, (2), you can go to a team of recent graduates, the Mob, of a middling medical school who will base their decision on a review of current literature, consultations with other doctors, etc. It ought not even be called a choice.
It's no mystery why markets and politicians track the GDP figures so zealously. Compiled by 2,000 economists and statisticians at the Bureau of Economic Analysis (BEA), GDP pulls together everything they can measure about how much American households and industries earn, consume and invest, and for what purposes. But earlier in the week, the BEA also tacitly acknowledged that its current GDP measure lags pretty badly behind the actual economy. Earlier in the week, the Bureau released a set of changes in how it will calculate GDP which will take effect in the second quarter report to be released on July 31. The main point of those changes is to take better account of the economic value of ideas and intangible assets. Few among us today question the notion that new ideas can have great economic value. And some 15 years ago -- long before smart phones, tablets and protein-based medications -- the BEA started to study how to revise the GDP measure to better measure that value. Finally, this week, the Bureau announced that starting soon, and for the first time, when a company undertakes research and development or creates a new book, music, or movie, BEA will count those costs as investments that add to GDP, rather than as ordinary business expenses, which do not.
BEA also will apply these changes to its GDP numbers going back more than a half-century. In the present, the revisions, in an instant, will add some $400 billion to the official accounting of the economy's current total product. Voila! Business profits also will look larger - now and for the past half-century -- because ordinary business expenses reduce reported profits, while investments do not. Most important, the revisions tell us that American businesses and government, together, now invest just 2.1 percent of GDP in R&D - that's less investment than in the 1990s, especially by businesses.
...that it took ten years to determine that there had been no recessions after the 1987 market crash nor under GHWB.
The pending Senate immigration bill would bring a minimum of 33 million people into the country during its first decade of operation, according to an analysis by NumbersUSA, a group that wants to slow the current immigration rate.
By 2024, the inflow would include an estimated 9.2 million illegal immigrants, plus 2.5 million illegals who arrived as children -- dubbed 'Dreamers' -- plus roughly 3.4 million company-sponsored employees with university degrees, said the unreleased analysis.
The majority of the inflow, or roughly 17 million people, would consist of family members of illegals, recent immigrants and of company-sponsored workers, according to the NumbersUSA analysis provided to The Daily Caller.
Those numbers seem high, but we can always hope. It's the sort of population growth other developed nations are desperate for. We're going to need a heck of a lot more houses though.
At more than $5 billion a year in humanitarian aid to Africa, President Bush has given more assistance to the continent than any other president. His administration's aid was largely targeted to fight the major global health issues facing the continent, HIV/AIDS and malaria.
In 2003 Bush founded the President's Emergency Plan for AIDS Relief (PEPFAR), which guaranteed $15 million to be spent over the course of five years on prevention, treatment and research on HIV/AIDS. Under the Bush administration, the U.S. was also a leader in contributing to the Global Fund on AIDS.
Though there was controversy over some of the qualifications for PEPFAR funds -up to 20% was to be spent on abstinence-focused prevention programs, and the funds could not be used for needle-sharing programs -- most HIV/AIDS activists credit the program for being instrumental in turning the tide on AIDS.
Before PEPFAR, an estimated 100,000 people were on anti-retroviral drugs in sub-Saharan Africa. By the time Bush left office in 2008 that number had increased to about 2 million.
In 2005 Bush started a $1.2 billion initiative to fight malaria. He defended the request for funding in 2007, saying, "There's no reason for little babies to be dying of mosquito bites around the world."
At Thursday's ceremony, President Clinton said in his travels throughout Africa he had "personally seen the faces of some of the millions of people who are alive today" because of Bush's policies.
Even some of Bush's most ardent critics have admitted that his foreign policy legacy on Africa continues to have a lasting effect.
U2 front-man and activist Bono, who criticized Bush on the Iraq War, nonetheless expressed his admiration for the Republican president on an appearance on the Daily Show last year, telling Stewart that Bush did an "amazing" job in the fight against the spread of HIV/AIDS in Africa.
"I know that's hard for you to accept," Bono said to a surprised crowd and host, "but George kind of knocked it out of the park. I can tell you, and I'm actually here to tell you that America now has 5 million people being kept alive by these drugs. That's something that everyone should know."
SCHOOLS ARE RUN FOR THE CONVENIENCE OF PARENTS, NOT THE GOOD OF KIDS:
The Science of Sleepy Teenagers : School schedules make them grouchy, impulsive, and humorless. (Russell Foster, April 27, 2013, Slate)
The biology of human sleep timing, like that of other mammals, changes as we age. This has been shown in many studies. As puberty begins, bedtimes and waking times get later. This trend continues until 19.5 years in women and 21 in men. Then it reverses. At 55 we wake at about the time we woke prior to puberty. On average this is two hours earlier than adolescents. This means that for a teenager, a 7 a.m. alarm call is the equivalent of a 5 a.m. start for people in their 50s.
Precisely why this is so is unclear, but the shifts correlate with hormonal changes at puberty and the decline in those hormones as we age.
However, biology is only part of the problem. Additional factors include a more relaxed attitude to bedtimes by parents, a general disregard for the importance of sleep, and access to TVs, DVDs, PCs, gaming devices, cellphones, and so on, all of which promote alertness and eat into time available for sleep.
The amount of sleep teenagers get varies between countries, geographic region, and social class, but all studies show they are going to bed later and not getting as much sleep as they need because of early school starts.
Mary Carskadon at Brown University, who is a pioneer in the area of adolescent sleep, has shown that teenagers need about nine hours a night to maintain full alertness and academic performance. My own recent observations at a U.K. school in Liverpool suggested many were getting just five hours on a school night. Unsurprisingly, teachers reported students dozing in class.
Evidence that sleep is important is overwhelming. Elegant research has demonstrated its critical role in memory consolidation and our ability to generate innovative solutions to complex problems. Sleep disruption increases the level of the stress hormone cortisol. Impulsive behaviors, lack of empathy, sense of humor, and mood are similarly affected.
All in all, a tired adolescent is a grumpy, moody, insensitive, angry, and stressed one. Perhaps less obviously, sleep loss is associated with metabolic changes. Research has shown that blood-glucose regulation was greatly impaired in young men who slept only four hours on six consecutive nights, with their insulin levels comparable to the early stages of diabetes.
Similar studies have shown higher levels of the hormone ghrelin, which promotes hunger, and lower levels of leptin, which creates a sense of feeling full. The suggestion is that long-term sleep deprivation might be an important factor in predisposing people to conditions such as diabetes, obesity, and hypertension.
Hanover High School shares the Dartmouth boathouse, so the 25% of the student body that participates in crew has to be there at 5am for practice. I drop three zombies there every morning on my way to work.
ALONG WITH FOLLOWING W'S LEAD ON MIDDLE EAST LIBERALIZATION....
Trade Winds : It's taken four years, but President Obama is finally coming around to a pro-trade economic agenda. And it could be his greatest legacy. (JAMES K. GLASSMAN | APRIL 26, 2013, Foreign Policy)
During the first three years of his first term, Barack Obama talked about boosting exports, but did little to expand trade. Unlike every president since Franklin Roosevelt, he declined to pursue trade promotion authority, necessary for any significant trade deal because it forces Congress to take an up-or-down vote, without amendments. Unlike his recent predecessors, he didn't push for multilateral agreements like the Doha Round, which focused on increasing trade links with developing countries. And he took nearly three years to get approval for the bilateral deals with Panama, Colombia, and South Korea that had been negotiated during President George W. Bush's tenure.
But in a dramatic about-face, Obama has embraced two large agreements that would open new markets to U.S. exporters. The Transatlantic Trade and Investment Partnership (TTIP) would remove tax and regulatory barriers with the European Union, while the Trans-Pacific Partnership (TPP) would increase trade with 11 Asian and Latin American countries. In February, President Obama and European leaders announced they would pursue a sweeping free-trade agreement, and on April 12, the United States approved Japan's entry into TPP negotiations, where it joins Australia, Brunei, Canada, Chile, Malaysia, Mexico, New Zealand, Peru, and Vietnam; the deal will likely be completed by December.
These are trade openings on a grand scale. The E.U. is the largest economy in the world, the United States is second, and Japan is fourth. The dozen nations in the Trans-Pacific Partnership account for some 40 percent of global GDP. This joint-lowering of trade barriers would be the most powerful step taken to restore economic growth since the 2008 financial crisis, both for the United States and its partner countries. "Countries that liberalized their trade regimes experienced average annual growth rates that were about 1.5 percentage points higher than before liberalization," according to an often-cited study by Stanford's Romain Wacziarg and Karen Horn Welch in 2008.
More open trade lets Americans reap additional revenues from foreign sales, while profiting from the lower costs that imports provide -- both for finished consumer goods and for inputs into U.S. manufacturing.
...and getting the Heritage Foundation's health insurance mandate, trade'll be nearly his only legacy. Historically, his term will be indistinguishable from that of Clinton, W and his successors.
Tallahassee remains Boston's indie-rock-meets-alt-country Providence-transplanted quartet with their second album Old Ways. Describing Old Ways as, "a close friend you've just met," their adventurous new record will be released May 7 with a release show at The Sinclair on May 3. Allston Pudding recently sat down with Tallahassee's Scott Thompson, Brian Barthelmes, and Matt Raskopf about the album and their release show.
Was there any sort of sound that you guys were going for with Old Ways? Is it different than your previous album, Jealous Hands?
BB: Yeah, big time.
ST: We have lots of different sounds! In some ways, the album's all over the place. We were reacting to the live show, which kept evolving after we released our last record. We were changing all the songs to try to make them more of a transaction with the audience, picking up on their energy and trying to give back in-kind. When we were writing the songs, I think we subconsciously reacted to that experience of playing live.
BB: Not even just subconsciously, though. I think we realized later on when we started playing with Jealous Hands that we hadn't so much fleshed out those songs live. As we were writing Old Ways, we were playing these songs live because we were excited about them and people were excited. We had a show at the Middle East maybe two days before we headed down to do the instrumental recordings and the hope and the idea was that what people were responding to was so much directing what we were reacting to. It was like, "Okay, what is it you are reacting to before we hit the studio to make sure those parts are really fresh." It was a lot more communal than Jealous Hands. Jealous Hands was a little more intimate. We were writing a lot and getting to know ourselves. Old Ways was a big reaction to the community that has supported us, that we have played for, that have played for us.
So would you guys say that this an album best heard live? Is your preference that these are songs you'd like to play live for people?
BB: Yeah, I think so. And loud. Maybe half of the songs on Jealous Hands were songs we wouldn't play live. They were these great intimate moments where Matt might be on the piano and there might be really hot electric guitars with light fingerpicking. They were these really intimate exchanges that, live, it would take a really peculiar set up to do well.
ST: Yeah, the songs on Jealous Hands might not work at Great Scott.
BB: But this record, the songs were meant to be live songs. When we had done Jealous Hands, Scott played a lot of different instruments, so there was banjo overdubs organ overdubs lap steel. On Old Ways, we focused on what we do live because it's been so pleasurable. So this record needed to stand on just the four parts we have on stage. In that regard, it's the best capture we could do of the live show that had come to be. We didn't expect to become a rocking and rolling band that loved doing live stuff. And when that happened, we were like, "We need to capture that!"
[T]he band spin stories with lush arrangements that take the best elements from folk, rock and alt-country.
Old Brown Shoes, from the album Old Ways (out on the 7th May), perfectly expresses the desire to escape the confines of a small town. With poignant guitar and fervent vocal delivery the band pursues their dreams of moving beyond the cultural vacuum of growing up in a small town, capturing that ache for big opportunities and bigger spaces.
Florida Sen. Marco Rubio's push to overhaul immigration laws would once have spelled peril for a potential Republican presidential candidate. But that is much less so today.
One big reason: Many of his potential 2016 rivals--Kentucky Sen. Rand Paul, Wisconsin Rep. Paul Ryan and former Florida Gov. Jeb Bush--are marching in the same direction.
A look at the policies embraced by the stars in the Republican party, a look at the U.S.'s economic growth of the past years, and a look at the Porsche Cayman S. (Photo: AP)
Despite differences in emphasis, all are supporting changes to immigration law that would offer an eventual pathway to citizenship for those now in the country illegally. That distinguishes them from many of the party's rank-and-file members in the House and Senate.
It's a Christian party, so amnesty is a 70-30 issue.
Rashid Perkins still remembers that first day. One summer morning, when he was 7, he left his home in Jamaica, Queens. At the Port Authority Bus Terminal, he said goodbye to his mother and boarded a bus that would take him to Needham, Mass., for a two-week stay with a host family, a trip sponsored by the Fresh Air Fund.
He wasn't nervous when he greeted the family, Beverlie and Morris Marks and their son Andrew, who was 5. "I had a big smile on my face," Mr. Perkins, now 22, recalled recently. He spent the rest of his first day outside, learning to play baseball with Andrew. That night, the quiet around him made it difficult for him to fall asleep.
Those two weeks changed his life. Mr. Perkins returned to visit the family every summer for more than a decade and often spent winter breaks with them. "I became part of the family and the community," he said.
Camp Sankaty serves the same purpose, although we got to stay for 10 weeks.
Government should help facilitate healthy lifestyles, while striving not to induce anxiety into the general population. The emphasis should be on promoting health, not "dis-ease."
This distinction is particularly important with body weight, where there is no single "ideal" weight--and where what is labeled overweight (a BMI between 25 and 30) is actually associated with lower mortality than what is labeled normal weight (a BMI between 18.5 and 25). In the spirit of full disclosure: I am overweight.
To be sure, obesity (BMI between 30 and 35)--and, particularly, severe obesity (BMI greater than 35)--is a problem to be dealt with. But to avoid the downsides of labeling, government should be focused on promoting a healthy lifestyle for all of us.
It's what your grandmother would have told you: Don't smoke, eat your fruits and vegetables, and go play outside. [...]
Healthy weight is largely the product of diet and physical activity. Government can facilitate better diets both by educating people about smaller portions and less calorie-dense foods (whole grains, fruits and vegetables) and by enhancing access to those foods--both in so-called food deserts and in schools.
Government can similarly facilitate more physical activity by educating people about two things: 1) It doesn't have to be complicated (e.g., walk more, take the stairs) and 2) That it has multiple benefits (the reason I exercise is less to maintain my weight--or extend my life--it's more because I sleep better and feel better).
The heads and hearts of atheists may not be on precisely the same page. That's the implication of recently published research from Finland, which finds avowed non-believers become emotionally aroused when daring God to do terrible things.
"The results imply that atheists' attitudes toward God are ambivalent, in that their explicit beliefs conflict with their affective response," concludes a research team led by University of Helsinki psychologist Marjaana Lindeman.
Stapel's fraud may shine a spotlight on dishonesty in science, but scientific fraud is hardly new. The rogues' gallery of academic liars and cheats features scientific celebrities who have enjoyed similar prominence. The once-celebrated South Korean stem-cell researcher Hwang Woo Suk stunned scientists in his field a few years ago after it was discovered that almost all of the work for which he was known was fraudulent. The prominent Harvard evolutionary biologist Marc Hauser resigned in 2011 during an investigation by the Office of Research Integrity at the Department of Health and Human Services that would end up determining that some of his papers contained fabricated data.
Every year, the Office of Research Integrity uncovers numerous instances of bad behavior by scientists, ranging from lying on grant applications to using fake images in publications. A blog called Retraction Watch publishes a steady stream of posts about papers being retracted by journals because of allegations or evidence of misconduct.
Each case of research fraud that's uncovered triggers a similar response from scientists. First disbelief, then anger, then a tendency to dismiss the perpetrator as one rotten egg in an otherwise-honest enterprise. But the scientific misconduct that has come to light in recent years suggests at the very least that the number of bad actors in science isn't as insignificant as many would like to believe. And considered from a more cynical point of view, figures like Hwang and Hauser are not outliers so much as one end on a continuum of dishonest behaviors that extend from the cherry-picking of data to fit a chosen hypothesis -- which many researchers admit is commonplace -- to outright fabrication. Still, the nature and scale of Stapel's fraud sets him apart from most other cheating academics. "The extent to which I did it, the longevity of it, makes it extreme," he told me. "Because it is not one paper or 10 but many more."
Stapel did not deny that his deceit was driven by ambition. But it was more complicated than that, he told me. He insisted that he loved social psychology but had been frustrated by the messiness of experimental data, which rarely led to clear conclusions. His lifelong obsession with elegance and order, he said, led him to concoct sexy results that journals found attractive. "It was a quest for aesthetics, for beauty -- instead of the truth," he said. [...]
Stapel stayed in Amsterdam for three years after his Ph.D., writing papers that he says got little attention. Nonetheless, his peers viewed him as having made a solid beginning as a researcher, and he won an award from the European Association of Experimental Social Psychology. In 2000, he became a professor at Groningen University.
While there, Stapel began testing the idea that priming could affect people without their being aware of it. He devised several experiments in which subjects sat in front of a computer screen on which a word or an image was flashed for one-tenth of a second -- making it difficult for the participants to register the images in their conscious minds. The subjects were then tested on a task to determine if the priming had an effect.
In one experiment conducted with undergraduates recruited from his class, Stapel asked subjects to rate their individual attractiveness after they were flashed an image of either an attractive female face or a very unattractive one. The hypothesis was that subjects exposed to the attractive image would -- through an automatic comparison -- rate themselves as less attractive than subjects exposed to the other image.
The experiment -- and others like it -- didn't give Stapel the desired results, he said. He had the choice of abandoning the work or redoing the experiment. But he had already spent a lot of time on the research and was convinced his hypothesis was valid. "I said -- you know what, I am going to create the data set," he told me.
Sitting at his kitchen table in Groningen, he began typing numbers into his laptop that would give him the outcome he wanted. He knew that the effect he was looking for had to be small in order to be believable; even the most successful psychology experiments rarely yield significant results. The math had to be done in reverse order: the individual attractiveness scores that subjects gave themselves on a 0-7 scale needed to be such that Stapel would get a small but significant difference in the average scores for each of the two conditions he was comparing. He made up individual scores like 4, 5, 3, 3 for subjects who were shown the attractive face. "I tried to make it random, which of course was very hard to do," Stapel told me.
Doing the analysis, Stapel at first ended up getting a bigger difference between the two conditions than was ideal. He went back and tweaked the numbers again. It took a few hours of trial and error, spread out over a few days, to get the data just right.
He said he felt both terrible and relieved. The results were published in The Journal of Personality and Social Psychology in 2004. "I realized -- hey, we can do this," he told me.
Mr. Jones married Dorothy Bonvillion when he was 17, but divorced her before the birth of their daughter. He served in the Marines from 1950 to 1953, then signed to Starday Records, whose co-owner Pappy Daily became Mr. Jones's producer and manager. Mr. Jones's first single, "No Money in This Deal," was released in 1954, the year he married his second wife, Shirley Corley. They had two sons before they divorced in 1968.
"Why Baby Why," released in 1955, became Mr. Jones's first hit. During the 1950s he wrote or collaborated on many of his songs, including hits like "Just One More," "What Am I Worth" and "Color of the Blues," though he later gave up songwriting. In the mid-'50s he had a brief fling with rockabilly, recording as Thumper Jones and as Hank Smith. But under his own name he was a country hit maker. He began singing at the Grand Ole Opry in 1956.
He had already become a drinker. "White Lightning," a No. 1 country hit in 1959, required 83 takes because Mr. Jones was drinking through the session. On the road, playing one-night stands, he tore up hotel rooms and got into brawls. He also began missing shows because he was too drunk to perform.
But onstage and on recordings, his career was advancing. In 1962 he recorded one of his signature songs, "She Thinks I Still Care," which was nominated for a Grammy Award. Another of his most lasting hits, "The Race Is On," appeared in 1964. He was part of the first country concert at Madison Square Garden, a four-show, 10-act package in 1964 that also included Ernest Tubb, Bill Monroe and Buck Owens. Each act was allotted two songs per show, but on the opening night Mr. Jones played five before he was carried offstage.
In 1966, Mr. Jones tried to start a country theme park in Vidor, the East Texas suburb where he lived. Called the George Jones Rhythm Ranch, it was the first of many shaky business ventures. Mr. Jones gave only one performance. After singing, he disappeared for a month, rambling across Texas. His drinking had gotten worse. At one point his wife hid the keys to all his cars, so he drove his lawn mower into Beaumont to a liquor store -- an incident he would later commemorate in a song and in music videos. They were divorced not long afterward.
Mr. Jones had his next No. 1 country single in 1967 with "Walk Through This World With Me." He moved to Nashville and opened a nightclub there, Possum Holler, which lasted a few months.
He had met a rising country singer, Tammy Wynette, in 1966, and they fell in love while on tour. She was married at the time to Don Chapel, a songwriter whose material had appeared on both of their albums. One night in 1968, Mr. Jones recalled, Ms. Wynette and Mr. Chapel were arguing in their dining room when Mr. Jones arrived; he upended the dining room table and told Ms. Wynette he loved her. She took her three children and left with Mr. Jones.
They were married in 1969 and settled in Lakeland, Fla. There, on the land around his plantation-style mansion, Mr. Jones built another country-themed park, the Old Plantation Music Park.
Mr. Jones severed his connection with Mr. Daily and later maintained that he had not received proper royalties. In 1971 he signed a contract with Epic Records, which was also Ms. Wynette's label, and the couple began recording duets produced by Billy Sherrill, whose elaborate arrangements helped reshape the sound of Nashville. Three of those duets -- "We're Gonna Hold On," "Golden Ring" and "Near You" -- were No. 1 country hits, an accomplishment made more poignant by the singers' widely reported marital friction.
"Mr. and Mrs. Country Music" was painted on their tour bus. But the marriage was falling apart, unable to withstand bitter quarrels and Mr. Jones's drinking and amphetamine use. After one fight, he was put in a straitjacket and hospitalized for 10 days. The Lakeland music park was shut down.
The couple divorced in 1975; the next year Mr. Jones released two albums, titled "The Battle" and "Alone Again." But duets by Mr. Jones and Ms. Wynette continued to be released until 1980, the year they rejoined to make a new album, "Together Again," which included the hit "Two Story House." They would reunite to tour and record again in the mid-1990s. Mr. Jones grew increasingly erratic after the divorce, drinking heavily and losing weight. His singles slipped lower on the charts. His management bounced his band members' paychecks. At times he would sing in a Donald Duck voice onstage. And he began using cocaine and brandishing a gun. In 1977 he fired at a friend's car and was charged with attempted murder, but the charges were dropped.
His nickname No-Show Jones gained national circulation as he missed more engagements than he kept. When he was scheduled to play a 1977 showcase at the Bottom Line in New York, he disappeared for three weeks instead. In 1979, he missed 54 concert dates. (Later, the license plates on his cars ran from "NOSHOW1" to "NOSHOW7.")
But as his troubles increased, so did his fame and his album sales. "I was country music's national drunk and drug addict," Mr. Jones wrote in his autobiography, "I Lived to Tell It All," published in 1996.
He had music industry fans outside country circles. James Taylor wrote "Bartender's Blues" for him, and sang it with him as a duet. In 1979, on the album "My Very Special Guests," Mr. Jones sang duets with Willie Nelson, Linda Ronstadt, Elvis Costello and Emmylou Harris. But he missed many of the recording sessions, and had to add his vocal tracks later.
Running From Debts
By then Mr. Jones had moved to Florence, Ala., in part to get away from arrest warrants for nonpayment of child support to Ms. Wynette and other debts in Tennessee. In Florence, he had a girlfriend, Linda Welborn, from 1975 to 1981. When they broke up, she sued and won a divorce settlement under Alabama's common-law marriage statutes.
In 1979 Mr. Jones declared bankruptcy. His manager was arrested and charged with selling cocaine. That December, Mr. Jones was committed for 30 days to a drug and alcohol rehabilitation center. After his release, he went back to cocaine and whiskey.
Yet he still had hits. "He Stopped Loving Her Today," a song about a man whose love ends only when his life does, was released in April 1980 and reached No. 1 on the country charts, beginning Mr. Jones's resurgence. The Country Music Association named it the song of the year, the award going to its songwriters, Bobby Braddock and Curly Putman, and the recording won the Grammy for best male country performance.
With a renewed contract from Epic Records, Mr. Jones became a hit maker again, with No. 1 songs including "Still Doin' Time" in 1981 and "I Always Get Lucky With You" in 1983. He made an album with Johnny Paycheck, a former member of his band, in 1980 and one with Merle Haggard in 1982; he recorded a single, "We Didn't See a Thing," with Ray Charles in 1983. And in 1984 he released "Ladies' Choice," an album of duets with Loretta Lynn, Brenda Lee, Emmylou Harris and other female singers.
In 1983 he married Nancy Sepulvedo, who straightened out his business affairs and then Mr. Jones himself. He gave up cocaine and whiskey. The couple moved to East Texas, near Mr. Jones's birthplace, and opened the Jones Country Music Park, which they operated for six years. In 1988 he changed labels again, to MCA, and soon moved to Franklin, Tenn.
By then, younger, more telegenic singers had come along with vocal styles learned largely from Mr. Jones and Merle Haggard. Now treated as an elder statesman, Mr. Jones sang duets with some of his musical heirs, including Randy Travis and Alan Jackson. Garth Brooks, Vince Gill, Travis Tritt, Clint Black, Patty Loveless and other country stars joined Mr. Jones on the single "I Don't Need Your Rocking Chair" in 1992. That same year he was named to the Country Music Hall of Fame.
Jones is easy to caricature as a hypocrite, to be sure. He performed some of the greatest songs in country music history. I would fight anyone, metaphorically speaking, who denies that "He Stopped Loving Her Today" is the greatest country song of all time, but Jones was known for more than his songs. His failed marriages, most notably from fellow country music star Tammy Wynette, and his life-long skirmish with substance abuse, were always in the headlines. [...]
But Jones did not present a light picture of his frailties. His songs demonstrated that he did not think of these things as frailties at all, as our therapeutic culture would have us to do. Yes, Jones sang with a wink in his eye often about liquor and pills and loneliness and divorce, but then he would turn around and sing of these things as Hell. The raw emotion of Jones' vocal chords communicated the anguish of a father who has lost his family in "Grand Tour," as he takes a stranger through every room in the house, including the empty nursery where the baby of a broken home once lay.
Yes he could sing about alcohol in a playful song comparing his love to the smoothness of Tennessee whiskey, the sweetness of strawberry wine, but he would then sing of living his life "Still Doing Time" in a "Honky Tonk Prison." He would sing honestly of his prison to alcoholism as a result of his broken relationships: "If Drinking Don't Kill Me, Her Memory Will." This is not a glorification of alcohol; it is the scratchings against the door of a man in pain.
Some may see hypocrisy in the fact that Jones sang gospel songs. The same emotion with which he sang of drunkenness and honky-tonking, he turned to sing of "Just a Little Talk with Jesus Makes Things Right." He often in concerts led the crowd in old gospel favorites, such as "Amazing Grace" or "I'll Fly Away." But I don't think this is hypocrisy. This is not a man branding himself with two different and contradictory impulses. This was a man who sang of the horrors of sin, with a longing for a gospel he had heard and, it seemed, he hoped could deliver him. In Jones' songs, you hear the old Baptist and Pentecostal fear that maybe, horrifically, one has passed over into the stage of Esau who, as the Bible puts it, "could not find repentance though he sought it with tears."
I'm not sure whether Jones sought repentance with tears, but he certainly sang of the longing for it with a quavering voice. In that sense, Jones communicated exactly what Flannery O'Connor wrote of when she spoke of a "Christ-haunted South, a region with a ubiquitous gospel, but without the ubiquity of gospel power." Jones communicated what all of us, left to ourselves, seek to suppress. Life without Christ is leading us to a lonely grave. This is why of all of Jones' corpus, I find most powerful his rendition of "The Cup Of Loneliness," a song about Jesus' agony in Gethsemane. This song still speaks to the hellishness of hedonism.
Bush understood the need for civility. I joined him despite my frustration because the need was too great for finger-pointing and blame-making. He flew to New Orleans and addressed the nation: "Tonight I also offer this pledge to the American people: Throughout the area hit by the hurricane, we will do what it takes. We will stay as long as it takes to help citizens rebuild their communities and their lives."
George W. Bush was good as his word. He visited the Gulf states 17 times; went 13 times to New Orleans. Laura Bush made 24 trips. Bush saw that $126 billion in aid was sent to the Gulf's residents, as some members of his own party in Congress balked.
Bush put a special emphasis on rebuilding schools and universities. He didn't forget African-Americans: Bush provided $400 million to the historically black colleges, now integrated, that remain a pride, and magnet for African-American students. Laura Bush, a librarian, saw to it that thousands of books ruined by the floods were replaced. To this day, there are many local libraries with tributes devoted to her efforts.
My biggest kitchen catastrophe to date struck two weeks before my wedding day in 2011. I was elbow-deep in flour and cocoa powder, daydreaming of crafting my own three-tiered cake. At a particularly grim moment -- and sporting a nasty burn on my forearm -- I drank a glass of wine and ate a fistful of crumpled chocolate cake for dinner. Then I straightened my apron, buckled down and muscled through on my own.
What I didn't know at the time was that a squadron of bakers was just a phone call away, ready to coach me through my ambitious undertaking. I disappeared down the rabbit hole of online cake-baking forums when instead I could have had someone like the delightful Irene Shover on the other end of the phone line.
Shover is one of nine bakers who staff the King Arthur Flour baking hotline in Norwich. They're on standby every weekday between 8 a.m. and 9 p.m., and on weekends from 9 a.m. to 5 p.m. (The call volume is typically highest in the run-up to Thanksgiving and Christmas, they say.) Home cooks can also fire off questions by email or live chat. Got a nagging concern about yeast breads that won't rise, or a pizza crust that isn't quite right? Shover and her compatriots have your back.
The US economy will officially become 3 per cent bigger in July as part of a shake-up that will see government statistics take into account 21st century components such as film royalties and spending on research and development. [...]
The changes will affect everything from the measured GDP of different US states to the stability of the inflation measure targeted by the Federal Reserve. They will force economists to revisit policy debates about everything from corporate profits to the causes of economic growth.
The revision, equivalent to adding a country as big as Belgium to the estimated size of the world economy, will make the US one of the first adopters of a new international standard for GDP accounting.
"We're capitalising research and development and also this category referred to as entertainment, literary and artistic originals, which would be things like motion picture originals, long-lasting television programmes, books and sound recordings," said Mr Moulton.
At present, R&D counts as a cost of doing business, so the final output of Apple iPads is included in GDP but the research done to create them is not. R&D will now count as an investment, adding a bit more than 2 per cent to the measured size of the economy.
Erwin Schrödinger's famous thought experiment involves a cat in a box that is simultaneously alive and dead until an observer looks at it. This is an extreme example of a quantum effect called superposition in which a physical system such as an atom or photon can exist in two or more quantum states until a measurement is made on it. While superposition is a regular feature of the microscopic world, it is never seen in our everyday lives.
In 1886, Pliny Earle, then the superintendent of the state hospital for the insane in Northampton, Massachusetts, complained to his fellow psychiatrists that "in the present state of our knowledge, no classification of insanity can be erected upon a pathological basis." Doctors in other specialties were using microscopes and chemical assays to discern the material causes of illness and to classify diseases accordingly. But psychiatrists, confronted with the impenetrable complexities of the brain, were "forced to fall back upon the symptomatology of the disease--the apparent mental condition, as judged from the outward manifestations." The rest of medicine may have been galloping into modernity on the back of science, but Earle and his colleagues were being left in the dust.
Thirty years later, they had not caught up. In 1917, Thomas Salmon, another leading psychiatrist, echoed Earle's worry in an address to his colleagues, drawing their attention to the way that their reliance on appearances had resulted in a "chaotic" diagnostic system, which, he said, "discredits the science of psychiatry and reflects unfavorably upon our association." Psychiatry, Salmon continued, needed a nosology that would "meet the scientific demands of the day" if it was to command public trust.
In the century that has passed since Salmon's lament, doctors in most medical specialties have only gotten better at sorting our suffering according to its biochemical causes. They have learned how to turn symptom into clues, and, like Sherlock Holmes stalking a criminal, to follow the evidence to the culprit. With a blood test or tissue culture, they can determine whether a skin rash is poison ivy or syphilis, or whether a cough is a symptom of a cold or of lung cancer. Sure-footed diagnosis is what we have come to expect from our physicians. It gives us some comfort, and the confidence to submit to their treatments.
But psychiatrists still cannot meet this demand. A detailed understanding of the brain, with its hundred billion neurons and trillions of synapses, remains elusive, leaving psychiatry dependent on outward manifestations for its taxonomy of mental illnesses. Indeed, it has been doubling down on appearances since 1980, which is when the American Psychiatric Association created a Diagnostic and Statistical Manual of Mental Disorders (D.S.M.) that intentionally did not strive to go beyond the symptom. In place of biochemistry, the D.S.M. offers expert consensus about which clusters of symptoms constitute particular mental illnesses, and about which mental illnesses are real, or at least real enough to warrant a name and a place in the medical lexicon. But this approach hasn't really worked to establish the profession's credibility. In the four revisions of the D.S.M. since 1980, diagnoses have appeared and disappeared, and symptom lists have been tweaked and rejiggered with troubling regularity, generally after debate that seems more suited to the floors of Congress than the halls of science. The inevitable and public chaos--diagnostic epidemics, prescription-drug fads, patients labelled and relabelled--has only deepened psychiatry's inferiority complex.
There are a number of reasons to join a presidential campaign, not least of which is the main-stage, high-wire excitement. But I can recall the day I decided that my guy was the guy. Bush, campaigning at a town-hall meeting in Gaffney, S.C., got a question demanding to know how he would stop the flow of illegal immigrants. He took the opportunity to remind his rural, conservative audience that "family values don't stop at the Rio Grande" and that as long as "moms and dads" in Mexico couldn't feed their children at home, they would seek opportunity in America.
Not "illegals." Moms and dads and children. It was classic Bush: direct, decent, human.
Mortgage rates continued to drop, with the 15-year fixed-rate loan hitting a record low, according to a weekly report from mortgage financier Freddie Mac.
In interviews and op-eds, aides to the former president have sought to redirect attention to lasting Bush accomplishments that don't get as much attention these days.
They're touting the Medicare prescription drug program, the bipartisan No Child Left Behind education law, and the President's Emergency Plan for AIDS Relief. These are achievements that were expensive, and some remain controversial, but they are now widely praised as having saved and improved lives.
Two big items of unfinished Bush business also figure in. Bush allies are arguing that the former president was ahead of demographic and political curves by pushing for immigration reform and a remaking of the nation's Social Security system - initiatives that are now or will soon be revisited by Washington.
Though Bush himself likes to say he's not fighting for history's judgment, he's made clear he's eager for a reevaluation. He's predicting a revival of interest in "compassionate conservatism," which he described to The Dallas Morning News this month as "the idea that articulating and implementing conservative ideas leads to a better life for all."
"The best way for people to understand what I meant by 'compassionate conservative' is to look at the programs we implemented and look at the results," Bush told his - and his new museum's - hometown newspaper.
I'm not talking about the objects they make. Their real art is to con us into accepting the works as authentic. They do so, inevitably, by finding our blind spots, and by exploiting our common-sense assumptions. When they're caught (if they're caught), the scandal that ensues is their accidental masterpiece. Learning that we've been defrauded makes us anxious-much more so than any painting ever could-provoking us to examine our poor judgment. This effect is inescapable, since we certainly didn't ask to be duped. A forgery is more direct, more powerful, and more universal than any legitimate artwork.
Consider the work of Han van Meegeren, one of the foremost forgers of the 1930s and '40s. A leading Dutch scholar named Bredius theorized that Jan Vermeer once made religious paintings, but no one could find them. So van Meegeren produced the work that Bredius described. The fact that these paintings looked nothing like any Vermeer in existence only added to their credibility. Van Meegeren found one of the feedback loops that generate unjustified belief. We should all take heed.
But isn't forgery like plagiarism?
Technically speaking, it's the opposite. (Plagiarists take credit for other people's work, whereas forgers attribute their own work to others.)
It helps that they are forging painters who worked when the aim of art was beauty. Future generations of forgers will ape modern artists and their work will be, likewise, crappy.
Breast cancer in your breast doesn't kill you; the disease becomes deadly when it metastasizes, spreading to other organs or the bones. Early detection is based on the theory, dating back to the late 19th century, that the disease progresses consistently, beginning with a single rogue cell, growing sequentially and at some invariable point making a lethal leap. Curing it, then, was assumed to be a matter of finding and cutting out a tumor before that metastasis happens.
The thing is, there was no evidence that the size of a tumor necessarily predicted whether it had spread. According to Robert Aronowitz, a professor of history and sociology of science at the University of Pennsylvania and the author of "Unnatural History: Breast Cancer and American Society," physicians endorsed the idea anyway, partly out of wishful thinking, desperate to "do something" to stop a scourge against which they felt helpless. So in 1913, a group of them banded together, forming an organization (which eventually became the American Cancer Society) and alerting women, in a precursor of today's mammography campaigns, that surviving cancer was within their power. By the late 1930s, they had mobilized a successful "Women's Field Army" of more than 100,000 volunteers, dressed in khaki, who went door to door raising money for "the cause" and educating neighbors to seek immediate medical attention for "suspicious symptoms," like lumps or irregular bleeding.
The campaign worked -- sort of. More people did subsequently go to their doctors. More cancers were detected, more operations were performed and more patients survived their initial treatments. But the rates of women dying of breast cancer hardly budged. All those increased diagnoses were not translating into "saved lives." That should have been a sign that some aspect of the early-detection theory was amiss. Instead, surgeons believed they just needed to find the disease even sooner.
Mammography promised to do just that. The first trials, begun in 1963, found that screening healthy women along with giving them clinical exams reduced breast-cancer death rates by about 25 percent. Although the decrease was almost entirely among women in their 50s, it seemed only logical that, eventually, screening younger (that is, finding cancer earlier) would yield even more impressive results. Cancer might even be cured.
That hopeful scenario could be realized, though, if women underwent annual mammography, and by the early 1980s, it is estimated that fewer than 20 percent of those eligible did. Nancy Brinker founded the Komen foundation in 1982 to boost those numbers, convinced that early detection and awareness of breast cancer could have saved her sister, Susan, who died of the disease at 36. Three years later, National Breast Cancer Awareness Month was born. The khaki-clad "soldiers" of the 1930s were soon displaced by millions of pink-garbed racers "for the cure" as well as legions of pink consumer products: pink buckets of chicken, pink yogurt lids, pink vacuum cleaners, pink dog leashes. Yet the message was essentially the same: breast cancer was a fearsome fate, but the good news was that through vigilance and early detection, surviving was within their control.
By the turn of the new century, the pink ribbon was inescapable, and about 70 percent of women over 40 were undergoing screening. The annual mammogram had become a near-sacred rite, so precious that in 2009, when another federally financed independent task force reiterated that for most women, screening should be started at age 50 and conducted every two years, the reaction was not relief but fury. After years of bombardment by early-detection campaigns (consider: "If you haven't had a mammogram, you need more than your breasts examined"), women, surveys showed, seemed to think screening didn't just find breast cancer but actually prevented it.
At the time, the debate in Congress over health care reform was at its peak. Rather than engaging in discussion about how to maximize the benefits of screening while minimizing its harms, Republicans seized on the panel's recommendations as an attempt at health care rationing. The Obama administration was accused of indifference to the lives of America's mothers, daughters, sisters and wives. Secretary Kathleen Sebelius of the Department of Health and Human Services immediately backpedaled, issuing a statement that the administration's policies on screening "remain unchanged."
Even as American women embraced mammography, researchers' understanding of breast cancer -- including the role of early detection -- was shifting. The disease, it has become clear, does not always behave in a uniform way. It's not even one disease. There are at least four genetically distinct breast cancers. They may have different causes and definitely respond differently to treatment. Two related subtypes, luminal A and luminal B, involve tumors that feed on estrogen; they may respond to a five-year course of pills like tamoxifen or aromatase inhibitors, which block cells' access to that hormone or reduce its levels. In addition, a third type of cancer, called HER2-positive, produces too much of a protein called human epidermal growth factor receptor 2; it may be treatable with a targeted immunotherapy called Herceptin. The final type, basal-like cancer (often called "triple negative" because its growth is not fueled by the most common biomarkers for breast cancer -- estrogen, progesterone and HER2), is the most aggressive, accounting for up to 20 percent of breast cancers. More prevalent among young and African-American women, it is genetically closer to ovarian cancer. Within those classifications, there are, doubtless, further distinctions, subtypes that may someday yield a wider variety of drugs that can isolate specific tumor characteristics, allowing for more effective treatment. But that is still years away.
Those early mammography trials were conducted before variations in cancer were recognized -- before Herceptin, before hormonal therapy, even before the widespread use of chemotherapy. Improved treatment has offset some of the advantage of screening, though how much remains contentious. There has been about a 25 percent drop in breast-cancer death rates since 1990, and some researchers argue that treatment -- not mammograms -- may be chiefly responsible for that decline. They point to a study of three pairs of European countries with similar health care services and levels of risk: In each pair, mammograms were introduced in one country 10 to 15 years earlier than in the other. Yet the mortality data are virtually identical. Mammography didn't seem to affect outcomes. In the United States, some researchers credit screening with a death-rate reduction of 15 percent -- which holds steady even when screening is reduced to every other year. Gilbert Welch, a professor of medicine at the Dartmouth Institute for Health Policy and Clinical Practice and co-author of last November's New England Journal of Medicine study of screening-induced overtreatment, estimates that only 3 to 13 percent of women whose cancer was detected by mammograms actually benefited from the test.
If Welch is right, the test helps between 4,000 and 18,000 women annually. Not an insignificant number, particularly if one of them is you, yet perhaps less than expected given the 138,000 whose cancer has been diagnosed each year through screening. Why didn't early detection work for more of them? Mammograms, it turns out, are not so great at detecting the most lethal forms of disease -- like triple negative -- at a treatable phase. Aggressive tumors progress too quickly, often cropping up between mammograms. Even catching them "early," while they are still small, can be too late: they have already metastasized. That may explain why there has been no decrease in the incidence of metastatic cancer since the introduction of screening.
At the other end of the spectrum, mammography readily finds tumors that could be equally treatable if found later by a woman or her doctor; it also finds those that are so slow-moving they might never metastasize. As improbable as it sounds, studies have suggested that about a quarter of screening-detected cancers might have gone away on their own. For an individual woman in her 50s, then, annual mammograms may catch breast cancer, but they reduce the risk of dying of the disease over the next 10 years by only .07 percent -- from .53 percent to .46 percent. Reductions for women in their 40s are even smaller, from .35 percent to .3 percent.
If screening's benefits have been overstated, its potential harms are little discussed. According to a survey of randomized clinical trials involving 600,000 women around the world, for every 2,000 women screened annually over 10 years, one life is prolonged but 10 healthy women are given diagnoses of breast cancer and unnecessarily treated, often with therapies that themselves have life-threatening side effects. (Tamoxifen, for instance, carries small risks of stroke, blood clots and uterine cancer; radiation and chemotherapy weaken the heart; surgery, of course, has its hazards.)
Many of those women are told they have something called ductal carcinoma in situ (D.C.I.S.), or "Stage Zero" cancer, in which abnormal cells are found in the lining of the milk-producing ducts. Before universal screening, D.C.I.S. was rare. Now D.C.I.S. and the less common lobular carcinoma in situ account for about a quarter of new breast-cancer cases -- some 60,000 a year. In situ cancers are more prevalent among women in their 40s. By 2020, according to the National Institutes of Health's estimate, more than one million American women will be living with a D.C.I.S. diagnosis.
D.C.I.S. survivors are celebrated at pink-ribbon events as triumphs of early detection: theirs was an easily treatable disease with a nearly 100 percent 10-year survival rate. The thing is, in most cases (estimates vary widely between 50 and 80 percent) D.C.I.S. will stay right where it is -- "in situ" means "in place." Unless it develops into invasive cancer, D.C.I.S. lacks the capacity to spread beyond the breast, so it will not become lethal. Autopsies have shown that as many as 14 percent of women who died of something other than breast cancer unknowingly had D.C.I.S.
There is as yet no sure way to tell which D.C.I.S. will turn into invasive cancer, so every instance is treated as if it is potentially life-threatening. That needs to change, according to Laura Esserman, director of the Carol Franc Buck Breast Care Center at the University of California, San Francisco. Esserman is campaigning to rename D.C.I.S. by removing its big "C" in an attempt to put it in perspective and tamp down women's fear. "D.C.I.S. is not cancer," she explained. "It's a risk factor. For many D.C.I.S. lesions, there is only a 5 percent chance of invasive cancer developing over 10 years. That's like the average risk of a 62-year-old. We don't do heart surgery when someone comes in with high cholesterol. What are we doing to these people?" In Britain, where women are screened every three years beginning at 50, the government recently decided to revise its brochure on mammography to include a more thorough discussion of overdiagnosis, something it previously dispatched with in one sentence. That may or may not change anyone's mind about screening, but at least there is a fuller explanation of the trade-offs.
In this country, the huge jump in D.C.I.S. diagnoses potentially transforms some 50,000 healthy people a year into "cancer survivors " and contributes to the larger sense that breast cancer is "everywhere," happening to "everyone." That, in turn, stokes women's anxiety about their personal vulnerability, increasing demand for screening -- which, inevitably, results in even more diagnoses of D.C.I.S. Meanwhile, D.C.I.S. patients themselves are subject to the pain, mutilation, side effects and psychological trauma of anyone with cancer and may never think of themselves as fully healthy again.
Yet who among them would dare do things differently? Which of them would have skipped that fateful mammogram? As Robert Aronowitz, the medical historian, told me: "When you've oversold both the fear of cancer and the effectiveness of our prevention and treatment, even people harmed by the system will uphold it, saying, 'It's the only ritual we have, the only thing we can do to prevent ourselves from getting cancer.' "
On one particularly thorny policy issue on which his advisors had strong and deep disagreements, over the course of two weeks we (his senior advisors) held a series of three 90-minute meetings with the President. Shortly after the third meeting we asked for his OK to do a fourth. He said, "How about rather than doing another meeting on this, I instead tell you now what each person will say." He then ran through half a dozen of his advisors by name and precisely detailed each one's arguments and pointed out their flaws. (Needless to say there was no fourth meeting.)
Every prominent politician has a public caricature, one drawn initially by late-night comedy joke writers and shaped heavily by the press and one's political opponents. The caricature of President Bush is that of a good ol' boy from Texas who is principled and tough, but just not that bright.
That caricature was reinforced by several factors:
The press and his opponents highlighted President Bush's occasional stumbles when giving a speech. President Obama's similar verbal miscues are ignored. Ask yourself: if every public statement you made were recorded and all your verbal fumbles were tweeted, how smart would you sound? Do you ever use the wrong word or phrase, or just botch a sentence for no good reason? I know I do.
President Bush intentionally aimed his public image at average Americans rather than at Cambridge or Upper East Side elites. Mitt Romney's campaign was predicated on "I am smart enough to fix a broken economy," while George W. Bush's campaigns stressed his values, character, and principles rather than boasting about his intellect. He never talked about graduating from Yale and Harvard Business School, and he liked to lower expectations by pretending he was just an average guy. Example: "My National Security Advisor Condi Rice is a Stanford professor, while I'm a C student. And look who's President. <laughter>"
There is a bias in much of the mainstream press and commentariat that people from outside of NY-BOS-WAS-CHI-SEA-SF-LA are less intelligent, or at least well educated. Many public commenters harbor an anti-Texas (and anti-Southern, and anti-Midwestern) intellectual bias. They mistakenly treat John Kerry as smarter than George Bush because John Kerry talks like an Ivy League professor while George Bush talks like a Texan.
President Bush enjoys interacting with the men and women of our armed forces and with elite athletes. He loves to clear brush on his ranch. He loved interacting with the U.S. Olympic Team. He doesn't windsurf off Nantucket, he rides a 100K mountain bike ride outside of Waco with wounded warriors. He is an intense, competitive athlete and a "guy's guy." His hobbies and habits reinforce a caricature of a [dumb] jock, in contrast to cultural sophisticates who enjoy antiquing and opera. This reinforces the other biases against him.
The sharp drop in gas prices over the last month or so could provide America's economy with a much-needed jolt, putting money into consumers' pockets just as the impact of federal spending cuts reverberates through the economy.
In fact, some economists believe they could balance each other out nearly dollar for dollar.
Two things underpinned the upswing in industrial commodities. First, low prices discouraged investment in new oil fields and mines through most of the 1980s and 1990s. Second, demand in emerging markets, especially China, jumped.
Neither factor will hold this decade in the way they did during the last.
High prices have encouraged investment in new supply. The most obvious example is the rebound in U.S. oil-and-gas production. Globally, spending on oil-and-gas resources is forecast by consultancy IHS Herold at almost $700 billion this year, more than four times the level of 10 years ago.
Something similar is happening with industrial metals and minerals. Caterpillar CAT -0.38% just cut its guidance, citing weak demand for mining equipment. Excess capacity has weighed on aluminum for years and has started hitting iron ore.
She was almost like a member of the family. An employee, but almost one of them.
For three years, Maria Magdalena Romero had tended to the suburban Miami home of Jeb and Columba Bush, had helped to raise their three children, had twined into the fabric of their lives.
Then, with lurching swiftness, she was yanked away. On a mild winter morning in 1991, two immigration agents appeared at the door of the family home looking for the woman Bush's younger son and namesake, then just 10 years old, remembers as "a super nice lady." They carried deportation orders.
It didn't matter that Bush's father was president of the United States at the time or that a Secret Service agent had answered the door. Romero, who was in the country illegally but had a work permit, wasn't getting a reprieve.
"It was a difficult time for all of us, but most of all for Maria," Jeb Bush said in an e-mail about that day. His son, Jeb Jr., hadn't even realized she'd been deported. "I thought she just left," Jeb Jr. said in a recent interview.
That long-ago deportation is one among many inflection points for the elder Bush in what has been a lifetime of intimate proximity to America's Hispanic community, to its searing pain and its buoyant joy, to its mores and its politics. While Republicans cast about for leaders who can connect with Spanish-speaking voters, this tall Texas native with the Mexican American wife has remarkably come to represent a kind of Hispanic consciousness for the party.
Of course, GHWB should have just issued a blanket pardon.
Aaron Sorkin's motor-mouthed idealists, with their adorable enthusiasm for progressive dogma and their courageous battles against anyone so evil as to disagree with them, have been rendered ridiculous by the much funnier and yet far more credible "Veep," which just launched its second season on HBO.
And here's the funniest part: The show that is mercilessly exposing the fatuousness of our government every Sunday night has impeccable left-wing credentials.
...as Friend Driscoll points out, for evidence that all comedy is conservative.
In 1816, the net public debt of the UK reached 240 per cent of gross domestic product. This was the fiscal legacy of 125 years of war against France. What economic disaster followed this crushing burden of debt? The industrial revolution. [...]
As Mark Blyth of Brown University notes in a splendid new book, great economists of the 18th century, such as David Hume and Adam Smith warned against excessive public debt. Embroiled in frequent wars, the British state ignored them. Yet the warnings must have appeared all too credible. Between 1815 and 1855, for example, debt interest accounted for close to half of all UK public spending.
Nevertheless, the UK grew out of its debt. By the early 1860s, debt had already fallen below 90 per cent of GDP. According to the late Angus Maddison, the economic historian, the compound growth rate of the economy from 1820 to the early 1860s was 2 per cent a year. The rise in GDP per head was 1.2 per cent. By subsequent standards, this may not sound very much. Yet this occurred despite the colossal debt burden in a country with a very limited tax-raising capacity. Moreover, that debt was not accumulated for productive purposes. It was used to fund the most destructive of activities: war. Quite simply, there is no iron law that growth must collapse after debt exceeds 90 per cent of GDP.
The guest-worker program is where they go wrong. For the Republican politicians who have in the past been its main supporters, this provision is like a dessert with no calories: Businesses get the benefit of the temporary workers' labor and they get to make some money, but the rest of us don't have to make room for immigrants in our society, and Republicans don't have to worry how they will vote.
That's exactly what's wrong with the idea. One of the worst things about illegal immigration is that it creates a class of people who contribute their labor to this country but aren't full participants in it and lack the rights and responsibilities of everyone else. A guest-worker program doesn't solve this problem. It formalizes it.
Two Tiers
So we would have a two-tier labor market. Most people who work in the U.S. can quit their jobs without worrying that they'll be ejected from the country after 60 days of unemployment. Temporary workers would have no such security. Most people can leave one industry for another. The temporary agricultural workers in the bill would have no such freedom. Some foreigners may choose this fate as better than their alternatives. It seems unfair, though, to ask Americans to compete with workers who will be more willing to put up with bad working conditions because of this artificially precarious situation. [...]
Enforcing the program's limits would involve similarly bad choices. One of the chief arguments for this bill is to stop enforcing immigration laws in ways that break up families. What happens when a guest worker has finished his three-year term and has no job -- but has brought his family here? (Or had a child, who would be a U.S. citizen?) Will we then deport him? Or will we just let him overstay his visa and go into the shadows as an illegal immigrant?
Supporters of the bill should rethink these provisions. Opponents should train their fire on them. Many Americans support legalizing illegal immigrants because it seems more humane and practical than mass deportations. Guest-worker programs seem at odds with those impulses, because they're neither humane nor practical.
Federal prosecutors are trying to piece together the complex web of influences that transformed a young man with no confirmed militant training or links, apparently acting with only the assistance of his younger brother, into a brutal bomber prepared to kill and maim in pursuit of a cause that remained largely unarticulated.
Tamerlan Tsarnaev has become the focal point of a global FBI investigation into whether any organised group or wider conspiracy lay behind last week's Boston Marathon bombings. The 26-year-old, who has been identified through fingerprinting as the man killed in the shootout with police in the Watertown suburb of Boston, is widely assumed to have been the mastermind of the marathon outrage, with his younger brother Dzhokhar Tsarnaev allegedly playing the role of junior partner.
[F]ormer Bush Chief of Staff Joshua Bolten noted that spending during most years of Bush's presidency was below 20 percent of gross domestic product, the target now established by House Republicans in their budget blueprint. No president since Richard M. Nixon, other than Bill Clinton, can make such a claim, he said.
In fact, over the last 40 years and eight presidencies, only two presidents have kept spending below 20 percent of GDP in even a single year: George W. Bush did it in six of his eight fiscal years; Bill Clinton in four. Barack Obama has averaged 24 percent of GDP spending so far; and even his optimistic budget projections don't have the U.S. getting close to 20 percent again. Ever. As another reference point: during fiscal years 1981-88, the Reagan years, federal spending averaged over 22 percent of GDP. Just in case anyone is interested in it.
I don't see much future for American conservatism until we wake up from our dreams of oppression. Conservative worries about "socialism" are patently absurd. At its worst it reflects a strange mental derangement that confuses ideals (the complete absence of barriers and regulatory friction) with realities (the dramatic growth of economic and political freedom over the last thirty years). The more mild instances of delusion stem from an inability to see the forest for the trees. Yes, health care in America is being rolled into an almost comprehensive national regulatory scheme, but that's only a small part of the picture. (And a complicated part: American health care in the past has never been thoroughly organized by free-market forces.)
If we wake up from our dreams we'll see what the postmodern left already sees, which is that the explosive growth of economic freedom in recent decades has created two very large social problems.
• Greater economic freedom has brought greater economic inequality, as it always has in the modern era. Those with sufficient capital, talent, and ambition are able to use their greater freedom to exploit opportunities and capture efficiencies. Others fall behind. This is a threat to social solidarity.
• Greater economic freedom accelerates the rate of creative destruction, again, as it always has in the modern era. This means that communities, and even nations, organized around existing industries and types of employment are increasingly vulnerable to dislocation and disintegration. This too is a threat to social stability.
These are not economic problems, and therefore cannot be addressed by increasing economic freedom, as the Romney campaign imagined. In the broadest sense of the term, they are political problems, as the social problems associated with the explosive successes of capitalism have always been in the modern era. Dealing with them will require political and not economic approaches.
...is the one where wealth gets redistributed via employment.
Senate Democrats hope to advance legislation this week that would replace the across-the-board budget cuts known as the sequester for the rest of the current fiscal year by redirecting savings from winding down the wars in Iraq and Afghanistan.
Created by Chris Chibnall, Broadchurch is, in many ways, a homegrown response to the riveting Nordic Noir television trend, which has captured the imagination of U.K. viewers in a very unexpected and palpable way. Like Forbrydelsen before it, Broadchurch focuses on both the police investigation--embodied here by churlish Detective Inspector Alec Hardy (David Tennant) and eager-to-be-liked Detective Sergeant Ellie Miller (Olivia Colman)--and how Danny's family copes in the wake of such monumental grief. ITV's Broadchurch--which was deemed "another jewel in the channel's drama crown" by The Independent--has proven to be a huge success in its native Britain, luring in roughly 9 million consolidated viewers, putting it on par with the massively successful Downton Abbey.
Everyone is a suspect in Danny's death, from the cheerful local vicar (Doctor Who's Arthur Darvill) and the grizzled newsagent (David Bradley) to Danny's own father, Mark (Andrew Buchan). Secrets have a way of spilling out in a murder investigation, and Broadchurch does a fantastic job of charting the numerous atomic explosions that follow in its wake. Everyone in the idyllic seaside town has something to conceal, something they're running from, a terrible past that they're looking to forget. Even Danny, the poor dead boy at the center of the story, seems to have harbored some terrible secret, one worth killing him over. Just what that is--and whodunit--remains the overarching plot that carries an electric current throughout the action.
White House press secretary Ari Fleischer walked into the media cabin of Air Force One on May 24, 2002, and dropped identical envelopes in the laps of two reporters, myself and Steve Holland of Reuters. Inside each was a manila card - marked by a small presidential seal and, in a simple font, "THE PRESIDENT."
Handwritten in the tight script of President George W. Bush, both notes said essentially the same thing: "Thank you for the respect you showed for the office of the President, and, therefore, the respect you showed for our country."
What had we done? Not much, really. An hour earlier, at a rare outdoor news conference in Germany, Steve and I decided to abide by the U.S. media tradition of rising from our seats when the president entered our presence. The snickering German press corps remained seated. "What a contrast!" Bush wrote. "What class." [...]
Bush's note, a simple gesture, spoke volumes about his respect for the office of the presidency. He did not thank us for respecting him. He knew it wasn't about George W. Bush. He was touched instead by the small measure of respect we showed "for our country."
The same sense of dignity compelled Bush to forbid his staff to wear blue jeans in the White House. Male aides were required to wear jackets and ties in the Oval Office.
He was a stickler for punctuality. Long-time adviser Karen Hughes asked him years ago why he was always early for appointments. "Late is rude," Bush replied. He thought that if people were going to take the time to see him, he shouldn't keep them waiting.
He remembered names of the spouses and children of his staff, and insisted that hard work at the White House not be an excuse to let family life suffer. One steamy summer day in 1999, then-Gov. George W. Bush called me with an exclusive interview and interrupted my first question. "What's all that noise in the background, Fournier?" he asked.
"I'm at the pool with my kids, governor."
Bush replied, "Then what the hell are you doing answering your phone?"
Damn good question, sir. We quickly ended the interview.
I envy few people -- maybe Nelson Mandela for his indomitable courage, maybe Philip Roth for his abundant talent, maybe even George Clooney for how much he seems to enjoy being George Clooney. I add, tentatively and for different reasons, George W. Bush. The man has the serene self-confidence of a divine-right monarch. Day or night, he seems to sleep well.
Days before his second term ended in 2009, Bush's approval rating among all adults was 33 percent positive and 66 percent negative. The new poll found 47 percent saying they approve and 50 percent saying they disapprove. Among registered voters, his approval rating today is equal to President Obama's, at 47 percent, according to the latest Post-ABC surveys.
It's not unusual for a former president to advance in public esteem after he's left the fray of partisan politics, but neither is it guaranteed. In polls four to five years after the end of their presidencies, Bush's father gained 18 points in approval, but Bill Clinton slipped by 4 and Ronald Reagan lost 12. (Reagan later improved in retrospect; it just took more time.)
his numbers will be driven up by the Peace Dividend--first, because he'll have been shown to have won the War; second, because the budget will swing back into balance so quickly that winning will be seen to have been worth it.
In Praise of Surveillance Cameras : Boston police have relatively few security cameras. Luckily, Lord & Taylor's were on. (L. Gordon Crovitz, 4/22/13, WSJ)
Boston is one of the less-wired large cities when it comes to surveillance cameras, so authorities relied largely on footage from private parties, such as the Lord & Taylor department store near the scene. The most recent estimate, from 2010, is that Boston and surrounding towns have some 150 police surveillance cameras, plus 400 in the subway. This compares with more than 3,000 government and networked private cameras in New York City's financial district alone, and some 400,000 cameras in London.
The cameras are getting smarter. New software goes beyond passive recording to alerting law enforcement about suspicious activity in real time. Video analytics enable what is called "activity forecasting." By applying artificial intelligence to video, these services issue alerts of what researchers call "anomalous" behavior--such as when cameras detect people leaving bags behind in public places.
The technology, from companies with names like IPVideo Corp. and ObjectVideo, is still new. It might not have been good enough to have identified the bags left behind by the terrorists in time to disarm the Boston bombs. As these systems improve, however, there will be a growing gap between cities that make full use of surveillance technologies and those that don't.
Measuring an unreported economy is obviously tricky. But look closely and you can see the traces of a booming informal economy everywhere. As Feige said to me, "The best footprint left in the sand by this economy that doesn't want to be observed is the use of cash." His studies show that, while economists talk about the advent of a cashless society, Americans still hold an enormous amount of cold, hard cash--as much as seven hundred and fifty billion dollars. The percentage of Americans who don't use banks is surprisingly high, and on the rise. Off-the-books activity also helps explain a mystery about the current economy: even though the percentage of Americans officially working has dropped dramatically, and even though household income is still well below what it was in 2007, personal consumption is higher than it was before the recession, and retail sales have been growing briskly (despite a dip in March). Bernard Baumohl, an economist at the Economic Outlook Group, estimates that, based on historical patterns, current retail sales are actually what you'd expect if the unemployment rate were around five or six per cent, rather than the 7.6 per cent we're stuck with
Is Rise of Jewish Fundamentalism Endangering Israeli Democracy? : Authors See Conflict Between Haredim and Secular World : a review of The War Within: Israel's Ultra-Orthodox Threat to Democracy and the Nation by Yuval Elizur and Lawrence Malkin (Jerome Chanes, April 22, 2013, issue of April 26, 2013, Tablet)
Veteran journalists Elizur and Malkin offer a basic, and bleak, thesis: Absent a separation of religion and state in Israel, the very future of Israel itself, to say nothing of America-Israel relations, will be jeopardized by sectarian/Haredi excesses in crucial areas of domestic public policy in Israel and, by extension, hegemony over the secular state. [...]
Elizur and Malkin ask a basic question: How did it happen that the sectarians in Israel were able to hijack central religious and political processes and thereby secure the ability to sway the destiny of much of the Israeli body politic? Who, in fact, are the "Ultras," who absolutely possess the authors and whose outsized presence in Israeli society is, as the authors assert, morbific, even toxic? [...]
Especially compelling are the two chapters on the question of exemption for yeshiva students from requisite military service in the Israel Defense Forces. What was originally a temporary
exemption granted in 1948 to 400 yeshiva students has become a national problem as some half a million sectarian youth do not serve in the army. Exacerbating the problem is the fact that the students do not work, because taking a job would mean forfeiture of their exemptions.
Elizur and Malkin trace the story from 1948 and earlier, through the political deals made over the decades with Orthodox parties and power centers, as the sectarian Orthodox community exploded demographically, to the expiration in 2012 of the Tal Law, which extended on a temporary basis the exemptions.
The obverse side of the sectarian/IDF story is that of those Orthodox who do serve in the army. "The War Within" rehearses this narrative in the context of how the Religious Zionism of an erstwhile centrist and responsible Mizrachi/National Religious Party was hijacked by Religious Zionist extremists who make common cause in some areas with Haredi sectarians.
A substantial chunk of the sectarian/Haredi problem is the fact that the rabbanut -- the Chief Rabbinate of the State of Israel -- has in recent years been under the thumb of a few
sectarian rabbis, and has become morally, politically and religiously bankrupt. This issue, not addressed by Elizur and Malkin, is especially rueful for those who recall the Religious Zionism, represented by the centrist, responsible, Mizrachi/National Religious Party of decades past, before the "Religious" got pinched by the sectarians and the "Zionism" got pinched by the settler movement.
At bottom, Elizur and Malkin assert, it's all about politics and purse: Israel's parliamentary system has often insisted that Haredi support is key when building a governing coalition; the sectarian influence has all too often forced the majority to kowtow to the Haredim.
This year the deficit is expected to be half that -- around 5.3% of GDP, the Congressional Budget Office estimates.
And by 2015, it's projected to drop to 2.4%.
What's more, the national debt that has accumulated from annual deficits is also projected to fall to an estimated 73.1% of GDP in 2018 from an estimated 76.3% today.
The International Monetary Fund recently held a conference that should concern most people despite its arcane subject -- "Rethinking Macro Policy II." Macroeconomics is the study of the entire economy, as opposed to the examination of individual markets ("microeconomics"). The question is how much "macro" policies can produce and protect prosperity. Before the 2008-09 financial crisis, there was great confidence that they could. Now, with 38 million unemployed in Europe and the United States -- and recoveries that are feeble or nonexistent -- macroeconomics is in disarray and disrepute. [...]
The economic models that didn't predict the crisis have also repeatedly overstated the recovery. The tendency is to blame errors on one-time events -- say, in 2011, the Japanese tsunami, the Greek bailout and the divisive congressional debate over the debt ceiling. But the larger cause seems to be the models themselves, which reflect spending patterns and behavior by households and businesses since World War II.
"The events [stemming from] the financial crisis were outside the experience of the models and the people running the models," Nigel Gault said in an interview. (Gault, the former chief U.S. economist for the consulting firm IHS, was not at the conference.) The severity of the financial crisis and Great Recession changed behavior. Models based on the past don't do well in the present. Many models assumed that lower interest rates would spur more borrowing. But this wouldn't happen if lenders -- reacting to steep losses -- tightened credit standards and potential borrowers -- already with large loans -- were leery of assuming more debt. Which is what occurred.
...that doesn't say that high inflation favors borrowers and deflation favors lenders?
On Sunday, the Dagestan affiliate of the Caucasus Emirate, a separatist group in Russia that has been tied to al Qaeda by the United Nations, issued a statement denying responsibility for the attacks in Boston. Here's a translation by the jihadist media clearinghouse blog Jihadology:
[T]here are speculative assumptions that [Tamerlan Tsarnaev] may have been associated with the Mujahideen of the Caucasus Emirate, in particular with the Mujahideen of Dagestan.
The Command of the Province of Dagestan indicates in this regard that the Caucasian Mujahideen are not fighting against the United States of America. We are at war with Russia, which is not only responsible for the occupation of the Caucasus, but also for heinous crimes against Muslims.
Start a Family... : And before you know it, you'll be voting for the GOP. (Jonathan V. Last, April 22, 2013, VWeekly Standard)
In 2005, Steve Sailer wrote a cover story for the American Conservative theorizing that the divide between red and blue states was driven in large part by the cost of family formation. Sailer dubbed this the "Dirt Gap" (referring to the price of homes with yards), and his general thesis was that affordable family formation--and the attendant bourgeois life which it enabled--was the source of our political divisions.
In February, George Hawley, a political science professor at the University of Houston, decided to test Sailer's theory. In a paper published in the peer-reviewed journal Party Politics, Hawley built a model which ought to be studied by every Republican political operative in the country. Because it shows not only that Sailer was correct--lower median home values are closely linked to Republican voting--but that one of the key factors linking home values and Republican voting is marriage. [...]
In Hawley's final pass through the model, he looked at how median home price and the marriage rate interact with one another. And here he found another powerful relationship: Every $10,000 increase in median home value causes a 0.3 percent decrease in the marriage rate of the 25- to 30-year-old cohort. Which suggests that, all else being equal, increasing home prices delays marriage.
Hawley's research might seem esoteric, but it carries with it an extraordinary amount of practical political guidance for Republicans.
For instance, the GOP is rightly committed to increasing economic prosperity. But Hawley notes that rising incomes don't actually produce any political benefit for Republicans if they require increasing educational attainment and are accompanied by rising land costs. So Republican economic policy should probably be somewhat more populist-minded.
And about those land costs. Whether or not Democrats have intuited that higher housing prices help them, liberal urban planning shibboleths--fealty to mass transit combined with a dogmatic commitment to increasing population density--have the effect of making homes more expensive. Republicans ought to be just as interested in measures which contain housing costs, such as building highways and removing land-use restrictions. In other words, Republicans ought to be every bit as committed to the suburban project as Democrats are to urbanization.
Geography has long proved resistant to policy initiatives, and land costs are malleable only to a point. Sociology, however, is more promising. The Republican party can't lower the cost of real estate in Manhattan but it could plausibly encourage more Americans to get married. In the same way no politician ever misses an opportunity to extol the virtues of college, Republicans should insistently be making the case for marriage.
This isn't a heavy lift. There's an enormous amount of research demonstrating that marriage makes people happier, healthier, and wealthier.
Among the most obvious policy conclusions the study forces are that the GOP should support programs that help the lower classes buy homes and that it should support importing family men who work in the construction trades. Thanks, W.
The number of workers per establishment then plunged during the 2001 recession, and continued slipping through the mid-2000s expansion and Great Recession. As of 2011, there were 15.7 employees per physical office location.
So what explains this slide? This shrinking has occurred in almost all industries, so it can only partly be explained by a change in the types of businesses that the economy comprises, the report says.
The report's researchers also investigated whether technological advances might be encouraging parent companies to open up more offices in more places, splitting the existing work force up among more locations but still retaining the same number of employees over all. That explains part of the phenomenon, but not all of it.
The biggest factor, it seems, is the age of establishments.
In the last decade, the report finds, new establishments have been starting smaller and then staying smaller than their predecessors. In the 1990s, the typical start-up opened shop with 7.6 workers; in 2001, 6.8 workers; and in 2011, just 4.7 workers. As older firms die out, these newer businesses represent a larger and larger fraction of the job market.
It's not entirely clear why new companies are starting out so much leaner than they used to. Presumably they are largely helped by new telecommunications technologies, which allow employees to manage more of their administrative and back-office work on their own (like shared calendars) or to outsource it.
The two young brothers from Cambridge seemed to be on promising paths, one a scholarship student at college, the other fighting for a national title in amateur boxing.
And then, apparently with little warning, they veered violently off track, deep into the darkness, setting off deadly bombs, authorities are convinced, at one of Boston's most iconic and joyful events.
To those who knew them, the apparent transformation of Tamerlan Tsarnaev, 26, and Dzhokhar Tsarnaev, 19 -- ethnic Chechens, born in the former Soviet territory now known as Kyrgyzstan and transplanted to a working-class Inman Square neighborhood -- seemed almost inconceivable.
But as friends and neighbors pieced together recollections of the terrorism suspects and their family, a picture emerged of an older brother who seemed to grow increasingly religious and radical -- and who may have drawn his more easygoing younger brother into a secret plot of violence and hatred.
"I used to warn Dzhokhar that Tamerlan was up to no good," Zaur Tsarnaev, who identified himself as a 26-year-old cousin, said in a phone interview from Makhachkala, Russia, where the brothers briefly lived. "[Tamerlan] was always getting in trouble. He was never happy, never cheering, never smiling. He used to strike his girlfriend. . . . He was not a nice man."
Don't read the piece until you've seen the movie, which we did on Friday. But after you've seen it you will want some clarification, particularly with regard to the ingenious framing device they use: a little black kid from Florida who is inspired by seeing Jackie.
For my money, the best scene in the movie is the one where Kentuckian Pee Wee Reese goes to see Branch Rickey about hate mail he receives prior to a trip to Cincinnati.
Narrowing the focus to international relations and U.S. foreign policy, I started to think if one could point to essays that really did affect the contours of world politics. The effect couldn't just be because of who the author was -- say, for example, Hillary Clinton describing the rebalancing strategy, which mattered because she was the U.S. secretary of state -- but rather the content of the ideas. Here's my somewhat obvious short list:
In short, both Somoza and the Shah were, in central ways, traditional rulers of semi-traditional societies. Although the Shah very badly wanted to create a technologically modern and powerful nation and Somoza tried hard to introduce mod- ern agricultural methods, neither sought to reform his society in the light of any abstract idea of social justice or political virtue. Neither attempted to alter significantly the distribution of goods, status, or power (though the democratization of education and skills that accompanied modernization in Iran did result in some redistribution of money and power there).
Both Somoza and the Shah enjoyed long tenure, large personal fortunes (much of which were no doubt appropriated from general revenues), and good relations with the United States. The Shah and Somoza were not only anti-Communist, they were positively friendly to the U.S., sending their sons and others to be educated in our universities, voting with us in the United Nations, and regularly supporting American interests and positions even when these entailed personal and political cost. The embassies of both governments were active in Washington social life, and were frequented by powerful Americans who occupied major roles in this nation's diplomatic, military, and political life. And the Shah and Somoza themselves were both welcome in Washington, and had many American friends.
Though each of the rulers was from time to time criticized by American officials for violating civil and human rights, the fact that the people of Iran and Nicaragua only intermittently enjoyed the rights accorded to citizens in the Western democracies did not prevent successive administrations from granting-with the necessary approval of successive Congresses-both military and economic' aid. In the case of both Iran and Nicaragua, tangible and intangible tokens of U.S. support continued until the regime became the object of a major attack by forces explicitly hostile to the United States.
But once an attack was launched by opponents bent on destruction, everything changed. The rise of serious, violent opposition in Iran and Nicaragua set in motion a succession of events which bore a suggestive resemblance to one another and a suggestive similarity to our behavior in China before the fall of Chiang Kaishek, in Cuba before the triumph of Castro, in certain crucial periods of the Vietnamese war, and, more recently, in Angola. In each of these countries, the American effort to impose liberalization and democratization on a government confronted with violent internal opposition not only failed, but actually assisted the coming to power of new regimes in which ordinary people enjoy fewer freedoms and less personal security than under the previous autocracy-regimes, moreover, hostile to American interests and policies.
The pattern is familiar enough: an established autocracy with a record of friendship with the U.S. is attacked by insurgents, some of whose leaders have long ties to the Communist movement, and most of whose arms are of Soviet, Chinese, or Czechoslovak origin. The "Marxist" presence is ignored and/or minimized by American officials and by the elite media on the ground that U.S. sup- port for the dictator gives the rebels little choice but to seek aid "elsewhere." Violence spreads and American officials wonder aloud about the viability of a regime that "lacks the support of its own people." The absence of an opposition party is deplored and civil-rights violations are reviewed. Liberal columnists question the morality of continuing aid to a "rightist dictatorship" and provide assurances concerning the essential moderation of some insurgent leaders who "hope" for some sign that the U.S. will remember its own revolutionary origins. Requests for help from the beleaguered autocrat go unheeded, and the argument is increasingly voiced that ties should be established with rebel leaders "before it is too late." The President, delaying U.S. aid, appoints a special emissary who confirms the deterioration of the government position and its diminished capacity to control the situation and recommends various measures for "strengthening" and "liberalizing" the regime, all of which involve diluting its power.
The emissary's recommendations are presented in the context of a growing clamor for American disengagement on grounds that continued involvement confirms our status as an agent of imperialism, racism, and reaction; is inconsistent with support for human rights; alienates us from the "forces of democracy"; and threatens to put the U.S. once more on the side of history's "losers." This chorus is supplemented daily by interviews with returning missionaries and "reasonable" rebels.
As the situation worsens, the President assures the world that the U.S. desires only that the "people choose their own form of government"; he blocks delivery of all arms to the government and undertakes negotiations to establish a "broadly based" coalition headed by a "moderate" critic of the regime who, once elevated, will move quickly to seek a "political" settlement to the conflict. Should the incumbent autocrat prove resistant to American demands that he step aside, he will be readily overwhelmed by the military strength of his opponents, whose patrons will have continued to provide sophisticated arms and advisers at the same time the U.S. cuts off military sales. Should the incumbent be so demoralized as to agree to yield power, he will be replaced by a "moderate" of American selection. Only after the insurgents have refused the proffered political solution and anarchy has spread throughout the nation will it be noticed that the new head of government has no significant following, no experience at governing, and no talent for leadership. By then, military commanders, no longer bound by loyalty to the chief of state, will depose the faltering "moderate" in favor of a fanatic of their own choosing.
In either case, the U.S. will have been led by its own misunderstanding of the situation to assist actively in deposing an erstwhile friend and ally and installing a government hostile to American interests and policies in the world. At best we will have lost access to friendly territory. At worst the Soviets will have gained a new base. And everywhere our friends will have noted that the U.S. cannot be counted on in times of difficulty and our enemies will have observed that American support provides no security against the forward march of history.
No particular crisis conforms exactly with the sequence of events described above; there are always variations on the theme.
[A]ll of these people sense dimly that there is some larger process at work, a process that gives coherence and order to the daily headlines. the twentieth century saw the developed world descend into a paroxysm of ideological violence, as liberalism contended first with the remnants of absolutism, then bolshevism and fascism, and finally an updated Marxism that threatened to lead to the ultimate apocalypse of nuclear war. But the century that began full of self-confidence in the ultimate triumph of Western liberal democracy seems at its close to be returning full circle to where it started: no to an "end of ideology" or a convergence between capitalism and socialism, as earlier predicted, but to an unabashed victory of economic and political liberalism.
The triumph of the West, of the Western idea, is evident first of all in the total exhaustion of viable systematic alternatives to Western liberalism. In the past decade, there have been unmistakable changes in the intellectual climate of the world's tow largest communist countries, and the beginnings of significant reform movements in both. But this phenomenon extends beyond high politics and it can be seen also in the ineluctable spread of consumerist Western culture in such diverse contexts as the peasants' markets and color television sets now omnipresent throughout China, the cooperative restaurants and clothing stores opened in the past year in Moscow, the Beethoven piped into Japanese department stores, and the rock music enjoyed alike in Prague, Rangoon, and Tehran.
What we may be witnessing in not just the end of the Cold War, or the passing of a particular period of post-war history, but the end of history as such: that is, the end point of mankind's ideological evolution and the universalization of Western liberal democracy as the final form of human government. This is not to say that there will no longer be events to fill the pages of Foreign Affairs's yearly summaries of international relations, for the victory of liberalism has occurred primarily in the realm of ideas or consciousness and is as yet incomplete in the real or material world. But there are powerful reasons for believing that it is the ideal that will govern the material world in the long run. [...]
While it is impossible to rule out the sudden appearance of new ideologies or previously unrecognized in liberal societies, then, the present world seems to confirm that the fundamental principles of sociopolitical organization have not advanced terribly far since 1806. Many of the wars and revolutions fought since that time have been undertaken in the name of ideologies which claimed to be more advanced than liberalism, but whose pretensions were ultimately unmasked by history. In the meantime, they have helped to spread the universal homogenous state to the point where it could have a significant effect on the overall character of international relations. [...]
The end of history will be a very sad time. The struggle for recognition, the willingness to risk one's life for a purely abstract goal, the worldwide ideological struggle that called forth daring, courage, imagination, and idealism, will be replaced by economic calculation, the endless solving of technical problems, environmental concerns, and the satisfaction of sophisticated consumer demands. In the post historical period there will be neither art nor philosophy, just the perpetual care taking of he museum of human history. I can feel in myself, and see in others around me, a powerful nostalgia for the time when history existed. Such nostalgia, in fact, will continue to fuel competition and conflict even in the post historical world for some time to come. Even though I recognize its inevitability, I have the most ambivalent feelings for the civilization that has been created in Europe since 1945, with its north Atlantic and Asian offshoots. Perhaps this very prospect of centuries of boredom at the end of history will serve to get history started once again.
Not that anyone has ever read any of the five, a truth most evident when you read Mr. Fukuyama's last paragraph and then listen to folks criticize the End of History as triumphalist and optimistic.
I've been writing about the electronic design industry for over 15 years and I've never seen an LED light bulb with a better combination of features than the Cree. At $13, it's dimmable, has high quality color, is long lasting, has a 10 year warranty and makes as much light as a 60-watt incandescent bulb. It's my new favorite LED light bulb.
For years we lived in Lexington, and my wife and I would take the children to see the re-enactment of the battle before dawn, brimming with mixed loyalties but exhilarated by the drama that made history echo with the crack of muskets. It was high-class reconstruction, impeccable to the last detail of uniform, with British regulars looming out of the darkness to a steady drum beat, their major barking out the command to the "damned rebels" to lay down their arms; the volley and the return snap of musket fire; a general flight of the surviving Minutemen. Then, as light came up, we all trooped off for the fried dough that owed more to Boston's Italian history than to 1775.
People thronged to Lexington from professorial Cambridge; from patrician Back Bay; from sylvan Lincoln and Wellesley, and from the gritty, "triple-decker" apartment blocks of Everett and Medford. Brit or Patriot, ferociously Irish (Noraid collection boxes could still be seen in the early 1980s in newsagents in Charlestown) or gently Armenian, we were all there, linked in a communion of memory; explaining to our kids what had been at stake all those centuries before; why it still mattered, why it was about something that changed the world.
The mutilations of Patriots' Day 2013 were enacted in mindless cruelty, not mindful remembrance, and will change nothing; not even, one hopes, the public's state of mind at all the sporting events that constitute the city's true calendar of devotion, with Fenway Park, where the Red Sox baseball team play, its cathedral of light.
Above all, the futile atrocity will not damage Boston's sense of itself as a city that refuses to be flattened into a generic version of urbanism, with the same malls, multiplexes and multistorey car parks controlling the rhythms of city life.
Looking out at the faces in the crowd, people who had come to Fenway Park to escape, to celebrate, and recapture some of the normalcy they had lost, David Ortiz felt what they were feeling.
"This past week, I don't think there was one human being who wasn't affected by what we got going on down here," Ortiz said. "This past week for me, myself, I was very emotional and angry about whole situation and got to get that out of my chest and make sure our fans and everyone in the nation knows that this is a great nation and part of it was supporting each other when everything went down."
At the end of the ceremony that preceded the Red Sox' 4-3 win over Kansas City, Ortiz's microphone was hot and his words were clear:
"This is our [expletive] city, and nobody is going to dictate our freedom. Stay strong."
He later apologized for the swear, but not the sentiment. But in the wake of incomprehensible terror, the words were forceful, defiant, and proud.
"I'm from the Dominican Republic and the one thing that I always say is me and my family are blessed by being in this country,'' Ortiz said. "And I love this country and I would do anything for this country. Everybody was one unit and that's what matters."
Germany channels roughly half of all high-school students into the vocational education stream from the age of 16. In the US that would be seen as too divisive, even un-American. More than 40 per cent of Germans become apprentices. Only 0.3 per cent of the US labour force does so. But with the US participation rate continuing to plummet - last month another 496,000 Americans gave up looking for work - many US politicians are scouring Germany for answers.
...many shouldn't be finishing high school. Skills will serve them better.
As Jack Speer reports for Morning Edition, the opening of the Watch Technicum in Lititz is just the latest signal of a rebirth for the high end of the watch industry. [...]
"The average age of a watchmaker today in the United States is approximately 57, 58 years old," says Bethiaume. "There's plenty of watches out there that will need maintenance, with a shortage of technicians to service these pieces."
Tolstoy's 1904 novel begins with a fifteen-year-old boy staring at the eponymous hero. "Everyone in the mountains knew Hadji Murad, and how he slew the Russian swine." Betrayed by the Chechnyan chieftain, Shamil, Murad is at the novel's beginning a fugitive, wrapped in a burka. The boy can't stop staring at him--indeed, the boy's "sparkling eyes, black as ripe sloes" contain all the sickly-sweet potential of a desperate boy's life. Several chapters later the boy's village, where Murad had taken refuge, will be razed by Russian troops.
The Russians, no less than the Chechnyans, are eager to get a look at Murad. Forced by his feud with Shamil to defect, he arranges to ride over to the Russians: the officer who takes him into custody has no translator, and has to gesture and smile. Murad smiles back, "and that smile struck Poltorátsky by its childlike kindliness. . . . He expected to see a morose, hard-featured man; and here was a vivacious person, whose smile was so kindly that Poltorátsky felt as if he were an old acquaintance. He had but one peculiarity: his eyes, set wide apart, gazed from under their black brows attentively, penetratingly and calmly into the eyes of others." The much-feared Murad charms the Russians. They give him a translator and allow him to pray at the appointed times. "He is delightful, your brigand!" reports an officer's wife. Tolstoy is very sensitive to the way we look at our babyfaced enemies: our outward condescension, our inner relief, our deluded, liberal belief that we already know them.
It is strange that Tolstoy, by this time a guru of peaceful resistance who would inspire Ghandi, wrote his final novel about a hero who kept multiple daggers on his person. To be clear: neither Murad nor the other Chechnyans in Tolstoy's book are terrorists. They are rebel insurgents defending their homeland against Russian invaders, who want to annex the Caucasus in order to connect their empire to Georgia.
The turn in the gold cycle is quickening and investors should sell the metal, Goldman Sachs said in an April 10 recommendation that returned 5.4 percent in three days. Gold retreated as the Standard & Poor's GSCI Index of 24 raw materials fell to a nine-month low, extending a slump that Citigroup Inc. said marks the "death bell" for the supercycle, or longer-than-average period of rising prices. Global equities advanced to the highest since June 2008 as U.S. stocks reached a record.
"Anybody who did some buying before this big drop is probably in some pain," said Donald Selkin, who helps manage about $3 billion of assets as chief market strategist at National Securities Corp. in New York. "The perception is that gold is not really needed as a safe haven. People are looking at the stock market and they're stunned, and there's no inflation. So people are saying 'What do we need gold for?'"
According to a team spokesman, Diamond woke up yesterday morning in Los Angeles and boarded a 4:30 a.m. flight to Boston. The 72-year-old singer landed at Logan International Airport at about 12:30 p.m., at which point he phoned the Red Sox' offices and asked if he could perform Sweet Caroline live on the field during yesterday's game against the Royals.
Unfortunately, he still has not learned how to govern.
How is it that the president won the argument on gun safety with the public and lost the vote in the Senate? It's because he doesn't know how to work the system. And it's clear now that he doesn't want to learn, or to even hire some clever people who can tell him how to do it or do it for him.
It's unbelievable that with 90 percent of Americans on his side, he could get only 54 votes in the Senate. It was a glaring example of his weakness in using leverage to get what he wants. No one on Capitol Hill is scared of him.
Even House Republicans who had no intention of voting for the gun bill marveled privately that the president could not muster 60 votes in a Senate that his party controls.
President Obama thinks he can use emotion to bring pressure on Congress. But that's not how adults with power respond to things.
Perhaps it's not so much that he doesn't know how to govern but doesn't know how to be male?
THE German immigrants of the 19th century were so devoted to their native language that Americans wondered if the new arrivals would ever assimilate. The Irish who followed were said to be too devoted to a foreign pope to embrace American democracy.
Many Italians not only were Roman Catholic but also returned home for the winter, when construction work here slowed. The Chinese and Jews, skeptics argued, were of an entirely different race than many successful immigrants who came before them. [...]
They have advantages that previous immigrant waves did not, starting with a national culture less accepting of discrimination than in the past. But they also face new obstacles. Perhaps most important, earlier groups of immigrants were not breaking the law by living in this country.
For the myriad ways that the country accepts illegal immigrants as part of society, their status still brings enormous disadvantages that inhibit climbing the economic ladder. Parents without legal status are less willing to become involved in their children's schools. They are less willing than legal workers to ask for a raise or to leave one job for another that brings more opportunity. They are less easily able to start a business.
Whether you consider those costs too small or too large for people who enter the country without permission, the bipartisan bill introduced in the Senate last week would clearly reduce them. Long before they won citizenship -- which would take years -- many of today's 11 million illegal immigrants would be able to lawfully register as residents.
"The change would be very significant for them, and it would happen immediately after they register," said Doris M. Meissner, a former commissioner of the Immigration and Naturalization Service who is now at the Migration Policy Institute. "They would no longer be clandestine."
New measures to boost share ownership could form a central part of Conservative plans to win back the "aspirational" voters who supported Baroness Thatcher's governments.
The Daily Telegraph understands that advisers to David Cameron are to study ways of making it cheaper and easier for households to invest in British companies, especially newly-founded firms.
Ministers are also considering using the potential privatisations of state-owned firms like RBS and Royal Mail to give voters shares.
Some Conservatives believe that boosting share-ownership should sit alongside helping more people buy their own homes in party promises to create an "aspiration nation".
Increasing the number of households with shares in public companies was a major ambition of the Thatcher government, built around the privatisation of major utilities like British Telecom and British Gas.
The return to Thatcherism/New Labour is a good thing, but not a new thing.
A study from Germany finds that men do a much better job of interpreting one vital set of signals -- the emotions conveyed by the eyes -- when they're communicating with another man, compared to another woman.
"The finding that men are superior in recognizing emotions/mental states of other men, as compared to women, might be surprising," a research team led by psychiatrist Boris Schiffer reports in the journal PLOS ONE.
The United States has given Chinese leaders no incentive to incur the risk of having the North Korean state unravel, which could lead to both a refugee crisis and the prospect of a united Korea allied militarily with the United States.
For Beijing to take such a gamble, either there would have to be a large potential reward for action or an equally large potential downside for inaction. Current U.S. policy includes neither feature, and that has to change. If Washington is not willing to offer Beijing the one "carrot" that might cause Chinese leaders to dump the country's troublesome client--ending the U.S. alliance with Seoul upon Korean reunification--the Obama administration must boost China's anxiety level.
The most effective way to do that is to invoke the specter that South Korea and Japan might decide to build their own nuclear arsenals if North Korea continues its menacing ways, especially its quest for nuclear weapons. Chinese officials would not be happy about a South Korean nuclear arsenal, and the last thing in the world they want to see is a nuclear-armed Japan.
Why not use North Korea as an opportunity for nuclear deterrence instead of proliferation? Just remove the regime and cite its nuclear ambitions as one of the justifications, though, as in Iraq, a secondary one..
When it comes to Chechnya, Russia knows more about the region than anyone else. Has it employed brutal methods to try and subdue it? Absolutely. But it is a hotbed of Islamic militants who also fought in Iraq and Afghanistan. Now they appear to be heading toward America itself in their deluded belief that they're waging a battle against the evil Western empire.
The Obama administration will have to study what went wrong. Part of studying that should include a reassessment of relations with Russia--not an ally under the Putin regime. But surely a country that America can cooperate with on mutual matters of national interests. Halting, as far as possible, terrorism is one such interest.
So we're supposed to connive at the repression of an entire people because of a couple knuckleheads? Way to legitimize the violence.
In 1981, my sitcom "Mork & Mindy" was about to enter its fourth and final season. The show had run its course and we wanted to go out swinging. The producers suggested hiring Jonathan to play my son, who ages backward. That woke me out of a two-year slump. The cavalry was on the way.
Jonathan's improvs on "Mork & Mindy" were legendary. People on the Paramount lot would pack the soundstage on the nights we filmed him. He once did a World War I parody in which he portrayed upper-class English generals, Cockney infantrymen, a Scottish sergeant no one could understand and a Zulu who was in the wrong war. The bit went on so long that all three cameras ran out of film. Sometimes I would join in, but I felt like a kazoo player sitting in with Coltrane. [...]
Earlier in his life, he had a breakdown and spent some time in a mental institution. He joked that the head doctor told him: "You can get out of here. All you need is 57 keys." He also hinted that Eileen wanted him to stay there at least until Christmas because he made great ornaments.
Even in his later years, he exorcised his demons in public. His car had handicap plates. He once parked in a blue lane and a woman approached him and said, "You don't look handicapped to me."
Jonathan said, "Madam, can you see inside my mind?"
We already know that we use more energy from oil, gas and coal than we really need. (America consumes the equivalent of about 48 barrels of oil per person per year; Germany, with the healthiest economy in Europe, consumes just 26.) We know that lower consumption would make us less dependent on other countries for energy, a goal every president since Richard Nixon has pursued. We know that oil and coal produce air pollution, which we'd like to reduce. And we know that those fuels emit carbon dioxide, which contributes to global warming.
Economists call the hidden costs of energy consumption -- the prices of climate change, pollution and national security -- "externalities." They're real costs, but they're not included in the price of the gasoline you put in your car or the electricity you use at home.
Even the federal gasoline tax that's now levied doesn't come anywhere near covering its purpose of paying for highways. The gas tax has been stuck at about 18 cents a gallon since 1993; if it had risen with inflation over those 20 years, it would be about 30 cents.
A federal carbon tax, though, would apply to more than just gasoline. It would be levied on any fuel that produces carbon dioxide emissions. That means it would fall heavily on coal, less heavily on oil and only lightly on natural gas. It would make energy efficiency more valuable and alternative energy (like wind power) more competitive.
First, a WTO agreement on trade facilitation should be concluded so as to reduce red tape and simplify customs and border controls. The resulting improvement in trade flows could generate more than $1 trillion in world export gains.
Second, WTO members should expand coverage of the Information Technology Agreement, a highly progressive measure agreed to in 1996 that committed the signatory nations -- who were responsible for more than 90 percent of the world's IT trade -- to eliminate all tariffs on 180 products in the IT sector. In the intervening years, countless IT products have come onto the market, some of which have resulted in trade disputes, growing out of uncertainty as to whether these products are covered by the ITA. A renewed commitment to refrain from imposing tariffs on IT products is needed.
Third, there is a compelling need to further open markets to international trade in services, given the growing role of services in today's global knowledge-based economy. Twenty-one like-minded economies -- representing nearly two-thirds of the global trade in services -- are about to start negotiating a comprehensive liberalization of trade across a range of services. The initiative, known as the International Services Agreement, will also focus on securing greater transparency and predictability regarding regulatory barriers to trade in services, while addressing the emergence of new services-related issues in the global marketplace.
Expanded trade in services will benefit service providers and their workers in developed and developing countries. It will also benefit consumers of services, including those in the manufacturing and agricultural sectors, the operations of which are increasingly dependent on the cross-border supply of services. And the successful conclusion of an International Services Agreement will provide a template that other nations can use when liberalizing their services sectors.
Fourth, we need to help the least-developed countries by enabling them to export to developed markets on a duty-free and quota-free basis. While some developed markets are already meeting this objective, others should make it a priority. Such a move will help accelerate the integration of the least-developed countries into the global economy.
On April 7, 2013, Kenya's Peter Some won the 37th Paris Marathon with a time of 2:05:38. A surprise winner, Some missed the event record by only 27 seconds, thus depriving him of a place in running history. He need not have worried; unknown to him and thousands of fellow marathoners, they were all nonetheless part of a historic event. As they ran across the Avenue des Champs Élysées and thumped their feet on 176 special tiles laid on a 25-meter stretch, the athletes generated electricity.
These special "energy harvesting tiles" were developed by London-based Pavegen Systems. The power thus generated can be used to run low-voltage equipment such as streetlights and vending machines. The concept is the brainchild of Laurence Kemball-Cook, who founded Pavegen in 2009 to commercialize it. "The Paris Marathon is the first of many such projects that will enable us to realize our goal of taking this technology to retail sites, transport hubs, office blocks and infrastructure spaces," he says.
"There's nothing inherently wrong with people taking part-time jobs if they want them," said Diane Swonk, chief economist at Mesirow Financial in Chicago. "The problem is that people are accepting part-time pay because they have no other choice."
Even for those who have been able to take advantage of the better job market, the opportunities have not been good. Since the economy began to recover almost four years ago, hiring has been concentrated in relatively low-wage service sectors, like retailing, home health care, and food preparation, and in contingent jobs at temporary-hiring companies. For example, nearly one out of every 13 jobs is at a restaurant, bar or other food-service establishment, a record high.
...is to shift to a ten or twenty hour work week with work of any kind triggering government benefits.
Japan will join 11 nations already in talks on the TPP: the United States, Canada, Mexico, Peru, Chile, Vietnam, Malaysia, Singapore, Brunei, Australia and New Zealand. Members hope to reach a deal by the end of this year.
With the world's third-largest economy on board, the final TPP pact would cover nearly 40 percent of global economic output and one-third of all world trade.
The new anti-urban ideology of ruralism : And more lessons from Rod Dreher's excellent memoir The Little Way of Ruthie Leming (Damon Linker, April 19, 2013, The Week)
If you read just one work of serious nonfiction this spring, let it be Rod Dreher's beautiful, moving memoir The Little Way of Ruthie Leming. At the center of the book is the emotionally gripping story of the death of the author's sister from cancer at the age of 42. But that story is embedded in an another -- an intellectually and spiritually provocative account of Dreher's youthful flight from and eventual return (after Ruthie's death) to his Louisiana hometown (population 1,700). It is these bracing reflections on place and community, ambition and happiness that transform the book into something far more than a tragic autobiography. Dreher has written a powerful statement about how we live today -- and more importantly, about how we should live. [...]
On visits back home during the 19 months she waged a losing battle with cancer and in the days immediately following her death, Dreher was repeatedly stunned by everyday acts of kindness and love in his hometown. Neighbors cooked meals and cleaned house for the overwhelmed Leming family. The community raised $43,000 in a single night to help them pay their medical bills. At Ruthie's funeral, the pallbearers removed their shoes, carrying her coffin barefoot in tribute to her love of the outdoors. As Mike Leming put it shortly after Ruthie's death, "We're leanin', but we're leanin' on each other."
When Dreher resolved to follow Ruthie's "little way" by giving up his life on the East Coast, returning to rural Louisiana, and writing a book defending the decision, he placed himself firmly in the camp of conservatives who congregate at a website called Front Porch Republic and contribute regularly to The American Conservative (by far the freshest and most intellectually serious magazine on the Right). Unlike the leaders of the mainstream conservative movement, Patrick Deneen, Mark T. Mitchell, Russell Arben Fox, Jeremy Beer, and the other "Porchers" have little interest in engaging with inside-the-Beltway power politics. Instead, they prefer to act as gadflies, denouncing the imperial ethos and influence-peddling that dominates Washington, as well as the boundless greed that drives would-be Masters of the Universe from around the country to seek their fortunes on Wall Street and in Hollywood and Silicon Valley.
Influenced by an eclectic range of thinkers, including sociologists Christopher Lasch and Philip Rieff, political theorist Wilson Carey McWilliams, Catholic philosopher David Schindler, and poet and essayist Wendell Berry, the Porchers see conservatism as a disposition or way of living locally, within moral, religious, economic, and environmental limits, in tightly knit, sustainable community with neighbors and the natural world. If they have a rallying cry, it's "Stay Put!" Or, in Dreher's case, "Go Home!"
While I yield to noone in terms of anti-urbanity, one is struck by the similarity between these crunchy cons and the Catholic Intellectuals our friend James Lothian wrote about in his exceptional book, The Making and Unmaking of the English Catholic Intellectual Community, 1910-1950. Both represent quaint, but fundamentally unserious, reactions to the End of History. They want to enjoy all the benefits of the triumph of the Anglo-American model -- especially capitalism -- but to grouse about its imperfections and to pretend that they aren't implicated in that broader culture.
And, just as friend Lothian delineates the unfortunate attitudes of folks like Belloc and Chesterton towards English Jews, so too do the crunchies dislike our immigrants. This "living within limits" seems to have a tendency among its adherents to extend to the point of limiting who can be part of your community.
Add to all that the element of "anti-imperialism," which really means nothing more than that America should disregard the suffering of peoples living under authoritarian/totalitarian regimes, and you have folks who have basically eschewed the central commandment of Christianity, to love one another. They reject the economics that has eliminated poverty, the politics that requires that government have the consent of the people, and the religion of neighbor love. There is nothing here for conservatives.
Is Boston Like Columbine? : Were the Tsarnaev brothers a "dyad" like Eric Harris and Dylan Klebold, with a charismatic leader and submissive follower? (Dave Cullen, April 19, 2013, Slate)
The first thing I learned covering Columbine all those years was that most of the theories that gain traction this week will be wrong.
But we do have an interesting situation developing with a pair of brothers as suspects: potentially, the classic dyad scenario. Notorious dyad examples include Bonnie and Clyde, Leopold and Loeb, and the D.C. snipers. The dyad tends to be a twisted, particular relationship that plays out very differently than the lone gunman or the terrorist team. [...]
The good news: There is a typical dyad pattern. That's in contrast to lone killers, who run the psychological gamut. Every study has drawn the conclusion that there is no typical mass murderer. Mass killers are mostly not loners or outcasts, and the Columbine killers were neither.
Killer dyads are more consistent. And the popular conception of the dominant, charismatic leader roping a submissive follower into his diabolical scheme--surprisingly, that usually turns out to be true. The leader is commonly a sadistic, dehumanizing psychopath--not always, but far more often than is the case with lone gunmen/bombers, where that personality type is relatively rare. The follower is often depressive, submissive, or otherwise dependent.
When there is a significant age difference--as with one killer just out of high school--we can't be certain the older partner plays the lead role, but it usually works that way.
Dyads usually contain contrasting personalities. A psychopathic killer generally does not link up with another psychopath. Nor do depressives pair up. Thrill-seeking psychopaths have been known to pair up, but most are looking for the qualities they lack.
Here, Columbine is highly illuminating. It's a lousy example for understanding most school shootings, because it's so atypical: It wasn't even intended primarily as a shooting--the main event was the failed bombs. But Columbine is a perfect illustration of the classic dyad: Eric Harris wanted a minion to march behind him; Dylan Klebold was looking for someone to lead a parade.
Real interest have famously been on the decline for thirty years. A rougher historical record suggests that English real interest rates may have been in decline since at least 1600.
The standard explanation here is better governance and lower systemic risk.
Yet, lets imagine a simple model where we have two sources of risk. There is background you cannot avoid. And, there is personal risk that you create by through your own choices.
Policy makers have since Thomas Hobbes been attempting to drive down background risk. They have larger been successful. As a result our lives are getting more and more stable.
...because there are too many people employed. The defining economic features of the past four or five centuries are that we have far more than we used to and producing it requires far less effort. While this was initially the case in only certain political jurisdictions, it has become the global norm, which only reinforces the trends.
And what are the inevitable results of such trends? Previously unimaginable affluence and lower prices and lower employment (and the massive synergy between the two).
Given such excess wealth and consistently falling prices how could interest rates be higher? In fact, real interest rates remain unsustainably high.
The Mystery of Original Sin : We don't know why God permitted the Fall, but we know all too well the evil and sin that still plague us. (Marguerite Shuster, 4/19/2013, Christianity Today)
Legend has it that G. K. Chesterton, asked by a newspaper reporter what was wrong with the world, skipped over all the expected answers. He said nothing about corrupt politicians or ancient rivalries between warring nations, or the greed of the rich and the covetousness of the poor. He left aside street crime and unjust laws and inadequate education. Environmental degradation and population growth overwhelming the earth's carrying capacity were not on his radar. Neither were the structural evils that burgeoned as wickedness became engrained in society and its institutions in ever more complex ways.
What's wrong with the world? As the story goes, Chesterton responded with just two words: "I am."
His answer is unlikely to be popular with a generation schooled to cultivate self-esteem, to pursue its passions and chase self-fulfillment first and foremost. After all, we say, there are reasons for our failures and foibles. It's not our fault that we didn't win the genetic lottery, or that our parents fell short in their parenting, or that our third-grade teacher made us so ashamed of our arithmetic errors that we gave up pursuing a career in science. Besides, we weren't any worse than our friends, and going along with the gang made life a lot more comfortable. We have lots of excuses for why things go wrong, and--as with any lie worth its salt--most of them contain some truth.
Still, by adulthood, most of us have an uneasy sense of self. Whatever we try to tell ourselves, something in us knows that we don't measure up to our own standards, let alone anyone else's. Even if we think we've done rather well, all things considered, there remains a looming conclusion to our lives we cannot escape. Death will bring an end to all achievements and all excuses. And who among us can face the reality of final judgment with the conviction that we are altogether blameless?
Maybe there is something to Chesterton's answer after all. In fact, theologian Reinhold Niebuhr was fond of saying that original sin--the idea that every one of us is born a sinner and will manifest that sinfulness in his or her life--is the only Christian doctrine that can be empirically verified. Everyone, whether a criminal or a saint, sins. Insofar as that dismal verdict is true, it's hardly surprising that there is a great deal wrong with the world.
Why Do We Sin?
But how could such a thing be? How could sin invade and pervade the world that God made good? To this great question, like the other great question of how it could be that Christ's death saves us, the Bible gives no theoretical answer. Rather, it only narrates how it came about.
The account comes in Genesis 2 and 3, the second Creation narrative. In the first Creation narrative, Genesis 1, God celebrates what he has made and gives humankind a position of honor and responsibility. The second narrative (probably written earlier than the first) provides an important counterpoint, given the broken world we experience. In Romans 5 and 1 Corinthians 15, Paul takes up this second narrative to point the direction to the doctrine of the Fall. What happened in Eden, Paul implies, didn't stay in Eden. What went wrong in the beginning marks everything that follows. Adam's sin not only brings the judgment of death upon all who come after him, but also makes them sinners. (True, Eve gets blamed in 1 Timothy 2 [see also 2 Cor. 11:3], but only Adam is named in Romans 5. Sin is an equal opportunity employer. In fact, the impulse to say, "I am not to blame, that other one is" flows from our primal disobedience.)
Your brother Jeb has been talked about as a potential candidate for president. Is the country--I mean this respectfully--ready for another Bush?
GB: That's for Jeb to figure out, you know what I mean? [laughs] I would hope that people would judge [him], if Jeb were to run, on his merits and his track record. ... So I hope he will run.
The inevitability that the next Republican president will be indistinguishable from W makes Jeb the obvious choice.
Tsarnaev's uncle experiences an immigrant's shame : Suspect's uncle Ruslan Tsarni tells an old American story: a newcomer's shame when one of his own becomes notorious (JOAN WALSH, 4/19/13, Salon)
To angry, anguished Ruslan Tsarni, his nephews Dzhokhar and Tamerlan Tsarneav had a simple motive for allegedly bombing the Boston Marathon: "Being losers. Being unable to settle." [..]
"He put a shame on the Tsarni family. He put a shame on the entire Chechen ethnicity," the uncle raged. "It has nothing to do with Chechnya." To reporters questions, he answered: "We're Muslims, we're ethnic Chechens," and he went on: "Of course we're ashamed. They're children of my brother. Who had little influence of them."
This wasn't the classic testimony of a family member, declaring love for the accused and shock at the accusation. Although he hadn't seen his nephews since 2009, Tsarni declared flatly:
"I didn't like them. I just wanted my family to be away from them."
And a second time, he attributed their alleged crime to "being losers. Not being able to settle themselves. And thereby just hating everyone who did. Anything else to do with religion, with Islam, it's a fraud; it's a fake."
By contrast, Tsarni declared his devotion to his new country. "This is the ideal micro-world in the entire world," he said. "I respect this country. I love this country; this country which gives chances to everybody else to be treated as a human being."
Dr. Pete Carril is a bit of a snob. I emphasize the "Dr." because last year Princeton, the school at which he coached basketball for twenty-nine years, awarded him the honorary degree, Doctor of Humanities. I emphasize the "snob" because one of his cardinal rules is, "The ability to rebound is in inverse proportion to the distance your house is from the nearest railroad tracks." He doesn't believe that "three car garage guys" are tough enough to get rebounds or loose balls. Having grown up in a no car garage family in South Bethlehem, Pennsylvania, Pete has never quite gotten over the neighborhood saloon snobbery of the steel mill where his father worked for almost four decades. But despite himself, Pete earned that doctorate, and is today a retired Princeton professor in the best sense.
I read his book (published in 1997, written with Dan White) at the suggestion of a good friend and former student, Kevin Shinkle of Philadelphia, who saw things in it that reminded him of good teachers and good coaches. I am struck by Pete Carril's dedication to the calling of teaching even more than by his obvious ability to coach young men in what is my third or fourth favorite sport. He won 525 games in his career (not counting high school)-no other Ivy league coach comes close-and has the universal respect of the top men in his profession (Bob Knight wrote the introduction to the book), but it is as a teacher that he should be remembered.
...of which it can be said that being good at it is exclusively a function of wanting to be.
Following a chaotic night of mayhem and a police shootout, one of the two suspects in the Boston Marathon bombings was shot and killed by police. The second suspect is on the loose and police are actively searching the area of Watertown, Massachusetts in a growing manhunt.
This is a breaking story and we'll continually update throughout the morning.....
THE LATEST (6:46 a.m.): The AP is the first to report a name:
BREAKING: AP: Surviving Boston bomb suspect identified as Dzhokhar A. Tsarnaev, 19, of Cambridge, Mass. -SS
-- The Associated Press (@AP) April 19, 2013
(6:43 a.m.): Pete Williams says the suspects have been identified, but is being told not reveal their names yet. He does say they are brothers. One is 19, one is 20. The AP says they are from a Russian region near Chechnya.
(6:28 a.m.): And another shot, from the other side of the house. No word from police on what the situation is at this house.
Swat is out on laurel st.#mitshooting #boston #mit twitter.com/AKitz/status/3...
-- Andrew Kitzenberg (@AKitz) April 19, 2013
(6:22 a.m.): There were reports earlier that police on radio scanners said the Sunil Tripathi, possibly identifying him as the suspect. Tripathi is a student at Brown University who has been missing for over a month, but according to (again) Pete Williams, Tripathi is not one of the two suspects.
According to Alben, the night's outbreak of violence began when police received reports of a robbery of a convenience store in Kendall Square near MIT. A few minutes later, an MIT police officer, who has not been identified, was shot multiple times while in his cruiser at Main and Vassar streets, near Building 32, better known as the renowned Stata Center on the MIT campus.
The officer was pronounced dead at Massachusetts General Hospital. A short time later, two men carjacked a Mercedes SUV at gunpoint, and the owner of that car was able to flee at a gas station on Memorial Drive.
The SUV proceeded out Memorial Drive toward Watertown followed by a long train of police vehicles in pursuit. At one point during the pursuit, the two suspects opened fire on Watertown police and a Transit Police officer, who was shot and who remains in critical condition at a Boston-area hospital this morning.
During the gunfight, the man known as Marathon suspect #1 was wounded and was taken into custody. This morning, Dr. Richard Wolfe said the man was brought to the Beth Israel Deaconess Medical Center emergency room about 1:10 a.m. with multiple traumatic injuries.
"It was more than gunshot wounds,'' Wolfe told reporters about 5:30 a.m. today. "It was a combination of injuries. We believe a combination of of blasts, multiple gunshot wounds."
Wolfe said it looked like the man had been hurt by an "explosive device'' and that the man was struck by "shrapnel.'' The man was pronounced dead at 1:35 a.m. The hospital officials said they did not know his name.
The night's chaos began about six hours after law enforcement released images of suspects in Monday's Boston Marathon bombings that left three people dead and 170 wounded.
After responding to the shooting at MIT, police streamed to Watertown, sirens blaring. There, gunfire and explosions cut through the night air. Police warned that spectators were in danger. At Arsenal Court and Arsenal Street in Watertown, an officer bellowed: "Ya gotta get outta here. There's an active shooter here with an active explosive. Go!"
Dzhokhar Tsarnaev appears to been educated at at a Cambridge high school before receiving a scholarship to pursue higher education in 2011. His name appears on a list of 45 recipients of the Cambridge City Scholarship, handed to students from Cambridge Rindge and Latin School.
Mr Davis earlier described Dzhokhar Tsarnaev as the 'white hat suspect,' referring to a man seen in video footage released by the FBI late last night of people they believed to be involved in the marathon bombing.
In a news conference, Mr Davis said: "We believe this to be a terrorist. We believe this to be a man who has come here to kill people. We need to get him in custody."
Police say Dzhokhar Tsarnaev is believed to be wearing a grey hooded sweatshirt. He is white with dark hair and appears to be in his twenties.
Colonel Timothy Alben, superintendent of Massachusetts State Police said: "We believe these are the same individuals that were responsible for the bombing on Monday of the Boston marathon. We believe they are responsible for the death of an MIT police officer and the shooting of an MBTA officer."
Locals are being warned to be extra vigilant after reports of a suspicious device by staff at the Massachusetts Institute of Technology (MIT) in nearby Cambridge. The town is in lock-down, as authorities order businesses not to open and all traffic to be stopped while the operation continues.
The sequence of events began late last night with the armed robbery of a 7-11 in Cambridge, followed by reports of loud shooting on the campus of the Massachusetts Institute of Technology (MIT). A campus police officer was shot after investigating a disturbance in the area and later died in hospital.
Minutes later a black Mercedes was carjacked from Kendall Square, close to the shootout, leading police on a chase through Watertown.
The chase led officers through a residential neighbourhood, where a shootout ensued between officers and the two suspects.
The School : On the first day of school in 2004, a Chechen terrorist group struck the Russian town of Beslan. Targeting children, they took more than eleven hundred hostages. The attack represented a horrifying innovation in human brutality. Here, an extraordinary accounting of the experience of terror in the age of terrorism. (C.J. Chivers, 3/14/07, Esquire)
9:10 a.m. The Schoolyard.
Morning marked a new school year at School No. 1 in Beslan, beginning with rituals of years past. Returning students, second through twelfth graders, had lined up in a horseshoe formation beside the red brick building. They wore uniforms: girls in dark dresses, boys in dark pants and white shirts. The forecast had predicted hot weather; only the day before, the administration had pushed the schedule an hour earlier, to the relative cool of 9:00 a.m. Students fidgeted with flowers, chocolates, and balloons, waiting for the annual presentation, when first graders would march before their schoolmates for the opening of their academic lives.
Zalina Levina took a seat behind the rostrum and greeted the milling parents. Beslan is an industrial and agricultural town of about thirty-five thousand people on the plain beneath the Caucasus ridge, part of the Russian republic of North Ossetia and one of the few places in the region with a modicum of jobs. For the moment, work seemed forgotten. Parents had come to celebrate. Irina Naldikoyeva sat with her daughter, Alana, four, and glimpsed her son, Kazbek, seven, in the formation with his second-grade class. Aida Archegova had two sons in the assembly. Zalina was baby-sitting her two-and-a-half-year-old granddaughter, Amina. They had not planned on attending, but the child had heard music and seen children streaming toward the school. "Grandma," she had said, "let's go dance." Zalina put on a denim dress and joined the flow. Already it was warm. The first graders were about to step forward. The school year had begun.
The terrorists appeared as if from nowhere. A military truck stopped near the school and men leapt from the cargo bed, firing rifles and shouting, "Allahu akhbar!" They moved with speed and certitude, as if every step had been rehearsed. The first few sprinted between the formation and the schoolyard gate, blocking escape. There was almost no resistance. Ruslan Frayev, a local man who had come with several members of his family, drew a pistol and began to fire. He was killed.
The terrorists seemed to be everywhere. Zalina saw a man in a mask sprinting with a rifle. Then another. And a third. Many students in the formation had their backs to the advancing gunmen, but one side did not, and as Zalina sat confused, those students broke and ran. The formation disintegrated. Scores of balloons floated skyward as children released them. A cultivated sense of order became bedlam.
Dzera Kudzayeva, seven, had been selected for a role in which she would be carried on the shoulders of a senior and strike a bell to start the new school year. Her father, Aslan Kudzayev, had hired Karen Mdinaradze, a video cameraman for a nearby soccer team, to record the big day. Dzera wore a blue dress with a white apron and had two white bows in her hair, and was on the senior's shoulders when the terrorists arrived. They were quickly caught.
For many other hostages, recognition came slowly. Aida Archegova thought she was in a counterterrorism drill. Beslan is roughly 950 miles south of Moscow, in a zone destabilized by the Chechen wars. Police actions were part of life. "Is it exercises?" she asked a terrorist as he bounded past.
He stopped. "What are you, a fool?" he said.
The terrorists herded the panicked crowd into a rear courtyard, a place with no outlet. An attached building housed the boiler room, and Zalina ran there with others to hide. The room had no rear exit. They were trapped. The door opened. A man in a tracksuit stood at the entrance. "Get out or I will start shooting," he said.
Zalina did not move. She thought she would beg for mercy. Her granddaughter was with her, and a baby must mean a pass. She froze until only she and Amina remained. The terrorist glared. "You need a special invitation?" he said. "I will shoot you right here."
Speechless with fear, she stepped out, joining a mass of people as obedient as if they had been tamed. The terrorists had forced the crowd against the school's brick wall and were driving it through a door. The people could not file in quickly enough, and the men broke windows and handed children in. Already there seemed to be dozens of the terrorists. They lined the hall, redirecting the people into the gym. "We are from Chechnya," one said. "This is a seizure. We are here to start the withdrawal of troops and the liberation of Chechnya."
As the hostages filed onto the basketball court, more terrorists came in. One fired into the ceiling. "Everybody be silent!" he said. "You have been taken hostage. Calm down. Stop the panic and nobody will be hurt. We are going to issue our demands, and if the demands are implemented, we will let the children out."
Rules were laid down. There would be no talking without permission. All speech would be in Russian, not Ossetian, so the terrorists could understand it, too. The hostages would turn in their cell phones, cameras, and video cameras. Any effort to resist would be met with mass executions, including of women and children.
When the terrorist had finished, Ruslan Betrozov, a father who had brought his two sons to class, stood and translated the instructions into Ossetian. He was a serious man, forty-four years old and with a controlled demeanor. The terrorists let him speak. When he stopped, one approached.
"Are you finished?" he asked. "Have you said everything you want to say?"
Betrozov nodded. The terrorist shot him in the head.
The Russian advance south of the Terek began in earnest after the Wars of Napoleon. This coincided with a profound spiritual movement in Chechnya and other Islamic areas of the north Caucasus which sought to establish a Koran-based social order. Ultimately, the Russian military faced two wars in the North Caucasus: in the west against the Cherkess people and in the east against the peoples of Dagestan, Chechnya and Ingushetia.
Russian rule in the North Caucasus had been imposed by force and was thus maintained. Following the collapse of the Russian Empire, the North Caucasus peoples declared the formation of the Republic of the North Caucasus Federation in 1918, under the sponsorship of the Central Powers. Germany's defeat and the outbreak of civil war in southern Russia turned the North Caucasus into a battleground for Reds and Whites. However, after the civil war the Bolsheviks sent the Red Army into the region, overthrew the existing order, and annexed it in 1922.
Stalinism and Chechnya
Joseph Stalin, the Bolshevik Commissar of Nationalities and a Georgian, adapted the class struggle to the traditional policy of divide and rule. Soviet federalism provided a national veneer to a centralized state, controlled by the Communist Party, where Russians staffed the key party posts within the various republics. The Chechens proved a difficult people to subdue. In 1929 they revolted against collectivization, leading to a decade-long struggle. Russians arrived to manage the oil industry with the development of Chechen oil fields.
During World War II, when the German Army advanced into the Caucasus, there were more signs of Chechen unrest and collaboration with the enemy. In late February 1944, Lavrenti Beria's NKVD carried out Stalin's "solution" to the Chechen Question--the mass deportation of Chechens to Central Asia. Over 70,000 Chechens of the 450,000 expelled died during transit or on arrival. Chechnya ceased to exist. The exile became the defining event for succeeding generations of Chechens. In 1957 Nikita Khrushchev decreed that the Chechens could return to their ancestral homelands. Chechnya and Ingushetia were joined administratively into the Chechen-Ingush Autonomous Republic. This arrangement joined the rebellious Chechens with the traditionally loyal Ingush in a clear continuation of Moscow's policy of divide and rule. Inside Chechnya, Soviet officials made their own arrangements with local clans while keeping an uneasy eye open for signs of resistance to Communist rule.
Chechnya and the Struggle for National Self-Determination
When Mikhail Gorbachev embarked on his ill-fated attempt to save the Soviet system via glasnost and perestroika, Chechen nationalists saw an opportunity for national self-determination. In the chaos and collapse of the Soviet Union, Boris Yeltsin led a resurgent Russian Federation and championed greater self-rule within the Union Republics. In his political struggle for control of Russia, Yeltsin encouraged the national republics within Russia to seek greater autonomy. The Chechens exploited this opportunity. On November 27, 1990, the Soviet of the Chechen‑Ingush Republic unanimously dissolved the union of Chechnya and Ingushetia and declared their independence and sovereignty.
In the aftermath of the August Coup of 1991 and the collapse of efforts to reform the Union, Chechens voted for independence and overwhelmingly elected General Dudayev as their president. The Yeltsin government's ham-handed tactics to thwart independence convinced most Chechens that whoever was in power in Moscow was an enemy of self-determination.
Between Peace and War
At this juncture in the struggle for Chechen independence, Moscow was weak, and Grozny drifted into chaos. Crime and corruption grew at a staggering pace. Although Yeltsin viewed Chechen independence as a threat to Russia's territorial integrity and sovereignty and a magnet for other disgruntled Caucasian peoples chafing under Russian rule, his administration focused its efforts elsewhere. Russia was preoccupied with dissolving the Soviet system, trying to create a viable Russian government, and transforming the economy through privatization and marketization. The Chechens seized arms from corrupt and incompetent Russian officials, but did not create an effective regular military.
In 1994, fearing that a Yeltsin rival would emerge, Russia abandoned efforts to ally with Chechens opposed to their own increasingly arbitrary and corrupt government. Russia then attempted to overthrow the Chechen regime by covert action with disguised Russian military personnel. The attack failed dismally. The Yeltsin government compounded the mistake by then mounting an overt and ill-prepared military intervention. Their failure to take Grozny by coup de main and the resultant protracted struggle reinforced the anti-Russian core of Chechen nationalism and led to an Islamic revival.
Chechnya: From War to War
Following the initial battle for Grozny and other cities, the war in Chechnya became a classic insurgency.
This is largely the approach the White House has taken to immigration reform. Early in the process, Obama gave some speeches and made his support clear. Since then, he has let the Senate's bipartisan "Gang of Eight" take the lead. As a result, if an immigration bill is passed, much credit will go to Senator John McCain, who opposed Obama in 2008, Senator Marco Rubio, who appears likely to run for the Republican presidential nomination in 2016, and Democratic Senator Charles Schumer. Insofar as Obama leads at all on the issue, he leads from behind. Yet immigration reform keeps grinding forward.
Indeed, of these three issues, immigration is in the best shape. The centerpiece of the gun-control bill, expanded background checks, has fallen apart in the Senate. It's too early to say what the final outcome will be on a budget deal. Obama's efforts have led to encouraging comments from Republicans who were happy to be invited to dinner and glad to see entitlement cuts in the president's budget, but who are nowhere near proposing concessions of their own. Immigration, meanwhile, is moving forward, and insiders are more optimistic today than they were a month ago.
Of course, this isn't a real A/B/C test. Immigration, gun control and the budget are different issues subject to different political dynamics. It's certainly easier to lead from behind on something like immigration reform, where crucial Republicans have decided they have a strong incentive to step up.
King Arthur Flour is one of America's oldest brands, dating from 1790 when Henry Wood began importing fine English flour into Boston to provide the very highest quality flour to American bakers.
Actually, the Colonies didn't need English flour. In fact, as early as 1700, when fields were still being harvested with hand sickles, Pennsylvania was exporting 350,000 bushels of wheat and 18,000 tons of flour annually.
By 1750, agricultural innovators began replacing the backbreaking hand sickles with the cradle scythe, a tool with wooden fingers, which arranged the stalks of grain for easy collection while allowing the farmer to stand upright and swing the tool in an efficient arc. The cradle scythe tripled the speed with which a field could be harvested, and combined with new farming methods, including fertilizing fields and crop rotation, the colonies were not just net exporters of grain, but net exporters of tonnage in grain. And, by the mid 1700s, they were perfectly capable of producing sufficient flour, as anyone who has ever seen a mill pond or grinding stones can attest.
But fine white flour does not pour off a grinding mill. Fine flour is obtained by sifting, shaking out the finest and lightest flour, separating it from the coarser grind (and little particles of grit!). That's likely not something done in any quantity in early Boston, hence the desirability of English white flour.
Shipping grain and flour back and forth across the ocean was nothing new. The flour traveled in barrels; there is one on the display floor at King Arthur's Vermont headquarters. There is a beautiful complexity in the shipping of a barrel of flour from a mill in England to a dock in Boston. Someone hand carved oak pegs to pin a ship together. Someone grew acres upon acres of flax. For perspective, an acre of flax yields a bed sheet and a shirt. How many acres did it take to equip a tall ship with sails? Miles upon miles upon lifetimes of linen thread were spun to weave those sails.
All so trade could roll back and forth, up and down, across the ocean from London to Boston. And Boston bakers could have the finest of flour.
Bread requires yeast. And yeast, in colonial America, came from beer brewers or lactobacillus, the yeast which grows in sourdough starter. Before pen was put to paper to write the words, "We The People, in order to form a more perfect union..." the Wood family of Boston was feeding a little sourdough starter.
Likely, their starter was alive when Boston held its tea party, and struggled along during the siege of Boston, when flour was perilously scarce. The Wood family started selling their yeast along with their flour in 1790, and they've been doing it ever since. Tucked into the refrigerators at King Arthur is not just a sourdough starter, but over two centuries of New England pride, thrift and stubborn determination -- in a little plastic jar.
Escape, if you must, into a novel, to find peace of mind. I prefer to find hope in the magic that is a little company from Boston, and the yeast that has survived with them for over two centuries.
When asked which best describes how they feel about the bombings, 58 percent of voters say: angry. That's double the number who feels worried (27 percent). [...]
By a wide 62-20 percent margin, voters think homegrown terrorists are more likely than Islamic terrorists to have been behind the Boston attacks. Nineteen percent are unsure.
[J]ust consider this observation from Warren Buffett in last year's letter to Berkshire Hathaway shareholders:
Today the world's gold stock is about 170,000 metric tons. If all of this gold were melded together, it would form a cube of about 68 feet per side. (Picture it fitting comfortably within a baseball infield.) At $1,750 per ounce - gold's price as I write this - its value would be $9.6 trillion. Call this cube pile A.
Let's now create a pile B costing an equal amount. For that, we could buy all U.S. cropland (400 million acres with output of about $200 billion annually), plus 16 Exxon Mobils (the world's most profitable company, one earning more than $40 billion annually). After these purchases, we would have about $1 trillion left over for walking-around money (no sense feeling strapped after this buying binge). Can you imagine an investor with $9.6 trillion selecting pile A over pile B?
Today all that gold would be worth only about $7 trillion so we can just say would you rather have a 68 cubic feet of gold, or all the cropland in the United States plus thirteen ExxonMobiles. The answer is, obviously, that you take the farms and the oil.
And in Westeros, the Lannisters have the cube of gold and the Tyrells with the rich farmland of the Reach have the real resources
[J]ust consider this observation from Warren Buffett in last year's letter to Berkshire Hathaway shareholders:
Today the world's gold stock is about 170,000 metric tons. If all of this gold were melded together, it would form a cube of about 68 feet per side. (Picture it fitting comfortably within a baseball infield.) At $1,750 per ounce - gold's price as I write this - its value would be $9.6 trillion. Call this cube pile A.
Let's now create a pile B costing an equal amount. For that, we could buy all U.S. cropland (400 million acres with output of about $200 billion annually), plus 16 Exxon Mobils (the world's most profitable company, one earning more than $40 billion annually). After these purchases, we would have about $1 trillion left over for walking-around money (no sense feeling strapped after this buying binge). Can you imagine an investor with $9.6 trillion selecting pile A over pile B?
Today all that gold would be worth only about $7 trillion so we can just say would you rather have a 68 cubic feet of gold, or all the cropland in the United States plus thirteen ExxonMobiles. The answer is, obviously, that you take the farms and the oil.
And in Westeros, the Lannisters have the cube of gold and the Tyrells with the rich farmland of the Reach have the real resources
The benefits of being stupid at work : Why employees who aren't the sharpest knives in the drawer -- at least not that anyone can tell -- might do best of all. (Megan Hustad, 4/17/13, Forbes)
Stupidity can increase efficiency, claims Mats Alvesson, professor of organization studies at Lund University in Sweden. In a Journal of Management Studies article titled "A Stupidity-Based Theory of Organisations" Alvesson and colleague André Spicer explain how what they call "functional stupidity" generally helped get things done. "Critical reflection and shrewdness" were net positives, but when too many clever individuals in an organization raised their hands to suggest alternative courses of action or to ask "disquieting questions about decisions and structures," work slowed.
The study's authors found that stupidity, on the other hand, seemed to have a unifying effect. It boosted productivity. People content in an atmosphere of functional stupidity came to consensus more easily, and with that consensus came greater roll-up-our-sleeves enthusiasm for concentrating on the job.
In the winter of 1972, Steven Noll and three sports-junkie friends at the College of William & Mary in Williamsburg, Va., were nursing two gripes.
One was the proliferation of All-America teams in college basketball--the annual awards given to standout players by sports magazines and journalist groups.
The other was the belief they shared that none of those teams would ever recognize the star player in their midst--William & Mary's own Mike Arizin, 6-foot-5 shooting guard and small forward.
So the college buddies hatched a plan. They made up a fictitious professional organization and called it the National Association of Collegiate Basketball Writers. They then created an All-America team of their very own, naming the top 15 rookie players in the nation.
In March 1973, they mailed official-looking certificates to the universities where the winners played. Finally, the four men, long-haired juniors who had never published a word about college basketball, told the Associated Press about the award. Just like that, the news went out on the wire and appeared in newspapers all over the country.
For 40 years, the men told almost nobody about the fake award, which they gave out just once. Now, as this year's NCAA Final Four starts Saturday, the men are telling their story with heads held high.
Thatcher is venerated by today's Tories for standing up to the EU to "get our money back", but her underlying view of Europe is best revealed in her fierce and deeply misguided opposition to German reunification after the fall of the Berlin Wall. Chancellor Helmut Kohl recalls in his memoirs that Thatcher told a gathering of European leaders: "We beat the Germans twice and now they are back." French diplomatic notes reveal conversations with President François Mitterrand in which both leaders envisioned a united Germany that would exercise more influence in Europe than Hitler ever had. Although both Thatcher and Mitterrand ultimately came around, and Thatcher stood firmly for the expansion of the European Community to former Soviet states in eastern and central Europe, her instincts were completely out of touch with modern Europe.
Thatcher supported the European Community, an economic union, but described the more political EU as "perhaps the greatest folly of the modern era". That is the legacy she has bequeathed to her party. Prime Minister David Cameron's pledge to hold a referendum on British membership in the EU and his plan to renegotiate the terms of that membership risk relegating Britain to the status of little more than a bit player in global politics.
When you think Europe remains significant you've misunderstood the world.
Cue the flashback to summer 2010. Ben Bernanke and other officials at the Federal Reserve were warning that inflation was approaching dangerous lows, perhaps even flirting with the dreaded "D" word -- deflation. Bernanke gave a key speech in Jackson Hole that August hinting that more Fed stimulus might be in the pipeline. Sure enough, it was. The Fed launched QE2 about two months later.
A similar murmur is starting up again: Could inflation be getting too low? St. Louis Fed President James Bullard thinks so.
"Inflation is pretty low right now, and it's been drifting down," he told reporters at a Levy Economics Institute event Wednesday morning.
"If it doesn't start to turn around soon, I think we'll have to rethink where we stand on our policy," he added.
So-called e-fairness legislation addresses the inequitable treatment of retailers based on whether they are located in-state (either a traditional brick-and-mortar store or an Internet retailer with a physical presence in the state) or out of state (again as a brick-and-mortar establishment or on the Internet).
In-state retailers collect sales taxes at the time of purchase. When residents purchase from retailers out of state (including over the Internet) they are supposed to report these purchases and pay the sales taxes owed--which are typically referred to as a "use tax." As you can imagine, few people do.
The result is to narrow a state's sales-tax base. It also leads to several inefficiencies that, on net, diminish potential job and economic growth.
Exempting Internet purchases from the sales tax naturally encourages consumers to buy goods over the Web; worse, the exemption incentivizes consumers to use in-state retailers as a showroom before they do so. This increases in-state retailers' overall costs and reduces their overall productivity.
The exemption of Internet and out-of-state retailers from collecting state sales taxes reduced state revenues by $23.3 billion in 2012 alone, according to an estimate by the National Conference of State Legislatures. The absence of these revenues has not served to put a lid on state-government spending. Instead, it has led to higher marginal rates in the 43 states that levy income taxes.
Therefore--as with any pro-growth tax reform--the sales tax base in the states should be broadened by treating Internet retailers similarly to in-state retailers, and the marginal income-tax rate should be reduced such that the total static revenue collected by the state government is held constant.
One of the most fearsome statistics in the war against the federal deficit has always been the country's ratio of debt to gross domestic product. When this ratio reaches 90%, the argument goes, watch out -- lower economic growth is on the horizon. And that's scary, because that's where the U.S. has been heading.
This idea comes from Harvard economists Ken Rogoff and Carmen Reinhart, who featured it in a 2010 paper and popularized it in a book entitled "This Time is Different: Eight Centuries of Financial Folly."
Since then, the stat has been cited countless times, including by Rep. Paul D. Ryan in rationalizing the draconian spending cuts in his proposed budgets. Now it turns out the authors may have counted wrong.
A new study by three researchers at the University of Massachusetts finds that Rogoff and Reinhart made several mistakes that invalidate their thesis.
...but I do know that British debt after defeating Napoleon in the first phase of the Lon War was 260% of GDP and US debt after WWII, just a later phase of the same War, was over 110%. Who would not wish for their country the economic success of 19th century Britain and 20th Century America? The Rogoff-Reinhart hypothesis is inherently silly.
There are as many hypotheses about the reasons for this massive decline as there are attempts at capturing the frenzied atmosphere at gold-trading desks worldwide Monday. At the core, however, is the simple fact that there isn't enough fear among investors worldwide. Anxiety, yes. Uncertainty, absolutely. But blind panic? Nope. And it's panic that pushes investors into gold as a "safe haven" and a store of value. It's what people buy when they think they'll need portable wealth, or when they believe a currency's value can't be sustained, or when they think that other assets are about to lose all their value. (Remember the autumn of 2008?)
The rest of the time, unless you're a gold bug, the metal isn't a logical investment. Most investments provide some kind of potential gain, whether in the form of yield or capital appreciation. Gold, on the other hand, actually costs the investor money to own, at least in its traditional forms. If you own futures contracts, you need to keep rolling them over, incurring trading costs; if you own the physical metal, there are small issues like storage costs. (And gold ETFs don't offer a perfect or sometimes even an easily managed proxy for gold itself.)
Avoiding certain ingredients goes in cycles: Back in the 70s, it was sugar. Then it was fat, then saturated fat. Then fat was in but carbs were out. Gluten is the pariah ingredient du jour, and there are a lot of healthy people shelling out big bucks for gluten-free food they probably don't need.
"Most people must be doing this because they think they feel better, or they do feel better but they're not diagnosed with gluten issues," says Harry Balzer, chief industry analyst at the NPD Group. As TIME Healthland pointed out:
People who have bad reactions to common gluten-containing foods -- pasta, breads, baked goods and breakfast cereal -- may actually be sensitive to something else... It's also possible that some people develop gastrointestinal or other symptoms simply because they believe they're food-sensitive.
None of this would be a huge problem, except that this is an exceptionally pricey food fad. Producing gluten-free items, especially baked goods, is more expensive because manufacturers have to come up with alternatives that will give the finished product the same light, chewy texture that gluten imparts.
Researchers from Dalhousie Medical School at Dalhousie University in Canada compared the prices of 56 ordinary grocery items that contain gluten with their gluten-free counterparts. All of the gluten-free ones were more expensive, and some were much more expensive. On average, gluten-free products were a whopping 242% pricier than the gluten-containing versions.
OPPOSING W'S AMNESTY IS HOW THE GOP CAUSED THE RECESSION IN THE FIRST PLACE:
The Immigration Windfall : A new study shows the potential economic benefits of reform. (WSJ, 4/16/13)
[T]he more important rebuttal is contained in a new study be Douglas Holtz-Eakin, former director of the Congressional Budget Office and now president of the American Action Forum, who finds substantial and "underappreciated" fiscal and economic benefits from more immigration. After surveying the economic literature and Census Bureau data, he sees a multitrillion-dollar gain in long-term U.S. economic output, higher per capita income growth, and lower budget deficits.
Mr. Holtz-Eakin, a free-market-leaning economist, argues that immigrants should be viewed as long-term investments. In the short term, the newcomers may cost more than they contribute, but as their job and language skills improve and their earnings rise over time, the net lifetime impact of most immigrants is positive.
"A benchmark immigration reform"--by which he means more visas to productive workers--"would raise the pace of economic growth by nearly a percentage point over the near term [and] raise GDP per capita by over $1,500," he says. As middle-class Americans who have seen their real incomes decline in the last four years know, that's a huge number.
One reason for this immigration windfall is that young immigrants will compensate for the low U.S. birthrate. The average birth rate in the U.S. has fallen to 1.93 children per couple. The replacement fertility rate is 2.1. America risks following Europe and Japan into the trap of a falling population combined with the spiraling entitlement costs of a graying society. A "population-enhancing" immigration policy is the ideal policy response, Mr. Holtz-Eakin concludes.
Immigrants are also generally productive, entrepreneurial and highly motivated. They have a higher labor-participation rate than native-born Americans, and they also create new businesses at a higher rate. Some have specific technical skills the U.S. needs, while even workers with lesser education or lower skills bring a strong work ethic and a willingness to fill low-wage jobs.
Mr. Holtz-Eakin believes the combination of more immigrants with a tilt in immigration visas away from extended-family unification and in favor of those with greater skills or education would spur faster growth. Some 74% of permanent U.S. immigrants in 2010 were for family unification, "greater by far than any" other developed economy, he says. A major goal of the Senate reform is to reduce family-chain immigration in favor of merit-based economic visas.
Faster economic growth would in turn drive down the budget deficit over the next 10 years by at least $2.5 trillion.
The Futility of Terrorism : Terrorists are by definition weak. Weaken them further by refusing to be terrorized. (MAX BOOT, 4/16/13, WSJ)
The anarchists called terrorism "propaganda by the deed," and so it remains. If you want to know why terrorism has spread so much since its modern origins in the 19th century, the growth of the mass media is a large part of the explanation. With the rise of the penny press, followed by moving pictures, radio, television, satellite television, mobile telephony and of course the Internet, terrorists have more ways than ever before to get out their message. They strike precisely because they know that their acts will generate feverish coverage--unless, as in the case of Iraq during the past decade, their atrocities become so regular that they cease to shock.
Unlike Iraqis, Americans are still shocked by domestic terrorism, of which, mercifully, we have had precious little since 9/11. So the Boston Marathon bombing got the blanket news coverage its perpetrator(s) presumably wanted. It is doubtful, however, that they will achieve their larger objectives--whatever those might be (so far there has been no claim of responsibility).
...that any potential meaning of the attack was so obscure that no one has any idea who perpetrated it? What sort of symbollism is it when you make folks go, huh?
We'd like to think terrorists are more competent than this--otherwise it really does place the "war on terror" in embarrassing perspective.
It seems just as likely to have been an MIT student's demented final thesis.
It is an argument that [Jeremi Suri, a professor of history and public affairs at the University of Texas, Austin] made in a New York Times op-ed on Friday entitled, "Bomb North Korea, Before It's Too Late."
"President Obama should state clearly and forthrightly that this is an act of self-defense in response to explicit threats from North Korea and clear evidence of a prepared weapon," Suri writes.
"And he should explain that this is a limited defensive strike on a military target - an operation that poses no threat to civilians - and that America does not intend to bring about regime change. The purpose is to neutralize a clear and present danger. That is all," he writes.
..that we removed the regime simply because it was morally abhorrent and because the mere pursuit of nuclear weapons by an enemy will not be tolerated.
In a six-month study, researchers found that offering upfront price information decreased overall use of diagnostic tests by roughly 9 percent, saving hundreds of thousands of dollars.
"There's a lot of waste in medicine because we don't have a sense of the costs of much of what we do," says Leonard S. Feldman, assistant professor at the Johns Hopkins University School of Medicine.
Hospitals typically keep both patients and providers essentially blind to the cost of medical services, contributing to the astronomical cost of health care in the United States, researchers say.
"There's no other area of our lives in which we don't even think about costs," says Feldman, leader of the study published online in JAMA Internal Medicine. "If one test costs three times what another does and provides basically the same information, that's a pretty easy decision. We need to give that information to those who need it, and we really have done a disservice to society by having our head in the sand about costs."
The good news is that several cities have found a way to hunt down their waste heat in some unexpected places. These cities are building systems that deliver heat in much the same way that networks handle electricity and water. Could they point the way to the next energy revolution?
Waste heat is an enormous problem. A report in 2008 by the US Department of Energy found that the energy lost as heat each year by US industry is equal to the annual energy use of 5 million Americans. Power generation is a major culprit; the heat lost from that sector alone dwarfs the total energy use of Japan. The situation in other industrialised countries isn't much better.
The report also estimated that given the right technologies, we could reclaim nearly half of that energy, but that's easier said than done. "We often talk about the quantity of waste heat," says David MacKay, chief scientific adviser to the UK Department of Energy and Climate Change, "but not the quality." Most of what we think of as "waste heat" isn't actually all that hot; about 60 per cent is below 230 °C While that may sound pretty hot, it is too cold to turn a turbine to generate electricity.
The alternative is to just move the heat directly to where it is needed. That is what "cogeneration plants" do. These are power plants that capture some or all of their waste heat and send it - as steam or hot water - through a network of pipes to nearby cities. There, buildings tap into the network to warm their water supplies or air for central heating.
Though he didn't invent the bar code, the late supermarket executive Alan Haberman was the U.P.C. symbol's most important patron. Haberman chaired the 1973 committee that from a range of alternatives selected the black-and-white pattern of vertical stripes that has become largely invisible today due to its omnipresence. It's just part of the landscape.
"Go back to Genesis and read about the Creation," Haberman once said, according to his New York Times obituary. "God says, 'I will call the night "night"; I will call the heavens "heaven."' Naming was important. Then the Tower of Babel came along and messed everything up. In effect, the U.P.C. has put everything back into one language, a kind of Esperanto, that works for everyone."
American actions are well-intended, although many people sympathetic to American interests do not accept this proposition at face value. To the extent that American national interests must be served, we can continue to make unpopular decisions and execute American grand strategy without broad international support. But we cannot do so indefinitely. America may act unilaterally on a case-by-case basis, weighing costs and benefits. We need to be honest with ourselves when we do so, however. Others may perceive our actions as excessive and bullying.
The cost of military intervention can be high: proponents must establish a legal basis, a jus ad bellum, for action; they must apply force consistent with the laws of armed conflict and possible mandates of the UNSC; the fighting must be controlled both in time and in space; fallout and political reactions must be anticipated; and, lastly, those advocating intervention must expect the unexpected. Murphy's Law applies to all human endeavors. Given the national interest in defeating terrorism and preserving international order, some degree of risk is normal and expected.
...the removal of a regime that lacks democratic legitimacy is always a jus ad bellum.
Here are just a few of the hundreds of messages from a public spreadsheet created within hours of the tragedy by Bostonians offering space in their homes to those in need after the Boston Marathon bombing.
My place has two beds, one air mattress, and a couch. A few blocks from the finish line. My place is small but open to any one that cannot get to their hotels tonight. I'm a local dentist, and I want to contribute.
*
I am so sorry we all have to deal with this. My thoughts are with all people who got hurt. I have a guest room with a double bed. I am right on the orange line 5 stops away from Back Bay station. I used to host marathon runners in my house in the South End, steps away from Copley. I will give you a safe place to stay.
*
If you're traveling with children, we have 2 Pack-n-Play's, toddler tub, diapering needs, etc. Our apt is rather small, but we do have a queen size inflatable mattress, a couch, and a day bed where people can sleep.
*
Anything you need: food, comfy couch to sleep on, and even some wine and a cat to hang out with for as long as you need
I learned of the trial and execution of Thomas Doughty from The World Encompassed by Sir Francis Drake, a history of the second circumnavigation of the globe. Despite its title this book was not written by Drake himself but by a nephew thirty years after Drake's death.
According to The World Encompassed, Doughty and Drake were good friends and companions, but even so, Doughty apparently had been plotting against Drake since "before the voyage began, in England" and sought not only to murder "our general, and such others as were most firm and faithful to him, but also [sought] the final overthrow of the whole action."
Eventually Doughty's transgressions became so egregious that Drake was forced to take action. He ordered Doughty into custody and convened a formal trial. Forty men were chosen as jurors. "Proofs were required and alleged, so many and so evident, that the gentlemen himself [Doughty], stricken with remorse of his inconsiderate and unkind dealing, acknowledged himself to have deserved death ... "
Thomas Doughty was convicted of treason by unanimous vote. After the verdict was returned, Drake offered the guilty man three options. "Whether you would take," he asked Doughty, "to be executed in this island? Or to be set a land on the main? Or to return into England, there to answer for your deeds before the lords of her majesty's council?"
Doughty, however, refused this leniency, replying:
"Albeit I have yielded in my heart to entertain so great a sin as whereof now I am condemned, I have a care to die a Christian man . . . If I should be set a land among infidels, how should I be able to maintain this assurance? . . . And if I should return into England, I must first have a ship, and men to conduct it . . . and who would accompany me, in so bad a message? . . . I profess with all my heart that I do embrace the first branch of your offer, desiring only this favor, that you and I might receive the holy communion again together before my death, and that I might not die, other than a gentleman's death."
Drake granted Doughty's request. The next day they celebrated communion with Francis Fletcher, pastor of the fleet, and afterward Drake and Doughty dined together, "each cheering up the other, and taking their leave, by drinking each to other, as if some journey only had been in hand." Then "without any dallying or delaying" Doughty knelt down, preparing his neck for the blade. His final words were instructions to the executioner to "do his office, not to fear nor spare."
My immediate reaction to this story was that it simply was not credible. I could not believe any man would choose death when given the alternative choice of a sea voyage home.
Whatever the short-term turbulence, a more subdued price for gold (and commodities) bodes well for the future economy.
There is no end-of-the-world scenario here, as there was after the financial meltdown. Nor is there an end-of-the-U.S.-dollar scenario, as many investors fear, nor an end to euro. Nor is there any massive inflation scenario, supposedly from the Fed cranking up all those printing presses.
The reality is that all those QE reserves from the Fed never circulated through the economy. Most of them are on deposit at the central bank. Because everyone is still risk-averse, the demand for cash is so high that the turnover, or velocity, of money keeps falling.
Last I looked, the M2 money measure was growing at less than 7%. The Fed's favorite inflation target, the personal consumption deflator, was only 1.3% over the past 12 months. The much-heralded printing-press/roaring-inflation episode hasn't happened -- at least not yet.
The U.S. economy is growing slowly at between 2% and 3%, but at least it's growth. Profits have propelled stocks to all-time highs. Housing is gradually recovering. Jobs are erratic, but rising. Even the dollar is up over the past year against the broad trade-weighted index of currencies.
Kurdistan is a democracy, though an imperfect one; the territory is peaceful and the economy is booming at the rate of 11 percent a year. Foreign investors are pouring though gleaming new airports to invest, especially in Kurdish-controlled oil fields. Exxon, Chevron, Gazprom and Total are among the multinationals to sign deals with the regional government. A new pipeline from Kurdistan to Turkey could allow exports to soar to 1 million barrels a day within a couple of years.
There was one university for the region's 5.2 million people a decade ago; now there are 30. "Our people," says Hussein, the chief of staff to President Massoud Barzani, "did quite good."
The bigger story is that Kurds, a non-Arab nation of some 30 million deprived of a state and divided among Turkey, Iraq, Iran and Syria, are on the verge of transcending their long, benighted history as the region's perpetual victims and pawns. Twenty-five years ago, Kurds were being slaughtered with chemical weapons by Saddam Hussein and persecuted by Turkey, where nearly half live. A vicious guerrilla war raged between Kurdish insurgents and the Turkish army.
Now Turkey is emerging as the Kurds' closest ally and the potential enabler of a string of adjacent, self-governing Kurdish communities stretching from Syria to the Iraq-Iran border. Having built close ties with the Iraqi Kurdish government, Turkish Prime Minister Recep Tayyip Erdogan is now negotiating a peace deal with the insurgent Kurdish Worker's Party (PKK) -- a pact that could mean new language and cultural rights, as well as elected local governments, for the Kurdish-populated areas of southeastern Turkey.
Meanwhile, Barzani and the Iraqi Kurds have been trying to foster a Kurdish self-government for northern Syria, where some 2.5 million Kurds live.
[M]y first thought about him to this day was never a play or a famous hit but an idle, almost inexplicable midsummer, mid-game moment at the Polo Grounds in June or July of 1948.
I was sitting in a grandstand seat behind the third-base-side lower boxes, pretty close to the field, there as a Giants fan of long standing but not as yet a baseball writer. Never mind the score or the pitchers; this was a trifling midseason meeting--if any Giants-Dodgers game could be called trifling--with stretches of empty seats in the oblong upper reaches of the stands. Robinson, a Dodger base runner, had reached third and was standing on the bag, not far from me, when he suddenly came apart. I don't know what happened, what brought it on, but it must have been something ugly and far too familiar to him, another racial taunt--I didn't hear it--that reached him from the stands and this time struck home.
I didn't quite hear Jackie, either, but his head was down and a stream of sound and profanity poured out of him. His head was down and his shoulders were barely holding in something more. The game stopped. The Dodgers' third-base coach came over, and then the Giants' third baseman--it must have been Sid Gordon--who talked to him quietly and consolingly. The third-base umpire walked in at last to join them, and put one hand on Robinson's arm. The stands fell silent--what's going on?--but the moment passed too quickly to require any kind of an explanation. The men parted, and Jackie took his lead off third while the Giants pitcher looked in for his sign. The game went on.
Involvement in ongoing policy debates has on occasion brought him into conflict with the GOP. The party, particularly its presidential candidates, repudiated key parts of Bush's record in the 2012 campaigns.
Bush has championed foreign aid, which some Republican contenders pushed to curtail or eliminate. He defends federal accountability in education, a key provision of his No Child Left Behind law, even as some Republicans declared it failed federal overreach.
The former president has touted tax policy as the first step to economic recovery; other Republicans focused more on spending cuts. He called for a "benevolent spirit" in the immigration debate.
Asked what message he's sending to the GOP, Bush reverted to broad descriptions of freedom. He steered clear of giving his party specifics on how to rebuild, but he said that he stands by "the principles that guided me when I was president."
"These are principles that need to be articulated and defended as time goes on," he said.
For Bush, "compassionate conservatism," much derided by the party's harder-edged tea party adherents, is still a powerful draw.
He predicted a renewed interest in the philosophy, which he described as "the idea that articulating and implementing conservative ideas leads to a better life for all."
Bush touted in particular the Medicare overhaul he signed into law in 2003.
Some Republicans blasted the new prescription drug benefit as too costly and slammed Bush for expanding an entitlement. Bush bristled at that critique, saying the "entitlement was already in place" and that "we were modernizing an antiquated system."
And, he argued, the results have proved him right. [...]
Bush, an avid mountain biker, hits the trails often. He likes to play golf and attend Texas Rangers games. He's also taken up painting, an activity in which he takes "great delight in busting stereotypes."
"People are surprised," he said. "Of course, some people are surprised I can even read."
Asked why a semi-retired 66-year-old is spending his free time on frustrating and potentially humiliating activities like mountain biking, painting and golf, Bush laughed.
"I don't know," he said. "You'll have to call all the people who've written these books about me, who claim they know me, the psycho-babblers."
It's tough to make sense of the economy these days.
The latest jobs numbers show hiring is down. Taxes are up, austerity reigns in Washington and consumers are skittish. Yet the stock market is defying gravity, marching ever higher in spite of it all. [...]
American giants are benefiting from productivity gains and renewed growth in China and other overseas markets, allowing them to increase profits even if business at home remains lackluster.
It would be inexplicable if they weren't doing well when reducing the workforce was having no effect on production.
That doesn't mean all charters are automatically good. They're not. But it's indisputable that the good ones -- most prominently, KIPP -- are onto something. The non-profit company, which now has 125 schools, operates on a model that demands much more of students, parents and teachers than the typical school does. School days are longer, sometimes including Saturday classes. Homework burdens are higher, typically two hours a night. Grading is tougher. Expectations are high, as is the quality of teachers and principals, and so are the results.
KIPP's eighth-grade graduates go to college at twice the national rate for low-income students, according to its own tracking. After three years, scores on math tests rise as if students had four years of schooling, according to an independent study.
The question isn't whether such successful models should be replicated, but how best to do it. In some forward-thinking communities, that reality is altering the stale charter debate.
With the release of the globally popular The Third Wave in 1980, futurist Alvin Toffler presciently predicted a massive shift from industrial to information-driven economy and the blurring of traditional economic roles resulting in a new class of participant he labeled the "prosumer" (producer plus consumer).
We've clearly been transitioning to an info-driven economy in which knowledge workers are in high demand. Much of this has been catalyzed by new classes of systems like social media (Facebook, LinkedIn, G+), social-driven search (Google, Bing, Siri), citizen science (SETI-Live, Cell Slider) and big data that increasingly harness consumer-generated data while returning value to users in the form of organized information, entertainment, and occasionally cash (Google is paying select users up to $25 in Amazon gift cards to monitor Internet activity, Viggle is paying people for their TV viewing data). The result? An emerging prosumer class.
While today's prosumer jobs pay little and fall into the informal category, there is reason to believe that they could grow in volume and value--rapidly. As accelerating change enables companies to get better at capturing, storing, transferring, processing and valuing user data, the prosumer role is poised to proliferate and scale.
New interfaces (iWatch, Google Glass, BCIs, Facetime, Kinect, Surface, search engines), sensors (FitBit, smartphones, data recorders in cars, reactive billboard cameras, camera-equipped drones), compression technologies, and faster computer processors are accelerating human-driven input. Social media, search, info-warehousing, banking, research companies and universities are collecting and mining vast amounts of this input. These developments are expanding the market for data that can be more easily applied across industries and converted into money, which in turn is increasing the demand for prosumers that can input, sort and output this data.
But what about the 1.8 billion jobs that people need now? In one sense, they're already here.
Over the next decade, companies appear ready to pay more users more money for real-time and longitudinal life-streaming, driving, brain, health and genetic data; for product reactions, media reactions, geographic scans, species scans, general behavioral data, and so forth. More informational value will be captured on-the-fly as people socialize, consume media, learn, play games, navigate the world, or even sleep. Considering these near-term likelihoods, we can confidently predict that data-driven, prosumer-centric capitalism will steadily augment or even grab market share from the traditional 9-to-5, single-role, industrial economy.
A more extreme version of this scenario is that this transformation of capital flow and economics, something that Toffler explores in detail in Revolutionary Wealth, could become an outright boom. Billions of formally unemployed or underemployed humans could prove necessary for the rapid development of new technologies spanning medicine, entertainment, transportation, farming, warfare, search and artificial intelligence.
According to the survey by Latino Decisions, 85 percent of undocumented immigrants have family members who are U.S. citizens, including 62 percent who have at least one U.S.-born child. About 95 percent of undocumented have at least one other family member, of any status, living in the U.S. [...]
More than three-quarters of those surveyed said they came to the United States in search of better economic opportunity or to create a better life, the poll says. Just 12 percent said they came to the U.S. to reunite with family members.
The poll also revealed the deep ties many of the estimated 11 million undocumented immigrants have in the U.S. - 68 percent of the respondents said they have been living in the country for more than a decade, 71 percent living in homes that own cars and 15 percent said they are homeowners.
The President's Free-Trade Path to Prosperity : A new accord with the U.S. brings Japan one step closer to joining the game-changing Trans-Pacific Partnership. ( TOM DONILON, 4/15/13, WSJ)
With the worst of the economic crisis behind us, the war in Iraq over and the war in Afghanistan winding down, the U.S. is regaining the freedom of maneuver that allows us to make a set of strategic investments in our future. As the president's second term begins, the U.S. is at the center of two trade initiatives across the Atlantic and Pacific with the potential to encompass 60% of world trade. Just as our security alliances across two oceans have brought stability that extends far beyond our treaty partners, U.S. economic diplomacy today can advance global prosperity by strengthening the international rules and norms that make trade and growth possible.
To start, the U.S. is deepening economic engagement in an Asia-Pacific region where economic rules of the road are still taking shape. Nearly half of all growth outside the U.S. over the next five years is expected to come from Asia, and the choices that nations across the region make now will shape the character of the entire global system for years to come.
The economic linchpin of the multidimensional U.S. rebalancing strategy is the Trans-Pacific Partnership that the U.S. is negotiating with Asia-Pacific economies from Chile and Peru to New Zealand and Singapore. This TPP agreement is built on its members' shared commitment to high standards, comprehensive market access for goods and services exports, disciplines for 21st-century trade issues and respect for a rules-based trade and investment framework.
We always envisioned the TPP as a growing platform for regional economic integration, open to additional countries willing and able to meet its high standards. Under President Obama, the original seven TPP nations have grown to include Vietnam, Malaysia, Canada and Mexico. The TPP was already as ambitious as any trade negotiation in the world. Now we have completed bilateral work with Japan. This is an important step toward Japan joining TPP negotiations. A TPP that includes Japan, the world's third-largest economy, would represent an annual trading relationship of $1.7 trillion and a strong regional constituency for shared economic values.
A key attribute of the TPP is that it can get done in a timely fashion. The U.S. is working hard with the other parties to complete negotiations in 2013. Alongside the U.S.-Korea Free Trade Agreement and U.S. participation in the Asia-Pacific Economic Cooperation forum and the East Asia Summit, it will serve as a powerful statement of American engagement and staying power in East Asia.
Even as the U.S. is reaching to the east, we are also pursuing a Transatlantic Trade and Investment Partnership, or T-TIP, to deepen ties with our largest and longest-standing trading partners in Europe.
The 2012 election only mattered because someone is going to get political credit for these peace dividends, as Bill Clinton did in the 90s.
Businesses are being told to change their attitude to employees in their 50s and 60s or be left behind with a looming shortage of qualified staff. Older workers should be seen as the new "untapped source of labour" because ministers are limiting the number of immigrants allowed into Britain, according to a government guide.
"Mother and Dad didn't understand me; I didn't understand them," he told Jim Lehrer on "The NewsHour With Jim Lehrer" in 1999. "So consequently it was a strange kind of arrangement." Alone in his room, he would create characters and interview himself.
The family's fortunes collapsed with the Depression. The Winters National Bank failed, and Jonathan's parents divorced. His mother took him to Springfield, where she did factory work but eventually became the host of a women's program on a local radio station. Her son continued talking to himself and developed a repertory of sound effects. He often entertained his high school friends by imitating a race at the Indianapolis Motor Speedway.
A poor student, Mr. Winters enlisted in the Marines before finishing high school and during World War II served as a gunner on the aircraft carrier Bon Homme Richard in the Pacific.
After the war he completed high school and, hoping to become a political cartoonist, studied art at Kenyon College and the Dayton Art Institute. In 1948 he married Eileen Schauder, a Dayton native who was studying art at Ohio State. She died in 2009. His survivors include their two children, Jonathan Winters IV, of Camarillo, Calif., known as Jay, and Lucinda, of Santa Barbara, Calif.; and several grandchildren.
At the urging of his wife, Mr. Winters, whose art career seemed to be going nowhere, entered a talent contest in Dayton with his eye on the grand prize, a wristwatch, which he needed. He won, and he was hired as a morning disc jockey at WING, where he made up for his inability to attract guests by inventing them. "I'd make up people like Dr. Hardbody of the Atomic Energy Commission, or an Englishman whose blimp had crash-landed in Dayton," he told U.S. News and World Report in 1988.
After two years at a Columbus television station, he left for New York in 1953 to break into network radio. Instead he landed bit parts on television and, with surprising ease, found work as a nightclub comic.
A guest spot on Arthur Godfrey's "Talent Scouts" led to frequent appearances with Jack Paar and Steve Allen, both of them staunch supporters willing to give Mr. Winters free rein. Alistair Cooke, after seeing Mr. Winters at the New York nightclub Le Ruban Bleu, booked him as the first comedian to appear on his arts program "Omnibus."
In his stand-up act, Mr. Winters initially relied heavily on sound effects -- a cracking whip, a creaking door, a hovering U.F.O. -- which he used to spice up his re-enactments of horror films, war films and westerns. Gradually he developed a gallery of characters, which expanded when he had his own television shows, beginning with the 15-minute "Jonathan Winters Show," which ran from 1956 to 1957. He was later seen in a series of specials for NBC in the early 1960s; on an hourlong CBS variety series, "The Jonathan Winters Show," from 1967 to 1969; and on "The Wacky World of Jonathan Winters," in syndication, from 1972 to 1974.
Many of Mr. Winters's characters -- among them B. B. Bindlestiff, a small-town tycoon, and Piggy Bladder, football coach for the State Teachers' Animal Husbandry Institute for the Blind -- were based on people he grew up with. Maude Frickert, for example, whom he played wearing a white wig and a Victorian granny dress, was inspired by an elderly aunt who let him drink wine and taught him to play poker when he was 9 years old.
Other characters, like the couturier Lance Loveguard and Princess Leilani-nani, the world's oldest hula dancer, sprang from a secret compartment deep within Mr. Winters's inventive brain.
As channeled by Mr. Winters, Maude Frickert was a wild card. Reminiscing about her late husband, Pop Frickert, she told a stupefied interviewer: "He was a Spanish dancer in a massage parlor. If somebody came in with a crick in their neck he'd do an orthopedic flamenco all over them. He was tall, dark and out of it."
One of Mr. Winters's most popular characters, she appeared in a series of commercials for Hefty garbage bags, which also featured Mr. Winters as a garbage man dressed in a spotless white uniform and referring, in an upper-class British accent, to gar-BAZH. Carson kidnapped Maude Frickert and simply changed the name to Aunt Blabby, one of his stock characters. Mr. Winters said that the blatant theft did not bother him.
Mr. Winters often called himself a satirist, but the term does not really apply. In "Seriously Funny," his history of 1950s and 1960s comedians, Gerald Nachman described him, a bit floridly, as "part circus clown and part social observer, Red Skelton possessed by the spirit of Daumier."
He was hard to define. "I don't do jokes," he once said. "The characters are my jokes." At the same time, unlike many comedians reacting to the Eisenhower era, he found his source material in human behavior rather than politics or current events, but in him the spectacle of human folly provoked glee rather than righteous anger.
In 1961 Variety wrote, "His humor is more universally acceptable than any of the current New Comics, with the possible exception of Bob Newhart, because he covers the mass experiences of the U.S. common man -- the Army, the gas station, the airport."
Casino Royale: 60 years old today : Ian Fleming's James Bond novel Casino Royale was first published on April 13 1953 and there is an intriguing tale behind the original screenplay of the 007 film adaptation. (Jeremy Duns, 13 Apr 2013, The Telegraph)
Much of the creative renaissance of the past decade stems from the decision to return to the spirit of Fleming's novels. Craig's Casino Royale was an adaptation of Fleming's first novel. The book merged the traditions of vintage British thrillers with the more realistic and brutal style of hardboiled American writers such as Dashiell Hammett.
But Craig's debut was not the first attempt to film the novel, but the third. The first was a one-hour play performed live on American television in October 1954: Barry Nelson starred as crew-cut American agent "Jimmy Bond" out to defeat villain Le Chiffre, played by Peter Lorre, at baccarat to ensure he will be executed by Soviet agency Smersh for squandering their funds. Due to the format, this was a much-simplified version of Fleming's novel, with little of its extravagance or excitement.
The book features a wince-inducing scene in which Le Chiffre, desperate to discover where Bond has hidden the cheque for 40 million francs that he needs to save his life, ties Bond naked to a cane chair with its seat cut out and proceeds to torture him by repeatedly whacking his testicles with a carpet-beater. This could clearly not be shown on television, so instead Bond was placed in a bath, his shoes removed, and viewers watched him howl with pain as, off-screen, Le Chiffre's men attacked his toenails with pliers. [...]
Of all the Bond books, Casino Royale was one of the more problematic to adapt for film. On the one hand, it's one of Fleming's strongest novels (Raymond Chandler and Kingsley Amis both felt it his best): intense, almost feverishly so, and richer in characterisation and atmosphere than many of the others.
But the novel is also short -- practically a novella -- with little physical action in it other than the infamous torture scene. Bond also falls in love with his fellow agent on the mission, Vesper Lynd, and even considers proposing marriage to her before he discovers she has been coerced into working for Smersh and has betrayed him. She kills herself, and the novel ends with Bond reporting to London savagely that "the bitch is dead".
The Decline of Obama : How to lose friends and influence. (Fred Barnes, April 22, 2013, Weekly Standard)
Under Obama, the presidency has been in decline. His use of the budget as a ploy against Republicans is an example of this. The biggest domestic issue is the looming fiscal crisis, but Obama has addressed it only rhetorically. Instead he's used the budget largely as a political tool that cheapened the presidency.
Other presidents have done this, but far less crassly or brazenly. At least they presented their budgets on time, as required by law. Obama was two months late. He erased one of Washington's oldest adages: The president proposes, Congress disposes. By last week, both the Senate and House had already passed budget resolutions.
Obama's tardiness touches on another aspect of presidential decline: the loss of influence. By long tradition, any release of the budget produced by the White House was a major event. True, the impact of the president's budget has waned in recent years. Obama has made it an afterthought.
On Capitol Hill today, Obama has scarcely any clout at all. One reason: He acts as if spending time with members of Congress, even Democrats, is an unpleasant chore. Another reason: Having deferred to Democrats in his first term, he finds it difficult to pull rank on them in his second. And having ignored or alienated Republicans, he isn't likely to achieve much by courting them over dinner in recent weeks.
Immigration and gun control are the dominant issues in Congress at the moment, and Obama is a major player on neither of them. The "gang of eight"--four Democrats, four Republicans--is the driving force on immigration in the Senate. Obama is no force at all.
Only the Right and the Left ever thought he mattered as anything more than a symbol of racial progress.
Emanuel drew a gasp from the crowd when he noted the U.S.'s healthcare spending last year -- $2.87 trillion -- makes it equivalent to the fifth largest economy in the world. "We spend more on healthcare in this country than the 66 million French spend on everything in their society," he said.
The federal government's share of health spending through programs like Medicare, Medicaid, and Veterans Affairs equates to the 16th largest economy in the world -- bigger than Turkey, the Netherlands, and Switzerland.
The only way to reduce that spending is to move away from a fee-for-service model that rewards volume over value and quantity over quality, Emanuel said, noting that the fee-for-service system has driven the spending growth that we have today.
"We need to take the responsibility now as a group in pushing for payment change," Emanuel said. "It's the only way we can facilitate the transformation and re-engineering that the system needs and we need to care about."
He called on physicians to "get our house in order" and "collectively campaign" for these changes.
If physicians take the lead, they can enhance their autonomy and design the delivery system they want, but that system will also assign them financial risk.
"I see no way of getting out of that," he warned. "It's not a dilemma. It's just the inherent nature of what it means to assume autonomy."
Modern medicine is just another consumer good, one that is subsidized to the enormous degree that Mr. Emanuel notes. The notion that those selling the commodity to consumers will drive money out of the system is ludicrous. Only by unleashing the power of consumers can you effect serious change.
ON Wednesday, Margaret Thatcher's funeral, at St Paul's Cathedral in London, will put her in the company of only three other Britons in more than 200 years: Lord Nelson, the Duke of Wellington and Sir Winston Churchill.
To Mrs. Thatcher's admirers, this is entirely appropriate: she made Britain great again; she was the longest-serving prime minister of the 20th century; and she was the first and only woman to hold that post.
But to her detractors, such elaborate obsequies are wholly inappropriate: imperial pomp that is out of place in post-imperial Britain, a state-funded extravaganza that seems a self-indulgent luxury at a time when government spending is being drastically cut. Hence the demands that the funeral should be privatized by obtaining some form of commercial sponsorship, and hence the threats of protests and demonstrations on the day itself.
Like Churchill before her, Mrs. Thatcher was always more uncritically admired in the United States than in Britain. Here she was a world leader, of unrivaled charisma and authority; there she was a partisan politician, who regarded her enemies as knaves or fools -- or Marxists.
...they are linked by the fact that they won significant victories in the Long War. Eventually, Tony Blair should share their honor.
That Mrs. Thatcher also revolutionized the Welfare State hardly need enter the equation.
With mortgage interest rates low and prices for existing homes rising, builders are coming back into the market. They're bidding up land prices, scrambling to find workers and building bigger, more expensive homes than they did just a few years ago.
The tents in Huntington Beach brought to mind last decade's housing boom, but the market has changed dramatically since then. Tens of thousands of Southland homeowners who would like to sell their homes still can't because they're underwater. That has created a dearth of inventory that's fueling bidding wars and camp-outs for a limited supply of dwellings.
The median home price for new homes sold in Southern California jumped 19% year-over-year to $401,000 in February, according to real estate firm DataQuick. Builders started construction on 2,097 new, single-family homes in Southern California during the fourth quarter of 2012 -- a 56% increase from the same quarter the previous year, according to research firm Metrostudy.
That sharp increase is perhaps not surprising, given that 2011 was the worst year on record for new construction. Still, the rapid turnaround may be one of the most meaningful developments for the broader economy; new home construction is a powerful driver of jobs and economic activity.
The building will create jobs not only in construction but in related industries including lumber, concrete, heating and air conditioning and more. The economic boost should be even stronger when builders complete once-dormant subdivisions and move on to virgin land.
"They are going to begin with the development of new land, and that is typically about a third of the cost or more of the overall house," said Gerd-Ulf Krueger, principal economist at HousingEcon.com. "That is going to have a pretty big impact on the housing economy."
...because construction is no kind of job for a white man and we haven't made it easy enough for women to do yet.
[The president of Fidelity Investments, Ronald] O'Hanley argued in a speech that workers should be required by default to set aside 6% of their earnings for the long term. Legislation passed in 2006 made 3% the "default" rate for 401(k) plans, a level from which workers can opt-out.
"The proverbial four-legged stool -- consisting of Social Security, traditional pension plans, defined contribution plans and personal savings -- is wobbly at best, and by and large does not exist for most Americans," O'Hanley said, although he believes the opt-out clause should remain.
...government will simply do the savings for us, via devices like O'Neill accounts. The reality is that it costs less in the long run to fund your welfare up front, when you're young and the money will build, than on the back end.
It's painful for me to admit that in my adopted hometown of Beit Shemsh, every year I'm a little less free. As recently as 2006, the only public route I had to Bar Ilan University was a privately run, women-at-the-back bus. I used to laugh at the absurdity of my reality. Twice weekly, I sat in the back half of a bus for twenty minutes while the driver simultaneously puffed away and filled up with illegal gas from a pump that seemingly arose out of the ground. I spent the time imagining all of us (modestly) blowing sky high.
I don't know when I realized it, but at some point I understood that this behavior was not something marginal but rather a force that wished to wind its way into every facet of my life with a desire to expand. And, slowly, it has.
In 2011 I whimsically chose then-journalist Yair Lapid as the prime minister I wanted to make a cameo appearance in my satiric novel "King of the Class," which takes the country's deep internal divisions to their logical dystopian conclusion. I remember my first readers' puzzled expressions: Who is Yair Lapid? No one's asking that question anymore. Last year a government sans Haredim (whether you think it's a good or bad thing) was unimaginable. Today, it's obvious that the last coalition negotiations were based on keeping the Haredim out of the government for the first time ever.
"King of the Class" is set in the near future in a post-civil war Israel that is split into two states, the religious fundamentalist state of Shalem and the militant secular state of Israel. When I wrote my novel a Jewish civil war was a fringe idea. Today, the possibility of such a scenario is an all too common refrain in Israeli media. We are getting closer to a place my great grandparents never imagined living in, a place where the real enemy is not without, but within.
Motivating this racism of exclusion are three principal fears. The first is existential fear, clearly expressed by Ron, a Jerusalem high school student, in a recent interview.
"There is a small part within every Arab, even those who say they want to live with us in peace, that can without warning jump on you and stab you with a knife. There is nothing you can do about it: In their roots they are against Jews."
According to this view, Arabs are fundamentally untrustworthy, constituting a constant "security threat."
They are the enemy. This applies to all Israeli Arabs, regardless of citizenship or how they come across as individuals.
Not surprisingly, Ron does not want to see Arabs in the public sphere: "not in the streets, not at the mall, not on the light rail."
But is it right to call this racism? Not according to Nir, another high school student, for whom "racism is when you hate a person without reason."
His hatred for Arabs is well-grounded, he thinks. He simply doesn't feel safe around them. "They brought it upon themselves, with all the terror attacks," Nir explains. "Because they are here, part of my life is ruined."
The fear motivating these attitudes and beliefs is understandable, yet it is nevertheless irrational and racist, as are the reactions themselves. It is irrational because according to the Shin Bet (Israel Security Agency) the involvement of Israeli Arabs in terrorist attacks remains "relatively minor."
To fear all Israeli Arabs because of the subversive or violent actions of a few is as irrational as it would be for a woman to fear all men because of the sexual assaults committed by some men. Such fear is also racist, for it presupposes that "in their roots" all Arabs hate Jews, thereby promoting the stigmatization and penalization of individuals based solely on group membership.
The heart of racism is not gratuitous hatred, but Manichean essentialism. Modern racism is based on the belief that members of a certain group - whether racial, cultural, national, religious, or some other belonging - possess features that warrant hostility and discrimination and, most important, that these features are rigid and unchangeable, resulting in unbridgeable differences between "us" and "them" so that "there is nothing you can do about it," as Ron said, except separate the two groups.
Minimizing contact between Jews and Arabs also responds to the second motivation underlying the racism of exclusion - the fear of miscegenation. As one person put it, "My blood freezes when I see an Arab man talking to a Jewish girl."
This is not simply the ancient Jewish aversion to assimilation. Nor is it simply concern for the girl's well-being, sincere as it may be. There is also fear of the "demographic threat" of a possible Arab majority in Israel. From this perspective, a Jewish woman (and the focus is predominantly on women) who goes off with an Arab man, diminishes the Jewish collective, particularly if she bears children of mixed identity.
But there is a further, more subtle fear that drives the racism of exclusion. This fear is a response to a core tenet of Zionism, namely, that appearances can create reality and that political power comes from appearing in the public sphere.
Theodor Herzl's utopian novel Altneuland ("The Old New Land") made public his dream of a Jewish national homeland and in so doing helped transform the dream into reality. Homa U'Migdal, the Jewish settlements of the 1930s, also showed how the barest appearance of a settlement - a tower and stockade - sufficed to create a new geographic and political reality overnight. Those calling for the exclusion of Israeli Arabs from the public sphere recognize the power of appearances.
The mere presence of an Arab woman, a teacher, in a Jewish neighborhood challenges the claim to exclusive Jewish ownership and control of that space. It reminds us that the Jewish homeland is also home to Arabs who have a right to equal consideration and respect as Israeli citizens. For the woman to pay a condolence call on a Jewish colleague while wearing the hijab is to implicitly reject invisibility and impotence and assert her right to be and act as an individual and an Arab in the public sphere.
WE NOMINATED PETER LAWLER'S POSTMODERN CONSERVATIVE:
The Web Marketing Association has opened its 17th annual international WebAward competition for website development and is looking for individuals that would like to nominate their religious orientated website for Best Faith-Based Website of 2013.
We are a volunteer organization that has worked since 1997 to establish standards of excellence for website development across all industries. It would be very helpful if you could help us spread the word to the web development community developing faith-based websites.
Please consider including this for your audience and let me know if you have any questions.
Renewables could be the world's primary source of energy if only someone could solve the storage problem--how to store lots of electricity cheaply on a wide scale? Batteries are too expensive and don't last long enough. Pumped hydro is cheap but not feasible for most locations. Thermal storage is promising but still too expensive or hard to scale. Compressed air is cheap and scalable but not yet efficient enough (although LightSail, a new company backed by Peter Thiel, Vinold Khosla and Bill Gates, hopes to change that). And what about flywheels? The biggest player, Beacon Power, went bankrupt in 2011.
Flywheels may be getting a second life, however. Silicon Valley inventor Bill Gray has a new flywheel design that would deliver distributed and highly scalable storage for around $1,333 a kilowatt, making it price competitive with pumped hydro and compressed air. With an efficiency of more than 80 percent, it would rival the best storage alternatives, and come with a 10-year guarantee. And it would make a perfect complement to an off-grid house with a solar photovoltaic (PV) system, able to charge fully in five hours--within the charging time of most solar PV systems--and store 15 kilowatt-hours of power, enough to run a modest house from sunset to sunrise.
The Glass Arm : Inside the art and science (but mostly still art) of keeping pitchers from getting hurt. (Will Leitch, Mar 17, 2013, New York)
There's something strange about almost every snapshot ever taken of a professional baseball pitcher while he's in his windup or his release: They look grotesque. A pitcher throwing, when you freeze the action mid-movement, does not look dramatically different from a basketball player spraining his ankle or a football player twisting his knee. His arm is almost hideously contorted.
"It is an unnatural motion," says former Mets pitcher and current MLB Network analyst Al Leiter, who missed roughly three years of his career with arm injuries. "If it were natural, we would all be walking around with our hands above our heads. It's not normal to throw a ball above your head."
Ever since Moneyball, baseball has had just about everything figured it out. General managers know that on-base percentage is more important than batting average, that college players are more reliable draft targets than high-school players, that the sacrifice bunt is typically a waste of an out. The game has never been more closely studied or better understood. And yet, even now, no one seems to have a clue about how to keep pitchers from getting hurt.
Pitchers' health has always been a vital part of the game, but it's arguably never been more important than it is today. In the post-Bonds-McGwire-Sosa era (if not necessarily the post-PED era), pitching is dominant to a degree it hasn't been in years. In the past three seasons, MLB teams scored an average of roughly 4.3 runs per game. The last time the average was anywhere near as low was 1992, at 4.12. In 2000, the heyday of Bonds & Co., it was 5.14. A team with great pitching is, in essence, a great team. Pitchers themselves have never stood to gain, or lose, as much as they do now. The last time scoring was this low, the average baseball salary had reached $1 million for the first time and the minimum salary was $109,000. Now that average salary is $3.2 million. Stay healthy, and you're crazy-rich. Blow out your elbow, and it's back to hoping your high-school team needs a coach.
And yet, for all the increased importance of pitching, pitchers are getting hurt more often than they used to. In 2011, according to research by FanGraphs.com, pitchers spent a total of 14,926 days on the disabled list. In 1999, that number was 13,129. No one is sure why this is happening, or what to do about it, but what is certain is that teams are trying desperately to divine answers to those questions. Figuring out which pitchers are least likely to get hurt and helping pitchers keep from getting hurt is the game's next big mystery to solve, the next market inefficiency to be exploited. The modern baseball industry is brilliant at projecting what players will do on the field. The next task is solving the riddle of how to keep them on it.
'Crunchiness' Personified : Margaret Thatcher rejected conventional wisdom in favor of hard truths. (Michael Barone, 4/11/13, National Review)
When Thatcher became prime minister in 1979, the consensus was that Britain was in inevitable decline. She hated that idea and proved that it was wrong. The great and the good never forgave her for it.
They also never forgave her for her suspicion of an ever-closer European Union and opposition to the creation of the Euro currency. Continental elites saw European unity as a way to prevent the horrors of another world war. American elites assumed a United States of Europe would be as benign as the United States of America.
Margaret Thatcher disagreed. She believed that the nation-state, with its long heritage of shared values, democratic governance, and economic practices, was the essential unit in politics and economics. A single European currency, she argued, could not work in a continent whose nations had different economies, cultures, and traditions.
Joe Weisenthal pointed out in Business Insider that, in her 1993 and 1995 autobiographies, Thatcher recounted the arguments she pressed on her successor, John Major. She noted that Germany "would be worried about the weakening of anti-inflation policies" and that the poorer countries would seek subsidies "if they were going to lose their ability to compete on the basis of a currency that reflected their economic performance." This has worked out exactly as she expected and warned. Fortunately for Britain, Thatcher's successors were stopped, perhaps fearing her disapproval, from ditching the pound and lurching into the euro as the great and good almost unanimously advised.
"Crunchiness brings wealth," wrote The Economist's Nico Colchester. "Wealth leads to sogginess. Sogginess brings poverty. Poverty creates crunchiness."
By crunchiness he meant "systems in which small changes have big effects, leaving those affected by them in no doubt whether they are up or down." In contrast, "sogginess is comfortable uncertainty."
Margaret Thatcher was crunchiness personified; that is what reporters are referring to when they say she was "divisive."
So much for that weak dollar, huh? The Bitcoin craze may finally be peaking and now another "alternative" currency is plunging too: gold.
Gold, which in its defense is at least a tangible asset, sank 5% Friday and fell below $1500 an ounce. [...]
[T]he latest producer price index figures, released Friday, showed that wholesale prices fell 0.6% in March. That might be one of the big reasons why gold was tanking.
It is tangible, like a tulip bulb, but, unlike a home, it has no intrinsic value.
"I have always classified [myself] as an actor/comedian, a humorist," he says. "I started out as an artist and what I do is verbal paintings. I paint a picture. Hopefully you'll see the characters and what they're doing and what they're saying."
Take the proprietor of the Used Pet Shop, a character from Winters' 1960 comedy album The Wonderful World of Jonathan Winters. In the album, Winters introduces the proprietor in one of his patented rural voices -- thick with the Ohio countryside -- as an apocryphal uncle who offered customers a real deal.
"I can give you that kangaroo over there for 10-and-a-half," Winters has the uncle say. "Come all the way from Australia, as most of them do. I got him as far as Muncie but he fell off a flatcar and broke his tail. You know, most of them set back on their tails like this. But this one, you have to lean against something."
There's a sense of danger in the idea of slightly imperfect bargain animals, and that danger is present in many of Winters' routines. According to him, that's the way it should be.
And then take a look at the video above. It is thousands of Street Views stitched together to create one big animated Street View, or "hyperlapse." Canadian agency Teehan+Lax has created a whole tool allowing you to make your own adventures. Just plug in two points, and it will do all the stitching for you.
[R]ickey famously says he's looking for a man "with guts enough not to fight back." He needs someone who will resist the temptation to retaliate. Robinson agrees to go along with it.
But where did Rickey get that crazy idea and why did Robinson agree? The film doesn't tell us, but the answers to these questions lie in the devout Christian faith of both men.
For starters, Rickey himself was a "Bible-thumping Methodist" who refused to attend games on Sunday. He sincerely believed it was God's will that he integrate baseball and saw it as an opportunity to intervene in the moral history of the nation, as Lincoln had done.
And Rickey chose Robinson because of the young man's faith and moral character. There were numerous other Negro Leagues players to consider, but Rickey knew integrating the racist world of professional sports would take more than athletic ability. The attacks would be ugly, and the press would fuel the fire. If the player chosen were goaded into retaliating, the grand experiment would be set back a decade or more.
Rickey knew he must find someone whose behavior on and off the field would be exemplary, and who believed "turning the other cheek" was not just the practical thing to do but the right thing. In their historic meeting, to underscore the spiritual dimension of the undertaking, Rickey pulled out a book by Giovanni Papini, titled Life of Christ. He opened to the passage about the Sermon on the Mount and read it aloud.
We know that Robinson's passionate sense of justice had gotten him into trouble earlier in life. But the patient mentoring of pastor Karl Downs convinced him that Christ's command to "resist not evil" wasn't a cowardly way out but a profoundly heroic stance.
When he met Rickey, Robinson was prepared for what lay ahead and agreed. But it was a brutally difficult undertaking. Robinson got down on his knees many nights during those first two years, asking God for the strength to continue resisting the temptation to fight back, or to say something he would regret.
An ambitious new plan, called the Integrated Global Action Plan for the Prevention and Control of Pneumonia and Diarrhea, launched this month by the World Health Organization and UNICEF, aims to step up existing interventions and pool global efforts, with the goal of reducing the number of deaths from pneumonia to less than three children per 1,000, and of diarrhea-related deaths to below one in 1,000. This would effectively end the preventable deaths of more than two million children every year.
For any other infectious disease or global health threat, achieving this kind of reduction in incidence rate and mortality would be nothing short of miraculous. Yet, for pneumonia and diarrhea, we have every reason to believe that we can succeed, because we already know what works.
For example, infants who are not exclusively breastfed for the first six months have a ten-fold increase in the risk of death from diarrhea, and are 15 times more likely to die from pneumonia. Similarly, basic sanitation, such as improved hand washing and access to clean water, and better nutrition can also produce significant risk reduction, much of which can be achieved through simple education programs.
Immunization is also highly effective. Vaccinating children against rotavirus, for example, can protect them from a pathogen that is responsible for 37% of all diarrhea deaths in children under five, thus saving 450,000 lives every year. Similarly, vaccines exist to protect against pneumococcal disease, which accounts for a half-million pneumonia-related deaths annually.
In 1939 he had pleaded for the long-promised assimilation of native Algerians. At that point, nearly 80 percent of the indigenous population wanted to become French citizens. By 1945 the promise had been too long delayed; scarcely anyone believed in it anymore. Moreover, as Camus pointed out, "hundreds of thousands of Arabs have spent the past two years fighting for the liberation of France." It seems incredible in retrospect that a new French government thought it could simply resume the old colonial relationship with only minor modifications, but that is evidently what it assumed. This was folly, Camus protested. The French would "have to conquer Algeria a second time," and "this second conquest will not be as easy as the first."
Arab public opinion had shifted from assimilation to federation and a modified form of independence. Camus strongly supported this position, though he carefully couched his support in a patriotic appeal to French wisdom and grandeur. He knew all too well the intransigence of the French Algerian community; he also recognized--almost uniquely among French intellectuals--that his fellow pieds-noirs, though dismayingly many of them were pigheaded racists, nevertheless had rights, too, and were just as much oppressed as they were oppressors.
The French government continued to dither. In 1948 it allowed elections for two separate assemblies, French and Muslim, but when it looked like the pro-independence parties would dominate the latter, the colonial administration rigged the elections and began arresting the leaders. Predictably, this led to further Arab protest, which led to further French repression. A National Liberation Front (FLN) formed, demanding complete independence. It was, of course, outlawed. In late 1954, the FLN launched a guerrilla offensive, to which the French government responded by escalating its repression. In August 1955, the FLN massacred 123 French and Muslim civilians, and the French Army (along with paramilitary groups of pieds-noirs) went on a rampage, killing thousands of guerrillas and Arab civilians. The Algerian War had begun in earnest. [...]
Moral imagination is not to be expected, perhaps, from politicians or military commanders. But even the intellectuals of Paris and Algiers failed to respond, preferring partisan commitment. Camus was profoundly discouraged, and moreover bore many scars from earlier Parisian polemics. Further ridicule was in store: At a press conference in Stockholm after the Nobel ceremony, Camus made a statement widely misreported as "I believe in justice, but I will defend my mother before justice." Goldhammer and Alice Kaplan--in her introduction to this edition--perform a considerable service in pointing out that Camus said nothing so simplistic. What he said was: "People are now planting bombs in the tramways of Algiers. My mother might be on one of those tramways. If that is justice, then I prefer my mother." He was not sentimentally exalting his mother above justice; he was rejecting the equation of justice with revolutionary terrorism.
But by the time he put together Chroniques algériennes the following year, his bridges to his fellow intellectuals had been burned.
Nothing so became him as the hatred of the intellectuals.
Beware the DSM-5, the soon-to-be-released fifth edition of the "psychiatric bible," the Diagnostic and Statistical Manual. The odds will probably be greater than 50 percent, according to the new manual, that you'll have a mental disorder in your lifetime.
Although fewer than 6 percent of American adults will have a severe mental illness in a given year, according to a 2005 study, many more--more than a quarter each year--will have some diagnosable mental disorder. That's a lot of people. Almost 50 percent of Americans (46.4 percent to be exact) will have a diagnosable mental illness in their lifetimes, based on the previous edition, the DSM-IV. And the new manual will likely make it even "easier" to get a diagnosis.
Clive Davis, director at recruitment specialist Robert Half UK, agrees that benefits around flexible working are the most desirable: "The ability to work flexibly from home is often valued above a greater salary. Even the option to carry leave days over to the next year is appreciated."
The Chartered Institute of Personnel and Development found in its 2012 Reward Management Survey that flexible working was the most desired employee benefit followed by generous annual leave and training and career development.
...is not having to pretend to be working. The strain of trying to seem busy is what makes the workplace miserable.
While it's difficult to get accurate information about North Korea - a police state that rarely admits foreigners - refugees and other sources of information have helped outsiders sketch the country's bankrupt economy. Here's a snapshot of life in North Korea:
- Annual GDP per capita is about $1,800, which ranks 197th in the world, according to the CIA World Factbook. GDP is 28 times higher in the United States and 18 times higher in South Korea.
- About half of North Korea's population of 24 million lives in "extreme poverty," according to the KUNI report. These people subsist on corn and kimchi and "are severely restricted in access to fuel for cooking and heating."
- One-third of children are stunted, due to malnutrition, according to the World Food Program.
- The average life expectancy, 69, has fallen by five years since the early 1980s, according to the blog North Korea Economy Watch. The blog notes that those figures are based on official statistics, so the real numbers could be even lower.
- Inflation may be as high as 100 percent, due to mismanagement of the currency.
- Most workers earn $2 to $3 per month in pay from the government. Some work on the side or sell goods in local markets, earning an extra $10 per month or so.
- Most homes and apartments are heated by open fireplaces burning wood or briquettes. Many lack flush toilets.
- Electric power is sporadic and unreliable, with homes that have electricity often receiving just a few hours per day.
[I] am changing my policy recommendation from neutrality to something that causes me, as a humanitarian and decades-long foe of the Assad dynasty, to pause before writing: Western governments should support the malign dictatorship of Bashar Assad.
Here is my logic for this reluctant suggestion: Evil forces pose less danger to us when they make war on each other. This (1) keeps them focused locally and (2) prevents either one from emerging victorious (and thereby posing a yet-greater danger). Western powers should guide enemies to stalemate by helping whichever side is losing, so as to prolong the conflict.
Thanks to new projects across the country, solar energy accounted for all new utility electricity generation capacity added to the grid for the first time in March, according to the Federal Energy Regulatory Commission's (FERC) Energy Infrastructure Update. All other energy sources combined added no new generation capacity, the report noted.
Since 2008, the amount of solar energy powering U.S. homes, businesses and military bases has grown by more than 600 percent according to the Solar Energy Industries Association. In 2012 alone, the United States brought more new solar capacity online than in the three prior years combined, underscoring projections that solar will be the nation's largest new source of energy over the next four years.
The continuing economic uncertainty makes the survey's findings of strong support for immigration all the more notable, said Mr. McInturff. Some 54% in the survey agreed with the statement that immigration strengthens the country and "adds to our character," a higher reading than the 47% who agreed with that statement in mid-2010 and the 41% in 2005.
"I think it speaks to something pretty potent that we're seeing this kind of change in that kind of economic uncertainty," Mr. McInturff said.
The recent survey found 36% of respondents said immigration weakens the U.S.
Nearly two-thirds of those surveyed said they favor giving citizenship to those who came here illegally and now hold jobs. Support jumped to 76% for a plan that required immigrants to pay fines, back taxes and pass a security check, among other measures, to gain citizenship.
Researchers have already uncovered worrying signs that exposure to traffic - and the vehicle emissions that come with it - can increase a child's risk of developing asthma and autism. Now comes evidence that it may make children more susceptible to certain kinds of cancers.
Researchers used the California Department of Transportation's computer model of traffic-related air pollution to estimate pollution exposure in communities across the state. They also used the California Cancer Registry to identify 3,590 children born between 1998 and 2007 who were diagnosed with some type of cancer. Then they compared the two to look for links between traffic and cancer incidence.
The one thing that's really clear about the current corporate income tax is that there's a huge divergence between theory and practice. In theory, profits are taxed at a 35 percent rate. In practice, many companies pay much less than that. General Electric, famously, managed to pay no income tax whatsoever in 2010. The murky incidence of corporate taxation means GE's lobbyists can push to preserve its tax breaks in good conscience. But firms that do pay high taxes can plausibly argue that the current system is unfair and we drastically need rate cuts in good conscience. Meanwhile, other companies such as Apple manage to pay a high rate while also squirreling away untold billions in untaxed allegedly offshore accounts.
Looking at any one of these corporate tax avoidance strategies suggests particular solutions: If you're bothered about companies like GE, scrap various tax breaks. If you're worried about companies paying high rates, lower the rate. If you're annoyed about Apple, go after foreign accounts. But looking at them all simultaneously suggests an alternative to reform. Just give up. Though the corporate income tax as presently constructed supports a small army of accountants, tax lawyers, lobbyists, and CNBC talking heads, it doesn't raise very much revenue. The 1-2 percent of GDP it brings in to the federal government is too much to do without but hardly too much to raise through other means.
Rather than trying to mend the tax, we ought to end it and replace it with something else.
"The only answer to the 'always 30 years in the future' argument is that we simply demonstrate it," Slough said. And that's what he and his colleagues intend to do this summer, at their lab inside a converted warehouse in Redmond, Wash.
It's obvious that nuclear fusion works: A prime example of the phenomenon can be seen every day, just 93 million miles away. Like other stars, our sun generates its power by combining lighter elements (like hydrogen) into heavier elements (like helium) under tremendous gravitational pressure. A tiny bit of mass from each nucleus is converted directly into energy, demonstrating the power of the equation E=mc2.
Thermonuclear bombs operate on a similar principle. But it's not practical to set off bombs to produce peaceful energy, so how can the fusion reaction be controlled on a workable scale?
Slough and his colleagues are working on a system that shoots ringlets of metal into a specially designed magnetic field. The ringlets collapse around a tiny droplet of deuterium, a hydrogen isotope, compressing it so tightly that it produces a fusion reaction for a few millionths of a second. The reaction should result in a significant energy gain.
"It has gain, that's why we're doing it," Slough said. "It's just that the form the energy takes at the end is hot, magnetized metal plasma. ... The problem in the past was, what would you use it for? Because it kinda blows up."
That's where the magnetic field plays another role: In addition to compressing the metal rings around the deuterium target, the field would channel the spray of plasma out the back of the chamber, at a speed of up to 67,000 mph (30,000 meters per second). If a rocket ship could do that often enough -- say, at least once a minute -- Slough says you could send a human mission to Mars in one to three months, rather than the eight months it took to send NASA's Curiosity rover.
Philips has cut the amount of power of its overhead LED tube light in half, a sign of continuing improvements in LED lighting geared at displacing incumbent technologies.
The company says it has built a prototype of a tubular overhead LED light that produces 200 lumens of light with a watt of power. Its current products produce light at 100 lumens per watt, about the same as florescent tube lights. Even though the price of LEDs will be higher, Philips thinks that they can start to displace more of the florescent tube lights that are everywhere from office buildings to parking garages based on energy savings.
The company plans to commercialize the technology in 2015 and transfer it to other products, including consumer light bulbs. In a consumer LED light bulb, that would mean that a 60-watt replacement would consume about 5 watts. "You could easily see how it will work through the entire retrofit line," says Coen Liedenbaum, the innovation area manager in lighting at Philips Lighting.
The Liberal Civil War Over Social Security Cuts : President Obama wants to adopt a new measure of inflation that would cut Social Security benefits--and debate over the proposal has split his base. (Erika Eichelberger, Apr. 11, 2013, Mother Jones)
Obama's budget would squeeze old age benefits by changing the way inflation is calculated so that increases in future monthly Social Security payments grow more slowly. Right now, benefit increases are tied to the Consumer Price Index, which tracks the prices of a bunch of consumer products. First Republicans, and now Obama, have proposed changing that to something called chained CPI, a different calculation that ends up producing a lower rate of inflation by accounting for consumers switching to cheaper substitutes when a product's price jumps. The president's proposal includes exemptions for the oldest and poorest beneficiaries, but it would cost all other retirees hundreds of dollars in lost benefits every year. The White House estimates the measure will save $230 billion over 10 years.
Thatcher's long ministry of nearly a dozen years is often mistakenly described as ideological in tone. In fact Thatcherism was (and is) essentially pragmatic and empirical. She tackled the unions not by producing, like Heath, a single comprehensive statute but by a series of measures, each dealing with a particular abuse, such as aggressive picketing. At the same time she, and the police, prepared for trouble by a number of ingenious administrative changes allowing the country's different police forces to concentrate large and mobile columns wherever needed. Then she calmly waited, relying on the stupidity of the union leaders to fall into the trap, which they duly did.
She fought and won two pitched battles with the two strongest unions, the miners and the printers. In both cases, victory came at the cost of weeks of fighting and some loss of life. After the hard men had been vanquished, the other unions surrendered, and the new legislation was meekly accepted, no attempt being made to repeal or change it when Labour eventually returned to power. Britain was transformed from the most strike-ridden country in Europe to a place where industrial action is a rarity. The effect on the freedom of managers to run their businesses and introduce innovations was almost miraculous and has continued.
Thatcher reinforced this essential improvement by a revolutionary simplification of the tax system, reducing a score or more "bands" to two and lowering the top rates from 83% (earned income) and 98% (unearned) to the single band of 40%.
She also reduced Britain's huge and loss-making state-owned industries, nearly a third of the economy, to less than one-tenth, by her new policy of privatization--inviting the public to buy from the state industries, such as coal, steel, utilities and transport by bargain share offers. Hence loss-makers, funded from taxes, became themselves profit-making and so massive tax contributors.
This transformation was soon imitated all over the world. More important than all these specific changes, however, was the feeling Thatcher engendered that Britain was again a country where enterprise was welcomed and rewarded, where businesses small and large had the benign blessing of government, and where investors would make money.
As a result Britain was soon absorbing more than 50% of all inward investment in Europe, the British economy rose from the sixth to the fourth largest in the world, and its production per capita, having been half that of Germany's in the 1970s, became, by the early years of the 21st century, one-third higher.
The kind of services that Thatcher rendered Britain in peace were of a magnitude equal to Winston Churchill's in war.
Central banking is not rocket science, but neither is it a trivial pursuit. Excellent books have continued to be written about the art and craft of central banking, from Walter Bagehot's Lombard Street in 1873 to Alan Blinder's Central Banking in Theory and Practice in 1998. Running a central bank is in one way a little bit like flying a plane or sailing a boat: much of the time standard responses and small adjustments will do just fine, but every so often a situation arises in which fundamental understanding, knowledge of history, and good judgment can make the difference between riding out the storm and crashing. There was no such person in charge in 1929, and the result was disaster. There was one in 2008. [...]
A more general description of what banks do is "maturity transformation." They incur short-term debt (deposits) and acquire longer-term, and therefore riskier, assets (such as loans to start-up businesses). This is a socially useful function: it enables savers who want instant access to their money to finance businesses that need to lock up capital for a long enough time to focus on the design, the production, and the marketing of a product. Human greed and ingenuity being what they are, a vast variety of financial institutions has been created to engage in maturity transformation and, analogously, risk transformation. They are not "banks," so they are not regulated and overseen as banks have been, and they are not required to provide as much public information; but neither do they come under the protection of the FDIC or the lender-of-last-resort function of the Federal Reserve. And since they can be much more complex and opaque than banks, it may be hard even for relative insiders to know what may go wrong, or when, or exactly where.
This complexity and opaqueness matters more than you might think, and not just because they enable the well-informed to fleece the less well-informed. Two aspects are important. First, most of these non-bank financial institutions were very highly leveraged: their assets had been acquired almost entirely with borrowed money, and there was only a thin layer of owners' capital supporting this leaning tower of sometimes risky assets.3 This meant that even a small gain on the large volume of assets would translate into a huge rate of profit on the small amount of equity capital invested; but it also meant that a small loss on those assets (or even a profit smaller than the interest cost of all that borrowed capital) would eat up the owners' equity and leave the creditors facing possible default. Second, financial institutions had all borrowed and loaned to each other in ways that were not public.
So if A looked shaky, then B, C, and D, who might or might not have lent heavily to A, would perhaps not be repaid, in which case E, F, and G, who might be creditors of B, C, and D, were also possibly in trouble. The natural tendency in scary situations is to pull in whatever debts can be pulled in, and hunker down. The tendency of illiquidity to transform itself into insolvency is again at work. That all this happens in a fog of uncertainty only makes it worse. It is like a little old bank run, only on an enormous scale. A densely interconnected, highly leveraged financial system is intrinsically vulnerable to a collapse of this kind. And if it does implode, it is likely to drag the "real" economy with it as financing dries up for wage and salary payments, inventories, and materials.
A DENSELY INTERCONNECTED, HIGHLY LEVERAGED FINANCIAL SYSTEM IS INTRINSICALLY VULNERABLE TO COLLAPSE.
What actually happened in 2008 and its aftermath was bad enough, and it is still going on, but it could have been worse had the Fed and the Treasury not stepped in as lender of last resort not just to banks but also to the whole financial system, and even beyond the financial system. The case of AIG is an example of a complex financial institution so strongly and opaquely interconnected with others that it had to be rescued, however distasteful that might be, for fear of the collateral damage that would be done by its failure. This is not to deny that AIG's creditors, who were not exactly babes in the wood, might have been allowed to take some losses, just as a learning experience.
Perhaps the biggest tactical mistake of the rescue operation was the decision by the Fed and Treasury to let Lehman Brothers go bankrupt: not because the owners and creditors of Lehman deserved better, but because the collapse of Lehman seemed to intensify the panic dramatically. Bernanke says that he had no choice, because Lehman was insolvent. I am not convinced. Bernanke is very good on the significance of this experience, though necessarily brief. The lesson he teaches is that the Fed can no longer focus so near-exclusively on monetary policy. It, and the other regulatory agencies, have to pay more effective attention to that third mandate, the preservation of financial stability.
As one of the country's largest and oldest conservative advocacy groups, the American Conservative Union has long fought to rein in federal spending and limit the size of government.
But behind the scenes, the group has formed a partnership with business lobbyists to tame the activists who have pushed Republican leaders in Congress to adopt some of the most austere spending limits in decades.
In a draft proposal circulated to defense and transportation industry executives in recent weeks, the union is offering to use its grass-roots organization, annual conference and movement clout to lobby against cuts to federal military and infrastructure spending.
The group is also proposing to incorporate favorable votes on military and infrastructure spending into its widely cited Congressional voting scorecard, "the 'gold standard' for elected officials," according to the proposal, a copy of which was obtained by The New York Times.
By legitimating changes that could lead over time to the conversion of Social Security into a means-tested program for the elderly poor only, Barack Obama has proven himself to be a true and worthy successor of his predecessor, George W. Bush.
This is not intended as hyperbole. In any given era, presidents and Congresses of both parties tend to share a consensus. Eisenhower and Nixon, for example, shared much of the New Deal consensus with Roosevelt and Johnson.
So it is in our time. Future historians will note that Bush and Obama alike both successfully expanded access to healthcare goods and services for Americans -- but at the price of massive government subsidies to for-profit corporations as an alternative to the cheaper and more efficient expansion of public, nonprofit, low-cost social insurance. And the historians of the future will note that both the Republican president and his Democratic successor directly attacked the structure of Social Security as a universal program that pays promised benefits.
The two greatest expansions of the welfare state since Medicare's enactment in 1965 -- Medicare Part D under Bush in 2003 and the Affordable Care Act ("Obamacare") under Obama in 2010 -- both repudiated the logic of New Deal/Great Society insurance programs like Social Security and Medicare in favor of a radically different and far less efficient strategy of using tax credits for individuals to indirectly subsidize private provider monopolies and oligopolies.
The West's two big mistakes in the Arab world : a review of Arab Society in Revolt: The West's Mediterranean Challenge | By Cesare Merlini and Olivier Roy (Eds) (Jennifer S. Bryson, 8 April 2013, Mercator)
Roy examines some of the West's key interpretive missteps. For one thing, Roy sees the West hindering itself from development of successful policies due to "[a]n entrenched prejudice in Western public opinion . . . that secularization in Muslim-majority societies must precede any process of democratization" (p. 47). Instead, asserts Roy, "the real issue is institutionalization of democracy, not the secularization of public space" (p. 52). In other words, Western powers are missing opportunities to help democracy set roots by distracting themselves with concern and even fear about public religiosity.
At the same time, Roy sees an opportunity for Western self-reflection to help inform policies. For one thing, he observes, the West is and has long been philosophically, politically, and religiously diverse, and the view of religious actors toward the state has been varied and has changed over time. Yet many Westerners act on an assumption of homogeneity, especially religious homogeneity, among Muslims in the Middle East. He suggests that if perhaps Europeans and North Americans were to consider how it would feel to have outsiders view them as a single culture and treat Western Christianity as a homogeneous block then they might begin to understand why Western policies assuming homogeneity among Arabs, especially among Arab Muslims, are misguided.
Roberto Aliboni maintains that the real choice the West faces is between moderate and conservative Islamist movements, and not supporting the former would be a mistake. The only alternative to these two he sees as "weak and confused Western-style liberals" (p. 204).
In a forthcoming paper in the International Journal for the Psychology of Religion, titled "Atheists Become Emotionally Aroused When Daring God to Do Terrible Things," researchers asked subjects to make the horrible statements mentioned above. Some statements were offensive (puppy kicking), some were malevolent (parents drowning), and some dared God to do awful stuff, to the subjects, their friends, or their families. Of the 29 subjects, 16 were self-described atheists and 13 were religious. It's important to note that the study took place in Finland, which has a much higher proportion of atheists and agnostics than the United States has. According to one estimate, most Finns don't believe in God; contrast that with the United States, where less than 10 percent of us are heathens.
In the study, subjects were first asked to rate the unpleasantness of those statements. Not surprisingly, believers said they were more bothered than atheists were by the thought of daring God to burn down their houses or afflict them with cancer. Then subjects were asked to read aloud the statements while hooked up to a skin-conductance meter, which basically measures how much you sweat. The idea is that the more you perspire, the more worked up you are about a particular statement. (Such tests have been around for a long time. Here's more about them if you're curious.)
This is where it gets interesting.
According to the skin-conductance tests, the atheists found asking God to harm them or others to be just as upsetting as religious folks did. The researchers also compared the reactions of the atheists when making statements like "I wish my parents were paralyzed" and "I dare God to paralyze my parents." Atheists were, like believers, more bothered by the latter statement, if you believe the skin-conductance tests, even though both declarations would be, in theory, equally empty if there were no heavenly overseer.
...whose politics didn't survive his own first term, Margaret Thatcher's politics bestride the Anglosphere, Scandinavia, portions of Eastern Europe, and will only spread further in coming decades. Though, to be fair, her politics were actually first implemented in places like Chile, New Zealand and Australia. She was just the more visible proponent of what became the Third Way, New Labour, the New Democrats, compassionate conservatism, etc. Here's a piece that approaches giving her the pride of place she deserves:
Goldilocks politics :The much--heralded "Third Way" in politics often seems to boil down to refusing porridge too hot for the voters without offering porridge that has actually gone cold. But though it lacks ideological rigour, Tony Blair's project has a serious purpose (The Economist, Dec 17th 1998)
Trying to pin down an exact meaning in all this is like wrestling an inflatable man. If you get a grip on one limb, all the hot air rushes to another. But it is worth persevering; it may be a poor ideology, but as a piece of politics the Third Way needs to be understood.
Despite the obfuscatory fog of generalities, one thing is reasonably obvious. For "a very fundamental paradigm shift in politics" the core ideas of the Third Way sound rather familiar. In Mr Clinton's vision of the Third Way, government does not just provide services: it is an "enabler and catalyst", "a partner with the private sector and community groups". The president wants government to be fiscally disciplined and less bureaucratic. It should not try to solve all of people's problems, but to create the conditions in which people solve their own. For his part, Mr Blair says that the Old Left championed indiscriminate and often ineffective public spending, but that the Third Way concentrates on making sure that the spending produces the desired result. He also says (something of a conceptual breakthrough for the Labour Party) that governments should be friendly to private enterprise (as the workers' class enemies are now known).
In short, these new politicians want to make government smaller and cleverer, fiscally sound, and friendly to business. It is hard to fault these commonsensical objectives. And in Britain's case they mark a clear departure from the big, stupid, overspending, business-hostile Labour governments of the 1960s and 1970s. But hang on. Aren't they precisely the objectives that Labour's Conservative foes tried to achieve before Mr Blair turfed them out of office in 1997?
Having demonised those Tory governments while opposing them Labour is understandably reluctant to admit that it is following the path they marked out. So a big part of the business of the Third Way consists of making up a story about what the Tories stand for which makes their Labour replacements look clearly different.
Mrs. Thatcher, being an heir to the paternalistic Tory tradition of Disraeli and Churchill, harbored no delusions about doing away with the welfare state (the Second Way) altogether. However, as an heir to Adam Smith, David Hume, Michael Oakeshott and the rest of the great British philosophers, she likewise harbored no delusions that a system as top-down as the Second Way could succeed. Her politics derives from the insight that you can incorporate the First Way--capitalist mechanisms--into the welfare state to make it less expensive as government, make it more lucrative for the citizenry, and ultimately enhance liberty in society.
Alongside this reformist politics, her other great achievement was in breaking inflation by pulverizing the trade union movement. Once the ratchet of consistently rising wages had been removed, globalization and technology were able to bite and drive the deflation of the last 30+ years.
Mrs. Thatcher recognized the great error of socialism. As she put it, the trouble with socialism is that you eventually run out of other people's money. She proposed a twofold solution. First, stop spending other people's money. Second, give them the opportunity to earn it. In short, she sought to reintroduce liberal capitalism to the country that had once been at its vanguard -- from the repeal of the Corn Laws to the Industrial Revolution.
To achieve the first objective, she slowly but surely privatized nationalized industries (though, unfortunately, not the BBC), took on the trade unions and won, and reduced the size of the civil service. She achieved the second objective by lifting onerous regulations on Britain's financial sector -- one of her first acts was to lift capital controls -- and implementing sound monetary policy. And she did all this in defiance of the received economic wisdom of the time.
At the polls, she defeated the error of socialism three times in a row (four if you include her Tory successor John Major's 1992 victory). The result was vindication of the best kind, as the Labour Party, under Tony Blair, rejected a return to nationalization and instead recognized the truth that people did best under capitalism.
The trade unions at the time were busy wreaking havoc on industry. The far left had infiltrated Labour constituencies; Labour candidates were as scared of the militants then as primary Republicans of the Tea Party candidates today. Local union chiefs called wildcat strikes, disrupted production. The union movement, with some Labour ministers in support, threatened a closed shop in the press which would have curtailed free speech. I'd spoken out against it as had the then editor of The Guardian, Alastair Hetherington. At another of those endless London dinners where Maggie was the speaker and still not in government, she referred to me as "one of us." I wasn't. I was just expressing a view on an issue. We had many things in common, both from the north, both educated in state schools, both brought up in a grocer's shop, in my case one my mother started, in hers one her father ran. I admired her. I was one of the millions of voters in the 1979 general election which put her into power as the first woman prime minister. The country was in dreadful shape, fearful and anxious during a winter of discontent in which trade union militants blocked cancer patients getting treatment and garbage piled up in the center of London.
She saved Britain from anarchy and immediately restored a sense of purpose. She could be rough. As Prime Minister, she had a limited tolerance for dissent and an infinite regard for personal loyalty. If you were not with on her everything, she regarded you as disloyal, as unreliable, lacking conviction. I suppose it was the reverse mirror of her indomitable courage. How valiant she was when the IRA terrorists blew up her conference hotel; they tried to murder her and almost succeeded. She was often vindicated. She was impatient with excuses for inertia and woolliness -- vividly represented in Meryl Streep's representation of her cutting off a Cabinet member in mid speech. I disappointed her by giving space in The Times to critics, especially one of them, Edwin Heath, whom she'd ousted as Prime Minister. The imperatives of news meant we published news stories she didn't like: she'd sunk in the polls and recession deepened. Relations became a little chillier. As an editor, I'd never sought to cosy up to political leaders, but I now understand more of what she was up against - the Tory snobs in the counties, the plotters in the party who eventually betrayed her, the "wets" and the "wimps" who would yield on a principle she considered vital.
When she became Prime Minister I was editor of The Times. We backed her a hundred per cent on trade union reforms, on holding the line on pay, especially in the public sector and on advocating more competition in the banking industry, on free trade, on resisting terrorism in Northern Ireland. I told her I thought she moved too slowly against trade union anarchy, but she bided her time and planned well. She won a famous victory against the coal miners, badly led by a firebrand who took money from Gaddafi, and it was thanks to her stalwart support of Rupert Murdoch, whom she admired as a free-booting entrepreneur , that he was able to win the battle of Wapping which ended the guerilla warfare of the print unions.
[S]he understood that the biggest threat to any viable future for Britain was a unionized public sector that had awarded itself a lifestyle it wasn't willing to earn. So she picked a fight with it, and made sure she won. In the pre-Thatcher era, union leaders were household names, mainly because they were responsible for everything your household lacked. Britain's system of government was summed up in the unlovely phrase "beer and sandwiches at Number Ten" -- which meant union grandees showing up at Downing Street to discuss what it would take to persuade them not to go on strike, and being plied with the aforementioned refreshments by a prime minister reduced to the proprietor of a seedy pub, with the Cabinet as his barmaids.
In 1990, when Mrs. Thatcher was evicted from office by her ingrate party's act of matricide, the difference she'd made was such that in all the political panel discussions on TV that evening no producer thought to invite any union leaders. No one knew their names anymore.
That's the difference between a real Terminator, and a poseur like Schwarzenegger.
But by the time she left office, the principles known as Thatcherism -- the belief that economic freedom and individual liberty are interdependent, that personal responsibility and hard work are the only ways to national prosperity, and that the free-market democracies must stand firm against aggression -- had won many disciples. Even some of her strongest critics accorded her a grudging respect.
At home, Mrs. Thatcher's political successes were decisive. She broke the power of the labor unions and forced the Labour Party to abandon its commitment to nationalized industry, redefine the role of the welfare state and accept the importance of the free market.
Abroad, she won new esteem for a country that had been in decline since its costly victory in World War II. After leaving office, she was honored as Baroness Thatcher of Kesteven. But during her first years in power, even many Tories feared that her election might prove a terrible mistake.
In October 1980, 17 months into her first term, Mrs. Thatcher faced disaster. More businesses were failing and more people were out of work than at any time since the Great Depression. Racial and class tensions smoldered so ominously that even close advisers worried that her push to stanch inflation, sell off nationalized industry and deregulate the economy was devastating the poor, undermining the middle class and courting chaos.
At the Conservative Party conference that month, the moderates grumbled that they were being led by a free-market ideologue oblivious to life on the street and the exigencies of realpolitik. With electoral defeat staring them in the face, cabinet members warned, now was surely a time for compromise.
To Mrs. Thatcher, they could not be more wrong. "I am not a consensus politician," she had often declared. "I am a conviction politician."
In an address to the party, she played on the title of Christopher Fry's popular play "The Lady's Not for Burning" in insisting that she would press forward with her policies. "You turn if you want to," she told the faltering assembly. "The lady's not for turning."
Her tough stance did the trick. A party revolt was thwarted, the Tories hunkered down, and Mrs. Thatcher went on to achieve great victories. She turned the Conservatives, long associated with the status quo, into the party of reform. Her policies revitalized British business, spurred industrial growth and swelled the middle class.
What really propelled Germany's economy to new heights was the package of market-oriented reforms launched by Chancellor Gerhard Schroeder 10 years ago. Schroeder acknowledged that Germany's safety net had become a bit of a hammock. He restructured and reduced unemployment and welfare benefits while giving employers more freedom to hire and fire.
As it happens, Schroeder's plan reflected his admiration for U.S. capitalism, which, in his view, had surpassed Germany's in dynamism, flexibility and innovation. But it triggered resistance within his own center-left Social Democratic Party -- and from ordinary workers, one-eighth of whom were receiving some form of government aid by 2003. The struggle probably cost Schroeder a third term in the 2005 elections.
History's verdict has been kinder -- the Organization for Economic Cooperation and Development has given Schroeder's reforms much of the credit for Germany's "labour market miracle" ...
Inside Britain she was the woman who sparked riots and ignored the advice of colleagues. But outside Britain -- in the United States, in Eastern Europe, even in the Soviet Union -- she made herself into an icon, a symbol of anti-communism and the transatlantic alliance at a time when neither was fashionable. She stood by Ronald Reagan in his battle against the Evil Empire. She used the same language as he did -- free markets, free people -- and entered into a unique and probably unrepeatable public partnership with him. It was useful to them both: If Reagan wanted to pull away from domestic scandals, he could appear with Thatcher on a podium. If Thatcher wanted to enhance her status, she could pay a visit to Reagan at the White House.
But their partnership was also useful to others, as Thatcher herself understood. When she arrived in Poland in the autumn of 1988, dressed in cossack boots, a full-length fur coat and a fur hat, she decided to visit a farmers' market, one of the few examples of "the free market" then available in Warsaw. She swept through the fruit stalls, swarmed by journalists and startled shoppers while the British ambassador scurried behind her, paying for jars of pickles broken in the fray. Her entourage then proceeded to Gdansk, where she met Lech Walesa. By all accounts, the two conducted an awkward and mutually incomprehensible conversation.
Nevertheless she appeared with him in front of cheering crowds at the Gdansk shipyard and declared, "We shall not be found wanting when Poland makes the progress toward freedom and democracy its people clearly seek." And that gesture, that moment, really mattered: It gave the Poles and others the courage to think they really could someday join the rest of Europe. Someone wanted them there.
Oil-eating bacteria that are abundant in the Gulf of Mexico may have prevented the 2010 Deepwater Horizon spill from being more catastrophic, according to new research discussed Monday.
According to some estimates, the spill pumped nearly 5 million barrels of oil into the Gulf of Mexico over the course of nearly three months, but within several weeks of being plugged, many areas of the Gulf were oil free. According to University of Tennessee researcher Terry Hazen, the Gulf has a "greater-than-believed" ability to clean itself up after an oil spill. He presented his research Monday at the American Chemical Society's national meeting in New Orleans.
"The bottom line from this research may be that the Gulf of Mexico is more resilient and better able to recover from oil spills than anyone thought," Hazen said in a statement. "It shows that we may not need the kinds of heroic measures proposed after the Deepwater Horizon spill, like adding nutrients to speed up the growth of bacteria that breakdown oil, or using genetically engineered bacteria. The Gulf has a broad base of natural bacteria, and they respond to the presence of oil by multiplying quite rapidly."
Over the last twenty years, the share of consumer spending on food has fallen from about 13% in 1982 to less than 9% last year, thanks to falling grocery prices that have resulted from advances in technology, more automation, and greater efficiencies in food production and delivery. For example, between 1982 and 2012, the inflation-adjusted prices of many common food items fell (see table below), and many of those declines in price have been significant, by as much as one-quarter or more in the case of chicken legs, pork chops, steak, bananas, lettuce and butter.
Americans like to compete on a level playing field. All the players should have an equal opportunity to win based on their competitive merits, not on some artificial imbalance that gives someone or some group a special advantage.
We think this idea should be applied to energy producers. They all should bear the full costs of the use of the energy they provide. Most of these costs are included in what it takes to produce the energy in the first place, but they vary greatly in the price imposed on society by the pollution they emit and its impact on human health and well-being, the air we breathe and the climate we create. We should identify these costs and see that they are attributed to the form of energy that causes them.
At the same time, we should seek out the many forms of subsidy that run through the entire energy enterprise and eliminate them. In their place we propose a measure that could go a long way toward leveling the playing field: a revenue-neutral tax on carbon, a major pollutant. A carbon tax would encourage producers and consumers to shift toward energy sources that emit less carbon--such as toward gas-fired power plants and away from coal-fired plants--and generate greater demand for electric and flex-fuel cars and lesser demand for conventional gasoline-powered cars.
On manufacturing, a huge gap separates public perceptions and economic realities, as Marc Levinson of the Congressional Research Service has shown in several reports.
For starters, manufacturing's decline is misunderstood. The truth is that output has continued to climb. In 2010, Levinson reports, U.S. manufacturing production of nearly $1.8 trillion was the largest in the world; it was slightly ahead of China's, about two-thirds higher than Japan's and nearly triple Germany's. China may now be No. 1, but the United States remains a manufacturing powerhouse. In 2011, near-record output was 72 percent more than in 1990 and six times greater than in 1950. Recall some American-made products: commercial jets, earth-moving equipment, gas turbines. (Output refers to "value added," which is the difference between the sector's purchased inputs and its final products.) [...]
[A]utomation improves the workplace. It replaces exhausting, dangerous or boring jobs. In his book "America's Assembly Line," historian David Nye quotes an early worker at a Ford plant on the demeaning regimentation of factory work: "Henry [Ford] has reduced the complexity of life to a definite number of jerks, twists, and turns. ... When the whistle blows [the worker] starts to jerk and when the whistle blows again he stops jerking." Many electronic assembly jobs outsourced to Asia today are similar: "The assembly line ran very fast," complained one worker for the electronics assembler Foxconn, "and after just one morning we all had blisters."
More important, greater factory efficiency raises living standards. Prices are held down; purchasing power expands. This has enabled Americans to spend more on education, health care, travel, recreation -- and much more. Because these activities typically don't require the huge energy inputs of heavy industry, society becomes less energy intensive. This is happening in all advanced nations; since 1973, manufacturing's share of Sweden's employment dropped from 28 percent to 13 percent.
The link between corrupt politicians and steakhouses would appear to be so obvious that corrupt politicians would avoid them altogether, especially since there are apparently as many hidden microphones as shrimp cocktails at a given table. But still they come. Experts on either side of the napkin offered theories.
"They're men," said Ben Benson, the owner of the former steakhouse that bore his name. "Men go to steakhouses. The power lunches, the power dinners -- it's what the steakhouse is all about." Dim lighting, plush booths; there is an unspoken promise of discretion in a steakhouse. As Mr. Benson put it, they are "clubby." One of his restaurant's most memorable decorations was itself a reminder of a famous steakhouse crime: a large picture of the mobster Paul Castellano shot dead in front of the steakhouse Sparks in 1985. "Eat at Ben Benson's," the poster read. "It Won't Kill You."
Over at Sparks on Friday, a manager, Sal Desai, said no one would have probably noticed the federal agent enjoying two separate meetings on the same Valentine's night.
"We try to leave them alone," he said of his customers. "We are so busy. There is no more room to talk."
The former assemblyman who enjoyed the bribe-free steak at Luger, Rory I. Lancman, said choosing a steakhouse as a setting for illegal activity spoke to ego.
"Such an inflated sense of self that they think they're entitled to these ill-gotten gains, they're beyond getting caught," he said. "Maybe there's something in the steakhouse culture that makes them think they're bigger and more macho than they are. Slicker."
He added, "You never read about an envelope being passed to a politician in the front seat of a Prius or while they're having a salad at Hale and Hearty."
State Senator Liz Krueger's district in Manhattan includes many of the city's best-known steakhouses. She was game for some steakhouse psychoanalysis.
"There's this imagery in mass culture that the big powerful guys eat at steak restaurants," she said. "These guys who are under the belief they are big powerful guys, or are in a desperate attempt to prove they are, may choose steakhouses for the scenery they think it provides."
...two friends came in from Grand Rapids for New Years and wanted to eat at Sparks. When they called and asked for a table in the non-shooting section they were hung up on.
AS a Texas-raised journalist, I can tell you two things with confidence about my native state. One, its economy has been humming nicely for years. Two, this appears to greatly offend a certain breed of Northern writer, several of whom have descended on the state in an attempt to rebut stories of a "Texas miracle." Their reports, Erica Grieder writes, have contributed to "a widespread impression that Texas is corrupt, callous, racist, theocratic, stupid, belligerent, and most of all, dangerous."
This is nothing new, as most any Texan will tell you. But Ms. Grieder, a onetime correspondent for The Economist who now works at Texas Monthly, and a Texan herself, has written a smart little book that counters much of this silliness, and explains why the Texas economy is thriving. It's called "Big, Hot, Cheap and Right: What America Can Learn from the Strange Genius of Texas" (PublicAffairs, $26.99). The sad truth, alas, is that it's probably a lot easier to understand the successes of Texas than it would be to duplicate them.
What might be copied, Ms. Grieder indicates, is the so-called Texas model -- that is, a weak state government with few taxes and fewer regulations and services.
Mr. Muhammad and his followers had been killed by the C.I.A., the first time it had deployed a Predator drone in Pakistan to carry out a "targeted killing." The target was not a top operative of Al Qaeda, but a Pakistani ally of the Taliban who led a tribal rebellion and was marked by Pakistan as an enemy of the state. In a secret deal, the C.I.A. had agreed to kill him in exchange for access to airspace it had long sought so it could use drones to hunt down its own enemies.
That back-room bargain, described in detail for the first time in interviews with more than a dozen officials in Pakistan and the United States, is critical to understanding the origins of a covert drone war that began under the Bush administration, was embraced and expanded by President Obama, and is now the subject of fierce debate. The deal, a month after a blistering internal report about abuses in the C.I.A.'s network of secret prisons, paved the way for the C.I.A. to change its focus from capturing terrorists to killing them, and helped transform an agency that began as a cold war espionage service into a paramilitary organization.
You are invited to join my on-line college hoops bracket group! To accept this invitation and join the group, click the link below (or cut and paste the link into your browser's address field). You'll be asked to enter the group's password before you can join. The group password is included in this e-mail.
We finally found a football book to pay off the last NFL pool winner and for this one we have, among others, a couple copies of science writer Mary Roach's newest:
On Thursday, the Associated Press revised the usage of the word "Islamist." But instead of banning the word, like it did with "illegal immigrant," it restricted the use of Islamist to certain situations.
For example: Here is the how the new Stylebook instructs reporters:
Islamist An advocate or supporter of a political movement that favors reordering government and society in accordance with laws prescribed by Islam. Do not use as a synonym for Islamic fighters, militants, extremists or radicals, who may or may not be Islamists.
The decision is an acknowledgement that a wide swath of Muslims -- ranging from mainstream politicians to violent jihadists -- view the Quran as a legitimate political model, and the AP's restriction is an effort to not conflate those two types of Muslims. But the move doesn't fully satisfy demands by the nation's largest Muslim civil liberties group, CAIR, to "drop the term" altogether. As CAIR's Communications Director Ibrahim Hooper argued in January, using the term in any fashion represents something of a double-standard:
There are few, if any, positive references to "Islamist" in news articles. There are also no -- nor should there be -- references to "Christianists," "Judaists" or "Hinduists" for those who would similarly seek governments "in accord with the laws" of their respective faiths.
No journalist would think of referring to the "Judaist government of Israel," the "Christianist leader Rick Santorum" or "Hinduist Indian politician Narendra Modi," while use of "Islamist" has become ubiquitous.
There's nothing wrong with calling a spade a spade:
When in the Course of human events, it becomes necessary for one people to dissolve the political bands which have connected them with another, and to assume among the powers of the earth, the separate and equal station to which the Laws of Nature and of Nature's God entitle them, a decent respect to the opinions of mankind requires that they should declare the causes which impel them to the separation.
We hold these truths to be self-evident, that all men are created equal, that they are endowed by their Creator with certain unalienable Rights, that among these are Life, Liberty and the pursuit of Happiness.--That to secure these rights, Governments are instituted among Men, deriving their just powers from the consent of the governed
Relations between the two countries, once strong allies, have been in tatters since May 2010, when Israeli troops raided a flotilla of ships carrying aid from Turkey to Gaza. The raid killed nine Turkish citizens and prompted the Turkish government to expel its Israeli ambassador and recall its own from Israel. The countries' two powerful militaries, once close partners, began to regard each other as hostile forces, and lucrative trade dried up. Even the number of Israeli tourists to Turkey, who once flocked there, dwindled amid fears that the country was no longer safe for travel.
Turkish Prime Minister Recep Tayyip Erdogan and his Israeli counterpart, Benjamin Netanyahu, meanwhile, failed to ease the tensions, increasing them instead with a bitter war of words. As recently as late February, on the eve of Kerry's last Turkey visit, Erdogan made international headlines when he referred to Zionism as "a crime against humanity."
The two leaders "are both stubborn in their own ways," says Yossi Mekelberg, a Middle East scholar at Chatham House in London. "Instead of dealing with things quietly and behind the scenes, these are politicians who like to hear their own voices and make great statements. And that's escalated the situation over the last few years."
Yet Mekelberg says there are pressing issues that seem to have convinced the two sides to drop their guard--namely, the crisis in Syria, and the fact that the continued spat has hurt each country's regional concerns. "It's important for them to work with one another, as opposed to against one another," Mekelberg says.
The final push for reconciliation came from America.
[W]ith revenues rising more rapidly than spending, deficits are evaporating in state capitals. "It's likely most states will end the year with a slight surplus," said Brian Sigritz, director of state fiscal studies at NASBO.
Surpluses are showing up in places you'd expect. North Dakota, currently enjoying an energy and agricultural boom, is projecting a $1.6 billion surplus over its two-year budgeting cycle. Texas, another resource-rich state, foresees an $8.8 billion surplus over its current two-year budget cycle.
But the Rust Belt is also regaining some of its fiscal shine. Ohio is expecting a $1 billion surplus for the current fiscal year. Wisconsin is looking at $484 million in black ink. Other states with surpluses include Iowa ($800 million) and Tennessee ($580 million). West Virginia completed its 2011-12 fiscal year with a surplus of about $88 million.
Some of the coastal states whose finances were hit hardest by collapsing housing markets and persistently high unemployment are also making a comeback. For the past several years, California's massive, recurring deficits have made life miserable for politicians and inspired comparisons to Greece. Thanks to tough spending cuts, higher taxes, and a general recovery, California's finances are on the mend. "California expects to take in $2.4 billion more in revenue than it will spend this fiscal year, which ends June 30," Tami Luhby of CNN Money reported. "After paying off a shortfall from last year and setting aside funds for upcoming obligations, it's on track to end the year with a $36 million surplus." Florida, another state that has had to deal with harsh cuts to rein in deficits, is also now in the black. The current projection is for a surplus of $437 million.
I ask him about his latest book, What Money Can't Buy: The Moral Limits of Markets (Penguin), in which he argues that the US and other countries are turning from market economies into market societies, as Lionel Jospin, the former French prime minister, once put it. Sandel argues that we live in a time of deepening "market faith" in which fewer and fewer exceptions are permitted to the prevailing culture of transaction. The book has infuriated some economists, whom he sees as practitioners of a "spurious science".
He has been at loggerheads with the profession for many years. In 1997, he enraged economists when he attacked the Kyoto protocol on global warming as having removed "moral stigma" from bad activity by turning the right to pollute into a tradeable permit. Economists said he misunderstood why markets work. Sandel retorts that they know the price of everything and the value of nothing. To judge by his sellout lecture tours, he has clearly tapped into a larger disquiet about the commodification of life.
[H]e gives me a quick sketch of "the rise of market reasoning", from the triumphalism of Ronald Reagan and Margaret Thatcher through Bill Clinton and Tony Blair up to the present day. "What Blair and Clinton did - and I'm using them not to blame them but as emblematic of this tendency - was they moderated but also consolidated the assumption that markets are the primary instrument for achieving the good life," he says. "So we never really had a debate."
Still gently toying with his burger, Sandel's tone takes on a note of regretfulness when I mention Obama, who in the philosopher's view has promised so much and delivered so little. "During the healthcare debate in 2009 there was a long angry summer. I was listening to Obama on C-Span and I heard him make the case for healthcare reform by saying we have to 'bend the cost curve in the out years'. [...]
"I think you could say that the weakness of my argument is that I'm arguing against an overarching singular way of thinking about all questions - 'an economic way of looking at life', as Gary Becker [the Chicago economist and Nobel Prize winner] described it," Sandel replies. "I'm arguing against that not by putting my own overarching singular philosophy but by saying that is a mistake and we must value goods case by case. So the answer may be one thing on the environment and the right way of dealing with nature, and a different one with education and on whether we should offer financial incentive to kids to do their homework, for example, and different still if we're arguing against a free market in kidneys and surrogate pregnancy."
Still not entirely convinced, I ask Sandel whether he does anything in his own life to make the world less money-minded. He begins a couple of answers but peters out. I suggest that he makes all his lectures free online. "Yes, that's one thing," he agrees. After our lunch I see that Sandel is listed on Royce Carlton, a speaker's agency, as one of its big names (without apparent irony, a posting by the agency last year said Sandel was available to lecture "at a reduced fee in conjunction with his new book, What Money Can't Buy").
But it is talking about the free stuff that gets him going. Sandel says he was recently approached by a Silicon Valley tech company, which he did not name, that has developed the technology to support interactive global lectures. He recently did a pilot run with simultaneous live audiences in Cambridge Massachusetts, Rio de Janeiro, New Delhi, Shanghai and Tokyo. Cisco TelePresence charges hundreds of thousands of dollars per session, says Sandel. This new method costs only a couple of thousand. The drop in price could change everything. "We could see them and they could see us. I could call out to a student in Delhi, and ask a student in the fifth row of a theatre in Harvard to reply to someone in São Paulo and someone in Shanghai - and it worked. The technology worked."
It is, of course, the market that worked to make the commodity that is Mr. Sandel's lecture cheaper.
ON April 9, 2003, Baghdad fell to an American-led coalition. The removal of Saddam Hussein and the toppling of a whole succession of other Arab dictators in 2011 were closely connected -- a fact that has been overlooked largely because of the hostility that the Iraq war engendered.
Few of the brave young men and women behind the Arab Spring have been willing to publicly admit the possibility of a link between their revolutions and the end of Mr. Hussein's bloody reign 10 years ago. These activists have for the most part vigorously denied that their own demands for freedom and democracy, which were organic and homegrown, had anything to do with a war they saw as illegitimate and imperialistic. [...]
For all its bungling, the Bush administration's invasion of Iraq exposed a fundamental truth of modern Arab politics. Washington's longstanding support for autocracy and dictatorship in the Middle East, a core principle of American foreign policy for decades, had helped stoke a deep-seated political malaise in the region that produced both Saddam Hussein and Al Qaeda. By 2003, American support for Arab autocrats was no longer politically sustainable.
The system of beliefs Mr. Hussein represented had ossified and lost the ability to inspire anyone long before 2003. And yet he was still there, in power, the great survivor of so many terrible wars and revolutions. Before the American invasion, it was impossible for Iraqis to see beyond him.
There was hardly any war to speak of in 2003. Mr. Hussein's whole terrible edifice just came crashing down under its own weight.
Payroll-tax revenue covers most of the cost of Part A hospital care. The coverage has a $1,184 annual deductible for care, but beneficiaries don't have to pay further cost-sharing on the first 60 days of a hospital stay. Copayment charges increase sharply after that.
The majority of beneficiaries also sign up for Part B coverage of doctor bills, with basic premiums set at $104.90 a month, and a $147 annual deductible and 20% cost-sharing. There are no caps on the amount that they pay out of pocket.
A deficit-reduction panel chaired by Republican Alan Simpson and Democrat Erskine Bowles recommended in late 2010 that the deductibles be combined into a single $550 deductible, with seniors asked to contribute 20% toward the cost of hospital and doctor care until they reach a cap.
Under the cap, beneficiaries would pay a much smaller share after the first $5,500 of costs and no more than $7,500 out of pocket in total.
The combined effect of the changes, the panel predicted, could save the U.S. $10 billion in 2015 and $110 billion through 2020.
That would help seniors who get very sick, but the savings would come because the majority of seniors would pay more out of pocket.
THE news that 11 percent of school-age children now receive a diagnosis of attention deficit hyperactivity disorder -- some 6.4 million -- gave me a chill. My son David was one of those who received that diagnosis.
In his case, he was in the first grade. Indeed, there were psychiatrists who prescribed medication for him even before they met him. One psychiatrist said he would not even see him until he was medicated. For a year I refused to fill the prescription at the pharmacy. Finally, I relented. And so David went on R****n, then A*****ll, and other drugs that were said to be helpful in combating the condition.
In another age, David might have been called "rambunctious." His battery was a little too large for his body. And so he would leap over the couch, spring to reach the ceiling and show an exuberance for life that came in brilliant microbursts.
As a 21-year-old college senior, he was found on the floor of his room, dead from a fatal mix of alcohol and drugs. The date was Oct. 18, 2011.
No one made him take the heroin and alcohol, and yet I cannot help but hold myself and others to account. I had unknowingly colluded with a system that devalues talking therapy and rushes to medicate, inadvertently sending a message that self-medication, too, is perfectly acceptable.
The 'Vigilance' Vigilantes : The tolerance enforcers will not tolerate dissent. (Mark Steyn, 4/05/13, National Review)
He who controls the language shapes the debate: In the same week the Associated Press announced that it would no longer describe illegal immigrants as "illegal immigrants," the star columnist of the New York Times fretted that the Supreme Court seemed to have misplaced the style book on another fashionable minority. "I am worried," wrote Maureen Dowd, "about how the justices can properly debate same-sex marriage when some don't even seem to realize that most Americans use the word 'gay' now instead of 'homosexual.'" She quoted her friend Max Mutchnick, creator of Will & Grace:
"Scalia uses the word 'homosexual' the way George Wallace used the word 'Negro.' There's a tone to it. It's humiliating and hurtful. I don't think I'm being overly sensitive, merely vigilant."
For younger readers, George Wallace was a powerful segregationist Democrat. Whoa, don't be overly sensitive. There's no "tone" to my use of the word "Democrat"; I don't mean to be humiliating and hurtful: It's just what, in pre-sensitive times, we used to call a "fact."
...couldn't we call them "men who engage in anal intercourse"? But, of course, then support for them would disappear.
About a year ago, on March 26, 2012, Sandra Steingraber, an environmental writer and activist against natural-gas fracking, wrote a public letter titled "Breaking Up with the Sierra Club." Breakups are never easy, and the letter, published on the website of the nature magazine Orion, was brutal from the start: "I'm through with you," Steingraber began.
The proximate cause of the split was the revelation that between 2007 and 2010 the nation's oldest environmental organization had clandestinely accepted $26 million from individuals or subsidiaries associated with Chesapeake Energy, a major gas firm that has been at the forefront of the fracking boom. "The largest, most venerable environmental organization in the United States secretly aligned with the very company that seeks to occupy our land, turn it inside out, blow it apart, fill it with poison," Steingraber wrote. "It was as if, on the eve of D-day, the anti-Fascist partisans had discovered that Churchill was actually in cahoots with the Axis forces."
Because, really, what's the difference between fracking and Nazism?
The Digital Public Library of America, to be launched on April 18, is a project to make the holdings of America's research libraries, archives, and museums available to all Americans--and eventually to everyone in the world--online and free of charge. How is that possible? In order to answer that question, I would like to describe the first steps and immediate future of the DPLA. But before going into detail, I think it important to stand back and take a broad view of how such an ambitious undertaking fits into the development of what we commonly call an information society.
Speaking broadly, the DPLA represents the confluence of two currents that have shaped American civilization: utopianism and pragmatism. [...]
How do these two tendencies converge in the Digital Public Library of America? For all its futuristic technology, the DPLA harkens back to the eighteenth century. What could be more utopian than a project to make the cultural heritage of humanity available to all humans? What could be more pragmatic than the designing of a system to link up millions of megabytes and deliver them to readers in the form of easily accessible texts?
Above all, the DPLA expresses an Enlightenment faith in the power of communication. Jefferson and Franklin--the champion of the Library of Congress and the printer turned philosopher-statesman--shared a profound belief that the health of the Republic depended on the free flow of ideas. They knew that the diffusion of ideas depended on the printing press. Yet the technology of printing had hardly changed since the time of Gutenberg, and it was not powerful enough to spread the word throughout a society with a low rate of literacy and a high degree of poverty.
Thanks to the Internet and a pervasive if imperfect system of education, we now can realize the dream of Jefferson and Franklin. We have the technological and economic resources to make all the collections of all our libraries accessible to all our fellow citizens--and to everyone everywhere with access to the World Wide Web. That is the mission of the DPLA.
Imagine taking a college exam, and, instead of handing in a blue book and getting a grade from a professor a few weeks later, clicking the "send" button when you are done and receiving a grade back instantly, your essay scored by a software program.
And then, instead of being done with that exam, imagine that the system would immediately let you rewrite the test to try to improve your grade.
EdX, the nonprofit enterprise founded by Harvard and the Massachusetts Institute of Technology to offer courses on the Internet, has just introduced such a system and will make its automated software available free on the Web to any institution that wants to use it. The software uses artificial intelligence to grade student essays and short written answers, freeing professors for other tasks.
The new service will bring the educational consortium into a growing conflict over the role of automation in education. Although automated grading systems for multiple-choice and true-false tests are now widespread, the use of artificial intelligence technology to grade essay answers has not yet received widespread endorsement by educators and has many critics.
Anant Agarwal, an electrical engineer who is president of EdX, predicted that the instant-grading software would be a useful pedagogical tool, enabling students to take tests and write essays over and over and improve the quality of their answers. He said the technology would offer distinct advantages over the traditional classroom system, where students often wait days or weeks for grades.
"There is a huge value in learning with instant feedback," Dr. Agarwal said. "Students are telling us they learn much better with instant feedback."
[I]n recent months he has surprised his many critics in the West by challenging his enemies, sometimes in ways that are shockingly public.
In February, during a session of Parliament that was broadcast nationwide, he showed a secretly taped video of a meeting between one of his allies and Fazel Larijani, the youngest of five influential brothers closely associated with the traditionalists, who Mr. Ahmadinejad said was proposing fraudulent business deals.
At the funeral ofHugo Chávez, the Venezuelan leader, he was photographed embracing the former president's mother, a display that was denounced by the clerics, who forbid physical contact between unmarried men and women who are not closely related. But urban Iranians, many of whom have moved far beyond the social restrictions set by the Islamic republic, viewed his action as a simple gesture of friendship.
Despite his early advocacy of Islam's role in daily affairs, the president is now positioning himself as a champion of citizens' rights. "He more and more resembles a normal person," said Hamed, a 28-year-old driver in Tehran who did not want his last name used. "He doesn't allow them to tell him what to do."
In speeches, he favors the "nation" and the "people" over the "ummah," or community of believers, a term preferred by Iran's clerics, who constantly guard against any revival of pre-Islamic nationalism. He has also said he is ready for talks with the United States, something other Iranian leaders strongly oppose under current circumstances.
Mr. Ahmadinejad regularly brings up the topic of corruption by other officials, and he hints that they have accumulated wealth and power because of their positions. "Some of the relationships, which had been formed as a result of groupings and power-mongering pursuits in the country, have come to an end, and with the help of God will be purged from the revolution and the holy Islamic republic," he asserted recently.
The president has also taken to using the slogan "long live spring" in his speeches, which some have interpreted as an allusion to the Arab Spring uprisings. "This way of thinking and talking about 'Human Awakening' is political mischief and dangerous," one newspaper wrote in an editorial.
Research by accounting software firm MYOB has found the revenue of small businesses that allow their staff to telework was more likely to have risen in the last year than businesses that don't allow teleworking.
According to the research small businesses whose employees worked remotely most or all of the time were 24 per cent more likely to have had a revenue rise in the past year. Twenty-one per cent of firms that allow teleworking had a lift in revenue over the last 12 months, compared to 17 percent of businesses whose staff work only in the office.
Teleworking is defined as work performed away from a business's main office. It offers benefits to businesses and employees. According to research sponsored by the Federal government employees who work from home have, on average, an extra hour to themselves each day, leading to improved work/life balance.
Employers who offer telework also find it easier to attract staff outside their local area. Teleworkers are also more productive than their office-based counterparts. Research by Macquarie University into the rising trend of teleworking found that most employees work more intensely away from an office.
Mr Duncan Smith's reforms, far from being a decisive break with the past, are merely the last staging post in the welfare state's long transformation from a system of social insurance to one of straight state handouts, universally applied regardless of contribution. Government-sponsored saving has given way to just another form of redistributional indulgence. Both recipients and donors are left with no discernible stake in the system. Mr Duncan Smith can therefore reasonably be regarded as more the heir of Gordon Brown than William Beveridge.
Available data on voting behavior from Muslim-majority democracies, such as Indonesia, show that the links between being religious and actually voting for religious candidates is weak. In short, religiosity is a poor predictor of whom people vote for and why. While similar data from Arab countries is limited, it suggests that Islam has only a small impact on political attitudes.
What's more, the Islamist policy agenda is indistinguishable from other political platforms. Consider the Ennahda movement in Tunisia, which has had the most detailed economic program of all Islamic parties in the region. Still, it offered few specifics, besides an endorsement of market economy and a pledge to fight inequality. Egypt's Freedom and Justice Party (FJP) is even worse. Back in June 2011, the chairman of the FJP tried to shrug off specific questions about his party's economic platform with a smile, saying that he "did not know much about the economy."
At the heart of Islamic politics in the Arab world lies the Muslim Brotherhood, a group originally founded in 1928 in Egypt, and involved in politics, proselytizing and provision of social services. Over time, it has become a loose network of Islamic parties throughout the region, and also a widely emulated model of organization that combines political and religious activism with the provision of social services.
What makes the Brotherhood distinctive is its involvement the social realm. Arab regimes typically allowed groups like the Brotherhood to run hospitals and schools as well as provide assistance to the poor. As a result, in 2006, the Brotherhood was running schools in every governorate in Egypt, as well as twenty-two hospitals around the country. Islamists have also been among the first and most effective to provide relief during large-scale disasters, such as the earthquake in Algiers in 1989. In other locations, Islamists run sports clubs, perform collective weddings or provide Sharia-friendly business finance.
As a result, Islamic political groups have trustworthy brand names--a unique asset in a political environment where most voters regard politicians as crooks (and for good reason). Election promises in transitional countries are not worth very much. But when a political organization can show a seventy-year record of social-service provision, people listen.
...these parties are untainted by the decades of oppression that people suffered.
Then along came quantum mechanics. When physicists observed that behavior at the atomic level was fundamentally indeterminate, the universal validity of classical physics, as well as philosophical determinism came into question. Physicists recoiled at the idea that their science could no longer claim to predict all things with infinite precision. But, that's what quantum mechanics teaches us. We absolutely cannot know exactly how something will turn out before it happens.
Most physicists eventually accepted this idea as an empirical fact of measurement, but assumed that a flaw in quantum mechanics created the uncertainty. Perhaps, with further insight, some "hidden variable" could allow them to predict things with perfect certainty again.
But that never happened.
John Bell, in a famous 1964 paper, forced everyone to reconsider, both scientifically and philosophically, their support for determinism. His famous theorem, Bell's inequality, is an incredibly profound statement. This relatively simple mathematical proof, when applied to experimental results, gives us a choice: We must either give up determinism or give up the existence of an objective reality explained by science and measurable by humans with instruments. (You can read the gory details about the experiments here.)
So if experiments on quantum phenomena are reliable, then Bell concludes that determinism is false. Most physicists agree.
Essentially, quantum mechanics tells us that there are things which we cannot know about the future, things which are not predetermined but happen with some factor of chance or randomness. Although many things in the world may be predicted, everything is not predetermined, and our actions do not unfold mechanically in a manner predetermined since the very moment of the Big Bang. Free will is preserved.
According to statistics accompanying that cover story, each generation of Latinos is significantly more Protestant than the last. And that has a direct bearing on their views on social-conservative issues: While 52 percent of Latino Catholics "are opposed to abortion in all or most cases," 70 percent of Latino Protestants hold this view. While Latino Catholics favor same-sex marriage by a lopsided 54 to 31 percent margin, Latino Protestants oppose it by an even greater margin, 66 to 25 percent.
The Latino population is expected to grow from 17 percent of Americans today to 29 percent by midcentury. The growing Evangelical element within the Latino community is energetic and committed: While 58 percent of Americans overall, and 66 percent of Latino Catholics, "say religion is very important in their lives," this number is a dizzying 92 percent among Latino Protestants.
The civilian labor participation rate fell again in March to 63.3%. That's 0.5-percentage points lower than a year ago, and it's a stunning 2.4-points lower than June 2009 when the recovery began. The last time the participation rate was so low was in May 1979, before the surge of women into the workplace in the 1980s and 1990s.
And because white men were doing the hiring, those women did not replace white male workers, they were just added to the employment rolls. Those jobs were a function of public policy, not the private economy.
Fifty years later, in a colorblind/genderblind society, the employment participation rate is naturally returning to historical norms.
The more interesting question is whether--given that those fifty years also saw a computer revolution and globalization of the world economy--it can possibly be sustained at such a high level. Given everything that computers, robots, and foreign workers will do for us, how can we possibly need as many workers as we had in the past?
It's easy to wage a war on drugs when the enemy combatants wear tie-dye and long hair, or gold chains, track suits, and beepers, or, in their current incarnation, saggy drawers and designer t-shirts. When the drug pusher dons a white coat, the lab garb provides a cloak of invisibility. Despite ubiquitous evidence of their malfeasance in overmedicated America, dope dispensing doctors remain largely immune from criticism. Indeed, the indecency resides in the suggestion that writing prescriptions can be habit forming, not in the writing of prescriptions that form habits.
The collective cognitive dissonance on drugs doesn't withstand an honest look at the history of the science. In most instances, today's dirty "street" drugs were introduced as yesterday's clean cure-alls by pharmaceutical companies.
...since they're just giving parents and teachers the means they demand to control boys.
President Obama reportedly is unveiling a budget using the chained CPI inflation measure to cheat elderly Americans out of the benefits they were promised. In two previous posts I've explained the perversity of the current debate about Social Security. The tax-favored private components of America's mixed private-public retirement system -- programs like employer pensions, 401Ks and IRAs -- are inefficient, volatile and subject to manipulation by overcompensated, fee-extracting money managers. In contrast, the Social Security program is simple and efficient, and has low overhead costs. And yet the bipartisan establishment, including many "progressive" Democrats as well as Republicans, wants to cut Social Security -- the part that works -- and expand tax-favored private savings, the inefficient, unstable and inequitable part.
While cutting Social Security makes no sense at all in terms of economics or public policy, it makes excellent sense in terms of the selfish class interests of the super-rich. They have extracted about half the gains from economic growth in the U.S. in the last half-century and recycle some of their profits to fund politicians, and lobbyists, as well as mercenary propagandists who pose as neutral think tank experts. Social Security's contribution to the retirement income of the rich is negligible, while the top 20 percent receives around 80 percent of the income from tax-favored private retirement savings accounts like 401Ks. Naturally many of America's oligarchs want the public discussion to be solely about cutting Social Security benefits for the bottom 80 percent, rather than 401Ks for the top 20 percent. To paraphrase Leona Helmsley, Social Security is for the little people. And if we cannot afford all of our present public-plus-private retirement system ... well, as the saying in Tsarist Russia had it, let any shortage be shared among the peasants.
Leave the metrics aside for a moment and just consider all kinds of examples from day to day life. How about photographs? Twenty years ago, I carried a $100 camera that took pictures on film that had to be both bought and developed, and if I wanted to share the images with others, I'd pay for copies. Now all of that activity is essentially free, and no doubt better. I can even edit the photos myself.
What's the value of friendships kept up through Facebook? A good restaurant found on Yelp? Not getting lost because of GPS on a cell phone? Jobs found or connections made on LinkedIn?
No numbers can measure such gains in living standards. Yet the gains are real for a wide swath of the middle class. These are not toys for the rich.
Such examples can go on and on. Sure, some products and services have remained immune to the grand forces of the past two decades--gas, real estate, airfares. But even when it comes to basics like food and furniture, we're getting more for less. As Boudreaux and Perry point out: "According to the Bureau of Economic Analysis, spending by households on many of modern life's basics--food at home, automobiles, clothing and footwear, household furnishings and equipment, and housing and utilities--fell from 53 percent of disposable income in 1950 to 44 percent in 1970 to 32 percent today."
In 1968, Kim Il Sung hijacked the U.S. intelligence ship Pueblo and held its crew hostage. America, tied down in Vietnam, did nothing. In 1976, North Koreans ax-murdered two U.S. officers in the DMZ. In 1983, Pyongyang tried to assassinate South Korea's president in Burma and blew up three members of his cabinet. In 1987, North Koreans blew up a South Korean airliner.
These unpunished atrocities all occurred during the rule of Kim Il Sung.
Under Kim Jong Il, Pyongyang torpedoed a South Korean patrol boat, killing 47, and shelled a South Korean island, killing four. Neither Washington nor Seoul retaliated.
The danger is that Kim Jong Un believes he, too, can get away with murder and he, too, will be appeased with aid and investments.
Yet neither President Obama nor President Park Geun Hye--whose father, President Park Chung Hee, was the target of assassination attempts and whose mother died in one--can be seen as tolerating another North Korean outrage.
To avoid a collision, a diplomatic path will have to be opened for Kim to back away from the confrontation he has provoked. But, in the longer term, America has to ask herself:
What are we doing, 20 years after the end of the Cold War, with 28,000 troops in Korea and thousands on the DMZ facing the North?
Rather, we ought ask ourselves why the regime remains in North Korea and why we tolerate so much North Korean blood on our hands as a result. It's time we accept the moral obligation we've been shirking for 65 years and liberate the people.
EXCEPT THAT THEY DON'Y HAVE NUKES TO THREATEN WITH...:
The Next Korean War :Conflict With North Korea Could Go Nuclear -- But Washington Can Reduce the Risk (Keir A. Lieber and Daryl G. Press, April 1, 2013, Foreign Affaiars)
Ironically, the risk of North Korean nuclear war stems not from weakness on the part of the United States and South Korea but from their strength. If war erupted, the North Korean army, short on training and armed with decrepit equipment, would prove no match for the U.S.-South Korean Combined Forces Command. Make no mistake, Seoul would suffer some damage, but a conventional war would be a rout, and CFC forces would quickly cross the border and head north.
The risk of nuclear war with North Korea is far from remote.
At that point, North Korea's inner circle would face a grave decision: how to avoid the terrible fates of such defeated leaders as Saddam Hussein and Muammar al-Qaddafi. Kim, his family, and his cronies could try to escape to China and plead for a comfortable, lifelong sanctuary there -- an increasingly dim prospect given Beijing's growing frustration with Kim's regime. Pyongyang's only other option would be to try to force a cease-fire by playing its only trump card: nuclear escalation.
It's impossible to know how exactly Kim might employ his nuclear arsenal to stop the CFC from marching to Pyongyang. But the effectiveness of his strategy would not depend on what North Korea initially destroyed, such as a South Korean port or a U.S. airbase in Japan. The key to coercion is the hostage that is still alive: half a dozen South Korean or Japanese cities, which Kim could threaten to attack unless the CFC accepted a cease-fire.
...and the regime would be gone by the end of that first exchange.
In effect the new governor, Haruhiko Kuroda, has imported into Japan the whole of the Federal Reserve's post-Lehman balance sheet strategy, and he will implement it in under two years, instead of the five years or more taken by the Fed. The doubling in the Japanese monetary base over a period of 21 months is in itself remarkable. Taken together with the extension of the duration of bonds purchased from less than 3 years to an average of 7 years, the injection becomes of historic proportions.
The new strategy brings, for the first time, a real prospect of breaking the deflationary psyche which has plagued Japan for so long.
Half of the children in India are chronically malnourished. So imagine the potential benefits if there were a simple way to increase the milk production from cows by, say, a quart per week? Or imagine if there were a better way for small farmers to cultivate rice -- the staple food of half the world -- one that required no costly inputs, used less water, and substantially increased yields?
Wouldn't these things be development miracles?
Well, they are not miracles; they are realistic possibilities. The problem is that most farmers in the developing world don't know about them -- or don't know how to implement them successfully.
Consider: an aquatic fern called azolla, which can be readily cultivated and added to animal feed, can boost production of cows milk by 15 to 20 percent. An approach known as System of Rice Intensification (SRI), which involves transplanting rice saplings, spacing them in a grid, keeping the soil drier, and carefully weeding plots, can produce remarkable gains. SRI has been called one of the most important agricultural innovations of the past 50 years, yet it is employed by a fraction of farmers (pdf). And there are countless other opportunities: for instance, orange-fleshed sweet potatoes from a home garden can avert Vitamin A deficiency, which causes hundreds of thousands of children in the developing world to go blind or die each year.
One of the great paradoxes in today's world is that information is so easy to transmit --few places on earth are beyond the reach of cellphones or televisions -- and yet our efforts to get life-saving, livelihood-boosting information to people in a form that sticks, a form that will actually change behavior, are frequently disappointing.
That was a problem that gripped Rikin Gandhi, a young American-born software engineer, while he was working in Bangalore for Microsoft Research India seven years ago. Rikin was interested in how rural telecenters might be used to spread education and information about health and agriculture in remote areas. A colleague suggested he investigate the application of the Digital StudyHall model in rural Karnataka. Gandhi did just that -- and his experience led to the creation of Digital Green, a platform and process for extending knowledge and influencing behavior that has seized the attention of many development experts.
Slow-Cooker Garlicky Shrimp (Miami Herald, Adapted from "The Slow Cooker Revolution, Volume 2: The Easy Prep Edition" by America's Test Kitchen)
3/4 cup extra-virgin olive oil
6 garlic cloves, thinly sliced
1 teaspoon smoked Spanish paprika (pimenton; may substitute sweet paprika)
1 teaspoon kosher salt
1/4 teaspoon freshly ground pepper
1/4 teaspoon crushed red pepper flakes
2 pounds extra-large (26-30 count) raw shrimp, peeled and deveined
1 tablespoon minced flat-leaf parsley, for garnish
Combine the oil, garlic, paprika, salt, black pepper and crushed red pepper flakes in the slow cooker, stirring until blended. Cover and cook on high for 30 minutes.
Stir in the shrimp to coat evenly; cover and cook on high for about 10 minutes. Stir to ensure the shrimp are cooking evenly, recover and cook 10 more minutes, until all of the shrimp are just opaque.
When attacked, they can say that the GOP's budget framework adopted a proposal from the Medicare reform commission of a Democratic president (Bill Clinton) and introduced by a Democratic senator (John Breaux of Louisiana). More important, that proposal--premium support--has been successfully tested for almost a decade.
The GOP proposal to reform Medicare puts consumers in charge by relying on competition and choice instead of centralized government planning and price controls. It was the organizing principle of the successful Medicare Part D prescription drug benefit passed in 2003.
Here's how premium support works. Seniors who participate in Medicare Part D choose among drug coverage plans offered by competing private insurers. The federal government helps them pay for this coverage. The amount of support is a weighted average of the premiums charged by plans in the part of the country where they live.
If seniors want a more-expensive plan, they pay the difference. If they pick a less-expensive plan, then more of its cost is covered by government's premium support. But requiring seniors to pay at least part of the premium encourages them to shop for value when comparing plans and to use generic drugs where possible.
Medicare Part D has been in operation for eight years, and the results are extraordinary. In 2003, the Congressional Budget Office projected Part D's cost for its first decade would be $552 billion. The actual cost will be around $358 billion, 35% less than forecast and 64% less than the $1 trillion cost that the CBO estimated for the competing Democratic plan, in which the federal government would decide who got what drugs when and at what price.
The average premium for drug coverage is $30 a month, half what the actuaries estimated it would be this year. A 2011 study in the Journal of the American Medical Association found that the prescription benefit helped reduce hospital stays and delay the need for nursing care, saving Medicare $12 billion a year. The Congressional Budget Office also reported last November that seniors "had fewer hospitalizations and used fewer medical services as a result" of participating in Part D.
The Roadrunner supercomputer, once the fastest computer in the world and the pride of Los Alamos, had a rich and full life. Born in 2008, it broke the Petaflop barrier, paved the way for hybrid supercomputers, and this weekend retired at the ancient age of five.
It turns out Moore's Law, which roughly predicts a doubling of computer power every two years, is especially harsh on the computers at the cutting edge. Within a year, a new supercomputer named Jaguar (sadly, not Coyote) had stolen Roadrunner's title as the fastest machine on the planet. Since then, Jaguar was unseated by Tianhe-1A, a Chinese supercomputer, which in turn fell before the might of Japan's K Supercomputer. Turns out the petaflop speed record game is cutthroat.
President Obama plans to return 5 percent of his salary to the Treasury in solidarity with federal workers who are going to be furloughed as part of the automatic budget cuts known as the sequester, an administration official said Wednesday. [...]
The president makes $400,000 a year, so a pay cut of 5 percent for the whole year amounts to $20,000; an administration official said Mr. Obama would pay back that amount, compressing the total over the remaining months of the fiscal year.
To make the self-caricature complete, oughtn't he don a sweater and turn down the thermostat?
Rachel Carson's Environmental Religion : Review of Silent Spring at 50: The False Crises of Rachel Carson. Edited by Roger Meiners, Pierre Desrochers, and Andrew Morriss (Bruce Edward Walker, Religion & Liberty)
In "The Lady Who Started All This," environmentalist William Kaufman presents an admiring portrait of Carson as a scientist who unfortunately took a left-turn from her previous works--based on objective, empirical research--when she endeavored to write Silent Spring shortly after her cancer diagnosis. For this illconceived approach, Kaufman blames Wallace Shawn, the New Yorker editor who prompted Carson to abandon her "disinterested scientist" voice in favor of a more "adversarial" tone. Since the famous editor signed Carson's check, the author readily complied.
Kaufman--an admitted admirer of Carson's eventual conclusions and penchant for prose-poetry--acknowledges the approach as a misstep: "[Shawn's] words demonstrate a serious flaw in logic and why Silent Spring is so different from Carson's earlier books: 'After all, there are some things one doesn't have to be objective and unbiased about--one doesn't condone murder!' This is classic polarization-- if you're not for us, you're against us. Clearly, objectivity and the open mind of scientific inquiry do not condone or condemn."
Kaufman correctly notes that Carson never advocated for a complete ban on chemical insecticides, but upbraids her for employing inflammatory language exemplified in her chapter titles: "Elixers of Death," "Needless Havoc," "Rivers of Death" and "Indiscriminately From the Skies." He further notes that she resorts to unnecessary demonization of chemical companies and government agents who spray insecticides as well as infantilization of the American public at large when she wrote: "As matters stand now, we are in little better position than the guests of the Borgias."
Perhaps most damning of all, Kaufman points out that Carson's book includes "sentimentalized line drawings of animals where even the bugs are cute. In fact, she wrote to Dorothy Freeman, 'I consider my contributions to scientific fact far less important than my attempts to awaken an emotional response to the world of nature.'" As Kaufman points out, this is where Carson set the stage for environmentalists to embrace Silent Spring as dogma. For her followers, he notes disapprovingly, "her contribution to the environmental movement was not a respect for science, but nourishment of a faith."
No fewer than two in three Americans want the U.S. to put more emphasis on producing domestic energy using solar power (76%), wind (71%), and natural gas (65%). Far fewer want to emphasize the production of oil (46%) and the use of nuclear power (37%). Least favored is coal, with about one in three Americans wanting to prioritize its domestic production.
Europeans are often mystified at the religiosity of Americans. However much atheism may be gaining ground, America is still far more religious than other economically-advanced societies, thanks to our earliest settlers. The Puritans "saw themselves as Israelites fleeing a new Egyptian captivity," Shalev writes, "crossing a seat to reach freedom and taking possession of a promised land."
It is not novel to observe that Americans are religious. But the content of American religious belief--the way they have fused biblical beliefs with nationalist myths--is distinctive. Settlers in the young country believed quite literally that Israel's second coming was in the New World. Even deists such as Thomas Jefferson and Benjamin Franklin reimagined the revolution as an "Exodus-like deliverance from slavery," notes Shalev. Just as the first Israelis were to a light under the world, so too have Americans been convinced that they must be a "redeemer nation," in the words of Ernest Lee Tuveson. [...]
[T]he widespread belief that America has a mission to bring democracy to the world--a notion introduced at the policy level by Woodrow Wilson and maximized rhetorically by George W. Bush--is in many ways a secular version of the national beliefs popular prior to the Civil War. It is a testament to the power of American religious views that they can be transmuted into secular terms--virtually divorced from God--with little change in actual policy.
It also helps explain why political disputes in America are so bitter. Consider that when Canada's Supreme Court ruled in late 2004 that same-sex marriage was constitutional, there were few protesters outside the courtroom. In the U.S. this week, conversely, thousands warred over the issue, frequently describing their respective positions in apocalyptic terms. Indeed, everyday issues from taxation to gun control to health care are frequently discussed in religious terms, as if the fate of the universe depends upon the specific level of Amtrak funding. But if one recognizes that Americans see their country in religious terms, the level of acrimony is more easily understandable. If nothing else, Shalev's convincing book reaffirms G. K. Chesteron's notion that America is a nation with the soul of a church.
The housing finance company--whose 2008 taxpayer bailout was one of the biggest firestorms of the financial crisis--announced Tuesday the largest net income in the company's history with $17.2 billion hauled in for 2012, and $7.6 billion alone in the fourth quarter of that year.
That meant the company did not request a draw of cash from Treasury for the fourth quarter of 2012, after drawing more than $116 billion from taxpayers in the years following the financial crisis.
CEO Timothy J. Mayopoulos called those results "terrific" and said they represent a turning point for the company, driven in part by a rebound in the nation's housing market and lower delinquency rates on mortgage loans.
What's more, the company said, it expects to remain profitable for the foreseeable future and that it expected to return "significant value" to the taxpayers who bailed it out in the wake of the financial crisis.
Marriage is one of those institutions -- along with religion and military service -- that restricts freedom. Marriage is about making a commitment that binds you for decades to come. It narrows your options on how you will spend your time, money and attention.
Whether they understood it or not, the gays and lesbians represented at the court committed themselves to a certain agenda. They committed themselves to an institution that involves surrendering autonomy. They committed themselves to the idea that these self-restrictions should be reinforced by the state. They committed themselves to the idea that lifestyle choices are not just private affairs but work better when they are embedded in law.
And far from being baffled by this attempt to use state power to restrict individual choice, most Americans seem to be applauding it. Once, gay culture was erroneously associated with bathhouses and nightclubs. Now, the gay and lesbian rights movement is associated with marriage and military service. Once the movement was associated with self-sacrifice, it was bound to become popular.
Americans may no longer have a vocabulary to explain why freedom should sometimes be constricted, but they like it when they see people trying to do it. Once Americans acknowledged gay people exist, then, of course, they wanted them enmeshed in webs of obligation.
At the top of the hill near the area's main mosque, groups of rebels mingled, in newfound amity, with Kurdish fighters from the local People's Defense Units, the armed wing of Syria's main Kurdish group, the Democratic Union Party (P.Y.D.). Until Friday, this area had been controlled by Kurdish fighters but was frequently visited by militias and intelligence agents from the regime of Bashar al-Assad. On Friday, though, in an event that may have momentous consequences for the course of the civil war, the Kurds switched sides, and with their help the rebels overran Sheikh Maksoud, which commands strategic high ground north of the city's center. [...]
The Assad regime had apparently hoped that the presence of Muslim's group, like Ocalan's in the 1990s, would dissuade Turkey from escalating its support to the rebels. But last week's reversal in Sheikh Maksoud suggests that Erdogan's recent overtures to Ocalan are already bearing fruit in his struggle with Assad.
Although the Kurdish groups in Syria are not very significant militarily, their cooperation would free the Turkish government's hands by allowing it to increase its support for the rebels in Syria without fear that the Assad regime could stoke the Kurdish insurgency inside Turkey in response.
It remains to be seen whether the Kurds' newfound cooperation with the rebels in Aleppo is part of a larger realignment by the P.Y.D. But if over the weeks ahead government forces are pushed out of their remaining bases in Kurdish areas, like oil-rich Hasakah in the northeast, then the fall of Sheikh Maksoud on Friday will have marked the beginning of a dramatic shift in Syria's civil war.
[I]t's the catastrophe which makes insurance a good deal. You wouldn't get much value from buying "grocery insurance". At best, you'd be paying an extra administrative fee to route your routine expenses through an insurer, rather than paying them directly. At worst, you'll end up with bills skyrocketing as all sorts of perverse incentives appear. After all, if the insurer is paying all your grocery claims, why not load up on filet mignon instead of ground turkey?
But insurers try very hard never to sell insurance for less than the cost of your expected claims. If you expect to buy $10,000 worth of groceries next year, it will not charge you less than that for a "grocery policy". And if we all drive up the costs of grocery insurance by consuming more, the insurer can do one of two things: raise everyone's "insurance premiums" to cover a filet mignon budget, or create a list of "approved groceries" that it will cover, and start hassling anyone who tries to file an excessively expensive claim.
Sound familiar?
This is why you should always have liability insurance, but should think twice about collision damage coverage. It's why high deductibles are a good idea--for small expenses, it's better to self insure. And it's why "catastrophic" health plans, which only cover the sort of extremely expensive events that most people would have difficulty financing, are a much better deal than the soup-to-nuts plans that most people get through their employers. Those plans are expensive, both because they're paying for a higher percentage of your expenses, and because they drive up utilization--which means that they drive up next year's premiums even more. Imagine what your car insurance would cost if it covered gasoline, routine maintenance, and those little air freshener trees you hang from the rearview mirror. Then stop asking why health insurance costs so much.
The newspaper renaissance is part of the reform efforts of President Thein Sein, who, after serving as prime minister in the previous military regime, took office in March 2011 as head of an elected civilian government. Political and economic liberalization were at the top of his agenda, in an effort to boost national development.
The press has been a major beneficiary. The government lifted censorship in August last year, allowing reporters to print material that would have been unthinkable under military rule.
MOOCs of Hazard : Will online education dampen the college experience? Yes. Will it be worth it? Well... (ANDREW DELBANCO, New Republic)
What's driving all this risk-taking and excitement? Many people are convinced that the MOOCs can rein in the rising costs of colleges and universities. For decades, the price of tuition has outstripped the pace of inflation. Over the past ten years, the average sticker price at private colleges has increased by almost 30 percent (though net tuition has risen less because financial aid has grown even faster). At state universities, the problem has been exacerbated by public disinvestment. For example, less than 6 percent of the annual budget of the University of Virginia is covered by state funds. Last fall, I heard the chief financial officer of an urban public university put the matter succinctly: The difficulty, he said, is not so much the cost of college, but the shift of the financial burden from the state to the student.
There are many reasons why college costs continue to soar: the expense of outfitting high-tech science labs, the premium placed on research that lures faculty out of the classroom (and, in turn, requires hiring more faculty to teach classes), the proliferation of staff for everything from handling government regulation to counseling increasingly stressed students. At some institutions, there are also less defensible reasons, such as wasteful duplication, lavish amenities, and excessive pay and perks for top administrators and faculty.
But the most persuasive account of the relentless rise in cost was made nearly 50 years ago by the economist William Baumol and his student William Bowen, who later became president of Princeton. A few months ago, Bowen delivered two lectures in which he revisited his theory of the "cost disease."1 "In labor-intensive industries," he explained, "such as the performing arts and education, there is less opportunity than in other sectors to increase productivity by, for example, substituting capital for labor." Technological advances have allowed the auto industry, for instance, to produce more cars while using fewer workers. Professors, meanwhile, still do things more or less as they have for centuries: talking to, questioning, and evaluating students (ideally in relatively small groups). As the Ohio University economist Richard Vedder likes to joke, "With the possible exception of prostitution . . . teaching is the only profession that has had no productivity advance in the 2,400 years since Socrates."
This is a true statement--but it unwittingly undercuts its own point: Most people, I suspect, would agree that there are some activities--teaching and prostitution among them--in which improved productivity and economies of scale are not desirable, at least not from the point of view of the consumer.
Find me someone who is about to purchase either who wouldn't prefer that a college degree or a roll in the hay was less expensive.
Jailed Fatah leader Marwan Barghouti would defeat Palestinian Authority President Mahmoud Abbas and Hamas Prime Minister Ismail Haniyeh in presidential elections, a public opinion poll published on Monday showed.
Conducted by the Palestinian Center for Policy and Survey Research, the poll covered 1,270 Palestinians and has a margin of error of 3%.
My theory is that happy marriages, from the Darwins on down, are made up of a steady, unchanging formula of lust, laughter and loyalty.
The Darwins had lust, certainly - 10 children in 17 years suggests as much anyway - and they had laughter. Emma loved to tease Charles about his passion, already evident in youth, for obsessive theorising.
"After our marriage," she wrote to him early on, "you will be forming theories about me, and if I am cross or out of temper you will only consider: 'What does that prove?' which will be a very philosophical way of considering it."
And loyalty? Well, despite Emma's Christian faith, she stood by him through all the evolutionary wars, and did for him the one thing only a loyal spouse can do - pretend he wasn't in when German journalists came calling.
So, marriages are made of lust, laughter and loyalty - but the three have to be kept in constant passage, transitively, back and forth, so that as one subsides for a time, the others rise.
...that what successful marriages have in common is that the husband acquiesces in arguments with the wife.
"Those are astronomical numbers. I'm floored," said Dr. William Graf, a pediatric neurologist in New Haven and a professor at the Yale School of Medicine. He added, "Mild symptoms are being diagnosed so readily, which goes well beyond the disorder and beyond the zone of ambiguity to pure enhancement of children who are otherwise healthy." [...]
Experts cited several factors in the rising rates. Some doctors are hastily viewing any complaints of inattention as full-blown A.D.H.D., they said, while pharmaceutical advertising emphasizes how medication can substantially improve a child's life. Moreover, they said, some parents are pressuring doctors to help with their children's troublesome behavior and slipping grades.
"There's a tremendous push where if the kid's behavior is thought to be quote-unquote abnormal -- if they're not sitting quietly at their desk -- that's pathological, instead of just childhood," said Dr. Jerome Groopman, a professor of medicine at Harvard Medical School and the author of "How Doctors Think." [...]
Because the pills can vastly improve focus and drive among those with perhaps only traces of the disorder, an A.D.H.D. diagnosis has become a popular shortcut to better grades, some experts said, with many students unaware of or disregarding the medication's health risks.
"There's no way that one in five high-school boys has A.D.H.D.," said James Swanson, a professor of psychiatry at Florida International University and one of the primary A.D.H.D. researchers in the last 20 years. "If we start treating children who do not have the disorder with stimulants, a certain percentage are going to have problems that are predictable -- some of them are going to end up with abuse and dependence. And with all those pills around, how much of that actually goes to friends? Some studies have said it's about 30 percent."
An A.D.H.D. diagnosis often results in a family's paying for a child's repeated visits to doctors for assessments or prescription renewals. Taxpayers assume this cost for children covered by Medicaid, who, according to the C.D.C. data, have among the highest rates of A.D.H.D. diagnoses: 14 percent for school-age children, about one-third higher than the rest of the population.
Tesla Motors is expecting to report its first-ever quarterly profit after sales of its all-electric Model S exceeded expectations.
The announcement about the just-ended first quarter pushed Tesla Motors (TSLA) shares up nearly 20% midmorning.
"There have been many car startups over the past several decades, but profitability is what makes a company real," co-founder and CEO Elon Musk said in a late Sunday statement. "Tesla is here to stay and keep fighting for the electric car revolution."
In this fragile global environment, has America become a beacon of hope? The US is experiencing several positive economic trends: housing is recovering; shale gas and oil will reduce energy costs and boost competitiveness; job creation is improving; rising labor costs in Asia and the advent of robotics and automation are underpinning a manufacturing resurgence; and aggressive quantitative easing is helping both the real economy and financial markets. [...]
In sum, among advanced economies, the US is in the best relative shape, followed by Japan, where Abenomics is boosting confidence. The eurozone and the UK remain mired in recessions made worse by tight monetary and fiscal policies. Among emerging economies, China could face a hard landing by late 2014 if critical structural reforms are postponed, and the other BRICs need to turn away from state capitalism. While other emerging markets in Asia and Latin America are showing more dynamism than the BRICs, their strength will not be enough to turn the global tide.
An unusual controversy has erupted at Emory University over the choice of famed neurosurgeon Ben Carson to deliver this year's commencement address because he does not believe in evolution.
Nearly 500 professors, student and alumni signed a letter (see full text below) expressing concern that Carson, as a 7th Day Adventist, believes in creationist theory that holds that all life on Earth was created by God about 6,000 years ago. It rejects Darwin's theory of evolution, which is the central principle that animates modern biology, uniting all biological fields under one theoretical tent, and which virtually all modern scientists agree is true.
Darwin's theory of evolution offers a sweeping explanation of the history of life, from the earliest microscopic organisms billions of years ago to all the plants and animals around us today. Much of the evidence that might have established the theory on an unshakable empirical foundation, however, remains lost in the distant past. For instance, Darwin hoped we would discover transitional precursors to the animal forms that appear abruptly in the Cambrian strata. Since then we have found many ancient fossils - even exquisitely preserved soft-bodied creatures - but none are credible ancestors to the Cambrian animals.
Despite this and other difficulties, the modern form of Darwin's theory has been raised to its present high status because it's said to be the cornerstone of modern experimental biology. But is that correct? "While the great majority of biologists would probably agree with Theodosius Dobzhansky's dictum that 'nothing in biology makes sense except in the light of evolution,' most can conduct their work quite happily without particular reference to evolutionary ideas," A.S. Wilkins, editor of the journal BioEssays, wrote in 2000. "Evolution would appear to be the indispensable unifying idea and, at the same time, a highly superfluous one." [...]
Darwinian evolution - whatever its other virtues - does not provide a fruitful heuristic in experimental biology. This becomes especially clear when we compare it with a heuristic framework such as the atomic model, which opens up structural chemistry and leads to advances in the synthesis of a multitude of new molecules of practical benefit. None of this demonstrates that Darwinism is false. It does, however, mean that the claim that it is the cornerstone of modern experimental biology will be met with quiet skepticism from a growing number of scientists in fields where theories actually do serve as cornerstones for tangible breakthroughs.